Pathfinder, a lightweight, small test rover, roams an ash heap in Point Marion, Pennsylvania, for research conducted by Cagri Kilic, a WVU postdoctoral researcher.
(Photo courtesy of Jonas Bredu)
West Virginia University Scientists have devised a way for extraplanetary rovers to use non-visual information to maneuver over treacherous terrain. This research aims to prevent losses like that of the Mars exploration rover Spirit, which lost communications after its wheels got caught in invisible drifting sands in 2010.
SPace robotic Cagri Kilic, a Statler College of Engineering Postdoc in the Faculty of Mechanical and Aerospace EngineeringIn the WVU navigation laboratory, led research into avoiding slips and trips in planetary rovers, presented in a field robotics Paper he co-authored with professors of aerospace engineering YuGu and Jason Gross.
Aided by funding from NASA’s established competitive research stimulus program, Kilic, Gu and Gross found a way to help a rover move forward using only its existing sensors when visual data isn’t available or reliable.
Darkness and extreme light can make it difficult for rover to rely on visual data for navigation, but Kilic’s work also focuses on helping the rover in situations where aspects of the physical terrain are difficult to discern based on visual inspection are: steep slopes, loose rubble, layers of different sands, soft soil or salt flats like that of Europa, the moon of Jupiter.
Many of these terrain features are found on the burnt coal ash heaps at Point Marion, Pennsylvania, where Kilic’s team is testing its software on WVU’s Pathfinder rover.
“The area was actually found when we were doing some testing for the Mars Society’s University Rover Challenge,” he said. “As soon as I saw the area, I wanted to look at the chemistry of the area because it looked like Mars.”
At Point Marion, Kilic’s team puts Pathfinder, a lightweight, small test rover through its paces, testing algorithms that allow it to adjust things like its course or speed based on the information it gets from onboard instruments like accelerometers, gyroscopes, magnetometers and odometer, rather than what it can see through its camera lens. These instruments inform Kilic’s software about orientation, speed and position, helping the rover and the engineers guiding it to understand and respond to the environment.
“Mars rovers can tell if there’s an obstacle in front of them,” Kilic said. “They can detect wheel slip with their cameras, they can tell if a wheel is spinning on a rock and so on. And they can adjust their navigation by altering their path, changing individual wheel speeds, or stopping to await orders from engineers on Earth.”
Kilic emphasized that when visual data is available, the rovers’ current visual navigation system is “almost perfect – 99% success rate. The problem is that it can only work if there are enough features in the environment.” The uniformity of a landscape poses problems for a rover when it relies on sight to get around.
According to Kilic, it is “homogeneous environments with low visual features, resembling deserts, oceans or tundra on our planet” that pose a problem for rovers not only on Mars but also on Earth’s moon and possibly Europa, where there is ice represent has stimulated scientific speculation about habitability. Kilic said he was trying to make the technology “as general as possible for use in any robot on any alien body.”
Wherever a rover can go in our solar system, Kilic’s algorithms can help protect it from falling or getting stuck.
“Of course, the software has to be tuned to a specific rover by adapting to the wheel dimensions and the properties of the inertial measurement unit, but no additional sensors are needed,” he said.
Still, Kilic’s research specifically aims to benefit the rovers currently exploring Mars: Curiosity, Perseverance, and Zhurong. Mars is a priority for Kilic because “the Martian soil presents an exceptional challenge to traverse. Even during a single trip, Mars rovers traverse different terrains with different inclinations.”
To achieve this goal, Kilic will now conduct additional tests with different rovers. His method already has more than 92% slip detection accuracy for distances of around 150 meters and uses fewer computational resources than visual-based navigation, allowing rovers using Kilic’s software to travel faster and stop less often than if they did would rely on visual signals.
Though the research still has some time to go, Kilic said the results so far “show us that we” – and the rovers – “are on the right track.”
-WVU-
mm/8/10/22
MEDIA CONTACT: Paige Nesbit
Marketing and Communications Director
Statler College of Engineering and Natural Resources
304-293-4135; [email protected]
Call 1-855-WVU-NEWS for the latest West Virginia University news and information from WVUToday.
Follow @WVUToday on Twitter.