Our expriment involves a ground vehicle planning a path through unknown terrain. To explore the terrain, the ground vehicle tasks an air vehicle, which fies over the terrain and acquires a ground map. The ground map is then radioed back to the ground vehicle and the path is modified accordingly.
Results of the experiment performed at the DARPA-MARS PI meeting in Tyson's Corner, VA.
- The initial plan by the ground vehicle, used to task the air vehicle
- video of the autonomous mapping flight
- animation of the map building during this flight
- the final ground model (in VRML, viewer can be downloaded here)
- a 2-D map with the path planned by the ground vehicle.
Segbot Ground Navigation at the Experiment:
- Video footage of the navigating Segbot
- Two Segbot Navigation Maps acquired from the ground map 1 and map 2.
- 3-D Map acquired by the Segbot from the ground: VRML model (get viewer here) and brief animation of the 3-D model and a more detailed animation
An earlier practice run at Stanford University:
- Movie of the autonomous mapping flight
- the raw data (in VRML, viewer can be downloaded here)
- the final ground model with navigation analysis (VRML file)
- animation of this mapping run
- extracting the 2-D navigation grid map (animation)
- Helicopter on the ground and in the air.
- The ground vehicle is a Segway RMP, see segbot.com: Image 1, Image 2, Image 3, Image 4, Image 5, Image 6 and Image 7.
If the AVIs don't play...
There are copies with other codecs at Dirk Haehnel's home page.
Some Background movies (by Andrew Ng):
- stable hover with reinforcement learning,
- faiure of conditional control, and
- four-legged robot trained by reinforcement learning