Current out-of-the-box solutions for navigation lack the robustness and flexibility for leaving a controlled laboratory environment and perform navigation with small aerial vehicles. Truly autonomous flight in general environments is not possible without reliance on unrealistic assumptions like uninterrupted GPS signals, perfect communication links to a ground station for data processing and control, or pose measurements from external motion capture systems. Higher level tasks, such as autonomous exploration, swarm operation and large trajectory planning, can only be tackled after solving such issues.
Computer vision techniques are commonly used in research for real-time tracking and navigation. High-performing stereo-based systems have demonstrated successful operation on ground vehicles. However, stereo setups are unsuitable for navigation with small aerial vehicles as the stereo image essentially reduces to a monocular image when the scene is viewed from a large distance or from close by (e.g., during landing). Hence we focus our study around monocular based methodologies.
We have implemented a high-performance parallel tracking and mapping system to achieve monocular self-localisation and mapping on-board a small aerial vehicle. Due to the long trajectories that are flown, the high dynamics of the motion of the vehicle, and dynamic objects in the scene, we use additional sensors including an Inertial Measurement Unit featuring accelerometers and gyroscopes, a magnetometer, air-pressure data, and GPS information where available.
The framework that we have developed is publically available at
http://www.asl.ethz.ch/research/software. The packages
ethzasl_ptam,
ethzasl_sensor_fusion, and
asctec_mav_framework have been developed partially in the myCopter project.