M. Sanfourche, J. Delaune, G. Le Besnerais, H. de Plinval, J. Israel, Ph. Cornic,
A. Treil, Y. Watanabe, A. Plyer
Nowadays, cameras and other exteroceptive sensors are on board of a large variety of automatic platforms, such as Unmanned Aerial Vehicles (UAV), space exploration probes and missiles. However, apart from this latter application, they are mostly used as payload and not to pilot the vehicle itself. In this paper, we focus on the use of computer vision for UAV perception to navigate through the environment and model it. This function is typically needed at low altitude in unknown or GPS-denied conditions.
The measurements from exteroceptive sensors can then be processed to obtain information about the motion of the UAV, or the 3D structure of the environment. Our contribution is presented starting with the vision-based closed control loop, where image-based navigation is integrated to UAV control. Then, we focus on proper motion estimation techniques, like mapless relative terrain navigation or map-based GPS alternatives. Eventually, environment mapping solutions are proposed. In most cases, real image sequences coming from an aircraft or a hand-held sensor are used for validation. Our research underlines the need for new co-designed 3D sensors and massivelyparallel computation technologies to go further in vision-based UAV navigation.