Visual/Inertial Motion Estimation

Fundamental to the successful operation of unmanned aerial vehicles, or UAVs, for service and inspection tasks is the capability to sense ego-motion and to compute a basic map the environment in real-time. The application requires a high degree of robustness and accuracy yet payload, power and size constraints limit the choice for the onboard sensor suite. We developed a framework that tightly couples visual and inertial cues to achieve the required robustness and accuracy.

This page presents our efforts towards a general-purpose Simultaneous Localization and Mapping (SLAM) device, with many applications that go far beyond aerial vehicles and other mobile robotic systems. Please contact us if you are interested in a collaboration: christoph.huerzeler@mavt.ethz.ch, janosch.nikolic@mavt.ethz.ch.

Visual/Inertial SLAM Sensor

New 2012 2nd generation SLAM Sensor. The new prototype integrates inertial sensors of different grades (all MEMS), global shutter machine-vision cameras, a XILINX Spartan-6 FPGA and a dual-core Intel ATOM processor. The FPGA performs sensor time-synchronization and visual feature detection, while the ATOM performs visual/inertial real-time SLAM/sensor fusion.

“General purpose SLAM in a Box” system developed to enable motion estiamtion and mapping in any GPS-denied (e.g. indoor) environment.
Sensor mounted on test platform.

Visual/Inertial Motion Estimation Prototype

1st generation visual/inertial motion estimation system used in the Narcea field tests in 2012. It is now replaced by the SLAM sensor 2nd prototype above.

1st prototype of a ”visual/inertial motion estimation system

Visual/Inertial motion estimation performed in teal-time (Rainer Voigt):

First successful flight tests (2011) (Michael Burri). Have a look at our latest flight experiments here: System Tests Narcea Power Plant.

You are here: startresearchperception
Valid CSS Driven by DokuWiki Valid XHTML 1.0