Skip to Main content Skip to Navigation

Extended sensor fusion for embedded video applications

Abstract : This thesis deals with sensor fusion between camera and inertial sensors measurements in order to provide a robust motion estimation algorithm for embedded video applications. The targeted platforms are mainly smartphones and tablets. We present a real-time, 2D online camera motion estimation algorithm combining inertial and visual measurements. The proposed algorithm extends the preemptive RANSAC motion estimation procedure with inertial sensors data, introducing a dynamic lagrangian hybrid scoring of the motion models, to make the approach adaptive to various image and motion contents. All these improvements are made with little computational cost, keeping the complexity of the algorithm low enough for embedded platforms. The approach is compared with pure inertial and pure visual procedures. A novel approach to real-time hybrid monocular visual-inertial odometry for embedded platforms is introduced. The interaction between vision and inertial sensors is maximized by performing fusion at multiple levels of the algorithm. Through tests conducted on sequences with ground-truth data specifically acquired, we show that our method outperforms classical hybrid techniques in ego-motion estimation.
Document type :
Complete list of metadata
Contributor : ABES STAR :  Contact
Submitted on : Thursday, March 31, 2016 - 12:31:25 PM
Last modification on : Wednesday, November 17, 2021 - 12:31:03 PM
Long-term archiving on: : Friday, July 1, 2016 - 12:31:06 PM


Version validated by the jury (STAR)


  • HAL Id : tel-01295570, version 1


Manu Alibay. Extended sensor fusion for embedded video applications. Other. Ecole Nationale Supérieure des Mines de Paris, 2015. English. ⟨NNT : 2015ENMP0032⟩. ⟨tel-01295570⟩



Record views


Files downloads