Vision-aided Inertial Navigation System Design for Indoor Quadrotors

Vision-aided Inertial Navigation System Design for Indoor Quadrotors
Author: Lianfeng Hou
Publisher:
Total Pages: 97
Release: 2015
Genre: Global Positioning System
ISBN:


Download Vision-aided Inertial Navigation System Design for Indoor Quadrotors Book in PDF, Epub and Kindle

The navigation task for unmanned aerial vehicles (UAVs), such as quadrotors, in an indoor environment becomes challenging as the global positioning system (GPS) and the magnetometer may provide inaccurate aiding measurements and the signals may get jammed. The navigation system design in this thesis integrates a visual navigation block with a inertial navigation system block, which adds information about aiding measurements information for indoor navigation design. The direct visual measurements are feature coordinates that are obtained from images taken from an onboard monocular camera with different positions in the 3D world space. The scaled relative pose measurements are generated through vision algorithm implementations presented in this thesis. The vehicle states are estimated using the extended Kalman filter (EKF) with inputs from a gyroscope and accelerometer. The EKF sensor fusion process combines inertial measurements and the visual aid- ing measurement to get an optimal estimation. This thesis provides two design results: one navigation system assumes that the 3D world feature coordinates are known and that the navigation system is map-based for the feature ex- traction. The other navigation system does not require prior knowledge of the feature location and captures the feature based on map-less vision algorithms with geometry constraints.

Vision-Aided Inertial Navigation

Vision-Aided Inertial Navigation
Author: Chiara Troiani
Publisher:
Total Pages: 0
Release: 2014
Genre:
ISBN:


Download Vision-Aided Inertial Navigation Book in PDF, Epub and Kindle

Accurate egomotion estimation is of utmost importance for any navigation system.Nowadays di_erent sensors are adopted to localize and navigate in unknownenvironments such as GPS, range sensors, cameras, magnetic field sensors, inertialsensors (IMU). In order to have a robust egomotion estimation, the information ofmultiple sensors is fused. Although the improvements of technology in providingmore accurate sensors, and the efforts of the mobile robotics community in thedevelopment of more performant navigation algorithms, there are still openchallenges. Furthermore, the growing interest of the robotics community in microrobots and swarm of robots pushes towards the employment of low weight, low costsensors and low computational complexity algorithms. In this context inertial sensorsand monocular cameras, thanks to their complementary characteristics, low weight,low cost and widespread use, represent an interesting sensor suite.This dissertation represents a contribution in the framework of vision-aided inertialnavigation and tackles the problems of data association and pose estimation aimingfor low computational complexity algorithms applied to MAVs.For what concerns the data association, a novel method to estimate the relative motionbetween two consecutive camera views is proposed. It only requires the observationof a single feature in the scene and the knowledge of the angular rates from an IMU,under the assumption that the local camera motion lies in a plane perpendicular to thegravity vector. Two very efficient algorithms to remove the outliers of the featurematchingprocess are provided under the abovementioned motion assumption. Inorder to generalize the approach to a 6DoF motion, two feature correspondences andgyroscopic data from IMU measurements are necessary. In this case, two algorithmsare provided to remove wrong data associations in the feature-matching process. Inthe case of a monocular camera mounted on a quadrotor vehicle, motion priors fromIMU are used to discard wrong estimations.For what concerns the pose estimation problem, this thesis provides a closed formsolution which gives the system pose from three natural features observed in a singlecamera image, once the roll and the pitch angles are obtained by the inertialmeasurements under the planar ground assumption.In order to tackle the pose estimation problem in dark or featureless environments, asystem equipped with a monocular camera, inertial sensors and a laser pointer isconsidered. The system moves in the surrounding of a planar surface and the laserpointer produces a laser spot on the abovementioned surface. The laser spot isobserved by the monocular camera and represents the only point feature considered.Through an observability analysis it is demonstrated that the physical quantities whichcan be determined by exploiting the measurements provided by the aforementionedsensor suite during a short time interval are: the distance of the system from the planarsurface; the component of the system speed that is orthogonal to the planar surface;the relative orientation of the system with respect to the planar surface; the orientationof the planar surface with respect to the gravity. A simple recursive method toperform the estimation of all the aforementioned observable quantities is provided.All the contributions of this thesis are validated through experimental results usingboth simulated and real data. Thanks to their low computational complexity, theproposed algorithms are very suitable for real time implementation on systems withlimited on-board computation resources. The considered sensor suite is mounted on aquadrotor vehicle but the contributions of this dissertations can be applied to anymobile device.

Aided Navigation: GPS with High Rate Sensors

Aided Navigation: GPS with High Rate Sensors
Author: Jay A. Farrell
Publisher: McGraw Hill Professional
Total Pages: 554
Release: 2008-04-03
Genre: Technology & Engineering
ISBN: 0071642668


Download Aided Navigation: GPS with High Rate Sensors Book in PDF, Epub and Kindle

Design Cutting-Edge Aided Navigation Systems for Advanced Commercial & Military Applications Aided Navigation is a design-oriented textbook and guide to building aided navigation systems for smart cars, precision farming vehicles, smart weapons, unmanned aircraft, mobile robots, and other advanced applications. The navigation guide contains two parts explaining the essential theory, concepts, and tools, as well as the methodology in aided navigation case studies with sufficient detail to serve as the basis for application-oriented analysis and design. Filled with detailed illustrations and examples, this expert design tool takes you step-by-step through coordinate systems, deterministic and stochastic modeling, optimal estimation, and navigation system design. Authoritative and comprehensive, Aided Navigation features: End-of-chapter exercises throughout Part I In-depth case studies of aided navigation systems Numerous Matlab-based examples Appendices define notation, review linear algebra, and discuss GPS receiver interfacing Source code and sensor data to support examples is available through the publisher-supported website Inside this Complete Guide to Designing Aided Navigation Systems • Aided Navigation Theory: Introduction to Aided Navigation • Coordinate Systems • Deterministic Modeling • Stochastic Modeling • Optimal Estimation • Navigation System Design • Navigation Case Studies: Global Positioning System (GPS) • GPS-Aided Encoder • Attitude and Heading Reference System • GPS-Aided Inertial Navigation System (INS) • Acoustic Ranging and Doppler-Aided INS

Inertial Navigation Aided by Simultaneous Localization and Mapping

Inertial Navigation Aided by Simultaneous Localization and Mapping
Author: V. Sazdovski
Publisher:
Total Pages:
Release: 2012
Genre:
ISBN:


Download Inertial Navigation Aided by Simultaneous Localization and Mapping Book in PDF, Epub and Kindle

Unmanned aerial vehicles technologies are getting smaller and cheaper to use and the challenges of payload limitation in unmanned aerial vehicles are being overcome. Integrated navigation system design requires selection of set of sensors and computation power that provides reliable and accurate navigation parameters (position, velocity and attitude) with high update rates and bandwidth in small and cost effective manner. Many of today's operational unmanned aerial vehicles navigation systems rely on inertial sensors as a primary measurement source. Inertial Navigation alone however suffers from slow divergence with time. This divergence is often compensated for by employing some additional source of navigation information external to Inertial Navigation. From the 1990's to the present day Global Positioning System has been the dominant navigation aid for Inertial Navigation. In a number of scenarios, Global Positioning System measurements may be completely unavailable or they simply may not be precise (or reliable) enough to be used to adequately update the Inertial Navigation hence alternative methods have seen great attention. Aiding Inertial Navigation with vision sensors has been the favoured solution over the past several years. Inertial and vision sensors with their complementary characteristics have the potential to answer the requirements for reliable and accurate navigation parameters. In this thesis we address Inertial Navigation position divergence. The information for updating the position comes from combination of vision and motion. When using such a combination many of the difficulties of the vision sensors (relative depth, geometry and size of objects, image blur and etc.) can be circumvented. Motion grants the vision sensors with many cues that can help better to acquire information about the environment, for instance creating a precise map of the environment and localize within the environment. We propose changes to the Simultaneous Localization and Mapping augmented state vector in order to take repeated measurements of the map point. We show that these repeated measurements with certain manoeuvres (motion) around or by the map point are crucial for constraining the Inertial Navigation position divergence (bounded estimation error) while manoeuvring in vicinity of the map point. This eliminates some of the uncertainty of the map point estimates i.e. it reduces the covariance of the map points estimates. This concept brings different parameterization (feature initialisation) of the map points in Simultaneous Localization and Mapping and we refer to it as concept of aiding Inertial Navigation by Simultaneous Localization and Mapping. We show that making such an integrated navigation system requires coordination with the guidance and control measurements and the vehicle task itself for performing the required vehicle manoeuvres (motion) and achieving better navigation accuracy. This fact brings new challenges to the practical design of these modern jam proof Global Positioning System free autonomous navigation systems. Further to the concept of aiding Inertial Navigation by Simultaneous Localization and Mapping we have investigated how a bearing only sensor such as single camera can be used for aiding Inertial Navigation. The results of the concept of Inertial Navigation aided by Simultaneous Localization and Mapping were used. New parameterization of the map point in Bearing Only Simultaneous Localization and Mapping is proposed. Because of the number of significant problems that appear when implementing the Extended Kalman Filter in Inertial Navigation aided by Bearing Only Simultaneous Localization and Mapping other algorithms such as Iterated Extended Kalman Filter, Unscented Kalman Filter and Particle Filters were implemented. From the results obtained, the conclusion can be drawn that the nonlinear filters should be the choice of estimators for this application.

Pedestrian Inertial Navigation with Self-Contained Aiding

Pedestrian Inertial Navigation with Self-Contained Aiding
Author: Andrei M. Shkel
Publisher: John Wiley & Sons
Total Pages: 194
Release: 2021-08-10
Genre: Technology & Engineering
ISBN: 1119699894


Download Pedestrian Inertial Navigation with Self-Contained Aiding Book in PDF, Epub and Kindle

Explore an insightful summary of the major self-contained aiding technologies for pedestrian navigation from established and emerging leaders in the field Pedestrian Inertial Navigation with Self-Contained Aiding delivers a comprehensive and broad treatment of self-contained aiding techniques in pedestrian inertial navigation. The book combines an introduction to the general concept of navigation and major navigation and aiding techniques with more specific discussions of topics central to the field, as well as an exploration of the future of the future of the field: Ultimate Navigation Chip (uNavChip) technology. The most commonly used implementation of pedestrian inertial navigation, strapdown inertial navigation, is discussed at length, as are the mechanization, implementation, error analysis, and adaptivity of zero-velocity update aided inertial navigation algorithms. The book demonstrates the implementation of ultrasonic sensors, ultra-wide band (UWB) sensors, and magnetic sensors. Ranging techniques are considered as well, including both foot-to-foot ranging and inter-agent ranging, and learning algorithms, navigation with signals of opportunity, and cooperative localization are discussed. Readers will also benefit from the inclusion of: A thorough introduction to the general concept of navigation as well as major navigation and aiding techniques An exploration of inertial navigation implementation, Inertial Measurement Units, and strapdown inertial navigation A discussion of error analysis in strapdown inertial navigation, as well as the motivation of aiding techniques for pedestrian inertial navigation A treatment of the zero-velocity update (ZUPT) aided inertial navigation algorithm, including its mechanization, implementation, error analysis, and adaptivity Perfect for students and researchers in the field who seek a broad understanding of the subject, Pedestrian Inertial Navigation with Self-Contained Aiding will also earn a place in the libraries of industrial researchers and industrial marketing analysts who need a self-contained summary of the foundational elements of the field.

Vision-aided Navigation for Autonomous Vehicles Using Tracked Feature Points

Vision-aided Navigation for Autonomous Vehicles Using Tracked Feature Points
Author: Ahmed Saber Soliman Sayem
Publisher:
Total Pages: 164
Release: 2016
Genre: Aids to navigation
ISBN:


Download Vision-aided Navigation for Autonomous Vehicles Using Tracked Feature Points Book in PDF, Epub and Kindle

This thesis discusses the evaluation, implementation, and testing of several navigation algorithms and feature extraction algorithms using an inertial measurement unit (IMU) and an image capture device (camera) mounted on a ground robot and a quadrotor UAV. The vision-aided navigation algorithms are implemented on data-collected from sensors on an unmanned ground vehicle and a quadrotor, and the results are validated by comparison with GPS data. The thesis investigates sensor fusion techniques for integrating measured IMU data with information extracted from image processing algorithms in order to provide accurate vehicle state estimation. This image-based information takes the forms of features, such as corners, that are tracked over multiple image frames. An extended Kalman filter (EKF) in implemented to fuse vision and IMU data. The main goal of the work is to provide navigation of mobile robots in GPS-denied environments such as indoor environments, cluttered urban environments, or space environments such as asteroids, other planets or the moon. The experimental results show that combining pose information extracted from IMU readings along with pose information extracted from a vision-based algorithm managed to solve the drift problem that comes from using IMU alone and the scale problem that comes from using a monocular vision-based algorithm alone.

Stochastic Constraints for Vision-aided Inertial Navigation

Stochastic Constraints for Vision-aided Inertial Navigation
Author: David D. Diel
Publisher:
Total Pages: 110
Release: 2005
Genre:
ISBN:


Download Stochastic Constraints for Vision-aided Inertial Navigation Book in PDF, Epub and Kindle

This thesis describes a new method to improve inertial navigation using feature-based constraints from one or more video cameras. The proposed method lengthens the period of time during which a human or vehicle can navigate in GPS-deprived environments. Our approach integrates well with existing navigation systems, because we invoke general sensor models that represent a wide range of available hardware. The inertial model includes errors in bias, scale, and random walk. Any camera and tracking algorithm may be used, as long as the visual output can be expressed as ray vectors extending from known locations on the sensor body. A modified linear Kalman filter performs the data fusion. Unlike traditional Simultaneous Localization and Mapping (SLAM/CML), our state vector contains only inertial sensor errors related to position. This choice allows uncertainty to be properly represented by a covariance matrix. We do not augment the state with feature coordinates. Instead, image data contributes stochastic epipolar constraints over a broad baseline in time and space, resulting in improved observability of the IMU error states. The constraints lead to a relative residual and associated relative covariance, defined partly by the state history. Navigation results are presented using high-quality synthetic data and real fisheye imagery.