Please use this identifier to cite or link to this item: http://hdl.handle.net/10603/10152
Title: Vision aided strapdown inertial navigation system for autonomous landing of unmanned aerial vehicles
Researcher: Anitha G
Guide(s): Shanmugam,J.
Keywords: Inertial navigation system, autonomous landing, Unmanned aerial vehicle, Strapdown Inertial Navigation System, Micro Electro-Mechanical
Upload Date: 29-Jul-2013
University: Anna University
Completed Date: 
Abstract: An autonomous Unmanned Aerial Vehicle (UAV) must operate without human intervention, yet must meet the rigorous requirements associated with any airborne platform. To eliminate the need for human supervision, a UAV must be capable of carrying out at least three essential tasks autonomously: take-off, navigation and landing. Hence the proposed work is to design a vision aided Strapdown Inertial Navigation System for the autonomous landing of UAVs. The main objectives of this thesis are to describe the underlying theory behind the SDINS and vision based navigation in order to emphasize the advantages and limitations of both the systems and understand the need and significance of integrating them, to simulate the Strapdown Inertial Navigation System using quaternion and validate the same with real time data, and to validate the algorithm using the Flight gear simulation software. The major conclusions of the thesis are the body angular rates and accelerations for an aircraft from the six degrees of freedom trajectory is simulated using a kinematic approach and these angular rates and accelerometers are used in the error modeling of Micro Electro-Mechanical System inertial sensors. The advantage of this method is that there is no need for a camera calibration which means that the focal length of the camera lens and the mounting angles relative to the aircraft are not required, the most significant conclusion for this work is the demonstration of the viability of fusing imaging and inertial sensors for navigation, the technique is incorporated into an automated vision aided SDINS using Unscented Particle filter, the SDINS/Vision integration algorithm was implemented in MATLAB/Simulink and performance verified by interfacing the algorithm with XPLANE flight simulator, the position accuracy in terms of Root Mean Square (RMS) is improved by the integration of inertial and vision data, this development holds a great deal of promise to produce a lowcost, navigation-grade optical-inertial sensor for passive environments
Pagination: xxvii, 209
URI: http://hdl.handle.net/10603/10152
Appears in Departments:Faculty of Electrical and Electronics Engineering

Files in This Item:
File Description SizeFormat 
01_title.pdfAttached File49.36 kBAdobe PDFView/Open
02_certificates.pdf930.59 kBAdobe PDFView/Open
03_abstract.pdf22.49 kBAdobe PDFView/Open
04_acknowledgement.pdf13.89 kBAdobe PDFView/Open
05_contents.pdf67.81 kBAdobe PDFView/Open
06_chapter 1.pdf248.63 kBAdobe PDFView/Open
07_chapter 2.pdf224.38 kBAdobe PDFView/Open
08_chapter 3.pdf1.41 MBAdobe PDFView/Open
09_chapter 4.pdf1 MBAdobe PDFView/Open
10_chapter 5.pdf306.77 kBAdobe PDFView/Open
11_chapter 6.pdf873.14 kBAdobe PDFView/Open
12_chapter 7.pdf25 kBAdobe PDFView/Open
13_appendices 1 to 4.pdf1.27 MBAdobe PDFView/Open
14_publications.pdf17.02 kBAdobe PDFView/Open
15_vitae.pdf10.79 kBAdobe PDFView/Open


Items in Shodhganga are protected by copyright, with all rights reserved, unless otherwise indicated.