An autonomous robot acting in an unknown dynamic environment requires a detailed understandingof its surroundings. This information is provided by mapping algorithms which arenecessary to build a sensory representation of the environment and the vehicle states. This aidsthe robot to avoid collisions with complex obstacles and to localize in six degrees of freedom i.e.x, y, z, roll, pitch and yaw angle. This process, wherein, a robot builds a sensory representationof the environment while estimating its own position and orientation in relation to those sensorylandmarks, is known as Simultaneous Localisation and Mapping (SLAM).A common method for gauging environments are laser scanners, which enable mobile robots toscan objects in a non-contact way. The use of laser scanners for SLAM has been studied andsuccessfully implemented. In this project, sensor fusion combining laser scanning and real timeimage processing is investigated. Hence, this project deals with the implementation of a VisualSLAM algorithm followed by design and development of a quadrotor platform which is equippedwith a camera, low range laser scanner and an on-board PC for autonomous navigation andmapping of unstructured indoor environments.This report presents a thorough account of the work done within the scope of this project.It presents a brief summary of related work done in the domain of vision based navigationand mapping before presenting a real time monocular vision based SLAM algorithm. A C++implementation of the visual slam algorithm based on the Extended Kalman Filter is described.This is followed by the design and development of the quadrotor platform. First, the baselinespeci cations are described followed by component selection, dynamics modelling, simulationand control. The autonomous navigation algorithm is presented along with the simulationresults which show its suitability to real time application in dynamic environments. Finally,the complete system architecture along with ight test results are described.
展开▼