The objective of this project was to create an autonomous system to search for a desired object in a foreign environment while displaying, in real-time, a two-dimensional map of the system’s current and past positions, the terrain of the environment, and the object’s location.
In order to accomplish this, a programmable robot, the iRobot Create Roomba, was used in conjunction with a TRENDnet Wireless Camera. The camera was rigidly mounted to the Roomba using an 80/20 aluminum extrusion structure and powered using a battery placed atop the Roomba. As this robot navigates the environment, images of the environment are captured and analyzed in using two different methods: (i) blob analysis is used to compare features in the image to the unique aspects of the image of the desired object, and (ii) edge detection and Hough transform are used to identify terrain features near the robot’s path. Based on results of these methods, the map of the robot’s current location and surroundings is updated. The robot will continue to map the environment until at least one of three conditions is met: (i) the object is found, (ii) the entire room is considered mapped, or (iii) the allotted mapping time is exceeded.
The most notable challenges included implementing a robust algorithm for efficiently navigating the environment with obstacles, distinguishing orientation of the obstacles, accurately reporting the robot’s motion relative to these obstacles, and locating the desired object within the environment.
Copyright © Aida Yoguely Cortés-Peña. All Rights Reserved.