Advertisement
Blogs
Advertisement

The Challenges of Rapidly Developing an Autonomous Robotic System

Thu, 09/23/2010 - 9:51am
Frank van Gennip, Ben van Seggelen, Bart Bouten, Daan Verstegen, Erik Hoedemaekers, Jeroen van de Mortel, and Rik Boonen - Fontys University of Applied Sciences

an autonomous robotic systemThe Challenge: To rapidly develop an autonomous robotic system to perform complex tasks quickly and accurately for the Field Robot Event 2010 competition with team members of various programming skill levels.
The Solution: Using multicore and parallel loop programming with NI LabVIEW and NI Embedded Vision System to build a winning robot.

"With LabVIEW, we built a robot that could outperform the competition more rapidly and with more ease than other software options." Fontys University of Applied Sciences focuses on turning students into well-rounded mechatronics and robotics engineers. A key element to this education includes hands-on opportunities to apply engineering principles to real-world problems.

Third-year engineering students at Fontys took on the challenge of developing a robot from the ground up within four months to compete in the Field Robot Event 2010 in Brauschweig, Germany. The Field Robot Event is a pan-European competition with 20 teams from major universities.

Each team builds a robot to perform three key tasks, including the following:
1. Driving independently through long, curving corn field rows to cover as much distance as possible within three minutes
2. Navigating between straight corn rows in which there may be up to 1 m gaps between plants, traveling as far as possible in three minutes
3. Detecting weeds between cornstalks and spraying or treating the weeds effectively

The Field Robot Event

Robot Development
We named our robot Ceres, after the Roman goddess of agriculture. Ceres contains two motors to power four wheels and two motors for steering control. Because of the robots’ symmetry reversing is exactly the same as forward driving, which simplified the software development.

We selected the LabVIEW Real-Time Module and an NI Embedded Vision System to build Ceres from the ground up. The vision system is an embedded rugged computer running a real-time OS. We attached to the embedded system an IEEE 1394 color camera that looks upward to a mounted parabolic mirror, providing a 360 degree view of the area around the robot. With this approach, we detected objects around the robot with a single camera, greatly simplifying signal monitoring and processing. The embedded computer uses a serial port to communicate with motor controllers developed in house.

LabVIEW for Parallel Processing of Obstacle Detection and Path Planning
In the 2009 competition, our robot was based on a Linux computer and used ANSI C algorithms. However, this year we wanted to code more complex functionality than we could in ANSI C with our short development time.
For the 2010 competition, we chose LabVIEW graphical system design software for faster, easier development. Team members without programming experience could easily contribute to the project with the LabVIEW GUI. We also used LabVIEW to implement parallel processing in our code without extensive experience in parallel programming.
Real-time processing and vision integration in LabVIEW also made programming more seamless than in previous years, saving precious time in our short development window. We also used a webcam during prototyping so we could begin developing our code before the robotic hardware was available.

The camera acquires images at 30 fps, and the LabVIEW Vision Development Module overlays a curved region of interest (ROI) and applies an excessive green filter. This is a computationally intensive filter and the software recalculates each pixel value based on the RGB values. To improve computational speeds, we applied a parallel For Loop, a new feature in LabVIEW 2009, that automatically divides For Loop iterations over different processor cores, by which individual iterations of the For-loop can be processed at the same time by multiple threads of the processor. The program thresholds the ROI and calculates a cumulative histogram over the x-axis. Using the histogram, the software program searches for edges where the corn plants are to steer the robot accordingly.

Results
With LabVIEW, we built a robot that could outperform the competition more rapidly and with more ease than other software options. At the Field Robot Event, Ceres competed against 16 teams in three matches and won first place in two of the three matches to claim the top prize overall. The robot performed especially well in the first match, driving three times further than its closest competitor.

Next year, a new group of students will focus on further improving Ceres. Proposed enhancements include better weed detection, wheel and turning radius improvements, and further vision code optimization. 

Topics

Advertisement

Share this Story

X
You may login with either your assigned username or your e-mail address.
The password field is case sensitive.
Loading