• Classified by Topic • Classified by Publication Type • Sorted by Date • Sorted by First Author Last Name • Classified by Funding Source •
Structure Based Color Learning on a Mobile Robot under Changing Illumination.
Mohan
Sridharan and Peter Stone.
Autonomous Robots, 23(3):161–182,
2007.
Official versionfrom
the Autonomous Robots publisher's webpage.
A central goal of robotics and AI is to be able to deploy an agent to act autonomously in the real world over an extended period of time. To operate in the real world, autonomous robots rely on sensory information. Despite the potential richness of visual information from on-board cameras, many mobile robots continue to rely on non-visual sensors such as tactile sensors, sonar, and laser. This preference for relatively low-fidelity sensors can be attributed to, among other things, the characteristic requirement of real-time operation under limited computational resources. Illumination changes pose another big challenge. For true extended autonomy, an agent must be able to recognize for itself when to abandon its current model in favor of learning a new one; and how to learn in its current situation. We describe a self-contained vision system that works on-board a vision-based autonomous robot under varying illumination conditions. First, we present a baseline system capable of color segmentation and object recognition within the computational and memory constraints of the robot. This relies on manually labeled data and operates under constant and reasonably uniform illumination conditions. We then relax these limitations by introducing algorithms for i) Autonomous planned color learning, where the robot uses the knowledge of its environment (position, size and shape of objects) to automatically generate a suitable motion sequence and learn the desired colors, and ii) Illumination change detection and adaptation, where the robot recognizes for itself when the illumination conditions have changed sufficiently to warrant revising its knowledge of colors. Our algorithms are fully implemented and tested on the Sony ERS-7 Aibo robots.
@Article{AURO07-mohan, author = "Mohan Sridharan and Peter Stone", title = "Structure Based Color Learning on a Mobile Robot under Changing Illumination", journal = "Autonomous Robots", year = "2007", volume = "23", number = "3", pages = "161--182", bibauthor = "smohan", abstract = {A central goal of robotics and AI is to be able to deploy an agent to act autonomously in the real world over an extended period of time. To operate in the real world, autonomous robots rely on sensory information. Despite the potential richness of visual information from on-board cameras, many mobile robots continue to rely on non-visual sensors such as tactile sensors, sonar, and laser. This preference for relatively low-fidelity sensors can be attributed to, among other things, the characteristic requirement of real-time operation under limited computational resources. Illumination changes pose another big challenge. For true extended autonomy, an agent must be able to recognize for itself when to abandon its current model in favor of learning a new one; and how to learn in its current situation. We describe a self-contained vision system that works on-board a vision-based autonomous robot under varying illumination conditions. First, we present a baseline system capable of color segmentation and object recognition within the computational and memory constraints of the robot. This relies on manually labeled data and operates under constant and reasonably uniform illumination conditions. We then relax these limitations by introducing algorithms for i) Autonomous planned color learning, where the robot uses the knowledge of its environment (position, size and shape of objects) to automatically generate a suitable motion sequence and learn the desired colors, and ii) Illumination change detection and adaptation, where the robot recognizes for itself when the illumination conditions have changed sufficiently to warrant revising its knowledge of colors. Our algorithms are fully implemented and tested on the Sony ERS-7 Aibo robots.}, wwwnote={<a href="http://springer.r.delivery.net/r/r?2.1.Ee.2Tp.1gRdFJ.BvfH9M..N.Dp4u.2rz0.DMQEcX00">Official version from the <a href="http://www.springer.com/10514/">Autonomous Robots</a> publisher's webpage.}, }
Generated by bib2html.pl (written by Patrick Riley ) on Tue Nov 19, 2024 10:24:39