When we think of robots, we envision the future. Intelligent mobile robots that can answer questions, give directions, complete tasks, and walk us through an ever-changing world—these robots could one day make more static technologies like Alexa, Siri, and GoogleHome look outdated.
Unbeknownst to many, robots of this caliber are being developed by our very own Texas Computer Science faculty and students in the Gates Dell Complex (GDC) right now! These are the robots that will shape the future of the Texas Computer Science experience all while contributing to the growing field of artificial intelligence.
More specifically, this pursuit, called the Building-Wide Intelligence Project, is housed-at and maintained-by a group of researchers in the Artificial Intelligence Lab on the third floor of the GDC. Established in 2014 by Texas Computer Science Professor Peter Stone, the BWI Project merges the fields of robotics and artificial intelligence in an aim to create robots that are both helpful guides for visitors on campus and useful platforms for groundbreaking research. The researchers working on this project are a diverse group, including professor and faculty overseers and involving postdoctoral, graduate, and undergraduate students. Together, they work toward a common goal—to develop fully autonomous mobile robots that will eventually become a permanent part of the GDC environment.
In other words, Stone states, “We want people to just walk into the GDC and expect to see robots—getting more used to being around them, having them not just see robots as novelties—with this project, we can see that slowly happening.”
Currently, these robots, also called BWIBots, can perform a variety of tasks. For example, they can answer a multitude of questions related to the Computer Science Department or the GDC building. They can also roam any given area without collision, avoiding both walls and moving obstacles, which may sound simple, but is in actuality, a convoluted process in which the robots create and use navigational planners to move. They also act as guides for visitors to the building, giving directions using an internal map or even leading people directly to the rooms they are looking for. Thus, the BWIBots can localize themselves using information from the environment around them. The bots can also identify common objects and fetch them for visitors. Furthermore, they can follow people around and recognize familiar faces.
Meanwhile, Stone is most excited about the natural-language capabilities of the BWIBots. He is working with other computer scientists who work on natural language processing, such as Prof. Raymond Mooney, to create more genuine interactions between the robot and humans.
Stone says, “it’s really amazing because not many groups in the world have access to both capable robots and world-class expertise in natural language.”
Aside from natural-language learning, the functionalities on the robots are related to a wide array of other computational research areas, including but not limited to: reasoning, planning, human-robot interaction, computer vision, localization, and multi-robot coordination. All these areas intertwine and work together in a complex and beautiful way to make even the most seemingly-simple tasks possible.
The robots are also unique in that they are, as Stone says, “custom built—from a hardware-perspective. They have very specific capabilities like [the earlier mentioned] activity recognition, semantic mapping, and others”
In the past, the BWIBots have participated in a variety of events, including ExploreUT (the largest open house in Texas) and the 2015 AAAI Conference robotics exhibition. They are also used in the Autonomous Intelligent Robots Stream in the Freshman Research Initiative (FRI) Program which aims to provide undergraduate students with a real-world research experience. Additionally, since the start of the BWI project, the robots have been used for experiments and demonstrations in 24 research publications.
Stone states since the BWI Project “doubles as a research project and educational program, the capabilities [of the robots] really depend on the particular set of people working on it currently and their interests.” Right now, Stone is working on multi-agent systems research, in other words, getting the robots to communicate and interact with one another.
Social experiments are also being performed with the BWIBots to investigate people’s different viewpoints on robots and to study human-robot interaction. Results and feedback from these experiments will then be used to further improve the BWIBots.
From elevating our understanding of artificial intelligence to paving the way for the emerging future of assistive mobile robots, the BWIBots are here to stay. These robots will continue to improve, growing in functionality and efficiency, as they gradually become integrated into the GDC environment, and eventually, the overall Texas Computer Science experience