Dinesh Manocha Receives Google Faculty Award
Story by Samuel Malede Zewdu, CS Communications
University of Maryland Distinguished University Professor of Computer Science Dinesh Manocha recently received a Google Faculty Award to further his research utilizing large language models (LLMs) for autonomous mobile robot navigation.
Manocha's research sits at the vital intersection of human language and robotics.
“To make household robots truly indispensable, they must seamlessly comprehend human instructions,” Manocha said. “Large language models can address this need by absorbing vast amounts of data about the logical structures of various environments, particularly domestic settings.”
Consider instructing a robot to ensure the front door is closed. The robot should be able to understand, navigate, check and report back without needing complex commands or interfaces.
Manocha believes LLMs can revolutionize how robots navigate in real-world settings. By leveraging LLMs, robots can intuitively discern the appropriate placement of objects within a household, leading to more reliable robotic systems. With the Google award, Manocha aims to use LLMs to enhance robot navigation efficiency, enable them to give wayfinding instructions, and ensure consistent navigation across different simulation platforms. In addition, the study focuses on instilling long-term memory in embodied agents, improving their environmental understanding and decision-making abilities.
In tandem with these objectives, Manocha will actively work to generate new datasets. Designers create these datasets to equip domestic robots to identify and address potentially hazardous or unsanitary conditions in households, ensuring user safety.
"Receiving support for the research, especially from a leading company like Google, which is at the forefront of AI research and development, is encouraging," Manocha said. "We aim to design new methods for robot navigation in intricate, unstructured environments leveraging the latest developments in large language models to enhance the navigation capabilities of these robots."
Published November 2, 2023