首页|Patent Issued for Robot navigation and robot-IoT interactive task planning using augmented reality (USPTO 11878421)

Patent Issued for Robot navigation and robot-IoT interactive task planning using augmented reality (USPTO 11878421)

扫码查看
From Alexandria, Virginia, NewsRx journalists report that a patent by the inventors Cao, Yuanzhi (West Lafayette, IN, US), Ramani, Karthik (West Lafayette, IN, US), Xu, Zhuangying (Redmond, WA, US), filed on April 23, 2019, was published online on January 23, 2024. The patent’s assignee for patent number 11878421 is Purdue Research Foundation (West Lafayette, Indiana, United States). News editors obtained the following quote from the background information supplied by the inventors: “Technical Field “Embodiments described in this disclosure relate generally to a system for robot navigation and task planning, and more particularly to an authoring system for robot navigation and task planning using a mobile device with augmented-reality simultaneous localization mapping (“AR-SLAM”) capabilities. “Brief Description of the Related Art “The concept of Internet of Robotic Things has not been widely explored in practice across the IoT and robotics communities, and as such, heretofore authoring systems for robot-IoT interactive task planning remain underdeveloped. Due to limited perception capabilities and current limitations in artificial intelligence (“AI”), ad-hoc tasks which humans take for granted remain challenging for robots. Previous work in this field has introduced external vision systems for tracking robots using a live camera view in a user interface. But this approach limits the authoring scene to the perspective of the camera only which is usually fixed. Another previous work, “Magic Cards,” proposed an implicit command authoring workflow with humans manually and spatially placing tags in a physical environment for facilitating robot navigation. In this method, however, the tracking from an overhanging camera was prone to occlusion, especially in a cluttered scene such as a household environment. Further, recent research has employed augmented reality (“AR”) interfaces and associated robots within an AR scene, for example, using hand-held or head-mounted devices. Although the mobility allowed users to move around and author distributed tasks from different perspectives, the limited field-of-view of the hand-held or head-mounted devices constrained the navigation range for the robot.

Emerging TechnologiesMachine LearningNano-robotPurdue Research FoundationRobotRobotics

2024

Robotics & Machine Learning Daily News

Robotics & Machine Learning Daily News

ISSN:
年,卷(期):2024.(Feb.9)