blog

Robots are guided by humans in the loop.

Robots are guided by humans in the loop.

Robots are guided by humans. Robots cannot see past barriers, much like humans. To reach where they’re going, they occasionally require a little assistance.

Rice University engineers have created a technique that enables people to assist robots in “seeing” their surrounds and performing jobs.

A unique approach to the age-old issue of motion planning for robots operating in surroundings where not everything is always clearly visible is the Bayesian Learning IN the Dark (BLIND) concept.

Late in May, the George R. Brown School of Engineering at Rice presented the research led by computer scientists Vaibhav Unhelkar and Lydia Kavraki, as well as co-lead authors Carlos Quintero-Pea and Constantinos Chamzas, at the International Conference on Robotics and Automation of the Institute of Electrical and Electronics Engineers.

According to the report, the program, which was principally created by graduate students Quintero-Pea and Chamzas who collaborated with Kavraki, keeps a human in the loop to “augment robot perception and, significantly, prevent the execution of unsafe motion.”

In order to help robots that have “large degrees of freedom,” or numerous moving parts, they did this by combining Bayesian inverse reinforcement learning—a method through which a system learns from continuously updated information and experience—with tried-and-true motion planning approaches.

The Rice lab used a Fetch robot, an articulated arm with seven joints, to test BLIND by instructing it to move across a barrier while grabbing a small cylinder from one table and moving it to another.

According to Quintero-Pea, “instructions to the robot get complicated if you have more joints.” You only need to say, “Lift up your hand,” when directing a person.

But when barriers obstruct the machine’s “view” of its target, a robot’s programmers must be precise about the movement of each joint at every point in its trajectory.

BLIND inserts a human mid-process to modify the choreographic options—or best guesses—suggested by the robot’s algorithm rather than programming a trajectory in advance. In this high-degree-of-freedom space, “BLIND allows us to take information in the human head and compute our trajectories,” Quintero-Pea explained.

We employ a particular technique of feedback called critique, which is essentially a binary form of feedback in which the human is given labels on segments of the trajectory.

These labels appear as a series of interconnected green dots that show potential routes. The human accepts or rejects each step that BLIND takes as it moves from dot to dot in order to fine-tune the course and effectively avoid obstructions.

According to Chamzas, because we can tell the robot, “I like this,” or “I don’t like that,” the interface is simple for people to utilize. The robot can complete its mission after it receives approval for a set of moves, he said.

“One of the most important things here is that human preferences are hard to describe with a mathematical formula,” Quintero-Peña said. “Our work simplifies human-robot relationships by incorporating human preferences. That’s how I think applications will get the most benefit from this work.”

“This work wonderfully exemplifies how a little, but targeted, human intervention can significantly enhance the capabilities of robots to execute complex tasks in environments where some parts are completely unknown to the robot but known to the human,” said Kavraki, a robotics pioneer whose resume includes advanced programming for NASA’s humanoid Robonaut aboard the International Space Station.

“It shows how methods for human-robot interaction, the topic of research of my colleague Professor Unhelkar, and automated planning pioneered for years at my laboratory can blend to deliver reliable solutions that also respect human preferences.”

Read More

smartechlabs

Recent Posts

IoT Smart Home With Bluetooth Voice Control & Energy Monitoring

The Future of Homes Is Smart https://youtu.be/dxeC41gVSQ4 Imagine walking into your house, saying “lights on”,…

3 weeks ago

How Smart Weather Stations Revolutionize Farming

Weather plays a pivotal role in farming, influencing everything from planting schedules to irrigation needs.…

3 weeks ago

How IoT is Revolutionizing Precision Farming for Smallholders

Introduction Imagine a world where farming decisions are guided not just by intuition but by…

3 weeks ago

AI-Powered Crop Harvesting: Benefits and Challenges

Introduction Imagine a world where robots and artificial intelligence (AI) handle the backbreaking work of…

3 weeks ago

AI Models for Predicting Drought Impact on Crop Yields

Introduction AI models for drought prediction, and made you ever wondered how farmers and researchers…

3 weeks ago

DIY IoT Hydroponic & Aquaculture Monitor with Arduino Nano, ESP-01, and Blynk

https://youtu.be/PpIlTJ0myoM Introduction: Why Bother Monitoring Water Anyway? IoT Aquaculture project If you’ve ever tried growing…

1 month ago

This website uses cookies.