Advertisement

Teleoperating robots with VR: MIT gets inside a robot’s head

Remote operators can now get inside a robot’s head from any distance, making it easier for factory workers to telecommute

By Gary Elinoff, contributing writer

Progress in robotics in recent years has been staggering, and present-day robots can operate with considerable autonomy and are capable of real local decision-making. But as of yet, we haven’t reached the point that actual R2-D2s or C-3POs are ready to be thrust out alone into the cold, cruel world. However, we humans can now begin to take a small step away from the action because it is now beginning to be possible to control our mechanical minions at a distance.

Just like a video game

Scientists at MIT’s Computer Science and Artificial Intelligence Lab (CSAIL ) have demonstrated a system allowing robots to be controlled via well-known and well-documented virtual reality (VR) gear such as hand controllers and by the now ubiquitous Oculus Rift headset. The system was described in a paper presented at the IEEE/RSJ International Conference on intelligent robots and systems in Vancouver, British Columbia, by CSAIL director Daniela Rus and post-doc Jeffrey Lipton.

Below, we see an operator controlling a Baxter humanoid robot from Rethink Robotics .

VR_Robot

Controlling a robot with VR components. Image source: Jason Dorfman, MIT CSAIL.

According to Lipton, “A system like this could eventually help humans supervise robots from a distance.”  And, “by teleoperating robots from home, blue-collar workers would be able to telecommute and benefit from the IT revolution just as white-collar workers do now.”

As described in MIT News , previous incarnations of controlling robots involved one of two methods:

The direct model. The virtual operator’s vision is directly coupled to the robot. The user is limited to one visual perspective, and inevitable signal delays can produce nausea (not for the robot, but for the user).

The cyber-physical model. In this regime, the user operates with a virtual copy of the robot and its operating environment. The problem here is that much more data is generated, which will be a handicap if the system is expected to operate with large distances between the robot and the operator.

The CSAIL system

The CSAIL system calls to mind the homunculus model, whereby it is as though a tiny creature lives inside our heads, seeing with our eyes and controlling our actions. And now, through the wonders of virtual reality, the human operator is the homunculus residing inside the robot’s head, directing Baxter’s actions by manipulating controls manifesting themselves virtually. The MIT News article points out that “the human’s space is mapped into the virtual space, and the virtual space is then mapped into the robot space to provide a sense of co-location.”

Instead of taking 2D images from two cameras and converting them into a 3D view, the CSAIL system bypasses this step by presenting each 2D image to either of the operator’s eyes. The operator’s brain then automatically “constructs” the 3D image that the operator perceives and works with.

Benchmarks

The CSAIL system was tested against other systems and showed clear advantages in performance. And more impressively still, during these tests, the Baxter was not operated by a user in the same room; rather, the user was physically located hundreds of miles away. And finally, yes, users with gaming experience were better at controlling Baxter then were non-gamers.

Potentially

It’s hard to imagine that these developments will benefit workers on the factory floor. If robots can be developed that can do all necessary mechanical movement, and if what’s happening can be so well understood that it can be described in ones and zeros, a bit of artificial intelligence will be all it’d take to obviate the need for the human at all. It is easier to see benefits for operating in toxic environments too dangerous for people, such as nuclear accident clean-ups. It would also make it possible to control military drone aircraft from the safety of a ship offshore, and if it gets shot down, well, nobody mourns a dead drone.

Advertisement



Learn more about Electronic Products Magazine

Leave a Reply