22 Jul 2024


Robot uses mocap technology to track human guide

Madlab researcher Madeline Gannon guides a robot using gestures which it tracks using a motion capture system

A US research organisation called Madlab.CC is developing software that allows humans to interact with industrial robots using intuitive gestures.

The Quipt software uses motion capture (“mocap”) technology to track reflective markers worn on a human trainer’s hands and body. This allows the robot to follow, mirror, respond to and avoid the trainer as they collaborate with each other.

Madlab is using Quipt to give “eyes” to an ABB IRB 6700 industrial robot. The software receives data from a Vicon mocap system and reformats it into movement commands for the robot. Quipt can also depict the data in an Android app, giving the human collaborator a continuous mobile view of what the robot is seeing.

Industrial robots have little-to-no awareness of the environment outside of their programmed tasks. This is a key reason why they have thrived only in controlled environments, such as factories. Their workzones have traditionally needed to be separated from unpredictable objects – people, in particular. 

This limits their use in less controlled environments. Because industrial robots can be dangerous and, because they need technical skills to program, they can be difficult to use.

Quipt is intended to help overcome these limitations, and to show how robots could be safer and easier to use in uncontrolled settings. Instead using code to program a robot, Quipt gives it spatial awareness and the ability to interact with people.

A human guides the robot’s movements by pointing, posturing and similar actions. The motion capture system tracks their position and orientation, and Quipt converts this data into a format that the robot can handle.

This gives the robot an awareness of where the person is. The software can tell the robot how the person is moving and how it should react in response.

Madlab foresees a future where industrial robots will move out of the factory. The next step is to create ways for machines to augment our abilities, not replace them. Reinventing the interfaces that link humans to robots will not only change how we use them, but could also find new uses for them.