22 Jul 2024


UK dual-arm robot is sensitive enough to handle a crisp

Bristol University’s dual-arm robot holding a crisp

UK researchers have developed a dual-arm robot system that is so sensitive that it can safely handle fragile items such as individual potato crisps. The AI-guided bi-touch system, devised by scientists at the University of Bristol, is claimed to display a tactile sensitivity approaching that of humans. The researchers believe it could revolutionise delicate applications such as picking and handling fruit, and could help to create a sense of touch in artificial limbs.

The technology uses an AI (artificial intelligence) agent to interpret its environment through tactile feedback, and then control the robots’ behaviour to provide precise sensing and gentle object manipulation.

“With our bi-touch system, we can easily train AI agents in a virtual world within a couple of hours to achieve bimanual tasks that are tailored towards the touch,” explains Yijiong Lin from the University’s Faculty of Engineering. “And more importantly, we can directly apply these agents from the virtual world to the real world without further training. The tactile bimanual agent can solve tasks even under unexpected perturbations and manipulate delicate objects in a gentle way.”

Yijiong Lin is the lead author in a report* on the development.

The researchers, from the University’s Robotics Laboratory, believe that manual manipulation with tactile feedback will be a key to human-level robot dexterity. Using twin robot arms in this way has not been explored widely, partly because of the limited availability of suitable hardware, combined with the complexity of designing effective controllers for such tasks. The team used recent advances in AI and robotic tactile sensing to develop their dual-arm robot system.

They created a simulation containing two robot arms with tactile sensors. They then designed a mechanism that encouraged the robot agents to learn to perform the bimanual tasks, and developed a real-world dual-arm robot system with tactile capabilities.

The robot learns its bimanual skills through deep reinforcement learning (Deep-RL), one of the most advanced techniques in the field of robot learning. It teaches robots to do things by letting them learn from trial and error – similar to training a dog with rewards and punishments.

To manipulate objects, the robot learns to make decisions by attempting various behaviours to achieve tasks such as lifting objects up without dropping or breaking them. If it fails, it learns what not to do. If it succeeds, it gets a reward. With time, it figures out the best ways to grab things. The AI agent relies entirely on proprioceptive feedback – a body’s ability to sense movement, action and location and tactile feedback.

The Bristol researchers’ dual-arm robot has safely held and lifted items as fragile as single Pringles crisps.

“Our bi-touch system showcases a promising approach with affordable software and hardware for learning bimanual behaviours with touch in simulation, which can be directly applied to the real world,” says Professor Nathan Lepora, who co-authored the report. “Our tactile dual-arm robot simulation allows further research on more different tasks as the code will be open-source, which is ideal for developing other downstream tasks.”

“Our bi-touch system allows a tactile dual-arm robot to learn purely from simulation, and to achieve various manipulation tasks in a gentle way in the real world,” adds Yijiong Lin. “And now we can easily train AI agents in a virtual world within a couple of hours to achieve bimanual tasks that are tailored towards touch.”

Bi-Touch: Bimanual Tactile Manipulation With Sim-to-Real Deep Reinforcement Learning by Yijiong Lin, Nathan Lepora et al, published in IEEE Robotics and Automation Letters.

Bristol Robotics LaboratoryX  LinkedIn