Allowing robots to feel

With the help of machine learning, ETH researchers have developed a novel yet low-cost tactile sensor. The sensor measures force distribution at high resolution and with great accuracy, enabling robot arms to grasp sensitive or fragile objects.
Don’t drop it – a conventional robot gripper at work. (Photograph: Shutterstock)

We humans have no problem picking up fragile or slippery objects with our hands. Our sense of touch lets us feel whether we have a firm grasp on the object or if it’s about to slip through our fingers, so we can adjust the strength of our grip accordingly. Robot gripper arms tasked with picking up objects that are fragile or slippery or have a complex surface also require this kind of feedback.

Robotics researchers at ETH Zurich have now developed a tactile sensor that could come in handy in just such an instance – and marks what they see as a significant step towards “robotic skin”. The sensor’s extremely simple design makes it inexpensive to produce, as the engineers point out. Essentially, it consists of an elastic silicone “skin” with coloured plastic microbeads and a regular camera affixed to the underside.

Measurements using purely optical input

The sensor is vision-based: when it comes into contact with an object, an indentation appears in the silicone skin. This changes the pattern of the microbeads, which is registered by the fisheye lens on the underside of the sensor. From those changes to the pattern, it is possible to calculate the force distribution on the sensor.

“Conventional sensors register the applied force at only a single point. By contrast, our robotic skin lets us distinguish between several forces acting on the sensor surface, and calculate them with high degrees of resolution and accuracy,” Carlo Sferrazza says. He is a doctoral student in the group led by Raffaello D’Andrea, Professor of Dynamic Systems and Control at ETH Zurich. “We can even determine the direction from which a force is acting,” Sferrazza says. In other words, the researchers can identify not only forces that exert vertical pressure on the sensor, but also shear forces, which act laterally.

Data-driven development

To calculate which forces push the microbeads in which directions, the engineers use a comprehensive set of experimental data: in tests that were standardised through machine control, they examined a variety of different kinds of contact with the sensor. They were able to precisely control and systematically vary the location of the contact, the force distribution and the size of the object making contact. With the help of machine learning, the researchers recorded several thousand instances of contact and precisely matched them with changes in the bead pattern.

(Video: ETH Zurich)

The thinnest sensor prototype the researchers have built so far is 1.7 centimetres thick and covers a measurement surface of 5 by 5 centimetres. However, the researchers are working on using the same technique to realise larger sensor surfaces that are equipped with several cameras, and can thus also recognise objects of complex shape. In addition, they aim to make the sensor thinner – they believe it is possible to achieve a thickness of just 0.5 centimetres using existing technology.

Robotics, sport and virtual reality

Because the elastic silicone is non-slip and the sensor can measure shear forces, it is well suited for use in robot gripper arms. “The sensor would recognise when an object threatens to slip out of the arm’s grasp so the robot can adjust its grip strength,” Sferrazza explains.

Researchers could also use such a sensor to test the hardness of materials or to digitally map touches. If integrated into wearables, cyclists could measure how much force they are applying to the bike through the pedals, or runners could measure the force that goes into their shoes when jogging. Lastly, such sensors can provide information important to developing tactile feedback, for example for virtual reality games.

Reference

Sferrazza C, Wahlsten A, Trueeb C, D’Andrea R: Ground Truth Force Distribution for Learning-​Based Tactile Sensing: A Finite Element Approach. IEEE Access 2019, doi: 10.1109/ACCESS.2019.2956882

Sferrazza C, D’Andrea R: Design, Motivation and Evaluation of a Full-​Resolution Optical Tactile Sensor. Sensors 2019, 19: 928, doi: 10.3390/s19040928