Robots can now identify an object and how to grip different objects with the help of a sensor-packed glove developed by a team of MIT scientists. Touch is one of the five senses of humans which is now incorporated in robots. A massive dataset that enables an AI system to recognize objects through touch alone has been compiled by scientists from MIT. This could help robots handle things better and it may also help in designing prosthetics.
The sensor-packed glove is called the “scalable tacile glove” or STAG, and is very economical. The glove has been laminated with conductive polymer that reacts to applied pressure. The glove is rather a simple knit glove which has over 550 tiny sensors and costs about $10 to produce. Researches have tested the glove with 26 different objects like a soda can, scissors, tennis ball, spoon, pen, a mug and so on.
Upon holding the objects, the sensors gathered a pressure-signal data which was then interpreted by a neural network. The glove could identify and predict the weight of the object with up to 76 percent accuracy. The glove could handle any object within about 60 grams. The sensor provides data that allows researchers to see how different regions of the hand work together. In a report, MIT researcher Subramanian Sundaram said, “We’ve always wanted robots to do what humans can do, like doing the dishes or other chores. If you want robots to do these things, they must be able to manipulate objects really well.”