Researchers have compiled a massive dataset that enables an AI system to recognize objects through touch alone. Signals are collected by a user wearing a sensor-packed glove while handling a variety of objects. The information could be leveraged to help robots identify and manipulate objects and may aid in prosthetics design.
The low-cost knitted glove, called scalable tactile glove (STAG), is equipped with about 550 tiny sensors across nearly the entire hand. Each sensor captures pressure signals as humans interact with objects in various ways. A neural network processes the signals to “learn” a dataset of pressure-signal patterns related to specific objects. Then, the system uses that dataset to classify the objects and predict their weights by feel alone, with no visual input needed.
Prosthetics manufacturers can potentially use information to, say, choose optimal spots for placing pressure sensors and help customize prosthetics to the tasks and objects people regularly interact with.
For more information, visit here .