Humans can sense, weigh and grasp different objects, deduce their physical properties at the same time, and exert appropriate forces – a challenging task for modern robots. Studying the mechanics of human grasping objects will play a supplementary role in visual-based robot object processing. These tools require large-scale tactile data sets with high spatial resolution. However, there is no large human-grasped tactile data set covering the whole hand, because dense coverage of the human hand with tactile sensors is challenging. Hence, the capability of observing and learning from successful daily humanobject interactions is the long-term goal of aiding the development of robots and prosthetics.
展开▼