In a world where industry has a need for ever more complex automated machines, robot grasping is still far from human capabilities. Despite recent innovations in computer vision and grasping planning, it is still a challenge for a robot to pick never-before-seen objects. Researchers are trying to combine vision with tactile sensing to augment the performance of modern intelligent machines. In this thesis, we will present a novel way to improve robotic grasping using tactile sensors and an unsupervised feature-learning algorithm. Using a test bench and sensors at the Control and Robotics (CoRo) laboratory of the ÉTS, we have developed and tested a series of classifiers to predict the outcome of a robotic grasp. Our method improves upon the results of hand-crafted feature learning. We have collected data from 100 different everyday objects, executing 10 grasping attemps per object, for a total of 1000 grasping attemps. The optimal system we developed recognized grasp failures 84.23% of the time.
Keywords:
Tactile sensors; Unsupervised learning; Sparse Coding; Grasp assessment