Robots are expected to operate autonomously in unstructured, real-world environments. For effective physical interaction with the world, robots must build and refine their understanding of the environment through sensory feedback. However, tactile feedback has been used primarily in open-air environments and not within granular materials. When robots operate within opaque granular materials, tactile and proprioceptive feedback can be more informative than visual feedback. Our long-term objective is to leverage tactile sensors to develop efficient algorithms that enable robots to infer environmental conditions and to plan exploratory movements that reduce uncertainty in their models of the world. Motivated by the need to keep humans out of harm’s way in search and rescue or other field environments, we address the challenge of using tactile feedback to locate objects buried in granular materials.
In study #1, we designed a tactile perception pipeline for sensorized robot fingertips that directly interact with granular materials in teleoperated systems. We proposed an architecture called the Sparse-Fusion Recurrent Neural Network (SF-RNN) to detect contact with an object buried within granular materials. We leveraged multimodal tactile sensor data in order to classify contact states within five different granular materials. We also constructed a belief map that combines probabilistic contact state estimates and fingertip location.
In study #2, we developed a framework for tactile perception, mapping, and haptic exploration for the autonomous localization of objects buried in granular materials. The haptic exploration task was performed within densely packed sand mixtures using sensor models that account for granular material characteristics and aid in the interpretation of interaction forces between the robot and its environment. The haptic exploration strategy was designed to efficiently locate and refine the outline of a buried object while simultaneously minimizing potentially damaging physical interactions with the object. Continuous occupancy maps were generated that fused local, sparse tactile information into global maps.
In summary, we developed tactile-based frameworks for perception, planning, and mapping for the challenging task of localizing objects buried within granular materials. Our work can serve as a foundation for more complex, autonomous robotic behaviors such as the excavation and bimanual retrieval of fragile, buried objects.