The cutting-edge field of soft robotics is transforming how we think about machines, utilizing flexible, pliable materials instead of traditional rigid components. However, these advanced robots have faced significant limitations due to inadequate sensory capabilities. Effective robotic manipulation requires both tactile sensing—the ability to feel what's being touched—and proprioception—the awareness of one's own position in space. Until recently, these crucial sensing capabilities have been largely absent from most soft robotic systems.
In groundbreaking research published recently, scientists from MIT's prestigious Computer Science and Artificial Intelligence Laboratory (CSAIL) have developed innovative technologies that dramatically enhance robots' ability to perceive and interact with their environment. These breakthroughs include advanced vision systems for object identification and classification, alongside sophisticated touch-sensitive components that enable unprecedented levels of dexterity.
"Our goal is to create robots that can experience the world through touch, much like humans do," explains Daniela Rus, CSAIL Director and the Andrew and Erna Viterbi Professor of Electrical Engineering and Computer Science. "These soft robotic hands feature sensorized skins that enable them to handle everything from fragile items like potato chips to heavier objects such as milk bottles with remarkable precision."
This revolutionary research received funding from the Toyota Research Institute, highlighting its potential impact on future automotive and manufacturing applications.
One study builds upon previous research conducted by MIT and Harvard University, where scientists developed an innovative soft robotic gripper inspired by origami principles. This cone-shaped structure collapses around objects similarly to a Venus flytrap, demonstrating the remarkable ability to lift items up to 100 times its own weight.
To enhance this versatile system with human-like sensitivity, the research team integrated specialized tactile sensors created from latex "bladders" connected to pressure transducers. These innovative sensors allow the gripper to not only handle extremely delicate objects like potato chips without damage but also to classify them—providing the robot with crucial information about what it's manipulating while maintaining that gentle touch.
During object classification tests, these advanced sensors correctly identified ten different objects with over 90% accuracy, even when objects slipped during manipulation—a significant improvement in robotic tactile perception.
"Unlike many alternative soft tactile sensors, our design can be rapidly manufactured, easily integrated into existing grippers, and demonstrates exceptional sensitivity and reliability," notes Josie Hughes, MIT postdoc and lead author of the sensor research paper. "We believe this technology opens new avenues for soft sensing across numerous manufacturing applications, including packaging and material handling operations."
In a second paper, researchers introduced "GelFlex," a sophisticated soft robotic finger that employs embedded cameras and deep learning algorithms to achieve high-resolution tactile sensing and proprioception capabilities.
The gripper, resembling a two-finger cup commonly found at beverage dispensers, utilizes a tendon-driven mechanism to actuate its fingers. When tested with variously shaped metal objects, the system demonstrated recognition accuracy exceeding 96%—a remarkable achievement in robotic perception.
"Our GelFlex finger delivers exceptional accuracy in proprioception and can reliably predict the characteristics of grasped objects while withstanding significant impacts without damage to itself or the environment," explains Yu She, lead author of the GelFlex paper. "By combining flexible exoskeleton constraints with high-resolution internal camera sensing, we're unlocking tremendous potential for soft manipulators in numerous applications."
Revolutionary Magic Ball Gripper Technology
The innovative magic ball gripper consists of a soft origami structure enclosed within a flexible balloon. When vacuum pressure is applied to the balloon, the origami structure closes around the target object, with the gripper conforming precisely to its shape.
While this mechanism enables the gripper to handle an unprecedented variety of objects—from soup cans and hammers to wine glasses, drones, and even individual broccoli florets—the finer aspects of delicate manipulation and object understanding remained challenging until the integration of advanced sensors.
When the sensors encounter force or strain, internal pressure changes occur, allowing the research team to measure these variations and create recognition patterns for future interactions.
Beyond the latex sensor technology, the team also developed sophisticated algorithms that utilize feedback mechanisms to enable the gripper to achieve a human-like combination of strength and precision. Impressively, 80% of tested objects were successfully grasped without any damage—demonstrating the system's remarkable dexterity.
The researchers extensively tested the sensor-equipped gripper on various household items, ranging from heavy bottles to small, fragile objects, including cans, apples, toothbrushes, water bottles, and bags of cookies.
Looking ahead, the team aims to scale this methodology, employing computational design and reconstruction techniques to enhance resolution and coverage using their novel sensor technology. Their long-term vision involves developing fluidic sensing skins that demonstrate both scalability and exceptional sensitivity.
Hughes co-authored the research paper with Rus, with their findings presented virtually at the 2020 International Conference on Robotics and Automation.
GelFlex Technology Breakthrough
In the second research paper, a CSAIL team focused on enhancing soft robotic grippers with more nuanced, human-like sensory capabilities. While soft fingers offer extensive deformation possibilities, controlled manipulation requires sophisticated tactile and proprioceptive sensing systems. The team utilized embedded cameras with wide-angle "fisheye" lenses to capture detailed information about the finger's deformations.
To create GelFlex, researchers fabricated the soft, transparent finger using silicone material, positioning one camera near the fingertip and another in the middle section. They applied reflective ink to the front and side surfaces of the finger and installed LED lights on the back. This configuration allows the internal fisheye cameras to monitor the status of the finger's external surfaces in real-time.
The team trained neural networks to extract critical information from the internal camera feeds for feedback purposes. One neural network learned to predict GelFlex's bending angle, while another was trained to estimate the shape and size of grasped objects. This advanced system enabled the gripper to successfully manipulate various items, including Rubik's cubes, DVD cases, and aluminum blocks.
During testing, the average positional error while gripping objects was less than 0.77 millimeter—actually surpassing human finger precision. In a second series of tests challenging the gripper to grasp and recognize cylinders and boxes of various sizes, only three out of 80 trials resulted in incorrect classification.
Future research directions include improving proprioception and tactile sensing algorithms, as well as expanding the capabilities of vision-based sensors to estimate more complex finger configurations, such as twisting or lateral bending—movements that challenge conventional sensors but appear achievable with embedded camera technology.
Yu She co-authored the GelFlex research paper with MIT graduate student Sandra Q. Liu, Peiyu Yu of Tsinghua University, and MIT Professor Edward Adelson. Their findings were presented virtually at the 2020 International Conference on Robotics and Automation.