Harting
Harting

Do you want to advertise here? Contact us

LMW
LMW

Do you want to advertise here? Contact us

“Sensorised” skin for robot’s automated orientation and self-control
.

“Sensorised” skin for robot’s automated orientation and self-control

By OEM Update Editorial April 1, 2020 1:52 pm

Flexible sensors and an artificial intelligence model tell deformable robots how their bodies are positioned in a 3D environment.
For the first time, MIT researchers have enabled a soft robotic arm to understand its configuration in 3D space, by leveraging only motion and position data from its own “sensorised” skin.

Soft robots, constructed from highly compliant materials, similar to those found in living organisms, are being championed as safer, and more adaptable, resilient, and bioinspired alternatives to traditional rigid robots. But, giving autonomous control to these deformable robots is a monumental task as they can move in a virtually infinite number of directions at any given moment. That makes it difficult to train planning and control models that drive automation.

Traditional methods to achieve autonomous control use large systems of multiple motion-capture cameras that provide the robots feedback about 3D movement and positions. But those are impractical for soft robots in real-world applications.

In a paper being published in the journal IEEE Robotics and Automation Letters, the researchers describe a system of soft sensors that cover a robot’s body to provide “proprioception” — means, awareness of motion and position of its body. That feedback runs into a novel deep-learning model that sifts through the noise and captures clear signals to estimate the robot’s 3D configuration. The researchers validated their system on a soft robotic arm resembling an elephant trunk, that can predict its own position as it autonomously swings around and extends.

The sensors can be fabricated using off-the-shelf materials, meaning any lab can develop their own systems, says Ryan Truby, a postdoc in the MIT Computer Science and Artificial Laboratory (CSAIL) who is co-first author on the paper along with CSAIL postdoc Cosimo Della Santina.

“We’re sensorising soft robots to get feedback for control from sensors, not vision systems, using a very easy, rapid method for fabrication,” he says. “We want to use these soft robotic trunks, for instance, to orient and control themselves automatically, to pick things up and interact with the world. This is a first step toward that type of more sophisticated automated control.”

One future aim is to help make artificial limbs that can more dexterously handle and manipulate objects in the environment. “Think of your own body: You can close your eyes and reconstruct the world based on feedback from your skin,” says co-author Daniela Rus, Director of CSAIL and the Andrew and Erna Viterbi Professor of Electrical Engineering and Computer Science. “We want to design those same capabilities for soft robots.”

Shaping soft sensors
A longtime goal in soft robotics has been fully integrated body sensors. Traditional rigid sensors detract from a soft robot body’s natural compliance, complicate its design and fabrication, and can cause various mechanical failures. Soft-material-based sensors are a more suitable alternative, but require specialised materials and methods for their design, making them difficult for many robotics labs to fabricate and integrate in soft robots.

Advertising

OEM Android App

Your future advertising space? Our media data

“Learning” configurations
As hypothesised, the sensors did capture the trunk’s general movement. But they were really noisy. “Essentially, they’re nonideal sensors in many ways,” Truby says. “But that’s just a common fact of making sensors from soft conductive materials. Higher-performing and more reliable sensors require specialised tools that most robotics labs do not have.”

To estimate the soft robot’s configuration using only the sensors, the researchers built a deep neural network to do most of the heavy lifting, by sifting through the noise to capture meaningful feedback signals. The researchers developed a new model to kinematically describe the soft robot’s shape that vastly reduces the number of variables needed for their model to process.

In experiments, the researchers had the trunk swing around and extend itself in random configurations over approximately an hour and a half. They used the traditional motion-capture system for ground truth data. In training, the model analysed data from its sensors to predict a configuration, and compared its predictions to that ground truth data which was being collected simultaneously. In doing so, the model “learns” to map signal patterns from its sensors to real-world configurations. Results indicated, that for certain and steadier configurations, the robot’s estimated shape matched the ground truth.

Next, the researchers aim to explore new sensor designs for improved sensitivity and to develop new models and deep-learning methods to reduce the required training for every new soft robot. They also hope to refine the system to better capture the robot’s full dynamic motions.
Currently, the neural network and sensor skin are not sensitive to capture subtle motions or dynamic movements. But, for now, this is an important first step for learning-based approaches to soft robotic control, Truby says: “Like our soft robots, living systems don’t have to be totally precise. Humans are not precise machines, compared to our rigid robotic counterparts, and we do just fine.”
Authored by: Rob Matheson, MIT News Office

Cookie Consent

We use cookies to personalize your experience. By continuing to visit this website you agree to our Terms & Conditions, Privacy Policy and Cookie Policy.

Webinar
Webinar

Do you want to advertise here? Contact us

OEM Update QR Code
OEM Update QR Code

Events

Clean India Show
Clean India Show
Factory Automation Expo
Factory Automation Expo
India Essen Welding and Cutting Expo
India Essen Welding and Cutting Expo
Logimat India
Logimat India
Metal Forming Expo
Metal Forming Expo

eMagazine November 2024

eMagazine November 2024
eMagazine November 2024

Do you want to advertise here? Contact us

Our Sponsors

DIRAK
DIRAK
Pragati Gears
Pragati Gears
Carl Zeiss India
Carl Zeiss India
STMCNC
STMCNC
Nord
Nord
Messer Cutting
Messer Cutting
Atos Profilo
Atos Profilo
Fronius
Fronius
SCHMALZ
SCHMALZ
Sigma-Weild
Sigma-Weild
Mallcom
Mallcom
igus
igus
DH Secheron Electrodes
DH Secheron Electrodes
Timken India
Timken India
UNP Polyvalves India Pvt Ltd
UNP Polyvalves India Pvt Ltd
ENS Oils & Lubricants
ENS Oils & Lubricants
Super Slides
Super Slides
Autonics
Autonics
Fuel Instruments  Engineers
Fuel Instruments  Engineers
Velvex
Velvex
Universal Orbital
Universal Orbital
Chicago Pneumatic Tools
Chicago Pneumatic Tools
MMC Hardmetal Pvt Ltd
MMC Hardmetal Pvt Ltd
Mennekes
Mennekes
ACD Machines
ACD Machines
TruCut
TruCut
tectyl
tectyl
BKT Tires
BKT Tires
Fibro India
Fibro India
Deceler
Deceler
Balluff
Balluff
Urgo Capital
Urgo Capital
Amsak Cranes
Amsak Cranes
Molygraph
Molygraph
SKS Welding
SKS Welding
pioneer Cranes
pioneer Cranes
Exorint
Exorint
Schmersal India
Schmersal India
Exon mobil
Exon mobil