Laundry assistance might come from robots, that can sense the layers.

Table of Contents

Laundry assistance Robots can now feel layers of clothing instead of just seeing them, thanks to new research from Carnegie Mellon University’s Robotics Institute. Robots could help people with household chores like folding clothes thanks to this work.

Humans pick up objects with their hands by using their senses of sight and touch. Little thought is given to it because it is so normal. However, these activities are quite challenging for robots. Up until recently, it was difficult to replicate touch in robotics due to the difficulty in quantifying the quantity of data that is collected through the sense.

According to David Held, an assistant professor in the School of Computer Science and the director of the Robots Perceiving and Doing (R-Pad) Lab, “Humans look at something, we reach for it, then we utilize touch to make sure that we’re in the appropriate position to grab it.” “Tactile perception is something that most people do naturally. We don’t give it much thought, so we are unaware of its importance.”

“How can this be fixed?” asked Held. Perhaps tactile sensing is what we need.

The best solution was ReSkin, which was created by Carnegie Mellon and Meta AI researchers. The free and open-source touch-sensing “skin” measures three-axis tactile impulses using a thin, elastic polymer that is embedded with magnetic particles. ReSkin was recently employed by researchers to enable a robot to feel layers of clothing rather than relying just on visual sensors to do so.

Thomas Weng, a Ph.D. student in the R-Pad Lab, worked on the project alongside RI postdoc Daniel Seita and graduate student Sashank Tirumala. “We can achieve tactile sensing by measuring the changes in the magnetic fields from depressions or movement of the skin,” Weng said. 

“By pinching with the sensor, we can use this tactile sensing to identify how many layers of material we’ve gathered up.”

The challenge is made more challenging by the fact that cloth is “deformable,” meaning it changes when you touch it, in contrast to other studies that has used tactile sense to grasp stiff things. 

The signals from the robot’s sensors fluctuate when its grasp on the fabric is modified.

Weng stated, “We were able to execute this very tiny operation, placing it between textile layers, which we can’t do with other sensors, particularly optical-based sensors, because the profile of this sensor is so thin. 

We used it to do tasks that were previously unachievable.

But before we give the laundry basket over to a robot, there’s a lot of research to be done. 

Smoothing a crinkled material, deciding how many layers to fold, and folding the cloth in the appropriate direction are the first phases in the process.

Weng explained, “It’s basically an investigation of what we can accomplish with this new sensor. We’re investigating simple methods to control cloth that we’ll need for robots to ultimately be able to do our laundry, as well as how to have robots feel soft objects with this magnetic skin.

The team will present its study, “Learning to Singulate Cloth Layers Using Tactile Feedback,” at the 2022 International Conference on Intelligent Robots and Systems, which takes place in Kyoto, Japan, from October 23–27. Additionally, it won the Best Paper prize at the 2022 RoMaDO-SI workshop of the conference.

Read More


Laundry assistance might come from robots in the future this is going to be exciting, it is going to be very useful for the laundry to make they work easy. you can drop your comment in the comment section.

Leave a Reply

Your email address will not be published. Required fields are marked *