Categories: Robotics

New Robot Mimics Human Dressing Actions

In a groundbreaking development, scientists have engineered a new robot that mirrors the two-handed movements of caregivers, significantly enhancing the dressing process for the elderly and disabled population. This innovation, inspired by healthcare professionals, marks a departure from previous one-armed assistive dressing robots that often proved uncomfortable or impractical for individuals in care.

The research, led by Dr. Jihong Zhu from the University of York’s Institute for Safe Autonomy, introduced a bimanual cooperative scheme for robotic dressing assistance. This scheme involves an interactive robot working in tandem with the human, providing support and guidance during the dressing process, while the dressing robot performs the task.

The team identified a key feature affecting the dressing action: the elbow angle. Using this feature, they developed an optimal strategy for the interactive robot. They also defined a dressing coordinate based on the arm’s posture to better encode the dressing policy.

The bimanual dressing scheme was validated through extensive experiments and an ablation study. The results showed that two hands were indeed necessary for dressing, and that specific angles and movements were required to reduce discomfort and distress for the individual in care.

Dr. Zhu and his team used a method called “learning from demonstration” to teach the robot. This involved allowing a robot to observe and learn from human movements and then, through AI, generate a model that mimics how human helpers perform their tasks. This approach allowed the researchers to gather enough data to illustrate that two hands were needed for dressing and not one, as well as information on the angles that the arms make, and the need for a human to intervene and stop or alter certain movements.

The team also developed algorithms that made the robotic arm flexible enough in its movements to perform the pulling and lifting actions, but also be prevented from making an action by the gentle touch of a human hand. The robot can be guided out of an action by a human hand moving it left or right, up or down, without the robot resisting.

This development could revolutionize the social care system, allowing care-workers to spend less time on practical tasks and more time on the health and mental well-being of individuals. However, Dr. Zhu emphasized the importance of ensuring the robot performs the task efficiently and safely, and that it can be halted or changed mid-action should an individual desire it. Trust is a significant part of this process, and the next step in this research is testing the robot’s safety limitations and whether it will be accepted by those who need it most.

The research was funded by the Honda Research Institute Europe and was a collaboration with researchers from TU Delft and Honda Research Institute Europe. The findings were published in the IEEE Transactions on Robotics.

In an era where the integration of technology and healthcare is becoming increasingly prevalent, this new dressing robot could significantly improve the lives of the elderly and disabled, providing a practical, comfortable, and dignified solution to a daily challenge.

Source: Jihong Zhu, Michael Gienger, Giovanni Franzese, Jens Kober. Do You Need a Hand? – A Bimanual Robotic Dressing Assistance SchemeIEEE Transactions on Robotics, 2024; 40: 1906 DOI: 10.1109/TRO.2024.3366008


Grow your business with AI. Be an AI expert at your company in 5 mins per week with this Free AI Newsletter

AI News

Recent Posts

Kling AI from Kuaishou Challenges OpenAI’s Sora

In February 2024, OpenAI introduced Sora, a video-generation model capable of creating one-minute-long, high-definition videos.…

7 months ago

Alibaba’s Qwen2 AI Model Surpasses Meta’s Llama 3

Alibaba Group Holding has unveiled Qwen2, the latest iteration of its open-source AI models, claiming…

7 months ago

Google Expands NotebookLM Globally with New Features

Google has rolled out a major update to its AI-powered research and writing assistant, NotebookLM,…

7 months ago

Stability AI’s New Model Generates Audio from Text

Stability AI, renowned for its revolutionary AI-powered art generator Stable Diffusion, now unveils a game-changing…

7 months ago

ElevenLabs Unveils AI Tool for Generating Sound Effects

ElevenLabs has unveiled its latest innovation: an AI tool capable of generating sound effects, short…

7 months ago

DuckDuckGo Introduces Secure AI Chat Portal

DuckDuckGo has introduced a revolutionary platform enabling users to engage with popular AI chatbots while…

7 months ago