Humanoid Robot Learns to Waltz by Mimicking Human Movements

Humanoid Robot Learns to Waltz by Mimicking Human Movements

An AI system called ExBody2 enables humanoid robots to replicate human movements with remarkable fluidity, from dancing to walking and even throwing punches. Developed by researchers at the University of California, San Diego, the system uses motion capture data and reinforcement learning to teach robots to perform a wide range of lifelike actions.

Humanoid Robot Masters the Art of Waltzing by Mimicking Human Movements

An AI system trained on motion capture data enables robots to replicate human actions with remarkable fluidity, from dancing to walking and even throwing punches.

By Alex Wilkins 16 January 2025

A humanoid robot gracefully waltzes, guided by an AI trained on human motion capture recordings.

An innovative AI system is empowering humanoid robots to mirror human movements with unprecedented smoothness, paving the way for robots that can walk, dance, and even engage in combat in ways that feel authentically human.

While robots like those from Boston Dynamics showcase impressive acrobatics, these feats are often limited to pre-programmed sequences. Expanding a robot’s repertoire to include a broader range of lifelike movements remains a significant challenge.

To address this, Xuanbin Peng and his team at the University of California, San Diego, have developed ExBody2, an AI system designed to help robots replicate a wide array of human actions with natural fluidity.

How ExBody2 Works

Peng and his colleagues began by compiling a database of movements that a humanoid robot could potentially perform. This included everything from basic actions like standing and walking to more intricate maneuvers, such as complex dance steps. The database was built using motion capture recordings from hundreds of human volunteers, gathered during previous research initiatives.

“Humanoid robots share a similar physical structure with humans, so it’s logical to leverage the vast amount of existing human motion data,” explains Peng. “By learning to mimic these movements, robots can quickly adopt a wide range of human-like behaviors. Essentially, if a human can do it, a robot can learn it.”

Training the AI

The team employed reinforcement learning to teach a simulated humanoid robot how to move. In this approach, the AI is provided with examples of successful movements and then tasked with figuring out how to replicate them through trial and error. Initially, ExBody2 was trained with full access to data about the virtual robot, such as the coordinates of each joint, to ensure it could mimic human actions as closely as possible. Later, the AI was trained using only the data it would have access to in real-world scenarios, such as sensor measurements of inertia or speed from a physical robot.

Real-World Applications

After completing its training, ExBody2 was tested on two commercial humanoid robots. The AI successfully enabled the robots to perform a variety of movements, from simple tasks like walking in a straight line and crouching to more complex actions like executing a 40-second dance routine, throwing punches, and even waltzing with a human partner.

“Humanoid robots excel when they can coordinate all their limbs and joints seamlessly,” says Peng. “Many tasks require the arms, legs, and torso to work in harmony, and full-body coordination significantly enhances a robot’s capabilities.”

Reference: arXiv DOI: 10.48550/arXiv.2412.13196

Published At: Jan. 25, 2025, 10:30 a.m.
Original Source: Humanoid robot learns to waltz by mirroring people's movements (Author: Alex Wilkins)
Note: This publication was rewritten using AI. The content was based on the original source linked above.
← Back to News