In the below video, researchers enjoy a delightful dance with ExBody2, a humanoid robot developed by UC San Diego, UC Berkeley, MIT, and Nvidia.
This demonstration isn't just a waltz in the park though. ExBody2 demonstrates an approach to improving robot mobility by enabling the robot to track the motion of people in great detail and mimic their movements.
From New Scientist:
This is all taught using reinforcement learning, a subset of machine learning in which the robot is fed large amounts of data to ensure it takes the optimal route in any given situation. Good outputs, simulated by researchers, are assigned positive or negative scores to "reward" the model for desirable outcomes, which here meant replicating motions precisely without compromising the bot's stability.
Previously:
• Factory robot convinces 12 other robots to go on strike
• Cornell researchers study 'trash barrel robots' in New York City
• Cyborg-Insect Factory is a real machine that turns cockroaches into robots