{"id":869103,"date":"2022-08-25T09:00:00","date_gmt":"2022-08-25T16:00:00","guid":{"rendered":"https:\/\/www.microsoft.com\/en-us\/research\/?p=869103"},"modified":"2023-07-19T10:04:11","modified_gmt":"2023-07-19T17:04:11","slug":"mocapact-training-humanoid-robots-to-move-like-jagger","status":"publish","type":"post","link":"https:\/\/www.microsoft.com\/en-us\/research\/blog\/mocapact-training-humanoid-robots-to-move-like-jagger\/","title":{"rendered":"MoCapAct: Training humanoid robots to \u201cMove Like Jagger\u201d"},"content":{"rendered":"\n
\n\t
\n\t\t
\n\t\t\t\t\t\tGroup<\/span>\n\t\t\tRobot Learning Group<\/span> <\/span><\/a>\t\t\t\t\t<\/div>\n\t<\/article>\n<\/div>\n\n\n\n
\"A<\/figure>\n\n\n\n

What would it take to get humanoid, bipedal robots to dance like Mick Jagger? Indeed, for something more mundane, what does it take to get them to simply stand still? Sit down? Walk? Move in myriads of other ways many people take for granted? Bipedalism provides unparalleled versatility in an environment designed for and by humans. By mixing and matching a wide range of basic motor skills, from walking to jumping to balancing on one foot, people routinely dance, play soccer, carry heavy objects, and perform other complex high-level motions. If robots are ever to reach their full potential as an assistive technology, mastery of diverse bipedal motion is a requirement, not a luxury. However, even the simplest of these skills can require a fine orchestration of dozens of joints. Sophisticated engineering can rein in some of this complexity, but endowing bipedal robots with the generality to cope with our messy, weakly structured world, or a metaverse that takes after it, requires learning<\/em>. Training AI agents with humanoid morphology to match human performance across the entire diversity of human motion is one of the biggest challenges of artificial physical intelligence. Due to the vagaries of experimentation on physical robots, research in this direction is currently done mostly in simulation. <\/p>\n\n\n\n

Unfortunately, it involves computationally intensive methods, effectively restricting participation to research institutions with large compute budgets. In an effort to level the playing field and make this critical research area more inclusive, Microsoft Research’s Robot Learning group<\/a> is releasing MoCapAct<\/a>, a large library of pre-trained humanoid control models along with enriched data for training new ones. This will enable advanced research on artificial humanoid control at a fraction of the compute resources currently required.\u00a0<\/p>\n\n\n\n

\n
\n
\n