Atlas, the humanoid robot famous for its parkour and dance routines, has recently begun demonstrating something altogether more subtle but also a lot more significant: It has learned to both walk and grab things using a single artificial intelligence model.

What is more, the robot’s single learning model is showing some tantalizingly “emergent” skills, like the ability to instinctively recover when it drops an item without having been trained to do so.

Boston Dynamics, the company that makes Atlas, together with the Toyota Research Institute (TRI), developed a generalist model that learns to control both arms and legs from a range of example actions. This is different from the norm: robots equipped with the ability to learn would usually rely on one model to walk and jump and an

See Full Page