Skip to main content

Dancing Boston Dynamics robots are impressive showcase of robot capabilities

Humanoid robot with this complexity is unparalleled, says ASU expert


Boston Dynamics dancing robots captivate audiences
|
January 11, 2021

To kick off the new year, Boston Dynamics posted a video of four robots doing a choreographed dance to the Contours’ "Do You Love Me." It’s garnered about 25 million views on the YouTube channel alone.

In a time when we all need something to smile about, the dancers have inspired lot of guessing about how the choreography was accomplished.

It wasn’t CGI, as many viewers speculated. And it wasn’t the hand-coding of traditional robot programs, either.

According to Arizona State University's Heni Ben Amor, an assistant professor in the Ira A. Fulton Schools of Engineering who specializes in machine learning and human-robot interaction, the technology involved is not new.

“However, Boston Dynamics has taken this technology and the underlying methods to new heights,” Ben Amor said. "In combination with the artistry of the choreography, this results in an impressive showcase of robot capabilities.”

A critical challenge is maintaining balance.

“There are a lot of explosive motions in the dance,” Ben Amor said. “Arm movement in the air may be easier to orchestrate than landing on one leg, but that arm movement generates momentum that affects landing. Does the floor have some bounce, or is it rigid?”

These are the kinds of assessments the robot is making as it goes through the routine, making 10 or more environment interaction predictions per second, he said.

“One mistake and a very expensive robot can crash.”

The routines likely were choreographed and performed by dancers, who worked with the roboticists to make adjustments to the movements that remained within the robots’ capabilities, Ben Amor explained.

While the mimicking of the dancers’ choreography has an element of puppetry, maintaining physical stability is part of the equation. “The robot has to think about how to actuate the motors so it can generate its own actions in space.”

Ben Amor emphasizes that the robots were not hand-coded to perform the exact same routines again and again. More specifically, the robots use a technique called model-predictive control to ask many “What if?” questions. Control signals are then chosen to maintain balance and stay as close as possible to the original choreography.

“Where normally a human head would be, you can detect 3D cameras,” Ben Amor said. “The cameras are providing that environmental data to the robots. The cameras, along with gyro-sensors, help the robot execute the routine as choreographed.” 

Robots must learn the complexity of human movement

ASU robotics Professor Heni Ben Amor discusses the complexity of human movement in a video by Deanna Dent, ASU, featuring Herberger Institute for Design and the Arts dance alumna Margaret Waller.

While the robots are able to produce the choreography, it’s not what we traditionally think of as AI, he added. 

“They aren’t making long-term autonomous decisions. They are making short-term decisions that enable them to reproduce what they’ve been shown in the space in which they are acting.”

And though the robots’ performances are definitely captivating, they won’t be choreographing their own recitals any time soon – or applying these skills in a surgical theater, for example.

“Even without any learning or AI, it is still difficult to overstate the quality of this achievement,” Ben Amor said. “A humanoid robot of this complexity, being controlled with this level of fluidity and grace, is unparalleled.”

There will come a time when robots can make the kinds of split-second decisions necessary to handle the challenges of assessing a surgical situation and adjusting actions accordingly. But for now, they don’t have the cognitive skills.

“Even making breakfast is a challenge at this point,” Ben Amor said. Steps like knowing eggs are needed, finding and retrieving them from the refrigerator and choosing the cooking process and equipment, are the kinds of decision-making skills that remain part of the ongoing research process.

“At this point, machine learning is at the stage where Netflix algorithms let the 'recommendation robot' evaluate the movies you are streaming to suggest what might next be on your viewing list,” he said. 

However, much more research needs to take place before machine learning finds a larger foothold in the field of robotics.

Ben Amor also advises that household robots will likely look nothing like the Jetsons' Rosey the Robot. In our homes, legs are so much more useful than wheels. Even vacuuming robots cannot navigate stairs.

“The future robots assistants we have in our homes will likely be of human size and shape,” he said. “They will manipulate door handles and stairs, seamlessly make the transition from tile to carpet, and perform dozens of other actions that, as humans, we do without consciously thinking about them.”

But there are psychological considerations to giving robots humanoid forms.

“People have an emotional response when robots seem like zombies,” Ben Amor said. “Sociologists and psychologists are researching whether it’s better or worse for a robot to emulate breathing and eye blinking behaviors, for example.”

But he believes that ultimately, people will begin to feel comfortable with realistic, humanoid robots. 

“Disney has been using realistic animatronics in theme parks for decades and they aren’t threatening,” he said. “Now they are adding robots that aren’t confined to a ride. Children will be able to have physical interactions with Buzz Lightyear and Olaf from Frozen.”

Ben Amor points out that Disney "imagineers" are developing “Stuntronics” for theme park rides and attractions. 

The focus of Ben Amor’s research is machine learning methods that enable physical, human interaction. “I envision that the next stage will be developing stunt robots capable of directly interacting with humans.”

When comic conventions resume in-person gatherings, the biggest attractions will likely be photo-ops with life-size, interactive Star Wars action heroes.

“Spiderman jumping in real life will bring robotics closer to humans," Ben Amor said. “The day will come when the annual Boston Dynamics release will feature partnering with human dancers and performing graceful ballet lifts.”

Top photo: Boston Dynamics Dancing Robots, courtesy of Boston Dynamics.

More Science and technology

 

Palo Verde Blooms

2 ASU postdocs receive prestigious Pegasi 51b Fellowship to study exoplanets 

The Heising-Simons Foundation has announced that Arizona State University School of Earth and Space Exploration postdoctoral researcher Luis Welbanks and incoming postdoctoral researcher Megan Weiner…

Student using laptop computer

ASU class explores how ChatGPT Enterprise can assist in scholarly writing

Just over a month ago, Jacob Greene received a notification he’d been waiting for — his proposal to use ChatGPT Enterprise was approved. Greene is an assistant professor at Arizona State University’…

Outdoor ASU sign reading "New schools New degrees New buildings" in front of a building.

New engineering degrees at ASU aim to open pathways, empower engineering expertise

It doesn’t take an extensive internet search to discover that engineering has become one of the most rapidly and broadly expanding STEM fields. Engineering has been on an upswing in recent years,…