Human Vs. AI Drone Racing At The University Of Zurich

AI simulated drone flight track

[Thomas Bitmatta] and two other champion drone pilots visited the Robotics and Perception Group at the University of Zurich. The human pilots accepting the challenge to race drones against Artificial Intelligence “pilots” from the UZH research group.

The human pilots took on two different types of AI challengers. The first type leverages 36 tracking cameras positioned above the flight arena. Each camera captures 400 frames per second of video. The AI-piloted drone is fitted with at least four tracking markers that can be identified in the captured video frames. The captured video is fed into a computer vision and navigation system that analyzes the video to compute flight commands. The flight commands are then transmitted to the drone over the same wireless control channel that would be used by a human pilot’s remote controller.

The second type of AI pilot utilizes an onboard camera and autonomous machine vision processing. The “vision drone” is designed to leverage visual perception from the camera with little or no assistance from external computational power.

Ultimately, the human pilots were victorious over both types AI pilots. The AI systems do not (yet) robustly accommodate unexpected deviation from optimal conditions. Small variations in operating conditions often lead to mistakes and fatal crashes for the AI pilots.

Both of the AI pilot systems utilize some of the latest research in machine learning and neural networking to learn how to fly a given track. The systems train for a track using a combination of simulated environments and real-world flight deployments. In their final hours together, the university research team invited the human pilots to set up a new course for a final race. In less than two hours, the AI system trained to fly the new course. In the resulting real-world flight of the AI drone, its performance was quite impressive and shows great promise for the future of autonomous flight. We’re betting on the bots before long.

12 thoughts on “Human Vs. AI Drone Racing At The University Of Zurich

  1. The “search and rescue” rationale is obviously total BS. They obviously wanted to beat the world’s best humans at yet another task but fell short of their goal. I’m not especially rooting for their success either as I’d rather not live in a world with Slaughterbots flying around.

    1. Yeah, racing quads trying to fly as quickly as possible seem like a strange approach for searching in an unknown environment, escpecially since losing a drone is a bigger problem in that case. Probably something larger with more loiter time and less kinetic energy would be a better fit.
      Of course, xkcd hits the nail on the head here:
      https://xkcd.com/2128/

    2. “search and rescue” is a euphemism to make your project sound altruistic and helpful, but still attract the attention/funding of big defence firms, who know that they can repurpose the tech to “search and destroy.” Not saying that’s the case here, but it’s a pretty common theme in engineering tech demos.

  2. Does anyone know, what camera and connected HW is used for such applications? They must have very low latency for things like this.
    Is there any opensource version of something like this?

    1. One mode did onboard processing, so no transmission latency. The other was many cameras around the space, 3rd-person style, tracking the drones via markers.

      So none of what human drone pilots think of when they say “latency” which is basically all transmission and (these days) encoding/decoding latency.

Leave a Reply

Please be kind and respectful to help make the comments section excellent. (Comment Policy)

This site uses Akismet to reduce spam. Learn how your comment data is processed.