Latest stories
Food fight: Russia's 'grain diplomacy' reshaping global markets
Australia's new defense strategy already behind the times A five-body problem in geopolitics Dogfighting is one of the most challenging aspects of air-to-air combat, but advances in AI can revolutionize it. The proliferation of increasingly stealthy fighter aircraft means opposing sides will unlikely detect each other at beyond-visual-range (BVR) distances, increasing the chances that a close-range dogfight may happen.
In 2020, an AI developed by US-based Heron Systems bested a human pilot with more than 2,000 hours on the F-16 , winning 5-0 using only its onboard cannon in a simulated dogfight.
The human pilot and Heron System's AI fought in five basic maneuver scenarios, with the AI operating within the limits of the F-16's maneuvers.
The AI's superhuman accuracy allowed it to score cannon kills against the human pilot, aiming from seemingly impossible angles. This ability means an AI-piloted fighter can decimate a manned fighter fleet using only a few rounds of ammunition and thus at marginal costs.
AI-piloted aircraft are not subject to human limitations and can fly faster, maneuver quicker and shoot better with constantly improving sensors, processors and software.
The apparent dogfighting superiority of AI pilots has raised questions about whether human pilots will still be needed for future aerial combat. While AI performs specific tasks well, it lacks a human pilot's general intelligence and judgment.
Combining AI precision with human decision-making may thus be the best approach to integrating AI in future aerial combat.
In a January 2022 article for The New Yorker , Sue Halpern argues that AI will change human pilots' roles and only partially replace them.
Halpern predicts that AI-piloted fighters will fly alongside manned fighters, with human pilots directing squads of unmanned aircraft. She also notes that the ACE program is part of a more significant effort to“decompose” fighter units into smaller, cheaper units, as the US may be unable to produce the number of manned fighters and train the pilots needed for a great power conflict with China.
However, Halpern points out that trust in AI is a significant issue, pointing out that the main challenge is how to get human pilots to trust their AI counterparts. A lack of trust may lead to the former constantly watching over the latter, breaking the logic of having AI pilots in the first place.
Tim McFarland notes in a 2022 article in the peer-reviewed International Journal of Law and Information Technology that, in a military context, trust in AI can be considered the confidence that AI will act as expected without constant supervision.
McFarland explains that people tend to rely on AI in situations that involve risk and uncertainty, such as navigating a vehicle or identifying military targets because past experiences have shown AI to be trustworthy. He notes that establishing trust in AI is essential to establish clear expectations, similar to a contract.
McFarland says that, for example, an AI system may be required to perform specific functions under certain conditions, such as identifying targets in a military operation, with its reliability in meeting these expectations being a critical factor in determining its trustworthiness.
McFarland emphasizes that in high-risk scenarios where operators may not have direct control or communication with AI systems, especially in electronic warfare (EW)-heavy environments, developing reliable AI systems that can be trusted based on their performance is crucial.
Sign up for one of our free newsletters The Daily ReportStart your day right with Asia Times' top stories AT Weekly ReportA weekly roundup of Asia Times' most-read stories
Caitlin Lee and others note in a May 2023 Aerospace America article that the sheer amount of data needed to train an AI pilot, compounded with the difficulties of training AI in a simulated environment, may not reflect real-world combat environments and the complexity of dogfighting.
China has conducted its own simulation dogfights pitting AI versus human pilots to avoid being left behind in the AI fighter pilot race.
In March 2023, the South China Morning Post (SCMP) reported that Chinese military researchers conducted a dogfight between two small unmanned fixed-wing aircraft-one with an AI pilot on board and the other controlled remotely by a human pilot on the ground. SCMP notes that the AI-piloted plane was superior in close-range dogfights, with its human opponent a constant underdog.
At the start of the said dogfight, the human pilot made the first move to gain the upper hand but the AI predicted his intentions, outmaneuvered, counter-moved and stuck close behind.
The SCMP report mentions that the human pilot attempted to lure the AI to crash to the ground but the AI moved to an ambush position and waited for him to pull up.
The human pilot performed the“rolling scissors” maneuver, hoping the AI would overshoot, but he could not evade his AI opponent, forcing the science team to call off the simulation after 90 seconds.
SCMP mentions that while the US pioneered AI pilot research 60 years ago, China has caught up quickly, with its technology using just a fraction of the computing resources used by US projects. It also says China's AI pilot is designed to operate on almost any People's Liberation Army-Air Force fighter.
Already have an account?Sign in Sign up here to comment on Asia Times stories OR Thank you for registering!
An account was already registered with this email. Please check your inbox for an authentication link.