The United States Air Force's Focus on AI Research and Development

  • Published
  • By J.M. Eddins Jr.
  • Airman Magazine

 

  In the vast expanse of the digital frontier, where cutting-edge technology meets the demands of modern warfare, the United States Air Force stands at the forefront of innovation. Among its arsenal of advancements, artificial intelligence emerges as a pivotal force, shaping the future of aerial dominance and national security.

  Nestled within the corridors of research institutions and laboratories, the USAF's commitment to AI research and development resonates with an unwavering dedication to staying ahead of the curve. Embarking on a journey fueled by ingenuity and technological prowess, the Air Force charts a course toward enhanced autonomy, operational efficiency and strategic advantage.

  At the heart of this endeavor lies a multifaceted approach, encompassing diverse focus areas that converge to redefine the landscape of aerial warfare. From autonomous systems to predictive analytics, the USAF's pursuit of AI transcends traditional boundaries, ushering in a new era of capabilities and possibilities. 

The X-62 Variable In-Flight Simulator Test Aircraft (VISTA) flies in the skies over Edwards Air Force Base, California, March 23, 2023. (U.S. Air Force photo by Ethan Wagner)


 

  Lt. Col. Joe Chapa, the Chief Responsible AI Officer at the DAF's Chief Data and AI Office, explains the DOD's goals for Artificial Intelligence, as well as the path it has laid out for its responsible development. "Behaving ethically is what the United States military does; that's who we are, and the people developing these tools - both the ethics side and the innovation side - are working arm-in-arm trying to solve the same problems,"  said Chapa. (U.S. Air Force Video by Tyler Prince)


Strategic Intelligence and Decision Support



  Within the realm of strategic intelligence, AI emerges as a force multiplier, augmenting human expertise with unparalleled analytical capabilities. 

  “Artificial intelligence is an evolution of software code that allows us to do things with technology that we haven't been able to do before, just like the computer was able to allow us to make groundbreaking leaps,” said Col. Tucker Hamilton, 96th Operations Group commander at Eglin Air Force Base and Air Force AI test and operations chief.

  Leveraging advanced algorithms and machine learning models, the USAF harnesses the power of data to discern patterns, anticipate threats and inform decision-making processes with unprecedented precision. 

  “In general, when we're talking about artificial intelligence, we're really talking about machine learning. And when we're talking about machine learning, we're talking about the next evolution of software code,” Hamilton said. 

  “We give the software an initial data set with guardrails, but then the software is actually rewriting aspects of it in order to optimize a human-defined objective. It is learning in a mathematical loop process. It's not magic, it's math.”

  Whether assessing geopolitical landscapes or analyzing battlefield dynamics, AI-driven intelligence platforms empower commanders with actionable insights in real-time.
 

Col. Tucker Hamilton, Air Force AI Test and Operations chief and 96th Operations Group commander, poses in a conference room at Eglin Air Force Base, Fla., Jan. 8, 2024. Hamilton is a flight test pilot who, prior to his tenure at Eglin AFB, was the director of the Department of the Air Force's AI Accelerator at MIT. (U.S. Air Force photo by J.M. Eddins Jr.)


The X-62 VISTA flies in the skies above Edwards Air Force Base, Calif., March 7, 2024. VISTA stands for Variable Stability In-flight Simulator Test Aircraft and is operated by the Air Force Test Pilot School with the support of Calspan and Lockheed Martin. (U.S. Air Force photo by Tech. Sgt. Janiqua P. Robinson)


Autonomous Systems and Unmanned Aerial Vehicles



  As the demand for autonomous capabilities surges, the USAF spearheads the development of next-generation unmanned aerial vehicles infused with AI intelligence to further the goals of the Collaborative Combat Aircraft program. 

  Enter the X-62 Variable In-Flight Stability Test Aircraft, a bespoke F-16 fighter jet originally used to test what would become the precursor to the F-22 Raptor’s thrust vectoring capability. The aircraft is providing the test bed necessary to make significant leaps toward integrating AI in kinetic systems.

  The goal is to meld the expertise and unmatched skill of U.S. Air Force pilots with the computational reasoning and speed offered by AI, merging a human pilot’s intuition with algorithmic precision to change the face of air combat as we know it.

  “Machine learning is different from more traditional, rules-based coding because rather than using “if-then” statements to make decisions, the machine learning algorithms are using robust statistical methods to discern patterns within massive data sets,” said Col. James Valpiani, the commandant of United States Air Force Test Pilot School at Edwards Air Force Base.

  “The resulting patterns are not easy for humans to read, understand or predict how they'll perform once they're implemented in a real-world environment, and that leads to really hard questions about trust and responsibility, especially in the realm of combat autonomy. But these aren’t just issues that are specific to the Air Force, they apply in everyday life. Autonomous vehicles are using these same machine learning algorithms.”

Col. James “FANGS” Valpiani, commandant of the United States Air Force Test Pilot School poses for a photo at Edwards Air Force Base, Calif., March 6, 2024. The Test Pilot School is where the Air Force's top pilots, navigators and engineers learn how to conduct flight tests and generate the data needed to carry out test missions. In addition to pilots, they also train Weapons Systems Officers, Combat Systems Officers, and Guardians. (U.S. Air Force photo by Tech. Sgt. Janiqua P. Robinson)



Roger Tanner and Bill Gray pilot the X-62A Variable Stability In-Flight Simulator Test Aircraft, or VISTA, from Hill Air Force Base, Utah, to Edwards Air Force Base, California, Jan. 30, 2019. (U.S. Air Force photo by Christian Turner)


  To address these challenges, Valpiani explained how TPS is using the X-62 to train these algorithms to the same standard every human pilot must meet.

   “A key aspect of our autonomy testing is ensuring that the algorithms we’re working with conform to the expectations of responsible use. There are training rules and expectations of conduct that all human fighter pilots are expected to adhere to, and we work with all of our partners, whether that’s the Air Force Research Laboratory, DARPA or some other entity, to ensure that these concepts are being integrated at every stage of autonomy testing,” Valpiani said. “Then, when we bring them over into real-world application in the X-62 from the simulation environment, we are assessing the same responsibility characteristics to ensure that they behave according to all the same rules that we hold ourselves to as human pilots.” 

  That’s one of the critical advantages of the X-62 Vanguard: its capacity to facilitate safer and more reliable testing environments by including a pilot and engineer in the cockpit, not as operators but as overseers for the AI systems in action. 

  "Having a human on board provides a crucial layer of oversight, allowing us to push the limits of AI with confidence in our ability to intervene if necessary," Valpiani explained. “It allows us to build confidence in these systems not only from the perspective of ‘Are they efficacious? Do they do the mission we set them out to do?’ but also, are they responsible, are they reliable, do they adhere to all the same norms and expectations of responsible use we expect of all humans in these environments?”

  The U.S. Air Force Test Pilot School and the Defense Advanced Research Projects Agency were finalists for the 2023 Robert J. Collier Trophy, a formal acknowledgement of recent breakthroughs in the aerospace industry that have launched the machine-learning era.

  The teams worked together to test breakthrough executions in artificial intelligence algorithms using the X-62A VISTA aircraft as part of DARPA’s Air Combat Evolution (ACE) program.


The XQ-58A Valkyrie is a high-speed, long-range, low-cost unmanned platform designed to offer maximum utility at minimum cost. This aircraft falls under the AFRL Low Cost Attritable Aircraft Technology portfolio. The XQ-58A was successfully designed, built, and demonstrated after a period of only two and a half years from contract award to first flight. (U.S. Air Force video by Keith C Lewis)


  “The potential for autonomous air-to-air combat has been imaginable for decades, but the reality has remained a distant dream up until now. In 2023, the X-62A broke one of the most significant barriers in combat aviation. This is a transformational moment, all made possible by breakthrough accomplishments of the X-62A ACE team,” said Secretary of the Air Force Frank Kendall. Secretary Kendall will soon take flight in the X-62A VISTA to personally witness AI in a simulated combat environment during a forthcoming test flight at Edwards​​.

  Testing in the X-62 program led directly to the Air Force Research Laboratory successfully completing a three-hour sortie July 25, 2023, demonstrating the first-ever flight of artificial intelligence agents (algorithms) controlling an XQ-58A Valkyrie un-crewed aircraft.

  The flight at the Eglin AFB Test and Training Complex was the culmination of the previous two years of partnership that began with the Skyborg Vanguard program.

  “The mission proved out a multi-layer safety framework on an Artificial Intelligence/Machine Learning-flown un-crewed aircraft and demonstrated an AI/ML agent solving a tactically relevant “challenge problem” during airborne operations,” Hamilton said. “This and future sorties will officially enable the ability to develop AI/ML agents that will execute modern air-to-air and air-to-surface skills that are immediately transferrable to the Collaborative Combat Aircraft program.”

  The algorithms were developed by AFRL’s Autonomous Air Combat Operations team.

  “AACO has taken a multi-pronged approach to un-crewed flight testing of machine learning Artificial Intelligence and has met operational experimentation objectives by using a combination of High-performance computing, modeling and simulation, and hardware in the loop testing to train an AI agent to safely fly the XQ-58 un-crewed aircraft,” said Dr. Terry Wilson, AACO program manager.


Dr. Terry Wilson AFRL’s Autonomous Air Combat Operations Program Manager at the Air Force Research Laboratory, Wright Patterson Air Force Base, Ohio, poses in front of a Kratos XQ-58A display at the Museum of the United States Air Force in Dayton, Ohio, March 1, 2024. (U.S. Air Force photo by Keith Lewis)


VENOM-AFT




  Also supplying autonomy test data for the development of Collaborative Combat Aircraft will be Project VENOM-AFT at Eglin Air Force Base, Florida. The Viper Experimentation and Next-gen Operations Model – Autonomy Flying Testbed program will be conducted by the 40th Flight Test Squadron and the 85th Test and Evaluation Squadron.

  VENOM is designed and funded to accelerate testing of autonomy software on crewed and un-crewed aircraft. It will complement the ADAx (autonomy data and artificial intelligence experimentation proving ground) at Eglin AFB and inform the CCA program and other autonomy developers. 

  As with the ACE program conducted by the X-62A VISTA, VENOM-AFT will integrate artificial intelligence, machine learning and autonomous systems into modified F-16 platforms. This will allow an onboard pilot to monitor and govern the autonomous systems during testing. The first modified F-16s arrived at Eglin AFB in February 2024.

  “It’s important to understand the ‘human-on-the-loop’ aspect of this type of testing, meaning that a pilot will be involved in the autonomy in real-time and maintain the ability to start and stop specific algorithms,” said Lt. Col. Joe Gagnon, 85th TES commander. “There will never be a time where the VENOM aircraft will solely ‘fly by itself’ without a human component.” 

  Operators will provide feedback during modeling, simulation, and post-flight to the autonomy developers to improve performance over time and ensure the autonomy is making the appropriate decisions prior to and during flight. 

 
The 40th Flight Test Squadron, under the 96th Test Wing, located at Eglin Air Force Base, Fla., is responsible for conducting developmental tests and evaluation on various airframes and weapon systems. In this video, Capt. Blake “BANDIT” Morgan, 40th FTS flight test engineer, explains the ways artificial intelligence and machine learning can be used as autonomous force multipliers. (U.S. Air Force video by Tech. Sgt. Janiqua P. Robinson)


 

Flight Test Engineer Capt. Blake "Bandit" Morgan of the 40th Flight Test Squadron poses in front of one of three F-16C aircraft to be used in VENOM flight tests at Eglin Air Force Base, Fla., Jan. 11, 2024. Morgan is the capabilities division chief test engineer for the Viper Experimentation and Next-gen Ops Models program under which Eglin AFB F-16s will be modified into airborne flying test beds to evaluate artificial intelligence performance in increasingly autonomous strike package capabilities. (U.S. Air Force photo by J.M. Eddins Jr.)


  “Having both developmental test and operational test pilots working and flying from the same location allows for daily collaboration and reduces the stove piping of knowledge and lessons learned,” said Lt. Col. Jeremy Castor, VENOM operational test lead. 

  The goals of VENOM are multifaceted, ranging from bolstering reconnaissance and surveillance capabilities to enabling precision strike missions in contested environments. With an emphasis on autonomy and rapid decision-making, these tests will enable the creation of advanced UAVs designed to operate seamlessly alongside manned aircraft, enhancing operational effectiveness and combat lethality across diverse mission scenarios while minimizing risk to human personnel. VENOM will also enable the Air Force to rapidly iterate and expand the body of knowledge for potential autonomy and payload solutions.  

  Furthermore, VENOM represents a pivotal step towards the realization of next-generation air superiority, building upon ACE’s advances in one-versus-one autonomous air combat in the X-62A VISTA, by testing and developing autonomous capabilities in more complex tactical situations. 

​  “The VENOM program marks a pivotal chapter in the advancement of aerial combat capabilities. This transformative program holds the potential to redefine air combat paradigms by fostering novel autonomous functions for current and future crewed and un-crewed platforms,” said Maj. Ross Elder, VENOM developmental test lead. “We look forward to the culmination of years of engineering and collaboration, as VENOM leads a measured step towards a new age of aviation.” 

  According to Captain Blake “Bandit” Morgan, flight test engineer at the 40th FTS and chief test engineer for the Advanced Capabilities Division at Eglin AFB, VENOM will be central to future programs such as AIR or Artificial Intelligence and Reinforcements program which will attempt to train autonomy to do more complex scenarios than a one-versus-one dogfighting engagement. 

  “We will introduce scenarios where you have more complex objectives than you're trying to accomplish in a one-v-one dogfighting scenario like ACE,” Morgan said. 



The 96th Test Wing and 53rd Wing welcomed the first three F-16 Fighting Falcons ready to take part in the Viper Experimentation and Next-gen Operations Model – Autonomy Flying Testbed program also known as VENOM.  (U.S. Air Force photo by David Shelikoff)


  “With ACE, the autonomy always knows where the other aircraft is located. With AIR, we will be training the autonomy to figure out where the adversary is located using onboard mission systems - radar and other sensors. Eventually, when we get to more complex autonomy development, the autonomy is going to have access to all mission systems on board the aircraft. It's going to be able to access the radar, targeting pods, potential electronic attack and surveillance systems and be able to feed all of that information into the autonomy to allow it to make decisions.

​  From reconnaissance missions to combat operations, these autonomous systems operate with unparalleled agility and adaptability, expanding the Air Force's reach across contested environments while minimizing risk to human personnel. Through continuous innovation and iterative refinement, the USAF envisions a future where swarms of AI-enabled UAVs seamlessly integrate into existing operational frameworks, reshaping the dynamics of aerial warfare.


Human-Machine Collaboration and Cognitive Enhancement



  In the pursuit of operational excellence, the USAF recognizes the pivotal role of human-machine collaboration in optimizing performance and maximizing mission effectiveness. 

​  While much of the AI public conversation focuses on the Airman’s role in the decision loop as a means to mitigate risk when employing AI, Dr. Alexis Bonnell, AFRL chief information officer, focuses her teams on the benefit to the force of the human part of human-machine collaboration.

  “Nothing gives us a relationship with knowledge at scale the way digital does. I think that's really important is you'll notice that I use the word knowledge. I don't talk about data and I don't talk about information,” Bonnell said.

​  “Knowledge is what happens when we blend data and information with our expertise — with our experience — and the way that we give it context and how our values play into that. I think one of the most interesting reasons that AI is such a robust topic is because ultimately we're having to navigate our identity. When you get a group of people to sit down and use a tool like ChatGPT and give them the same assignment, typically they will all query or create their question posed to the A.I differently.”

  According to Bonnell, differing contexts, knowledge sets and experiences between individuals can profoundly differentiate the questions posed to the AI and, therefore, how the AI responds.

Alexis Bonnell, chief information officer and director of the Digital Capabilities Directorate of the Air Force Research Laboratory, poses for a portrait at Wright-Patterson Air Force Base, Ohio, Feb. 20, 2024. She is responsible for developing and executing the AFRL Information Technology strategy, leading the strategic development of highly advanced next-generation technologies and platforms for AFRL. Her focus includes catalyzing the discovery, development and integration of warfighting technologies for air, space and cyberspace forces via digital capabilities, IT infrastructure and technological innovation across the lab’s operations and culture. (U.S. Air Force photo by J.M. Eddins Jr.)


Col. Garry Floyd is the Department of the Air Force director for the DAF Massachusetts Institute of Technology, Artificial Intelligence Accelerator, in Cambridge, Mass. He was photographed at Ft. Meade, Md., on Thursday, March 28, 2024. As the director of the DAF-MIT AIA, Floyd ensures the team of permanent party researchers, MIT partners and temporary phantom interns have everything they need to successfully accelerate fundamental AI research. The DAF-MIT AIA stood up in 2019, the same year the President of the United States signed Executive Order 13859 announcing the American AI Initiative. (U.S. Air Force photo by J.M. Eddins Jr.)


Dr. Steven K. "CAP" Rogers, a U.S. Air Force senior scientist at the Air Force Research Laboratory, Wright-Patterson Air Force, Ohio, poses for a portrait at the Wright Brothers Institute, Dayton, Ohio, Feb. 22, 2024. Rogers is focused on artificial intelligence-enabled autonomy at AFRL, and he initiates, technically plans, coordinates, evaluates, and conducts research and development to advance artificial intelligence-enabled autonomy. Dr. Rogers also provides technical leadership to the AFRL Autonomy Capability Team, aiding in the rapid advancement of autonomy research and development to operationalize AI at scale for the U.S. Air Force and Space Force. Rogers’ personal research focuses on the Qualia Exploitation of Sensing Technology and how to build autonomous systems by replicating the engineering characteristics of consciousness. (U.S. Air Force photo by J.M. Eddins Jr.)

​​  “For example, in my experience, I've written a lot of policy. People like facts and figures in their policy, so I'm going to query and be very specific about asking for facts or figures someone else might not have had. The inspiration, or the way that something is communicated in a compelling way, is actually really important. Those differences between perspectives in employing the AI can often lead to unexpected and innovative results; it can really amplify the problem-solving power of a diverse force,” Bonnell said.​

  By fostering symbiotic relationships between a diverse force of humans and AI systems, the Air Force endeavors to leverage the complementary strengths of both entities, capitalizing on human intuition and creativity alongside AI's computational prowess. 

  “What has happened is we've democratized AI,” said Dr. Steve Rogers, Autonomy Capability Team chief scientist at AFRL.

​​
​  “I've been doing artificial intelligence for 50 years. I used to be sequestered in a little room that was dark, and I would just code and code and code. These days, I can walk into the places where we do intelligence analysis, and they trot out 18-year-old Airman Snuffy and he's innovated with some little piece of AI code to analyze some data in a way nobody's ever done before. Innovative Airmen are our enduring asymmetric advantage. The initials for Airmen innovate are AI. The initials for artificial intelligence are AI. That's AI plus AI. That combination is a huge advantage.”

  Through initiatives focused on cognitive enhancement and human-centered design, the USAF is cultivating a workforce adept at navigating the complexities of AI-driven environments, fostering a culture of innovation and adaptability.

​  This AI plus AI idea has led ACT3 to develop the Air and Space Force Cognitive Engine.


The United States Department of the Air Force – Massachusetts Institute of Technology Artificial Intelligence Accelerator hosted two iterations of the Learning Machines Training course in conjunction with the MIT Media Lab, Nov. 28-30 and Dec. 12-14. U.S. Air Force Captain Rebekah Magness is an AI Accelerator Phantom who took part in November's three-day, hands-on immersive learning experience consisting of highly structured coding projects, discussions on AI policy and ethics, and a capstone project that challenges Airmen to build and interact with their own machine learning models. (U.S. Air Force video by Tech. Sgt. Brycen Guerrero)


​​  ASCE is a one-stop, all-inclusive, license-free suite of closely integrated software and services that enable users to develop complex AI solutions at any scale. This cooperative suite of tools supports users as they research, prototype, continuously test, improve and deploy their AI products.
 
  Designed in close collaboration with a large cohort of researchers within ACT3 and beyond, ASCE is designed to help overcome the common challenges in working with data and steep computational requirements. 

​  Specifically, ASCE provides an easy-to-use interactive computational environment with full Graphics Processing Unit support. It also provides data packaging, discovery and delivery. The vision for ASCE is to enable AI to be a tool in the hands of every Airman, with the ability to interact with AI in a do-it-yourself fashion that will result in an exponentially innovative landscape.

​​
  This unique approach of creating systems uses existing AI to create new, world-class AI. Requirements that may benefit from ASCE include business processes such as civilian hiring and contract monitoring, predictive maintenance, automated air combat operations, aircraft damage inspection, humanitarian assistance, disaster relief and additive manufacturing. Intelligence, surveillance and reconnaissance missions can also benefit from ASCE. More generally ASCE significantly lowers the barrier to entry to using a large-scale cluster for computational science.

​  This drive within the Air Force to begin developing and utilizing AI has led to increased cooperation with commercial and academic partners.

​  One such partnership is with the Massachusetts Institute of Technology’s AI Accelerator program.


In this AFRL video, members of the Autonomous Aircraft Experiment Team explain how they are taking autonomous research concepts that exist in a lab and turning them into operational realities for our nation’s warfighters. (U.S. Air Force video by Air Force Research Laboratory)

​​​​​​​
  “The mission is fundamental artificial intelligence research with the idea that we're going to bring capability back to the Department of the Air Force,” said Col. Garry Floyd, director of the Department of Air Force's AI Accelerator at MIT.

  “We're going to bring new operationally relevant capability back to the Department of the Air Force. That's the mission. As director, my job is to make sure that we're working on the right kinds of things; that we are driving our research investments into the right areas. And that as we bring these projects along, that I'm telling the rest of the Air Force about what we're doing.

​  “When I see an opportunity, we lift our projects literally out of research, out of research papers, out of published papers and turn them into prototypes that we can test, for potential deployment in our use cases. That's the goal. At the end of the day, make sure we're working on the right projects.” 
​​
  In the end, AI is all about building speed, efficiency, and mission success in every Air Force operation, from finance to personnel management to effects on target.

​  “I think the most important currency that we bring to the relationship is our operational experience and our ability to connect our work to stakeholders. I talk about all the time that I has this tremendous potential to, you know, where we can take the OODA Loop concept from Col. John Boyd; observe, orient, decide and act,’ Floyd said.

​  AI has this tremendous ability to help us tighten up and increase the speed of our decision loops. If I go to the fight and I can observe, orient, decide, and take an action faster than my adversary, I've got a really good chance to win that fight. AI has tremendous potential here.” 
 ​​
​​​​​

 
In the pursuit of developing a strategic advantage against our adversaries, it’s imperative for the U.S. Air Force to harness all available national assets. The MIT AI Accelerator emphasizes the fusion of advancing science with practical problem-solving through programs like the Phantoms. By fostering partnerships with industry and academia, the AI Accelerator aims to cultivate the integration of technology into military operations while remaining agile in the face of evolving threats. (U.S. Air Force video by Master Sgt. Christopher Griffin)

 

 

Ethical Considerations and Responsible AI Development



  Amidst the rapid advancement of AI technologies, the U.S. Air Force remains steadfast in its commitment to ethical principles and responsible development practices. Recognizing the potential implications of AI on societal norms and international relations, the Air Force prioritizes transparency, accountability and ethical stewardship in all facets of AI research and deployment. 

​  “It's important for us to understand what we're dealing with and to not overreact to the development of the technology,” Hamilton said. 

  “But in the same breath, we must really address the responsible and ethical development of artificial intelligence, just like we ethically address the development of all of our technology, from our guns to our planes to our bombs, to our communication systems. For me, our focus is understanding the technology enough that we are able to ensure that whatever we're developing is going to keep our military members and the civilian population safe.”

​​  Through interdisciplinary collaboration and engagement with external stakeholders, the U.S. Air Force seeks to address complex ethical dilemmas and ensure that AI systems align with the values of democracy, freedom and human dignity.

  A DOD working group, including Air Force members, called the Responsible AI Working Council was convened in 2020. They published a document called the Responsible AI Strategy and Implementation Pathway in the summer of 2022. 

​  “They outlined five principles within the DOD for AI ethics; our AI systems need to be responsible, equitable, traceable, reliable and governable,” said Lt. Col. Joseph Chapa, chief responsible AI ethics officer, Department of the Air Force Chief Digital and Artificial Intelligence Office.  


Lt. Col. Joseph Chapa, the Department of the Air Force's chief responsible artificial intelligence ethics officer, poses for a photo at the Pentagon in Arlington, Va., Feb. 5, 2024. In this role, Chapa helps ensure that as the Air Force makes strides to adopt and implement AI into many processes and career fields, it does so responsibly, ethically and with a nuanced approach. (U.S. Air Force photo by Tech. Sgt. Janiqua P. Robinson)


​​​​​  “Responsible just means that an appropriate person at every phase of the lifecycle needs to maintain responsibility for the system; users need to know enough about what's happening under the hood to be able to make wise decisions about how to employ it. They need to be reliable, in the sense that they've been tested and evaluated for the specific use case for which they're intended. They need to be governable, meaning that we can impose the right bounds, the right guardrails and that we are always able to turn these systems off.”

  However, the last couple of years have seen an explosion of AI capability in large language models and generative AI.

​  “The traceability requirement under the DOD AI ethical principles in 2020 was based on the 10-year history of deep neural networks, which were getting pretty good at natural language processing and object classification,” Chapa said. “Many of those models are so large and so complex that we can't possibly call it traceable in the sense that is required by the DOD AI ethical principles.”

​  According to Chapa, the open question going forward is whether we should modify those DOD ethical principles to reflect this new capability or restrict its use because it doesn't meet the existing DoD AI ethical principles. 

​​  “One possible solution to that is just to recognize the nuance here. Some use cases are very high risk, and some are very low risk. If we have a high-risk use case, something that's going to influence military operations or life and limb or even personnel actions that are going to affect people's careers, well then, we probably need to be more restrictive,” Chapa said. “But if we're using generative AI for low-risk use cases, say allowing people to leverage the efficiency in back-office applications, then maybe we don't need to hold them to the traceability standard.

  “It gets complicated, but one way to look at this is we should allow Airmen and Guardians to experiment to the max extent possible with low-risk use cases, so they can start to get the practice reps and sets and identify how these things might fail. 

  Eventually, I think the entire workforce will be upskilled in these tools, but only if we give them the ability to experiment at scale. I'm not saying we can't use generative AI down the road for more complicated or more high-risk cases, but before we can do that, we have to kind of build in the institutional guardrails so that we know how to review the output that's coming out of that black box deep neural net, or large language model system. If it's easy for the user to validate the answer, that is an ideal use case for generative AI.

As the United States Air Force continues to push the boundaries of technological innovation and integration of artificial intelligence it is emerging as a cornerstone of its strategic vision. With unwavering determination and a steadfast commitment to excellence, the USAF is charting a course toward a future where AI-powered capabilities empower Airmen to soar to new heights, safeguarding the skies and securing the nation's interests with unparalleled precision and resolve.

 

Read more from the Airman Magazine issue exploring AI in the Air Force

 
AIRMAN MAGAZINE