The Defense Department’s path toward the adoption of artificial intelligence is guided by trust and responsibility, a senior Pentagon AI official said today.

Dr. William Streilein, chief technology officer for DOD’s Chief Digital and Artificial Intelligence Officer, said his office has launched a department-wide effort focused on understanding how the DOD can accelerate the adoption of generative AI to support the warfighter.

As part of that effort, known as Task Force Lima, Streilein said his office has identified nearly 200 use cases for how the department could leverage the breakthrough technology across a variety of functions.

“And we’re assessing them, we’re trying to understand which ones would be appropriate given the state of technology, which is important to acknowledge,” Streilein said during a discussion on the role of trusted AI in the DOD hosted by Government Executive, a government-focused publication based in Washington, D.C.

“There is still a lot to learn about it,” he said. “It definitely has commercial application, but within the DOD, the consequences are perhaps higher and we need to be responsible in how we leverage it.”

Streilein explained that critical importance is establishing trust in each application of the technology, meaning the confidence that the AI algorithm produced the intended result.

“So that means we have to be good with our testing, he said. “We have to be able to specify what we want the algorithms to do, and then can move forward with justified confidence.”

He added that in addition to trust, the DOD places special emphasis on key tenants underpinning the ethical principles of AI: responsibility, reliability, equitability, governability and traceability. “Those are actually terms […] that apply to the human in their application of AI,” he said. “Meaning that we should always be responsible in our use of AI. We should know how we’re applying it, know that we have governance over it, know that we understand how it provided its answer.”

Last month, the DOD released its strategy to accelerate the adoption of advanced artificial intelligence capabilities to ensure U.S. warfighters maintain decision superiority on the battlefield for years to come.

The 2023 Data, Analytics and Artificial Intelligence Adoption Strategy, which was developed by the Chief Digital and AI Office, builds upon and supersedes the 2018 DOD AI Strategy and revised DOD Data Strategy, published in 2020, which have laid the groundwork for the department’s approach to fielding AI-enabled capabilities.

The strategy prescribes an agile approach to AI development and application, emphasizing speed of delivery and adoption at scale leading to five specific decision advantage outcomes:

  • Superior battlespace awareness and understanding
  • Adaptive force planning and application
  • Fast, precise and resilient kill chains
  • Resilient sustainment support
  • Efficient enterprise business operations

The blueprint also trains the department’s focus on several data, analytics and AI-related goals:

  • Invest in interoperable, federated infrastructure
  • Advance the data, analytics and AI ecosystem
  • Expand digital talent management
  • Improve foundational data management
  • Deliver capabilities for the enterprise business and joint warfighting impact
  • Strengthen governance and remove policy barriers

As the technology has evolved, the DOD and the broader U.S. government, have been at the forefront of ensuring AI is developed and adopted responsibly.

In January, the Defense Department updated its 2012 directive that governs the responsible development of autonomous weapon systems to the standards aligned with the advances in artificial intelligence.

The U.S. has also introduced a political declaration on the responsible military use of artificial intelligence, which further seeks to codify norms for the responsible use of the technology.

Streilein said those trust and the ethical use of AI underpins the department’s experimentation with the technology.

“A lot of what we’re doing to understand this technology is to figure out how we can be true to those five principles [of ethical use] in the context of what’s happening,” he said.

Leave a comment

Powering peace, equipping nations