By April, the Army plans to issue guidance on how its various organizations should and shouldn’t make use of artificial intelligence, based on lessons learned from a series of pilot efforts over the last several months. But at least one thing is already clear: cost considerations will be a significant factor going forward.
Under the auspices of Project Athena, which officials launched in November, the Army has been examining AI use cases that can deliver capability to broad swaths of the service, largely in backoffice-type business functions. Technology and acquisition leaders have been looking at different deployment architectures, the potential utility of various AI tools, and cost models.
One goal is to find ways to deliver AI capabilities that can serve 80 percent of the Army, rather than letting “a thousand flowers bloom,” said Leonel Garciga, the Army’s chief information officer.
“These efforts are hyper-focused on back office, and on capabilities that are ready to go. There’s not a lot of [government R&D] work happening here,” he told reporters this week. “We’re seeing them in commercial space, a lot of us are using them personally, and it’s stuff that we can deploy on the network today. We started with an initial memo that said, ‘Hey, commanders, go do this. Run as fast as you can. Here are some guidelines, here’s some things you need to think about. Think about cybersecurity, think about resourcing.’ I think in this next phase, we’ll mature that and say, ‘Here’s what we learned over the last couple of months, and here’s where you’re best postured to use this capability against this resourcing profile and this cybersecurity profile.’”
Cost guardrails
One of those lessons: the Army will need to be deliberate about creating “guardrails” around commands’ use of commercial AI capabilities, mainly for cost reasons. Since most of the tools they’ll be employing use cloud consumption-based pricing models, bills can add up quickly.
In at least one recent case, the on-demand billing associated with an AI platform put an Army program at financial risk, Garciga said.
“One of our army commands was asked to look at a national-level data problem, and they needed a response very quickly. And within 48 hours, we had the database, we had analytical work that was ready to go, and we unleashed the power of some of these tools in the cloud. It was a real fast spin up, but all of a sudden I needed GPUs that weren’t in my budget,” he said. “That was done in a very unconstrained manner, so it really pushed the limits of their resourcing for their base cloud bill, because we didn’t have these guardrails in place at that time.”
One possible answer is to implement hard limits, enforced by commercial cloud service providers, so that commands and program offices don’t inadvertently run up huge bills. But Garciga said policing those cloud costs in the Army’s IT ecosystem is potentially more challenging than in purely-commercial environments.
“Most CIOs in the commercial space would say that they’re building these guardrails to make sure that they don’t push on cloud costs so much that they price themselves out of the market through analytical work,” he said. “I think the more important question is how do we get to a business model down that allows us to do guardrails in an environment where sometimes it’s cloud native, but sometimes we’re using cloud compute and storage with an organic large language model on top. How do we build guardrails into that? That’s the place where we’re spending a lot of time exploring. We need to understand whether or not we have enough fine-grained control to make sure that we don’t inadvertently price ourselves out when we’re running analytics.”
On-prem AI hosting will be rare
But even in cases where the Army is using its own AI models and algorithms, in almost all instances, it’s likely to use commercial clouds for its computational and other infrastructure needs. Running large AI tasks in government-owned data centers will be fairly rare.
“[On-premises hosting] will be for those models that we need to have behind the fence line,” said Jennifer Swanson, the deputy assistant secretary of the Army for data, engineering and software. “I don’t think that financially, over time, it makes any sense to do this on-prem, because we are going to have to buy GPUs. We’re going to have to maintain that facility and that infrastructure. And three years from now, when those GPUs are no longer sufficient for the model we want to run, we’re going to have to buy new GPUs. So from the standpoint of backoffice, cloud is the way to go. They’re going to be able to keep up, and we’re not, so we should leverage what they’re providing.”
As for the ultimate outcomes of Project Athena, Swanson said most of what the Army will issue in April will be new policies and guidance — not necessarily new ways of contracting for AI.
“There will be some contracts that we have that we can make available, but we’re definitely not going to say these are the only contracts you shall use,” she said. “There’s going to be other options for commands, if they so choose, and there probably will be some policy that comes out based on what we learn. It will constrain if you are going to solicit for some other capability, you shall do this or shall not do this, and it’s really going to be based on what we learn as we proceed through the next couple of months.”
Copyright
© 2025 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.