Ryan Haines / Android Authority
TL;DR The US Department of Defense is reportedly tapping Gemini for classified projects with minimal restrictions from Google.
The deal allows the Pentagon to use Gemini for “any lawful government purpose.”
Inside Google, backlash is building again, with hundreds of employees warning about real-world harm from AI misuse.
The US Department of Defense has decided to rely on Google Gemini for classified projects. A new agreement between the search giant and Pentagon reportedly gives the latter full access to Google’s AI models, and Mountain View will not have any say in how the technology is used.
The Information reports that a source familiar with the deal says the contract lets the DOD use Gemini for “any lawful government purpose.” Google can recommend restrictions, such as not using the AI for autonomous weapons or domestic mass surveillance without human oversight, but the government does not have to follow these suggestions.
Don’t want to miss the best from Android Authority? Set us as a favorite source in Google Discover to never miss our latest exclusive reports, expert analysis, and much more.
to never miss our latest exclusive reports, expert analysis, and much more. You can also set us as a preferred source in Google Search by clicking the button below.
This decision comes just two months after the Pentagon blacklisted Anthropic, officially due to supply chain risks. In reality, Anthropic would not allow its Claude AI to be used for some military purposes, such as autonomous weapons and mass surveillance. This approach was not accepted by the DOD or President Trump, who indicated Anthropic could be considered in the future, but for now, the company is excluded.
Cameron Stanley, the Pentagon’s chief digital and AI officer, told CNBC that “overreliance on one vendor is never a good thing.” The DOD is now depending heavily on Google, OpenAI, and xAI for classified projects. Stanley says Gemini is already saving “thousands of man hours on a weekly basis” for US military personnel.
... continue reading