Anthropic’s relationship with the U.S. government is getting complicated

Anthropic has so far insisted it will stand with the Trump administration.

On October 21, the company’s CEO, Dario Amodii, wrote a new blog post in what he called “a fit in Anthropic’s claims of talking about policy positions.”

His comments by David Sacks, a prominent tech venture capitalist who currently serves as the Trump Administration’s EU Czar, indicted state-level regulation and anthropomorphism working with Democratic mega-donors. Since then, the narrative has gained traction in right-wing spaces online. The comments are also subject to the release of an executive order earlier this year dealing with “awakening AI,” although officials have yet to say how it will be enforced.

Now, Antropic defends its work on AI security, which should trump Amodei’s priority of “politics over politics.” It also doubled down on the company’s position by regulating AI at the state level in the absence of a national standard.

Referring to JD Vance’s comments directly on AI, AmoDi said, “Like medicine and disease prevention, applications that help people are maximized while minimizing the harmful ones.”

The CEO also questioned the notion that the company’s flagship chatbot is more susceptible to political bias than other similar language models. Republicans, including President Donald Trump, have stepped up accusations that the country’s leading AI companies are building AI models, echoing accusations made against social media companies in recent years.

In short, Anthropic wanted to learn about the EU’s commitment to protecting the security of all artificial disciplines and general AIs that offend human species and society in ways that make society tremble. It’s all happening in the company’s attempt to land more government work.

“Anthropic strives for constructive involvement in state policy issues. When we agree, we say, we say,” he said. “When we don’t, we offer an alternative. For this, a mission of the EU to ensure everyone and disclosure, these goals, and because we share them with both sides of society, we believe in being this purposeful and fair. We will stand against the policies we believe in. otherwise.”

Federal contracts

Amodii highlighted Antropy’s work with the federal government as well as the federal government under contract with the Pentagon, including joint partnerships with the federal government and the Department of Energy’s National Laboratory system. Openai is also working with competitors such as Google and Xai, Anthropic, as well as a shared services administration to provide enterprise service to the Federal Agency at a discounted price.

Anthropy is said to be a government official familiar with the matter, who serves as a staff adviser on the work, science and technology policy within GSA. Fast company. Last month, Democrats launched an ethics inquiry into investors that would allow investors to participate in management while protecting some of their investments.

Anthropic has received good feedback from the U.S. government use of GSA, a company spokesperson says. The AI ​​developer also points to its ongoing partnership with Palantir on a wonky but critical cloud security review program used to offer technology across federal agencies.

Palantir is a controversial technology contractor that has seen its work with both defense and civilian sides in recent years. As part of that work, Palantir has already been cleared to provide its cloud technology to federal agencies.

While Anthropic is collecting government contracts, it appears that OpenAI is behind OpenAI in its independent Fedrampa permit. This could be a game-changer for Openai: Openai needs to win this accreditation, not have to pitch directly to the government, a Microsoft like any other company. At this point, Openai will be a freer government contractor to maintain more independence than other major cloud companies.

The same government official explained Fast company This anthropic has not yet shared a plan to obtain accreditation for its systems through this program or otherwise sponsor a review. A spokesman for the GSA declined to comment.

Leave a Comment