Takeaways

The Executive Order outlines a multi-agency strategy to investigate, challenge and develop laws to preempt state authority to regulate AI by targeting state AI statutes inconsistent with a “minimally burdensome” national AI policy.
It continues a broader trend, including the earlier “Big Beautiful Bill” proposal for a 10-year moratorium on state AI laws, toward constraining state AI authority in the name of regulatory uniformity and national competitiveness.
Companies subject to state AI regimes, including those in California, Colorado and other active jurisdictions, should expect continued evolution in this area.

Background: Federal AI Policy and State AI Laws
Earlier this year, President Trump revoked Executive Order 14110, “Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence,” issued under the prior administration, and replaced it with Executive Order 14179 (the “EO”), “Removing Barriers to American Leadership in Artificial Intelligence.” EO 14179 directs agencies to identify and roll back regulations that could act as barriers to artificial intelligence (AI) innovation and calls for the development of a national AI Action Plan focused on competitiveness and reduced regulatory burdens.

As Pillsbury previously explained in its AI Action Plan alert, this pivot signaled a shift to minimal oversight and industry self-regulation, with the federal government de-emphasizing oversight, and emphasizing AI leadership, reduced regulatory barriers and public–private collaboration.

In Congress, lawmakers have simultaneously wrestled with the question of federal preemption of state AI laws. The 2025 House budget reconciliation bill, widely known as the “One Big Beautiful Bill,” would have imposed a 10-year moratorium on enforcement of state and local AI laws, broadly suspending existing and future state AI frameworks across sectors such as employment, health care and financial services without immediately creating replacement federal standards. That moratorium language was ultimately stripped from the bill in the Senate, which voted 99–1 against penalizing states for enacting AI laws.

States, meanwhile, continue to legislate. California, Colorado, New York and Illinois have adopted or proposed comprehensive AI and algorithmic accountability laws addressing transparency, bias mitigation, documentation and sector-specific safeguards. Commentators estimate that more than 1,000 AI-related bills have been introduced across nearly every state in 2024–2025.

The AI framework in the United States differs greatly from the EU, which has taken a uniform, protective approach to regulation of AI.

Overview of the Executive Order
The EO states that the Trump administration will advance its objective of removing barriers to United States AI leadership. It declares that “United States AI companies must be free to innovate without cumbersome regulation” and identifies (1) a “patchwork” of 50 different regulatory regimes, (2) the risk of “ideological bias” in models, and (3) impermissible state regulation of interstate commerce as the impetus for further action.

To address these issues, the EO directs the:

  • U.S. Department of Justice (DOJ) to establish and operate an AI Litigation Task Force to challenge state AI laws inconsistent with the federal AI policy;
  • U.S. Secretary of Commerce to evaluate state AI laws, classifying them as onerous or consistent with federal AI policy. The Secretary will then be responsible for issuing conditions on federal funding, or granting additional funds pursuant to the results of the evaluation;
  • U.S. Federal Communications Commission (FCC) and U.S. Federal Trade Commission (FTC) to, respectively, determine whether to adopt a federal reporting and disclosure standard for AI models that would preempt conflicting state laws; and issue a policy statement explaining how the FTC Act’s prohibition on unfair or deceptive acts or practices applies to AI models and when state laws requiring alterations of truthful outputs are preempted; and
  • U.S. Special Advisor for AI and Crypto and U.S. Assistant to the President for Science and Technology to jointly prepare a legislative recommendation establishing a uniform federal AI policy framework that preempts conflicting state AI laws.

Notably, the above actions will not preempt state AI laws related to (i) child safety protections; (ii) AI compute and data center infrastructure; and (iii) state government procurement and use of AI.

How the EO Impacts State AI Laws
The EO does not directly amend, repeal or suspend any state AI law. State statutes and regulations remain effective unless and until they are changed by the relevant state, preempted by federal law or invalidated by a court.

The EO:

  • Signals federal scrutiny of state AI laws. DOJ is directed to establish an AI Litigation Task Force whose responsibility is to challenge state AI laws considered inconsistent with the policy in the EO, and U.S. Department of Commerce (Commerce) is directed to identify state laws that may conflict with that policy.
  • Connects certain federal funding decisions to state AI laws. Depending on how agencies implement Section 5, a state’s eligibility for Broadband Equity, Access, and Deployment (BEAD) non-deployment funds and certain discretionary grants may be affected by whether it has enacted and enforces AI laws identified in Commerce’s evaluation or challenged by DOJ.
  • Provides a framework for future agency and legislative action. The EO anticipates FCC and FTC actions and a legislative recommendation on AI preemption, rather than establishing a comprehensive federal AI regulatory scheme on its own.

The EO, by itself, does not resolve the extent to which federal law will ultimately interact with specific state AI regimes. That will depend on how agencies implement the EO and how courts assess any resulting litigation or regulatory actions.

Key Considerations for Stakeholders
For companies that develop, deploy or rely on AI, particularly those operating in jurisdictions with active or emerging AI regimes, the EO raises several factual and planning considerations:

State AI Laws Continue to Apply, But May Be Challenged
State AI laws in California, Colorado and other jurisdictions continue to apply. At the same time, those laws may become the subject of DOJ litigation or other federal action prompted by the EO. Companies should be aware that the legal status of certain state AI requirements could change over time as litigation proceeds or as states and agencies respond to the EO.

Contracts related to the use of AI should account for potential changes in law, and delineate responsibility for monitoring, compliance and adaptation based on the changes.

Possible Implications for Funding-Related Projects
Entities that rely on state-administered federal funds, including broadband and infrastructure providers, should consider how the EO’s direction to agencies regarding funding conditions could affect projects that involve BEAD or other discretionary funds. Section 5 of the EO contemplates that states may be asked either not to enact certain AI laws or, for existing laws, to enter into binding agreements with federal agencies not to enforce those laws during the period in which they receive funding.

Interaction with Internal AI Governance
The EO focuses on the federal-state allocation of authority rather than on detailed technical standards for AI systems. Companies that have developed internal AI governance and compliance programs to meet state AI obligations will need to track how those obligations evolve but may find that many of the underlying risk-management practices remain relevant irrespective of changes in specific legal provisions.

Recommended Actions
Although many details will depend on how agencies implement the EO and how courts respond to any litigation, companies can consider the following steps now:

  • Identify relevant state AI laws. Catalog which state AI, automated decision-making and algorithmic accountability regimes apply to your operations, focusing on jurisdictions that have enacted comprehensive frameworks.

  • Assess funding dependencies. Determine whether your organization or key projects rely on federal or state-administered funding that could be affected by AI-related funding conditions.

  • Monitor federal and state developments. Track implementation of the EO, including:

    -  DOJ announcements regarding the AI Litigation Task Force;

    -  Commerce’s report on state AI laws and any funding-related notices; and

    -  FCC/FTC developments and any legislative proposals relating to AI and preemption.

    Continue to follow state legislative and regulatory activity, as states may respond to the EO in a variety of ways.

  • Coordinate with counsel. Work with legal counsel to review how the EO may intersect with existing contractual obligations, regulatory requirements and AI-related risk management frameworks, and to prepare for potential changes in the legal landscape for state AI laws.

  • Advocate for workable AI laws and regulations. Retain government affairs experts to influence federal agency, congressional and state level legislative and regulatory activity related to AI. In particular, having a government affairs team that can coordinate your organization’s response to numerous legislative and regulatory actions related to AI across states and the federal government is critical.

Pillsbury’s Technology Transactions team is available to help clients assess how the EO may affect their AI arrangements, update contracting and governance frameworks, and negotiate vendor and customer agreements in light of evolving state and federal AI regimes. Pillsbury’s Government Law & Strategies Team is available to advise clients on the implications of this policy shift, assist in drafting public comments, and developing and executing an advocacy strategy to influence federal and state AI policy.

These and any accompanying materials are not legal advice, are not a complete summary of the subject matter, and are subject to the terms of use found at: https://www.pillsburylaw.com/en/terms-of-use.html. We recommend that you obtain separate legal advice.