On March 20, 2026, the White House unveiled its National Policy Framework for Artificial Intelligence, providing a blueprint on legislative recommendations and urging Congress to act. It recommends that Congress create a unified federal standard to reduce the regulatory friction of competing state AI regimes, promote AI innovation, and develop an AI-ready workforce, while ensuring the protection of children, consumers, and intellectual property rights. 

The Framework’s Seven Pillars

The recommendations cover seven core pillars:

  1. Protect children – Calls for age-assurance requirements, parental control tools, limits on data collection from minors, and features to reduce risks of exploitation and self-harm on AI platforms.
  2. Safeguard communities – Recommends “augmenting law enforcement efforts to combat” AI-related fraud, limiting energy cost impacts, streamlining federal permitting for AI infrastructure, providing AI resources to small businesses, and ensuring national security agencies have sufficient technical capacity to assess frontier AI model capabilities and associated risks.
  3. Respect intellectual property rights – Affirmatively states that AI training on copyrighted material does not violate copyright law but defers final resolution of that question to the courts. Encourages exploration of voluntary licensing frameworks for rights holders and protection against unauthorized AI-generated digital replicas.
  4. Encourage free speech – Urges the prevention of government coercion of AI providers to censor lawful expression and the ability for consumers to seek redress against federal censorship efforts.
  5. Promote AI innovation and dominance – Proposes “regulatory sandboxes for AI applications,” accessible federal datasets for training AI models, and recommends that no new federal AI regulatory body should be created, relying instead on existing agencies and industry-led standards.
  6. Empower the workforce – Encourages AI educational training and support programs to develop an AI-ready workforce.  
  7. Preempt state laws – Seeks a uniform national standard that preempts potentially unduly burdensome state AI laws while preserving states’ traditional police powers, consumer protections, and zoning authority.

The Remaining Gaps

The framework understandably cannot cover every facet of potential AI issues and is largely silent on regulatory enforcement and a comprehensive data privacy regime (though it does address children’s data and privacy). It does not propose specific penalties, compliance mechanisms, or oversight structures for companies developing or deploying AI. It also does not address potential AI-generated discrimination, algorithmic accountability, and how existing agencies should coordinate enforcement, if at all.

As previously published in Compliance & Enforcement, “the absence of a federal AI framework has left existing legal doctrines—privilege law, constitutional Commerce Clause analysis, decades-old fraud statutes—to absorb questions they were never designed to answer.” This remains an open issue, as illustrated, for example, in the recent Southern District of New York opinion applying attorney-client privilege and attorney work-product—traditional legal doctrines—to novel AI questions without legislative guidance. 

Preemption Needed to Prevent Inconsistency

The framework tracks principles from the White House Executive Order on Ensuring a National Policy Framework for Artificial Intelligence (December 11, 2025), which invoked existing executive authority and general Commerce Clause preemption principles to check state AI regulation. The framework’s call for preemption goes further, noting that AI development is “an inherently interstate phenomenon with key foreign policy and national security implications.” This push for Congressional action implicitly concedes that executive authority alone may be insufficient. Until Congress acts, states retain room to pursue their own AI regimes and the AI legal landscape will remain in flux.

Conclusion

The framework is a serious, if incomplete, attempt to bring coherence to an enforcement landscape that has been improvising. The seven pillars address various pressure points, including preemption, IP, child safety, and censorship, but the absence of any enforcement architecture means that even if Congress acts, implementation questions will land back in the agencies and courts. In releasing this framework, the executive branch may be conceding, implicitly, that it cannot implement its objectives alone. Congress has been handed a blueprint, but whether it is able to enact comprehensive federal legislation is another matter.  Companies utilizing AI should not wait for Congress to act before assessing their exposure.

Print:
Email this postTweet this postLike this postShare this post on LinkedIn
Photo of Anthony Q. Le Anthony Q. Le

Anthony has a broad array of experiences assisting with compliance issues, regulatory and enforcement matters, internal investigations, and individual and class litigation. His diverse practice helps him achieve the most efficient and practical results for his clients spanning the financial services, technology, automobile…

Anthony has a broad array of experiences assisting with compliance issues, regulatory and enforcement matters, internal investigations, and individual and class litigation. His diverse practice helps him achieve the most efficient and practical results for his clients spanning the financial services, technology, automobile, and retail sectors.

Photo of Aaron R. Marienthal Aaron R. Marienthal

Aaron is a dynamic litigator and regulatory counsel with extensive experience in high-stakes consumer financial services matters. He represents financial institutions, fintechs, and technology companies related to a variety of consumer lending issues, payments products, and legislative and government affairs.

Photo of Garen S. Marshall Garen S. Marshall

Garen Marshall, a former federal prosecutor and United States Navy special forces combat veteran, is a member of the firm’s Government Investigations and White Collar Litigation Department. Garen’s practice leverages his years in the private sector and his tenure at the United States…

Garen Marshall, a former federal prosecutor and United States Navy special forces combat veteran, is a member of the firm’s Government Investigations and White Collar Litigation Department. Garen’s practice leverages his years in the private sector and his tenure at the United States Attorney’s Office for the Eastern District of New York, where he served in the Office’s National Security and Cybercrime Unit and the Organized Crime and Gangs Unit. Garen focuses his practice on financial fraud and corruption investigations across a range of industries, including banking, private equity, healthcare, financial technology, and cryptocurrency, and advises clients on emerging enforcement, litigation, and compliance risks, including those arising from the use of artificial intelligence. He is a frequent author and speaker on AI governance and risk and is recognized as a thought leader on the evolving regulatory and enforcement landscape.

Photo of Janet P. Peyton Janet P. Peyton

Janet currently serves as the firm’s office managing partner for the Richmond office. She practices in the areas of intellectual property and data privacy and security. Janet provides worldwide brand protection, enforcement, licensing and transactional IP services, and she assists clients with preventive…

Janet currently serves as the firm’s office managing partner for the Richmond office. She practices in the areas of intellectual property and data privacy and security. Janet provides worldwide brand protection, enforcement, licensing and transactional IP services, and she assists clients with preventive data security as well as compliance issues in the aftermath of a data breach.

Photo of David Hirsch David Hirsch

Dave is a highly respected member of the securities enforcement and regulatory counseling practice group at McGuireWoods, where he plays a key role shaping the strategic direction of the firm’s securities enforcement initiatives. Before joining McGuireWoods, Dave was Chief of the Crypto Assets…

Dave is a highly respected member of the securities enforcement and regulatory counseling practice group at McGuireWoods, where he plays a key role shaping the strategic direction of the firm’s securities enforcement initiatives. Before joining McGuireWoods, Dave was Chief of the Crypto Assets and Cyber Unit in the SEC Division of Enforcement, and prior to that served as enforcement counsel to SEC Commissioner Crenshaw. He is a recognized expert and frequent speaker with a robust practice that spans a wide array of complex regulatory and enforcement matters, particularly those involving crypto and cyber.