Innovation, Interrupted: How Congress Gave AI a Decade Without Rules

AI today isn’t just answering emails or recommending playlists. It’s writing code, drafting contracts, designing molecules, modeling supply chains, and generating media indistinguishable from human speech. It’s helping doctors detect cancer earlier and helping bad actors spin up synthetic disinformation instantly. AI is redefining the limits of productivity, as well as the boundaries of human agency.

Buried in Section 43201(c) of H.R. 8281 - better known as former President Trump’s “Big, Beautiful Bill” - was a clause with no clear connection to border policy and every indication it was meant to slip through unnoticed. Marketed as a sweeping package on immigration and federal spending, the bill became a vehicle for much more: a deregulatory wish list tucked into an omnibus frame.

One provision stood out, and not for what it created, but for what it erased.

The clause bars states from enforcing any law that specifically regulates artificial intelligence for the next ten years. If an AI system complies with whatever state law happens to exist, whether robust, vague, or nonexistent, then no other state can raise the bar. And no new law can fill the gap.

To be clear, this is not yet law. Passed narrowly by the House but still awaiting Senate consideration, Section 43201(c) nonetheless signals how AI oversight might look for the next decade.

At the federal level? Silence. Congress hasn’t passed a comprehensive AI framework. Agencies like the FTC and EEOC still have broad consumer protection tools but lack a coordinated strategy or clear direction. Just an empty space where policy should be.

And that empty space comes at a moment when AI is moving fast - fast enough to transform how we work, learn, govern, and live.

AI today isn’t just answering emails or recommending playlists. It’s writing code, drafting contracts, designing molecules, modeling supply chains, and generating media indistinguishable from human speech. It’s helping doctors detect cancer earlier and helping bad actors spin up synthetic disinformation instantly. AI is redefining the limits of productivity, as well as the boundaries of human agency.

The technology holds astonishing promise. Economic acceleration. Scientific breakthroughs. Education tailored to every learner. Medical insights previously impossible. But with that promise comes the need for responsibility, and right now, this clause pushes it off the table.

Most states haven’t passed meaningful AI laws. And under Section 43201(c), they now can’t, unless their laws fit narrow exceptions like criminal statutes or rules treating AI no differently than spreadsheets.

For a technology capable of reshaping modern life, that’s not governance. That’s a green light…and a blindfold.

The Clause, Unpacked

The actual statutory language:

“No State or political subdivision thereof may enforce any law or regulation limiting, restricting, or otherwise regulating artificial intelligence models, artificial intelligence systems, or automated decision systems during the 10-year period beginning on the date of enactment of this Act.”

This single sentence triggers a sweeping preemption of state and local AI laws. If a state wanted transparency about how AI decides who gets hired, or laws prohibiting AI from denying insurance claims based on zip codes, or oversight of predictive policing tools, those efforts are likely preempted. Even consumer disclosures about chatbots or automated product reviews may now be unenforceable.

Unless Congress acts, and there’s no indication it will, governance of AI in America is now dictated by a vacuum.

A Policy Loophole Disguised as Uniformity

This wasn’t an accident.

Tech companies have pushed hard for a federal “framework,” but not necessarily accountability. What they feared most wasn’t overregulation from Washington, it was fragmented innovation governance from states.

California, in particular, spooked them. A 2023 bill by State Senator Scott Wiener imposed tiered safety obligations on large AI models. Industry response was swift and coordinated. White papers warned of a “patchwork problem.” Lobbyists flooded Capitol Hill. Think tanks warned of regulatory chaos.

And then, hidden in a budget and border bill, came the clause.

Sources described it as a “strategic prophylactic” - a preemptive strike ensuring AI regulation, if it ever happens, will be federal or nonexistent.

That’s not regulation. That’s preemption as policy.

States as First Responders

While Congress dithered, states acted.

For years, state governments have done the heavy lifting on AI oversight, responding to real-world cases with real policy tools:

  • New York City enforced bias audits of automated hiring tools, forcing companies to prove fairness;

  • Colorado adopted comprehensive AI oversight, focusing on transparency and risk mitigation;

  • Illinois enforced its biometric privacy law, targeting facial recognition without consent; and

  • California, Vermont, Utah, and Connecticut advanced AI-specific regulations.

States took AI seriously, balancing innovation with accountability. Federalism, in this case, was working: When Congress didn’t lead, states stepped in.

Now, those states are being told to sit down. Under H.R. 8281, their efforts could be rolled back or frozen, legally sidelined from meaningful action. Attorneys general now face years of litigation justifying existing protections.

The result? Corporate certainty, consumer confusion, and regulatory paralysis.

A Tale of Two Americas - and Fifty Speed Limits

Cannabis is, by any rational measure, less disruptive than artificial intelligence. Cannabis might impair for hours - AI may silently determine life opportunities without recourse.

This move has echoes.

With cannabis, federal silence allowed states to experiment and innovate. Licensing systems, equity programs, public health campaigns - states led, despite federal illegality. Fragmentation spurred policy innovation, ultimately pushing federal reform forward.

In AI, we’re seeing the opposite. Federal power asserts dominance not to lead but to preempt states from taking action. No experimentation. No innovation. Just enforced inertia.

Cannabis is, by any rational measure, less disruptive than artificial intelligence. Cannabis might impair for hours - AI may silently determine life opportunities without recourse.

Yet cannabis remains federally illegal, while AI, the century’s most consequential technology, gets a decade-long regulatory holiday.

This isn’t federal oversight. This is regulatory abdication.

The Global Regulatory Clock Ticks

While the U.S. pauses oversight, the world moves forward:

  • EU’s AI Act establishes clear risk-based obligations;

  • China mandates registration and security reviews for generative AI; and

  • The UK, though pro-innovation, assigned regulators clear oversight roles.

These countries recognize the urgency of governing. Meanwhile, H.R. 8281 blocks U.S. states from even trying. Rather than simplifying compliance, it complicates it, forcing global companies either to adopt multiple standards or default to foreign rules.

Ironically, this U.S. law may mean our AI standards will be dictated by Brussels or Beijing.

An Exemption Wrapped in a Myth

In 2023, AI companies publicly welcomed regulation. But when California and Colorado attempted meaningful oversight, their tune changed. Suddenly, “Regulate us” became “Not like that.”

State laws were too fast, too enforceable, too risky. A federal vacuum, with no standards, no interference, was safer.

Thus, H.R. 8281 didn’t regulate AI. It regulated regulation. Not cowardice…choreography.

Final Thought

We’ve seen this strategy before:

  • A small telecom clause created legal immunity for internet platforms;

  • Financial loopholes built hidden risks leading to a global crisis; and

  • Tax provisions shielded online commerce from state oversight.

Small clauses, massive consequences.

But this time it’s bigger. It’s about algorithms determining jobs, healthcare, safety, and freedom.

And Congress is telling states: Your hands are tied—come back in 2035.


Sources, Resources, and Suggested Reading:

Sources

Suggested Reading

Shawn Collins

Shawn Collins is one of the country’s foremost experts in cannabis policy. He is sought after to opine and consult on not just policy creation and development, but program implementation as well. He is widely recognized for his creative mind as well as his thoughtful and successful leadership of both startup and bureaucratic organizations. In addition to cannabis, he has a well-documented expertise in health care and complex financial matters as well.

Shawn was unanimously appointed as the inaugural Executive Director of the Massachusetts Cannabis Control Commission in 2017. In that role, he helped establish Massachusetts as a model for the implementation of safe, effective, and equitable cannabis policy, while simultaneously building out and overseeing the operations of the East Coast’s first adult-use marijuana regulatory agency.

Under Shawn’s leadership, Massachusetts’ adult-use Marijuana Retailers successfully opened in 2018 with a fully regulated supply chain unparalleled by their peers, complete with quality control testing and seed-to-sale tracking. Since then, the legal marketplace has grown at a rapid pace and generated more than $5 billion in revenue across more than 300 retail stores, including $1.56 billion in 2023 alone. He also oversaw the successful migration and integration of the Medical Use of Marijuana Program from the stewardship of the Department of Public Health to the Cannabis Control Commission in 2018. The program has since more than doubled in size and continues to support nearly 100,000 patients due to thoughtful programmatic and regulatory enhancements.

Shawn is an original founder of the Cannabis Regulators Association and also helped formalize networks that provide policymakers with unbiased information from the front lines of cannabis legalization, even as federal prohibition persists. At the height of the COVID-19 pandemic, Collins was recognized by Boston Magazine as one of Boston’s 100 most influential people for his work to shape the emerging cannabis industry in Massachusetts.

Before joining the Commission, Shawn served as Assistant Treasurer and Director of Policy and Legislative Affairs to Treasurer Deborah B. Goldberg and Chief of Staff and General Counsel to former Sen. Richard T. Moore (D-Uxbridge). He currently lives in Webster, Massachusetts with his growing family. Shawn is a graduate of Suffolk University and Suffolk University Law School, and is admitted to practice law in Massachusetts.

Shawn has since founded THC Group in order to leverage his experience on behalf of clients, and to do so with a personalized approach.

https://homegrown-group.com
Next
Next

THC Is THC - Why Our Cannabis Laws No Longer Make Sense