OpenAI sent a letter to Governor Gavin Newsom proposing that California align its AI rules with federal and global frameworks. Why now? Because, the company says, a patchwork of state laws could slow innovation without improving safety. (openai.com, cdn.openai.com)
What OpenAI proposes
OpenAI summarizes its ask in practical, concrete steps to avoid a regulatory "perfect storm" between states and with federal rules:
- Harmonize state regulation with federal standards and international frameworks like the
EU AI Act Code of Practice
(CoP). - Recognize that frontier model developers can be treated as meeting state requirements when they have security agreements with relevant federal agencies, such as
CAISI
. - Avoid duplicating obligations that penalize SMEs and startups; propose exemptions or differentiated compliance paths for small teams.
- Encourage developers to work with agencies that have the technical and national-security capacity for advanced evaluations.
These recommendations appear explicitly in the letter to the governor. (cdn.openai.com)
What problem is this letter trying to solve?
Can you imagine if every state had its own instruction manual for building a phone or an electrical grid? That’s what OpenAI warns could happen with AI: 1,000 ongoing state initiatives could create a legal patchwork that confuses more than it protects.
The letter argues that duplication and regulatory inconsistency can slow down the safe adoption of useful technologies. (openai.com, cdn.openai.com)
"In the Age of Intelligence, clarity about AI is not optional" — the phrase that guides the request. (openai.com)
OpenAI uses two helpful analogies: the Space Race (to illustrate the cost of fragmented regulations) and CEQA (California’s environmental law) as a lesson in how well-intentioned rules can have adverse effects if not properly calibrated. (cdn.openai.com)
What are the implications for startups, companies and citizens?
For startups: fewer duplicated regulations means lower legal bills and more resources for product and hiring. Sound familiar — fighting to pay lawyers instead of paying engineers? OpenAI asks for compliance paths that don’t strangle small teams. (cdn.openai.com)
For national security: the letter highlights that certain tests require access to sensitive information and expertise only federal agencies can provide, so the State alone might not have the tools to properly evaluate frontier models. (cdn.openai.com)
For California’s economy: OpenAI reminds readers that the AI ecosystem brings billions to state coffers and jobs; regulation should balance safety and economic competitiveness. (openai.com, cdn.openai.com)
Risks and open questions
Does this mean industry gets to write the rules? Not necessarily, but it does open a debate about who sets standards and with what values: do we prioritize competitiveness, safety, rights protection, or all of the above? There are real tensions between state sovereignty and the need for evaluations that, by nature, require national and international coordination.
Another risk: a federal or international framework that favors large players could exclude emerging innovators if exemptions aren’t designed correctly. That’s exactly the concern OpenAI tries to address by asking for SME exemptions. (cdn.openai.com)
What’s next and how you can learn more
OpenAI also proposed to engage in dialogue with the state government and is willing to sign a "California approach" that reinforces global and federal standards. The letter is dated mid‑August 2025 and is publicly available. If you want to read it in full, it’s here: read OpenAI’s full letter to Gov. Newsom. (cdn.openai.com, openai.com)
Closing: why should you care?
Because how we regulate AI today shapes which companies survive tomorrow, what jobs get created, and how your digital rights are protected. Do you prefer a mosaic of rules that limits options, or a coherent framework that aims for both safety and competition?
The letter from OpenAI is a nudge for California to think big without choking the small players.
The good news: the conversation has already started. Now we’ll see if the response is practical coordination or more state-level patches. What do you think?