New York state lawmakers passed a bill on Thursday that aims to prevent frontier AI models from OpenAI, Google, and Anthropic from contributing to disaster scenarios, including the death or injury of more than 100 people, or more than $1 billion in damages.
The passage of the RAISE Act represents a win for the AI safety movement, which has lost ground in recent years as Silicon Valley and the Trump Administration have prioritized speed and innovation. Safety advocates including Nobel prize laureate Geoffrey Hinton and AI research pioneer Yoshua Bengio have championed the RAISE Act. Should it become law, the bill would establish America’s first set of legally mandated transparency standards for frontier AI labs.
The RAISE Act has many of the same provisions and goals as California’s controversial AI safety bill, SB 1047, which was ultimately vetoed. However, the co-sponsor of the bill, New York state Senator Andrew Gounardes told gajed in an interview that he deliberately designed the RAISE Act such that it doesn’t chill innovation among startups or academic researchers — a common criticism of SB 1047.
“The window to put in place guardrails is rapidly shrinking given how fast this technology is evolving,” said Senator Gounardes. “The people that know [AI] the best say that these risks are incredibly likely […] That’s alarming.”
The Raise Act is now headed for New York Governor Kathy Hochul’s desk, where could either sign the bill into law, send it back for amendments, or veto it altogether.
If signed into law, New York’s AI safety bill would require the world’s largest AI labs to publish thorough safety and security reports on their frontier AI models. The bill also requires AI labs to report safety incidents, such as concerning AI model behavior or bad actors stealing an AI model, should they happen. If tech companies fail to live up to these standards, the RAISE Act empowers New York’s Attorney General to bring civil penalties of up to $30 million.
The RAISE Act aims to narrowly regulate the world’s largest companies — whether they’re based in California (like OpenAI and Google) or China (like DeepSeek and Alibaba). The bill’s transparency requirements apply to companies whose AI models were trained using more than $100 million in computing resources, and are being made available to New York residents.
Silicon Valley has pushed back significantly on New York’s AI safety bill, New York state Assemblymember and co-sponsor of the RAISE Act Alex Bores told gajed. Bores called the industry resistance unsurprising, but claimed that the RAISE Act would not limit innovation of tech companies in any way.
Anthropic, the safety-focused AI lab that called for federal transparency standards for AI companies earlier this month, has not reached an official stance on the bill, co-founder Jack Clark said in a Friday post on X. However, Clark expressed some grievances over how broad the RAISE Act is, noting that it could present a risk to “smaller companies.”
When asked about Anthropic’s criticism, state Senator Gounardes told gajed he thought it “misses the mark,” noting that he designed the bill not to apply to small companies.
OpenAI, Google, and Meta did not respond to gajed’s request for comment.
Another common criticism of the RAISE Act is that AI model developers simply wouldn’t offer their most advanced AI models in the state of New York. That was a similar criticism brought against SB 1047, and it’s largely what’s played out in Europe thanks to the continent’s tough regulations on technology.
Assemblymember Bores told gajed that the regulatory burden of the RAISE Act is relatively light, and therefore, shouldn’t require tech companies to stop operating their products in New York. Given the fact that New York has the third largest GDP in the U.S., pulling out of the state is not something most companies would take lightly.
“I don’t want to underestimate the political pettiness that might happen, but I am very confident that there is no economic reasons for them to not make their models available in New York,” said Assemblymember Borres.