Anthropic Announces Cautious Support for New California AI Regulation Legislation
"It is our best understanding that this interplay will not end up causing unnecessary pre-harm enforcement, but the language has enough ambiguity to raise concerns," Amodei wrote. "If implemented well, this could lead to well-defined standards for auditors and a well-functioning audit ecosystem, but if implemented poorly this could cause the audits to not focus on the core safety aspects of the bill."
"The bill's treatment of injunctive relief." Another place pre-harm enforcement still exists is that the Attorney General retains broad authority to enforce the entire bill via injunctive relief, including before any harm has occurred. This is substantially narrower than previous pre-harm enforcement, but is still a vector for overreach.
"Miscellaneous other issues." The company's list of concerns also included know-your-customer requirements on cloud providers, overly short notice periods for incident reporting, and overly expansive whistleblower protections that are subject to abuse, were not addressed.
"The burdens created by these provisions are likely to be manageable, if the executive branch takes a judicious approach to implementation," Amodei wrote. "If SB 1047 were signed into law, we would urge the government to avoid overreach in these areas in particular, to maintain a laser focus on catastrophic risks, and to resist the temptation to commandeer SB 1047's provisions to accomplish unrelated goals."
Opponents of the bill, which include OpenAI, Meta, Y Combinator, and venture capital firm Andreessen Horowitz, argue that the bill's thresholds and liability provisions could stifle innovation and unfairly burden smaller developers. They criticize the bill for focusing on model-level regulations rather than specific misuse. He warned that strict requirements could drive innovation overseas and harm the open source community.
Anjney Midha, General Partner at Andreessen Horowitz, has expressed concerns that startups, founders, and investors will feel blindsided by the bill and emphasized the need for lawmakers to consult with the tech community.
In an open letter, the AI Alliance, a group focused on safe AI and open innovation, voiced its concerns. The group noted that, although SB 1047 doesn't directly target open-source development, it would significantly impact it. The bill requires developers of AI models with 10^26 FLOPS or more to implement a shutdown control, but it doesn't address how this would work for open source models. Although no such models exist yet, the bill could freeze open source AI development at its 2024 level.
Several California representatives, including Ro Khanna, Anna Eshoo, and Zoe Lofgren, have opposed the bill, citing concerns about its impact on the state's economy and innovation ecosystem.
About the Author
John K. Waters is the editor in chief of a number of Converge360.com sites, with a focus on high-end development, AI and future tech. He's been writing about cutting-edge technologies and culture of Silicon Valley for more than two decades, and he's written more than a dozen books. He also co-scripted the documentary film Silicon Valley: A 100 Year Renaissance, which aired on PBS. He can be reached at [email protected].