A bill with support and opposition from dozens of stakeholders cleared a key legislative hurdle Aug. 15 and will next proceed to the Assembly floor in the coming weeks.
Senate Bill 1047, authored by Sen. Scott Wiener with the intention of regulating the AI industry to protect Californians, he says, from potential safety risks has been hotly contested by the tech industry though other groups say they fear, if passed, it could jeopardize state and national security.
The author said the bill is needed to safeguard California from malicious uses of AI technology.
“Large-scale artificial intelligence has the potential to produce an incredible range of benefits for Californians and our economy—from advances in medicine and climate science to improved wildfire forecasting and clean power development,” Wiener said in legislative analyses. “It also gives us an opportunity to apply hard lessons learned over the last decade, as we’ve seen the consequences of allowing the unchecked growth of new technology without evaluating, understanding, or mitigating the risks.”
He argues SB 1047 would develop “responsible, appropriate guardrails” regarding the development of the largest AI models in the industry “to ensure they are used to improve Californians’ lives without compromising safety or security.”
As written, the bill applies only to AI models that cost at least $100 million to develop and have high levels of computation processes.
Consultants for the Assembly’s Judiciary Committee noted in analyses published in July, that AI’s rapid advancement presents “unprecedented opportunities and significant challenges.”
Uncertainty abounds, according to consultants, as highly trained AI models can behave unpredictably and can attract malicious actors intent on using the technology for malevolent purposes.
“This unpredictability, coupled with the high stakes involved, underscores the necessity for stringent oversight and safety protocols,” committee staff wrote in the analyses.
Concerns around cyberattacks, espionage, and misinformation campaigns are just a few of the examples provided by proponents of the bill who suggest criminals and potentially terrorists could use the technology to coordinate attacks.
“The accessibility of powerful AI tools to malicious entities can amplify their capabilities, making it easier for them to carry out sophisticated attacks with potentially devastating consequences,” committee staff wrote. “This highlights the urgent need for comprehensive oversight and regulatory measures, such as those proposed in this bill, to mitigate these risks and ensure that AI technologies are developed and deployed responsibly.”
The bill passed the Assembly’s Appropriations Committee after being placed on the suspense file for its fiscal impact—which committee staff estimated at up to $10 million annually. Costs could exceed estimates depending on the number of staff required to oversee the new regulations, if the bill ultimately passes, and costs to the state’s judicial system that would vary based on litigation that may arise.
With two weeks left in the legislative session, stakeholders on both sides of the issue are voicing their opinions.
Members of the health care and life science industry are requesting exemptions from requirements for an emergency shutdown switch in AI systems. The groups say the health care related models are essential for medical operations and sudden shutdowns could negatively affect patient care.
Additionally, some academic and research institutions are recommending exemptions for study-based groups citing excessive regulations as a potential stifle on innovation that could limit the open exchange of information and collaboration.
Some opponents, including software developers and the Abundance Institute—a nonprofit advocating for emerging technologies—are suggesting the bill should request voluntary compliance with measures for low-risk AI models to allow “responsible” development.
Additionally, a number of advocacy groups said in legislative analyses that California should recognize the positive and negative potential for AI and lead the nation with regulations for the most powerful models.
“California must act now to ensure that it remains at the forefront of dynamic innovation in AI development,” the groups said in the analyses. “California must ensure that the small handful of companies developing extremely powerful AI models—including companies explicitly aiming to develop ‘artificial general intelligence’— take reasonable care to prevent their models from causing very serious harms as they continue to produce models of greater and greater power.”
Another group said in support that CalCompute—introduced by the bill as a publicly-owned computing system that would “foster research and development for the betterment of society” is a positive first step that will increase transparency and encourage competition.
One Latino based organization said the legislation would create fairness and more opportunity.
“California has the opportunity and responsibility to ensure that the economic prosperity stemming from the tech boom benefits everyone in the state,” the Latino Community Foundation said in legislative analyses—noting that only a handful of corporations dominate the industry. “Without equitable access to AI tools and resources, Latino-led nonprofit organizations, along with Latino small businesses and entrepreneurs, face the threat of being left behind in accessing vital public and private resources to expand their operations and effectively serve communities and engage consumers.”
However, some technology companies staunchly oppose the bill because of what they perceive as threats to the industry due to excessive regulation.
“The bill could inadvertently threaten the vibrancy of California’s technology economy and undermine competition,” Y Combinator, a technology startup accelerator and venture capital firm, said in legislative analyses.
Writing on behalf of dozens of AI businesses in the state, the company said it is most concerned with liabilities and regulation for developers as opposed to those who use the technology for ill purposes.
They also questioned the computing power thresholds presented in the bill, saying that as the industry is new, the metrics could fail to “capture the capabilities or risks” of future models.
Opponents also challenge what they say is the “extremely vague” language of the bill, suggesting the interpretation of certain verbiage could prove problematic.
Dozens of business groups—including the California Chamber of Commerce, California Manufacturers and Technology Association, and Civil Justice Association of California, among others—argue that the bill is “overbroad” and “impractical, if not infeasible” and could prove detrimental to California’s economy.
“The bill requires developers to comply with incredibly vague, broad, impractical, if not impossible, standards when developing ‘covered models’ and determining whether they can provide reasonable assurance that a covered model does not have a hazardous capability or come close to one, creating significant regulatory uncertainty,” the groups said in legislative analyses.