OpenAI, Adobe, Microsoft, and Elon Musk Support Different California AI Bills - 1

OpenAI, Adobe, Microsoft, and Elon Musk Support Different California AI Bills

  • Written by Andrea Miliani Former Tech News Expert

Microsoft, Open AI, Abode, and businessman Elon Musk have recently supported new California bills regarding artificial intelligence, but not all the same ones.

According to TechCrunch , OpenAI, Microsoft, and Adobe have shared letters agreeing with the bill AB 3211 which requires businesses to label AI-generated content. Elon Musk, on the other hand, has supported bill SB 1047, which focuses on AI safety and requires businesses to document and create safeguards.

Both bills will receive a final vote by the end of the month and will affect most AI companies. However, SB 1047, recently amended to comply with certain companies’ requirements, has been the most controversial bill .

Elon Musk shared a public post on X , supporting the safety measures on SB 1047. “This is a tough call and will make some people upset, but, all things considered, I think California should probably pass the SB 1047 AI safety bill,” wrote Musk, “For over 20 years, I have been an advocate for AI regulation, just as we regulate any product/technology that is a potential risk to the public.”

Musk’s statements surprised many in the industry, as his companies could be subject to the new requirements and measures of SB 1047. Other tech giants like OpenAI, Google, and Meta have opposed this bill.

OpenAI and Microsoft could support bill AB 3211, which was also amended to align more with these companies’ requirements, to encourage lawmakers to approve this bill instead of SB 1047.

Bill AB 3211 requires watermarks in metadata—many companies are already complying as it is also invisible—on AI-generated images, audio clips, and videos, and online platforms to display a visible watermark for users to recognize AI-generated content as well.

The SB 1047 bill, on the other hand, includes sanctions on companies if their technologies cause major harm to society. Last week, OpenAI said this bill “makes no sense ” and has stopped expanding its office in San Francisco.

Oklahoma City Police Deploy AI to Draft Incident Reports, Raises Bias Concerns - 2

Photo by Mohamed Hassan on pxhere

Oklahoma City Police Deploy AI to Draft Incident Reports, Raises Bias Concerns

  • Written by Kiara Fabbri Former Tech News Writer
  • Fact-Checked by

Oklahoma City police are now using an AI tool to write incident reports. The technology, developed by Axon, uses AI similar to ChatGPT to draft reports from body camera audio in just eight seconds.

While officers praise the time-saving technology, legal scholars and community activists raise concerns about potential bias and the accuracy of AI-generated reports.

The software, Draft One, converts body camera audio into written incident reports, aiming to improve report drafting efficiency.

The Associated Press reports that Oklahoma City Sergeant Matt Gilmore, who tested it, said the AI-written report was “better” than anything he could have written himself, was 100% accurate, and even included details he didn’t recall.

Axon created the product. Company founder and CEO Rick Smith told the AP that Draft One had the “most positive reaction” of any product the company has introduced.

“However, there are concerns,” Smith noted. He explained that district attorneys want to ensure that police officers, not just an AI chatbot, are responsible for writing their reports since they may need to testify in court about their observations.

The Independent reports that Oklahoma City community activist Aurelius Francisco has expressed deep concerns about the use of AI technology in police reporting, particularly due to potential racial biases.

Past incidents, such as Robert Williams’ wrongful arrest due to flawed facial recognition, exemplify the dangers of overreliance on AI in police investigative work, particularly regarding racial bias and inaccuracies.

“Given all the sensitivities around policing, around race and other identities of people involved, that’s an area where I think we’re going to have to do some real work before we would introduce it,” Smith told AP.

There’s a consensus that while AI can assist in report drafting, ultimate responsibility and accountability should remain with human officers, especially when dealing with serious crimes that may require court testimony.