Creators Demand Tech Giants To Pay For AI Training Data - 1

Image by Cristofer Maximilian, from Unsplash

Creators Demand Tech Giants To Pay For AI Training Data

  • Written by Kiara Fabbri Former Tech News Writer
  • Fact-Checked by Justyn Newman Former Lead Cybersecurity Editor

Governments are allowing AI developers to steal content – both creative and journalistic – for fear of upsetting the tech sector and damaging investment, a UK Parliamentary committee heard this week, as first reported by The Register .

In a Rush? Here are the Quick Facts!

  • UK MPs heard concerns over AI exploiting copyrighted content without compensation.
  • Composer Max Richter warned AI threatens musicians’ livelihoods and originality.
  • Publishers found 1,000 bots scraping data from 3,000 news websites for AI models.

Despite a tech industry figure insisting that the “original sin” of text and data mining had already occurred and that content creators and legislators should move on, a joint committee of MPs heard from publishers and a composer angered by the tech industry’s unchecked exploitation of copyrighted material.

The Culture, Media and Sport Committee and Science, Innovation and Technology Committee asked composer Max Richter how he would know if “bad-faith actors” were using his material to train AI models.

“There’s really nothing I can do,” he told MPs. “There are a couple of music AI models, and it’s perfectly easy to make them generate a piece of music that sounds uncannily like me,” he said, as reported by The Register.

“That wouldn’t be possible unless it had hoovered up my stuff without asking me and without paying for it. That’s happening on a huge scale. It’s obviously happened to basically every artist whose work is on the internet,” Richter added.

Richter, whose work has been used in major film and television scores, warned that automated material would edge out human creators, impoverishing musicians. “You’re going to get a vanilla-ization of music culture,” he said, as reported by The Register.

“If we allow the erosion of copyright, which is really how value is created in the music sector, then we’re going to be in a position where there won’t be artists in the future,” he added.

Former Google staffer James Smith echoed this sentiment, saying, “The original sin, if you like, has happened.” He suggested governments should focus on supporting licensing as an alternative monetization model, reported The Register.

Matt Rogerson, director of global public policy at the Financial Times, disagreed, emphasizing that AI companies were actively scraping content without permission. “We can only deal with what we see in front of us,” he said, as reported by The Register.

A study found that 1,000 unique bots were scraping data from 3,000 publisher websites, likely for AI model training, according to The Register.

Sajeeda Merali, chief executive of the Professional Publishers Association, criticized the AI sector’s argument that transparency over data scraping was commercially sensitive. “Its real concern is that publishers would then ask for a fair value in exchange for that data,” she said, as reported by The Register.

The controversy over AI training data escalated in October when over 13,500 artists signed a petition to stop AI companies from scraping creative works without consent . Organized by composer and former AI executive Ed Newton-Rex, the petition was signed by public figures like Julianne Moore, Thom Yorke, and Kazuo Ishiguro.

“There are three key resources that generative AI companies need to build AI models: people, compute, and data. They spend vast sums on the first two – sometimes a million dollars per engineer, and up to a billion dollars per model. But they expect to take the third – training data – for free,” Newton-Rex said.

Tensions heightened further when a group of artists leaked access to OpenAI’s text-to-video tool, Sora, in protest. Calling themselves “ Sora PR Puppets ,” they provided free access to Sora’s API via Hugging Face, allowing users to generate video clips for three hours before OpenAI shut it down.

The protesters claimed OpenAI treated artists as “PR puppets,” exploiting unpaid labor for a $157 billion company. They released an open letter demanding fair compensation and invited artists to develop their own AI models.

With artists and publishers pushing back against AI’s unchecked use of their content, the debate over ethical AI training practices continues to intensify. The UK government faces mounting pressure to implement policies that protect creative industries without stifling technological advancement.

U.K. Demands Apple Build Backdoor To Access Users’ Encrypted Data Worldwide - 2

Image by Shahadat Rahman, from Unsplash

U.K. Demands Apple Build Backdoor To Access Users’ Encrypted Data Worldwide

  • Written by Kiara Fabbri Former Tech News Writer
  • Fact-Checked by Justyn Newman Former Lead Cybersecurity Editor

The British government has secretly ordered Apple to provide unrestricted access to its encrypted cloud backups, a move that could compromise user privacy worldwide.

In a Rush? Here are the Quick Facts!

  • UK demands Apple create a backdoor for encrypted cloud data globally.
  • Apple may withdraw encrypted storage in the UK rather than comply.
  • The order comes under the UK’s Investigatory Powers Act of 2016.

According to sources familiar with the matter, the demand—issued under the UK’s Investigatory Powers Act—requires Apple to create a backdoor for law enforcement, not just for targeted accounts but for all users globally, as first reported by The Washington Post .

Apple, known for its strong stance on privacy, is likely to withdraw encrypted storage services in the UK rather than comply, sources said.

However, this decision would not address the UK’s demand for access in other regions, including the United States. The company has not publicly commented on the matter, said The Post.

The order, known as a Technical Capability Notice, forces companies to provide access to encrypted data upon request. Under UK law, it is a criminal offense to disclose such demands. Apple does have an option to appeal the order, but the process does not allow delays in compliance, as reported by The Post.

In anticipation of this move, Apple warned UK lawmakers last March that forcing backdoor access would set a dangerous precedent.

“There is no reason why the U.K. [government] should have the authority to decide for citizens of the world whether they can avail themselves of the proven security benefits that flow from end-to-end encryption,” the company stated, as reported by The Post.

Privacy advocates and cybersecurity experts have condemned the UK’s actions. Sen. Ron Wyden (D-Oregon) criticized the demand, warning that if the US allows Britain to force Apple’s compliance, “would be unconscionable and an unmitigated disaster for Americans’ privacy and our national security,” as reported by The Post.

Law enforcement agencies, including the FBI and UK authorities, argue that encryption enables criminals and terrorists to evade detection, said The Post.

Tech companies, however, have long resisted such demands, citing concerns that security weaknesses introduced for governments could also be exploited by cybercriminals and authoritarian regimes, as reported by The Post.

Apple’s Advanced Data Protection, launched in 2022, provides users with end-to-end encryption for cloud storage, meaning even Apple cannot access the data. While most users do not enable this feature, it prevents governments from secretly accessing stored messages, photos, and other sensitive data.

If the UK succeeds in forcing Apple to comply, it may embolden other nations, including China, to demand similar access. This could lead Apple to withdraw encrypted cloud services globally rather than risk compromising user privacy.