Google’s AI Model Under European Investigation - 1

Image from Trustedreviews

Google’s AI Model Under European Investigation

  • Written by Kiara Fabbri Former Tech News Writer
  • Fact-Checked by Justyn Newman Former Lead Cybersecurity Editor

The Data Protection Commission (DPC) has announced today an investigation into Google to determine whether the company followed EU data protection laws while developing its AI model, Pathways Language Model 2 (PaLM 2).

PaLM2 is a large language model used in various AI services, including email summarization. Google has stated it will cooperate with the inquiry, as noted by the AP .

The investigation will assess whether Google should have performed a Data Protection Impact Assessment (DPIA) to evaluate the potential risks to individuals’ rights and freedoms from its AI technologies.

This investigation is part of the DPC’s broader efforts to ensure compliance with data protection rules in the AI sector across Europe. Cross-border processing, which involves handling data across multiple EU countries or affecting individuals in several nations, is under particular scrutiny.

For this purpose the DPC can impose fines of up to 4% of Google’s parent company Alphabet’s global annual revenue for violations, as reported by TechCrunch.

Google has developed a range of generative AI tools, including its Gemini series of large language models (formerly Bard) used for various applications, including enhancing web search through AI chatbots, notes TechCrunch.

Central to these tools is Google’s PaLM2, a foundational LLM launched at last year’s I/O developer conference, said TechCrunch.

Last Month, Elon Musk’s X has also faced scrutiny from European regulators over its use of user data for AI training. The DPC launched an investigation after receiving complaints that X was feeding user data into its Grok AI technology without obtaining proper consent. While X has agreed to limit its data processing , it has not faced any sanctions.

This investigation is part of the DPC’s broader efforts to regulate the use of personal data in AI development across the European Union. The EU’s recent adoption of the Artificial Intelligence Act marks a significant step towards establishing a regulatory framework for AI technologies within the bloc.

European Report Urges Stricter Oversight Of In-Game Purchases - 2

Image from Freepik

European Report Urges Stricter Oversight Of In-Game Purchases

  • Written by Kiara Fabbri Former Tech News Writer
  • Fact-Checked by Justyn Newman Former Lead Cybersecurity Editor

A new report raises concerns about the increasing use of premium currencies in video games and mobile apps, warning of potential risks, especially for children, and calling for stricter regulation.

Today, the European Consumer Organisation (BEUC) filed a complaint on behalf of consumer groups from 17 countries, accusing several popular video games of “unfair practices” and violating consumer protection laws in relation to in-app and in-game purchases.

The complaint targets widely played games such as Epic Games’ Fortnite, Supercell’s Clash of Clans, Microsoft’s Minecraft, and EA Sports FC 24, as noted by TechCrunch .

The report examines the growing trend of in-game and in-app premium currencies used in video games and mobile applications. These virtual currencies allow players to purchase additional content, features, or advantages within the game or app.

While offering potential convenience and revenue for developers, the report raises concerns about potential consumer risks associated with these practices.

The report criticizes the lack of transparency in pricing models for premium currencies, often lacking clear real-world values. This can lead to overspending, especially for younger players.

Additionally, the report expresses concerns about the use of manipulative techniques to encourage in-app purchases.

Game developers may employ tactics like limited-time offers, social pressure within the game environment, and loot boxes to entice players to spend real money. These practices can exploit impulsive behavior and lead to excessive spending.

The report also raises concerns about age targeting and the potential for gambling-like behavior associated with in-app purchases. Games with in-game purchases may be targeted towards younger audiences, who may be less aware of the financial implications of their actions.

The use of loot boxes with randomized rewards is likened to gambling, increasing the risk of compulsive spending and addiction.

Finally, the report questions whether in-app purchases create an unfair advantage for those who spend real money. This could potentially distort the gameplay experience and create a “pay-to-win” environment where success is determined by financial investment rather than skill.

BEUC recommends regulating in-game premium currencies by enforcing clear pricing models that translate virtual currency into real-world costs. It also calls for banning manipulative tactics like loot boxes and limited-time offers, along with stronger parental controls and educational campaigns to raise awareness of the risks for children.

Adding to these concerns, Kaspersky experts have released a recent report detailing a significant increase in online threats targeting young gamers. Kaspersky’s findings reveal that cybercriminals are exploiting these games by offering fraudulent free in-game items or using phishing tactics to steal personal information.

This is particularly troubling given BEUC’s concerns about manipulative practices and potential gambling-like behavior associated with in-app purchases.