
Image generated with ChatGPT
OPINION: Massive AI Data Centers Everywhere—The Compute Power Race and Its Impact
- Written by Andrea Miliani Former Tech News Expert
- Fact-Checked by Sarah Frazier Former Content Manager
This year, tech giants like Nvidia, OpenAI, Microsoft, Google, and Alibaba have rolled out multibillion-dollar plans to build the infrastructure needed for today’s and tomorrow’s AI tech. But why is this infrastructure necessary, and how are we managing its consequences?
We are entering an era defined by the construction of massive AI data centers—everywhere. This year alone, several multibillion-dollar deals have been announced to build the infrastructure that will power the advanced AI technology being developed by the world’s tech giants.
All the major players—from Nvidia to Alibaba, OpenAI to Google, and Microsoft—are moving quickly: signing new agreements, making enormous investments, and securing land in unlikely locations to begin construction as soon as possible.
Progress at our datacenter in Abilene. Fun to visit yesterday! pic.twitter.com/W22ssjWstW — Sam Altman (@sama) September 24, 2025
In January, Digital Edge raised $1.6 billion to expand its data centers in Singapore to meet growing AI demands across Asia. In the United States, OpenAI, Softbank, Oracle, and the White House announced a $500 billion joint venture: the Stargate Project .
Only days later, France and the United Arab Emirates announced a new partnership —worth €30 to 50 billion—to build a 1-gigawatt AI-dedicated data center in France, while Alibaba announced a $52 billion investment in AI infrastructure .
France’s brief advantage in AI infrastructure lasted only a few months. In July, Anker and Nscale announced a partnership with OpenAI to build Europe’s largest AI data center in Norway. This project, Stargate Norway, stood out for being powered entirely by renewable energy—a crucial point given that energy consumption is one of the most pressing concerns.
The Stargate “franchise” continues to expand with bold new moves. Just weeks ago, it was reported that OpenAI is planning on building a 1-gigawatt data center in India . Shortly after, the Stargate UK program was announced, aiming to accelerate AI development in the region with Nvidia pledging up to $15 billion in chips and Microsoft committing $30 billion for supercomputer development.
And, as if these deals weren’t enough, Nvidia announced last week it will invest $100 billion more in OpenAI to build more data centers.
Much like a game of Catan, the locations and strategies are now visible on the board. The scramble for resources to build AI empires has begun.
But… What Is an AI Data Center?
AI data centers are large-scale facilities built to house the IT infrastructure needed to support advanced technologies. As IBM explains , beyond the traditional servers, networking equipment, and storage units, AI data centers require specialized systems capable of handling far more demanding workloads.
“Typical data centers contain infrastructure that would quickly be overwhelmed by AI workloads,” states IBM on its website. “AI-ready infrastructure is specially designed for the cloud, AI, and machine learning tasks.”
One of the biggest differences lies in the hardware. Instead of relying primarily on central processing units (CPUs), AI data centers are built around graphics processing units (GPUs)—the chips that have fueled Nvidia’s popularity in recent valuations .
And the build-out is only expected to accelerate. According to McKinsey , demand for AI data centers in the United States will triple by 2030, requiring roughly $7 trillion in new investment. But are these predictions realistic? Do we truly need all the data centers now being planned?
The Side Effects
While these powerful facilities can fuel the AI magic we’ve all come to admire, they also demand enormous amounts of energy. That translates into more pollution, higher energy costs—including for local communities—and deeper concerns about the future of the environment.
AI data centers require vast amounts of electricity, complex cooling systems, and cutting-edge hardware running around the clock under optimal conditions. Renewable energy, however, still lacks a reputation for delivering the consistency needed to meet such demands.
The government of the United States has already taken steps to permit more fossil fuel use to power these centers. And while projects like Stargate Norway have prioritized renewables, building truly “green” ecosystems at this scale is far from simple.
“To continuously produce just a single gigawatt, a renewable-energy plant would need around 12.5 million solar panels — enough to cover nearly 5,000 football fields,” states a recent report from the New York Times . In their latest partnership, Nvidia and OpenAI announced they expect “at least 10 gigawatts of AI data centers.”
The Neighbors Are Already Complaining
These massive facilities also consume significant water for their cooling systems. While Sam Altman—OpenAI’s CEO—insists that ChatGPT queries only consume “ roughly one fifteenth of a teaspoon ” of water, many citizens living next to these data centers complain not only about the increasing energy bills, but also about water scarcity.
“Large data centers can consume up to 5 million gallons per day, equivalent to the water use of a town populated by 10,000 to 50,000 people,” states a report on water consumption shared by the Environmental and Energy Study Institute in June. And people are already worried.
In Georgia, residents have already complained about groundwater conditions . A woman living near Meta’s data center reported that her water had become hazy and said she fears it could run out within a few years. Others living near similar facilities say their electric bill costs have significantly increased .
A Game Of Titans
The most important pieces of the game are already on the board. The largest AI data centers at the moment are in the United States, China, and Europe, as well as in other regions. The competition over who manages to develop the energy source that will power new technologies is at a point of high tension.
The consequences of developing these data centers are still to be seen, and many factors are at play. From environmental and human consequences—given how they are already affecting the lives of people living near these places—to financial ones. Some experts are also questioning whether the new AI models, as developers of DeepSeek have shown, might not even need that much energy.
The stakes are very high, and the desire for them to succeed is even greater. While some benefit from the jobs these data centers are generating—AI is not yet performing the manual labor needed for construction—others worry about their basic needs, such as having drinking water. Ambition seems to be the piece with the greatest power on the board.

Image by Wayne Sutton, from Flickr
Hackers Claim Massive Red Hat Breach
- Written by Kiara Fabbri Former Tech News Writer
- Fact-Checked by Sarah Frazier Former Content Manager
Red Hat, an open-source software company, has confirmed a security breach after the hacking group Crimson Collective announced it obtained 570GB of compressed data from the company’s private GitHub repositories.
In a rush? Here are the quick facts:
- The Crimson Collective says it breached 28,000 internal projects.
- Data allegedly includes 800 Customer Engagement Reports (CERs).
- CERs contain sensitive infrastructure, tokens, and client system details.
The group announced they obtained 28,000 internal projects and hundreds of Customer Engagement Reports (CERs), which contain sensitive client information, including network maps, authentication tokens, and configuration details.
“Red Hat is aware of reports regarding a security incident related to our consulting business and we have initiated necessary remediation steps,” the company told BleepingComputer .
Stephanie Wonderlick, Red Hat’s VP of communications, echoed this to 404 Media , adding: “The security and integrity of our systems and the data entrusted to us are our highest priority. At this time, we have no reason to believe the security issue impacts any of our other Red Hat services or products and are highly confident in the integrity of our software supply chain.”
The Crimson Collective, however, claims to have accessed authentication tokens and database connection strings, using them to “gain access to some of their client’s infrastructure as well,” as reported by The Register .
The group also published file listings on Telegram and claimed to hold CERs covering 2020 through 2025, allegedly involving major institutions including the U.S. Navy’s Naval Surface Warfare Center, the Federal Aviation Administration, Bank of America, Walmart, AT&T, T-Mobile, and the U.S. House of Representatives.
The group published file directories on Telegram while announcing possession of CERs spanning from 2020 to 2025. The profiles supposedly involve organizations like the U.S. Navy’s Naval Surface Warfare Center, the Federal Aviation Administration, Bank of America and Walmart, AT&T, T-Mobile, and the U.S. House of Representatives.
The hackers say they tried to contact Red Hat with an extortion demand but received only a generic response instructing them to submit a vulnerability report. “We have given them too much time already to answer lol instead of just starting a discussion they kept ignoring the emails,” they wrote on Telegram, as noted by 404Media.
Red Hat has not validated any information about stolen data or customer information exposure, according to their official statements. The Register reports that the extent of the breach remains unknown because Red Hat has not publicly confirmed the hackers’ statements about stolen data or customer exposure.