Trump Plans Executive Orders To Boost AI Growth and Power Supply - 1

Image by Jonathan Ardila, from Unsplash

Trump Plans Executive Orders To Boost AI Growth and Power Supply

  • Written by Kiara Fabbri Former Tech News Writer
  • Fact-Checked by Sarah Frazier Former Content Manager

Trump plans three major executive decisions to expand AI systems, while competing with Chinese dominance due to increasing electricity demand surges nationwide.

In a rush? Here are the quick facts:

  • Federal land may be offered for AI data center development.
  • AI-related power demand could grow thirtyfold by 2035.
  • July 23 may be declared “AI Action Day” by the White House.

Four unnamed sources revealed to Reuters that the Trump administration will use executive orders to accelerate the growth of American AI, while boosting energy capacity and challenging Chinese technological supremacy.

The planned actions include speeding up power grid connections for energy projects and offering federal land for AI data center construction. “Training large-scale AI models requires a huge amount of electricity,” one source said to Reuters, noting that energy demand is growing at a rate not seen in decades.

Reuters reports that the administration also intends to streamline environmental permits alongside introducing a nationwide Clean Water Act permit that will simplify zoning rules and reduce construction delays. The administration is taking these steps as part of its initiative to remove all barriers that impede AI development.

President Trump is expected to highlight these initiatives on July 15 at an AI and energy event in Pennsylvania, and the White House is set to release a comprehensive AI Action Plan on July 23. That date may be declared “AI Action Day” to publicly reinforce the administration’s commitment, as reported by Reuters.

These steps come amid growing concern over the energy needs of AI. Reuters reports that power demand from AI data centers could grow more than thirtyfold by 2035, with outdated infrastructure and delayed grid connections already posing serious challenges.

AI models such as Claude, DeepSeek’s R1, and OpenAI’s o3 are significantly contributing to environmental costs. Research indicates that reasoning-enabled models produce 50 times more CO₂ emissions than basic systems when performing identical tasks.

The “chain-of-thought” method, which models use to mimic human reasoning, leads to substantial increases in their power consumption.

This concern is far from abstract. Recent research reports that AI has started to transform energy consumption patterns across the globe. Data centers in Virginia’s Culpeper County consume electricity at a level that equals the power needs of 10,000 to 20,000 residential homes.

The data center sector in Ireland now supplies more than 20% of the country’s total electricity requirements. The International Energy Agency said that data centers worldwide consumed approximately 340 terawatt-hours of electricity during 2022, which represented 1.3% of global electricity usage.

The growing carbon footprint of AI will challenge governments to develop effective management strategies because they lack adequate transparency and proactive policies as more energy-intensive models enter the market.

Inside Online Support Groups For ‘AI Addiction’ - 2

Image by Inspa Makers, from Unsplash

Inside Online Support Groups For ‘AI Addiction’

  • Written by Kiara Fabbri Former Tech News Writer
  • Fact-Checked by Sarah Frazier Former Content Manager

People struggling with dependency on AI chatbots are turning to online recovery groups, as experts sound the alarm on emotional risks and apps engineered for compulsive use.

In a rush? Here are the quick facts:

  • Teens and adults report emotional dependence on chatbots like Character.AI and ChatGPT.
  • Reddit forums offer support for those trying to quit chatbot use.
  • Experts compare chatbot addiction to gambling and dopamine-triggering habits.

404 Media reports that across Reddit, forums like r/Character_AI_Recovery, r/ChatbotAddiction, and r/AI_Addiction, which serve as informal support groups for people who claim to have developed unhealthy emotional attachments with AI companions.

Users describe experiencing both dependency, and psychological changes which go beyond basic addiction. Indeed some users report to have developed spiritual delusions because they believe chatbot responses contain divine guidance. However, more commonly, users believe that their bot companion is in some way conscious.

Experts say that design of chatbot platforms actively promotes users to spend more time on the platforms. A recent MIT study reveals that users develop compulsive and addictive behavior after engaging with these platforms.

One of them is Nathan, now 18, who began spending nights chatting with bots on Character.AI. “The more I chatted with the bot, it felt as if I was talking to an actual friend of mine,” he told 404 Media. He realized his obsession was interfering with his life and deleted the app. But like many, he relapsed before finding support online. “Most people will probably just look at you and say, ‘How could you get addicted to a literal chatbot?’” he said.

Aspen Deguzman, also 18, created r/Character_AI_Recovery after struggling to quit. “Using Character.AI is constantly on your mind,” they said. The forum offers a space to vent and connect anonymously. Posts range from “I keep relapsing” to “I am recovered.”

The issue isn’t limited to teens. David, a 40-year-old developer, compares chatbot use to gambling. “There were days I should’ve been working, and I would spend eight hours on AI,” he said. His personal life and job have suffered.

Part of the danger lies in how humans perceive AI. According to philosopher Luciano Floridi, semantic pareidolia describes how humans tend to find meaning and emotional content in things that actually lack both. Users tend to mistake simulated empathy from AI for genuine sentience because the technology has become more realistic.

Some chatbots demonstrate emotional intelligence beyond human capabilities , which strengthens the false impression of their sentience.

The growing number of recovery forums and increasing demand for help indicates a potential start of a major mental health issue related to generative AI.