
Image by Justin Dickey, from Unsplash
Surya AI Offers Early Warnings Of Solar Storms
- Written by Kiara Fabbri Former Tech News Writer
- Fact-Checked by Sarah Frazier Former Content Manager
NASA collaborated with IBM to produce Surya, their new open-source AI system for predicting solar activity.
In a rush? Here are the quick facts:
- Surya is trained on 250 TB of NASA solar observatory images.
- The model predicts solar flares up to two hours in advance.
- Solar storms can damage satellites, power grids, and radio communications.
MIT Tech Review explains that Surya is trained on more than a decade of NASA solar data, and aims to give scientists early warnings of solar flares that may disrupt life on Earth.
MIT explains that solar storms happen when the sun bursts out energy and particles into space. These storms produce solar flares and coronal mass ejections as energy and particles escape from the sun.
Solar storms can cause major problems: they disrupt radio signals, damage satellites, expose astronauts to radiation, and potentially trigger power grid failures on Earth.
Being able to predict when a flare will strike has always been a challenge. As Louise Harra, an astrophysicist at ETH Zurich, explains, “when it erupts is always the sticking point,” as reported by MIT.
Harra says scientists can often tell from images if a flare is likely soon, but predicting the timing and strength is much harder. The magnitude of solar flares determines the extent of their impact: small flares may disrupt radios every few weeks, while massive solar superstorms could destroy satellites and shut down electricity worldwide.
Surya was trained on over 250 terabytes of images from NASA’s Solar Dynamics Observatory. In early tests, it predicted some solar flares up to two hours in advance. “It can predict the solar flare’s shape, the position in the sun, the intensity,” says Juan Bernabe-Moreno, the IBM AI researcher who led the project. That’s about twice the warning time current methods provide.
Harra notes, “It’s just those tiny destabilizations that we know happen, but we don’t know when.” The hope is that Surya can spot these patterns faster than humans can.
Bernabe-Moreno adds that Surya could also help uncover links between solar weather and Earth weather. “Understanding the sun is a proxy for understanding many other stars,” he says. “We look at the sun as a laboratory.”

Photo by Collabstr on Unsplash
Meta Rolls Out English and Spanish AI-Powered Translations for Reels
- Written by Andrea Miliani Former Tech News Expert
- Fact-Checked by Sarah Frazier Former Content Manager
Meta announced on Tuesday that it is rolling out Meta AI translations to more users, helping creators reach a wider audience in English and Spanish. The AI-powered tool allows users with more than 1,000 followers to “speak” in another language by dubbing their content.
In a rush? Here are the quick facts:
- Meta is rolling out globally Meta AI translations for Instagram reels.
- Users with public accounts and more than 1,000 followers will be able to share reels both in English and Spanish.
- The company expects to expand the tool to more languages soon.
According to Meta’s announcement , the AI tool analyzes a user’s voice and tone, replicates them in a different language, and syncs the dubbed audio with the speaker’s mouth movements to make it appear more natural. The company plans to expand the feature to additional languages in the future.
“Today we’re expanding access to Meta AI translations, a free tool that, once enabled, automatically dubs and lip syncs your reels into another language,” wrote Meta. “With Meta AI translations, you can speak to viewers in their own language, opening up your content to new audiences that may not have found it accessible before.”
Meta had previously announced it had been testing the AI translation tool in 2024 and is now deploying it globally. The company’s Head of Instagram, Adam Mosseri, shared a video on Threads explaining how the new feature works in both languages.
“As of this week, all public accounts on Instagram are gonna have access to our AI translation feature,” said Mosseri in the video displayed in both languages. “For now, it translates from English to Spanish and Spanish to English, but over time we’re gonna add more and more languages.”
To use the tool, users must have a public account with at least 1,000 followers. Before posting a reel, they need to turn on the option “Translate your voice with Meta AI” and then select “Share now” to publish the video in both languages
Creators can also review the translation through the Professional Dashboard before publishing. Once shared, the video will display in each viewer’s preferred language, though audiences can choose to watch either the translated or original version.
Meta also provided recommendations for best results: favor face-to-camera videos, limit content to no more than two speakers, and ensure high-quality video and audio.
A few months ago, Meta also announced a language program in collaboration with UNESCO , the Language Technology Partner Program, to develop AI translation models.