
Image by Chris Montgomery, from Unsplash
Zoom Announches AI Companion 3.0 With Avatars and Voice Translation
- Written by Kiara Fabbri Former Tech News Writer
- Fact-Checked by Sarah Frazier Former Content Manager
Zoom has announced AI Companion 3.0 as the new AI assistant for Zoom Workplace.
In a rush? Here are the quick facts:
- The update introduces photorealistic avatars that mimic user movements.
- Live camera authentication will prevent avatar misuse.
- Real-time voice translation expands to 18 languages.
Zoom says that the updated AI Companion system brings forth enhanced “agentic” capabilities, which enable users to convert discussions into work assignments, and schedule meetings automatically.
It can also take notes automatically which function across Zoom meetings, traditional meetings, as well as third-party platforms such as Microsoft Teams and Google Meet platforms.
“Around the world, millions of people are using Zoom to connect with their colleagues and customers, but they could be getting so much more out of those conversations with the help of AI,” said Smita Hashim, chief product officer at Zoom.
“AI Companion 3.0 will provide deeper insights from their conversations to help them get more done at work and drive better business outcomes, regardless of whether they’re working in Zoom Workplace, in person, or across compatible apps and integrations,” Hashim added.
The upcoming December 2025 release will introduce photorealistic AI avatars as an eye-catching meeting feature. The system creates digital avatars from user photos which they can then personalize with business attire , tracking the user’s real-time movements.
The system includes two security features: “live camera authentication”, and visible “in-meeting tile notices”, aimed at stopping unauthorized use.
Hashim explained: “This feature remains in development, and specific enrollment and authentication processes may change before general availability,” as reported by The Verge .
Zoom will also launch real-time voice translation this December, allowing participants to hear speakers in their chosen language, with support for English, German, Chinese, French, Spanish, Arabic, Japanese, Portuguese, and Italian.
Finally, other updates include a custom AI agent builder, group assistant features, proactive meeting recommendations, and improved video quality.

Photo by Dima Solomin on Unsplash
Meta Launches Smartglasses With AI Assistant And Neural Band
- Written by Andrea Miliani Former Tech News Expert
- Fact-Checked by Sarah Frazier Former Content Manager
Meta unveiled its latest and most advanced smartglasses, the Meta Ray-Ban Display, on Wednesday at the annual Connect event in California. The “AI glasses” come with an electromyography (EMG) wristband, called Meta Neural Band, which includes a sensor that allows users to control the device through gestures.
In a rush? Here are the quick facts:
- Meta launched its latest and most advanced smartglasses, Meta Ray-Ban Display, for $799.
- The new device comes with a wristband, Meta Neural Band, which allows users to control the AI glasses with gestures.
- The company now offers three categories of AI glasses: Camera AI glasses, Display AI glasses, and Augmented reality glasses.
According to Meta’s official announcement , the new Meta Ray-Ban Display integrates microphones, cameras, speakers, and a full-color high-resolution display backed with AI technology. The smart glasses will be sold together with the Meta Neural Band, starting at $799 on September 30 in the United States.
Meta Ray-Ban Display + Meta Neural Band = our most advanced pair of AI glasses. Ever. pic.twitter.com/PlrVcwbprN — Meta (@Meta) September 18, 2025
Meta emphasized that the display is designed for easy removal and optimized for short interactions controlled through its intuitive wristband.
“Meta Neural Band is so effortless, it makes interacting with your glasses feel like magic,” states the document. “It replaces the touchscreens, buttons, and dials of today’s technology with a sensor on your wrist, so you can silently scroll, click, and, in the near future, even write out messages using subtle finger movements.”
Meta Neural band’s battery is expected to last up to 18 hours and is made with Vectran, a strong and bendable material.
The tech giant also introduced new AI-powered features for the new device. The company explained that users will be able to show images and text on the glasses, with context-aware adjustments based on their surroundings. Interactions can be performed by swiping with the thumb or giving voice commands.
The AI glasses will also support messaging—including WhatsApp messages and videos from social media—and take video calls, showing the wearer’s point of view. A new pedestrian navigation system provides a visual map on the display, enabling users to move without checking their smartphones.
Meta also clarified that it now has three categories for its smartglasses: Camera AI glasses—developed along with Ray-Ban and Oakley—, AR glasses—introduced last year with the Orion prototype —, and the new category, Display AI glasses, introduced with the release of the new Meta Ray-Ban Display.