
Image from Freepik
Uber And Waymo Partner To Launch Robotaxis In Austin And Atlanta In 2025
- Written by Kiara Fabbri Former Tech News Writer
- Fact-Checked by Justyn Newman Former Lead Cybersecurity Editor
Uber announced today that, starting in early 2025, users in Austin and Atlanta will be able to request Waymo robotaxis through the Uber app, as part of an expanded partnership between the two companies .
Uber will manage fleet operations, such as dispatch, cleaning, repairs, and general maintenance. Meanwhile, Waymo will focus on autonomous technology, overseeing testing, roadside assistance, and essential support functions.
This initiative builds on their previous efforts in Phoenix, where Waymo’s autonomous vehicles have already provided rides to tens of thousands. In April, Uber Eats also began using Waymo’s self-driving cars for food deliveries within Waymo’s 225+ square mile service area in Phoenix.
Meanwhile, Waymo co-CEO Tekedra Mawakana emphasizes the company’s commitment to expanding their service, stating , “Waymo’s mission is to be the world’s most trusted driver, and we’re excited to launch this expanded network and operations partnership with Uber.”
While this partnership represents a significant step forward for autonomous vehicles, it also comes amid growing concerns about the technology’s safety and impact on jobs.
Axios Austin (AA) reports that six complaints have been filed against Waymo, according to a city site tracking autonomous vehicle incidents . These include one classified as a “safety concern,” four as “nuisance,” and one as a “near miss.”
Additionally, in May, investigators identified 22 incidents where Waymo’s self-driving cars were either “the sole vehicle involved in a collision” or exhibited driving behavior that might have violated traffic safety laws, as reported by AA.
AA also notes that a 2017 state law , supported by car companies, prevents Texas cities from regulating self-driving cars.
Self-driving vehicles have both impressed and unsettled people worldwide , raising concerns job impacts. As robotaxis become more common, it will be important to address these concerns and ensure that the technology is developed and deployed responsibly.

Photo by Simon Kadula on Unsplash
Google DeepMind’s New AI Systems Teach Robots to Tie Shoelaces And Hang Clothes
- Written by Andrea Miliani Former Tech News Expert
- Fact-Checked by Justyn Newman Former Lead Cybersecurity Editor
Google DeepMind’s robotic team published two papers on their research in robot dexterity, featuring its new AI systems DemoStart and ALOHA Unleashed. With the new developments, researchers managed to make two robotic arms tie a shoelace, hang clothes, and repair another robot autonomously.
In the update published yesterday , the robotics team explains that performing simple tasks like tightening a screw or tying shoelaces can be extremely difficult for robots as they require high dexterity and coordination between two arms.
Google’s Deepmind team had been working with just one arm. They recently created a human-level competitive robot that can play ping pong with “just one arm”.
Now, researchers have developed AI systems to train two-armed devices to perform more complex tasks that humans get to do daily.
“To make robots more useful in people’s lives, they need to get better at making contact with physical objects in dynamic environments,” wrote the team.
The AI system ALOHA Unleashed—based on the open source and low-cost system ALOHA developed by Stanford University—taught two-arm robots to manipulate elements and work simultaneously to tie a shoelace, hang a shirt, clean a kitchen, and insert a gear.
DemoStart, on the other hand, developed a “reinforcement learning algorithm” that teaches robots during simulations with the open-source program MuJoCo. This AI system is for more complex tasks involving more robot parts like fingers, sensors, and joints.
“The robot achieved a success rate of over 98% on a number of different tasks in simulation, including reorienting cubes with a certain color showing, tightening a nut and bolt, and tidying up tools,” explained researchers. Later, in real life, the robot performed with a 97% success rate in the lifting and cube reorientation tasks, and 64% in a complex task requiring plug-socket insertion.
Introducing 2️⃣ new AI systems for robotics: 🤖 ALOHA Unleashed to perform two-armed manipulation tasks 🦾 DemoStart to control a multi-fingered robotic hand They learned to tackle a range of actions requiring dexterity. Here’s how. 🧵 https://t.co/SV3TsXIhhh pic.twitter.com/JnbZEMwB1j — Google DeepMind (@GoogleDeepMind) September 12, 2024
The company provided videos and images of the experiments and the robots to demonstrate the capabilities of the new AI systems.
“One day, AI robots will help people with all kinds of tasks at home, in the workplace and more,” wrote the team regarding the future of this area in robotics. “Dexterity research, including the efficient and general learning approaches we’ve described today, will help make that future possible.”