Uber and Lyft drivers are using Teslas as unofficial robotaxis, leading to rising safety concerns. This practice raises questions about the legal and regulatory framework, as well as the potential risks involved with using semi-autonomous vehicles for ride-sharing services without proper oversight. The trend highlights ongoing challenges in the autonomous driving space and its impact on public safety.
In April, a self-driving Tesla operating for Uber collided with an SUV in suburban Las Vegas, raising concerns about the increasing use of self-proclaimed "robotaxis" and their potential exploitation of regulatory loopholes in U.S. cities, posing safety risks.
Elon Musk, CEO of Tesla, plans to reveal his vision for a robotaxi service on October 10. Musk has long envisioned a network of autonomous Teslas owned by individuals.
Despite Tesla's official plans, many drivers are already using Tesla’s Full Self-Driving (FSD) software to offer ride-hailing services. Although the software has its limitations, drivers continue using it to reduce stress, allowing them to drive longer and earn more.
Reuters was the first to report on the Las Vegas incident and the subsequent federal investigation, as well as the widespread use of Tesla's autonomous software among ride-hail drivers.
While fully autonomous robotaxis from companies like Waymo and Cruise are subject to strict regulations, Tesla drivers are solely responsible for their vehicles, even when using driver-assistance technology like FSD.
In the April crash, the other driver was found at fault for failing to yield, but Tesla driver Justin Yoon said the software didn’t slow down in time. Yoon, who operates the YouTube channel "Project Robotaxi," showed footage of the Tesla navigating at 46 mph and failing to detect the SUV. He eventually took control, reducing the impact.
After the crash, Yoon and his passenger sustained minor injuries, and the car was totaled. Tesla, Uber, and Lyft did not respond to requests for comment, while Uber emphasized driver safety responsibilities in their community guidelines.
Musk has ambitious plans for Tesla's self-driving technology, hoping to build a network of autonomous ride-sharing vehicles owned by individual Tesla customers. However, drivers have expressed concerns over the software’s limitations, particularly in complex situations like navigating airports and parking lots.
One driver, Sergio Avedian, avoids using FSD with passengers and estimates that 30% to 40% of Tesla drivers in the U.S. regularly use the software for ride-hailing services.
The U.S. government classifies FSD as partial automation, requiring active driver oversight, yet its use in ride-hailing has not been prohibited. Experts like Guidehouse Insights' Jake Foose argue that the use of these systems in commercial settings needs stricter regulation.
Federal authorities are aware of the Las Vegas crash, but state regulators do not currently oversee the use of FSD in ride-hailing services, as it falls outside robotaxi regulations.
Tesla recently introduced a feature allowing Uber passenger destinations to be sent directly to the car’s navigation system, further encouraging the use of FSD for ride-hailing.
Though other automakers limit partial automation to highways, Tesla claims FSD can handle most driving tasks with minimal intervention, though safety experts like David Kidd from the Insurance Institute for Highway Safety warn that it raises serious concerns.
Some experts, like Missy Cummings from George Mason University, suggest that companies like Uber and Lyft should preemptively restrict the use of FSD to avoid potential safety risks.
Drivers like Kaz Barnes, who has completed over 2,000 rides using FSD, hope that one day they’ll be able to rely on the technology entirely without needing to oversee the vehicle’s operations.
For questions or comments write to writers@bostonbrandmedia.com
Source: Reuters