Autonomous driving is apparently the wave of the future, even if U.S. drivers do not really trust the technology.
Assisted driving tech has been around for at least two decades, and Americans seem fine with that. But autonomous driving is in a different lane, and Americans are skeptical.
💵💰 Don't miss the move: Subscribe to TheStreet's free daily newsletter 💵💰
“Consumers are skeptical of the full self-driving (FSD) technology that undergirds the robotaxi proposition, with 60% considering Tesla’s full self-driving ‘unsafe,' 77% unwilling to utilize full self-driving technology, and a substantial share (48%) believing full self-driving should be illegal,” said the May 2025 edition of the Electric Vehicle Intelligence Report (EVIR).
California, frequently at the forefront of many technological innovations, has become a hub for AV testing, but citizens there have demanded heavy guardrails.
Nearly 80% of California voters support requiring a human safety operator in self-driving trucks and delivery vehicles, and just 33% of voters express a favorable general impression of autonomous vehicles.
Related: Tesla faces its most serious court battle in years
But there are levels to autonomous vehicles ranging from 0-5, according to the Society of Automotive Engineers.
Level 0 represents no automation, while Level 5 represents full automation with no human intervention at all.
The assisted driving systems Americans have been using for 20 years represent Level 1, where the vehicle can assist with steering or acceleration/deceleration but not both at the same time.
Level 2 vehicles can control both steering and velocity at the same time. Americans are also pretty familiar with this level. Tesla Full Self Driving is L2 autonomous.
But Level 3 is where things get tricky, especially for legal reasons. One Chinese carmaker seems willing to invest in AV tech.
BYD is the only carmaker willing to take financial responsibility for AV driving
Level 3 is where the true autonomous driving magic occurs.
“The transition from SAE level 2+ to level 3 is a significant one. While many level 2+ systems have proven popular and, for the most part, effective, level 3 vehicles mean that, in some situations, eyes can be taken off the road,” a new research report from IDTechEx says.
The “eyes taken off the road” part is crucial because at that point, the driver is officially no longer in control of the vehicle; the vehicle's software is.
So if an accident happens while the “driver” of an L3 or above vehicle is operating, who really is at fault?
“Generally, this would result in the accountability of any accident occurring while level 3 is operational falling onto the manufacturer, not the driver. As a result, the overall reliability, defined by both the hardware and software, has to be much greater,” the report states.
Tesla has been sued multiple times over fatal mistakes that drivers say FSD has made. Each time, Tesla has argued it was the driver's fault.
Related: Unprecedented BYD assisted driving offer puts competition on notice
If Tesla ever wants to reach L3 autonomous driving, that excuse won't fly anymore.
Chinese rival BYD seems more than ready to take on the responsibility.
Earlier this month, BYD debuted a smart parking feature that allows the vehicle to achieve Level 4 autonomy.
Level 4 autonomy, as defined by the Society of Automotive Engineers, is the second highest available level of autonomy. In layman's terms, BYD vehicles equipped with the highest assisted driving packages will be able to park themselves.
But most interestingly, regarding the latest upgrade, BYD promises to pay for any accidents caused by autonomous parking.
Rather than going through their insurance companies, BYD drivers using the tech can file a claim with BYD's after-sales team if something goes wrong.
Tesla faces FSD court battles as company looks to expand
Earlier in July, the U.S. District Court for the Southern District of Florida heard opening arguments in a lawsuit filed against Tesla by the family of Naibel Benavides, who was killed in 2019 by a runaway Tesla that had FSD engaged.
The vehicle, driven by George Brian McGee, sped through a T intersection at 62 miles per hour and T-boned an empty parked car.
The parked car's owners were standing outside the vehicle when they were struck. Benavides, 22, was killed in the crash, and her body was found flung about 75 feet from the crash site. Dillon Angulo, her boyfriend, survived the crash but was left with a severe concussion and multiple broken bones.
Like other cases involving FSD in the past, Tesla blames the crash on driver error.
“The evidence clearly shows that this crash had nothing to do with Tesla’s Autopilot technology,’’ Tesla said in a statement to Bloomberg.
L3+ driving would allow the person who crashed, who reportedly dropped his cellphone and was searching for it on the ground when the crash occurred, to blame Tesla.
But Tesla has not reached the level of automation that would make it responsible for a driver who took his eyes off the road.
Related: Alphabet's Waymo flexes on Tesla Robotaxi with latest update