Self-Driving Cars Scare 68% of American Drivers

Self-Driving Cars Scare 68% of American Drivers

According to a 2023 study by AAA, American drivers are more wary of self-driving cars than ever. The survey found that 68% of drivers fear these vehicles, compared to just 55% in 2022. 

AAA defined driverless vehicles as those “capable of operating without human involvement. A human driver is not required to control the vehicle at any time, nor required to be present in the vehicle while moving.” These are considered level 5 autonomous vehicles by the Society of Automotive Engineers (SAE).

Technically, no vehicles currently available for consumer purchase meet this definition. However, many manufacturers are striving to meet this goal. Some even imply in advertising that their cars already meet these qualifications, which may explain why fears increased so greatly. 

Currently, the highest level of autonomy available in a vehicle is level 3, defined as vehicles that can perceive their surroundings and make decisions for themselves but still require a human driver who is alert and ready to take control in an emergency. However, ads and claims by people like Elon Musk have made many people assume that these cars are fully autonomous in ways that have not yet been achieved. 

This is problematic for several reasons. First, owners of level 3 vehicles may believe that their cars are more capable than they are, which can cause accidents if they ignore the road and fail to react when a human override is necessary. Second, accidents that genuinely were caused by the vehicles’ failings can be brushed off as driver errors.

If you’re considering purchasing a supposedly self-driving car or already own one, you must understand how the system actually works. Here’s what sets self-driving vehicles apart from those with “driver-assist” features, what makes level 3 autonomous cars dangerous, and how to tell if your car’s system is actually broken.

Differences Between Self-Driving and Driver-Assist Features

On the SAE’s scale, there is a significant difference between level 3 “autonomous” vehicles and level 2 cars with driver-assist systems. The AAA survey found most drivers are actually heavily in favor of level 2 vehicles. The driver is in control of these cars at all times. However, the car has limited abilities to help the driver, such as:

  • Lane centering
  • Adaptive cruise control
  • Steering support
  • Brake and acceleration support

For example, a level 2 car could slow down if it detected traffic slowing down ahead of it or re-center itself in the lane if it noticed it was drifting. However, the driver is still responsible for most of the decisions. 

In contrast, in level 3 cars, the vehicle makes the bulk of the decisions. However, the driver still has to be alert and ready to take over because the system cannot handle all road conditions and circumstances. 

Are Self-Driving Cars Actually Dangerous?

Road users may be rightfully worried about self-driving technology. There are two major problems that cars with any level of autonomy may cause: driver complacency and software failures.

The complacency problem should be obvious. If you believe your vehicle can handle anything the road can present, you’re less likely to pay attention to what’s happening in front of you. Since all currently available vehicles still require driver engagement at some point, this could lead to dangerous accidents if drivers aren’t prepared to take control.

A bigger problem is that of software failures. Self-driving vehicles rely on complex software that is constantly receiving updates and patches. This should concern anyone who’s ever had an update break their phone, computer, or other device. It’s one thing if an update makes your computer slower; it’s another if an update causes your car to forget when to stop.

Even the best and most attentive driver can’t fully account for software problems. For example, Teslas, Subarus, and Hondas have all faced “phantom braking” issues from faulty driver-assist systems. If your car decides to stop out of nowhere while driving at highway speeds, at best, you may suffer whiplash, and at worst, you could be in a serious accident. 

Could Your Self-Driving Car Be a Lemon?

Any vehicle can be a lemon if it has a serious manufacturing defect. The question that autonomous vehicles raise is when driving system flaws become grounds for lemon law claims. 

There’s no doubt that driver-assist and autonomous driving cars have more potential failure points. Beyond the software issue, these vehicles also come with various radars, cameras, sensors, and electrical systems designed to collect and act on information. Each component is another opportunity for something to go wrong during the manufacturing process. 

Mechanical flaws in these systems can easily meet California’s lemon claims criteria. Suppose your vehicle is not repaired in a reasonable number of attempts and is still under warranty. In that case, you can likely file a claim against the manufacturer to have your vehicle replaced or refunded. While software errors are not as clear-cut, they still may grant you the right to file a claim.

Talk to Johnson & Buxton – The Lemon Law Guys

If you’re having problems with your supposedly autonomous vehicle, you may be able to hold the carmaker accountable for its flaws. However, lemon law claims for self-driving cars are particularly complex. Don’t try to handle them on your own. Instead, talk to the lemon law experts at Johnson & Buxton – The Lemon Law Guys. We have years of experience on both sides of these claims and stay on the cutting edge of California lemon case law. Whether you’re worried about faulty self-driving software or a dangerous driverless vehicle, we can help. Schedule your case review today to discuss your situation and discover how we can help you get your car replaced or refunded.

Leave a comment

Your email address will not be published.

Sorry. You must be logged in to view this form.

Lemon Trouble?​

See if you qualify!