False sense of security? Autopilot not as fail-proof as you think
Imagine traveling from California to New York without ever having to touch the steering wheel or acceleration pedal. Self-driving vehicles could one day make that possible, but currently, they aren’t safe enough to operate freely on the roads.
Advocates of self-driving vehicles argue they have the potential to be safer than human drivers, but we’re not there yet. The technology still needs improvement as demonstrated by recent fatal accidents involving autonomous cars.
What are the dangers of autonomous vehicles? We will discuss some of them in this post.
What is an automated vehicle?
The National Highway Traffic Safety Administration classifies automation on a scale of zero to five. Level Zero indicates no automation while five indicates a fully autonomous vehicle. A vehicle with autopilot features would be classified somewhere in the middle, depending on its capabilities and how much the driver must remain alert.
What seems to be the biggest danger thus far?
Though crashes often occur even with a human behind the wheel, autonomous vehicles pose a different challenge. Because a human won’t be watching the road, the software in the vehicle must be programmed to recognize painted lines, cars, pedestrians, road signs and other pertinent information.
This could be a problem if the vehicle can’t see the lines when there’s a winter storm or heavy downpour. Even when weather doesn’t interfere, unforeseen circumstances-such as a car cutting across traffic or a person crossing the road-could present a problem for self-driving vehicles.
Fatal accidents involving autonomous cars
Earlier this year, the ride-hailing company Uber pulled its self-driving vehicles out of Arizona after one of its cars hit and killed a pedestrian crossing the street. The car’s driver wasn’t paying attention, and the vehicle’s log confirmed it didn’t brake before the collision.
Tesla’s Autopilot system has also been involved in fatal accidents. In 2016, a man behind the wheel of a Tesla on autopilot was killed after the vehicle failed to recognize a tractor-trailer making a left turn into oncoming traffic. Its driver also wasn’t paying attention at the time of collision.
In March, a Tesla on autopilot drove into the concrete divider on a highway and burst into flames, killing its driver.
Hope for the future?
Tesla has repeatedly said drivers are supposed to remain alert while using the Autopilot system. Despite these warnings, drivers seem to place too much trust in the technology.
Autonomous vehicle technology is still being tested and improved. Although self-driving cars may be ready for the road one day, the current technology is not adequately developed, especially for surprise circumstances. With current technology, drivers of these vehicles should be constantly attentive to their surroundings in case they need to jump in.
If you or a loved one has been injured in an accident involving an autonomous vehicle or any other kind of vehicle, the vehicle manufacturer and the software company may be liable in addition to the vehicle owner, vehicle driver, and possibly others. You may benefit from speaking with an attorney to evaluate and fully understand your rights.