How Safe is Tesla’s Autopilot Driver?

By:  |  Category: Blog Tuesday, March 10th, 2020  |  No Comments

We’re ready for the world of self-driving cars. You can see it in those who text and drive (not something we condone), those who are distracted by fixing their hair or inhaling their lunch on the way to the next meeting. Not specifically safe things to do while operating our vehicles, right?

What if your car has autopilot? Can you trust it?

Two years ago, an Apple software engineer Walter Huang crashed into a highway barrier while playing a gaming app on his phone. His Tesla was running the autopilot feature, and “at 71mph the vehicle’s safety systems failed to detect the barrier and accelerated into it.”

Distracted driving meant he probably didn’t realize what was happening until it was too late, and complete trust of the autopilot feature meant he was comfortable with distracted driving behind the wheel.

So how safe is Tesla’s autopilot feature really.

“The board says Tesla’s crash-avoidance system was “not designed to, and did not, detect the crash attenuator”. Because of this, Autopilot accelerated the vehicle, and the vehicle failed to provide a crash alert and didn’t activate emergency braking.”

We can confidently say that without human supervision, an autopilot feature is not 100% trustworthy.

The National Transportation Safety Board (NTSB) has chastised “Tesla, Apple, and road-safety regulators in the report from its nearly two-year investigation into the fatal crash.”  The NTSB (obviously) “recommends that partial driving-automation systems must be able to effectively detect potential hazards and warn drivers of them to be safely deployed in high-speed environments.”

Shouldn’t they be designed to do this already? We agree. It’s through a “nonregulatory approach” to features like autopilot that we see sometimes fatal mistakes in automobile accidents, not to mention the Apple employee using his company-provided mobile device while driving…

The NTSB writes that a “strong company policy, with strict consequences for using portable electronic devices while driving, is an effective strategy in helping to prevent the deadly consequences of distracted driving,” 

Back to Tesla. The “harshest” criticism of this entire event is that Tesla failed to include safeguards to limit the use of Autopilot in unsuitable situations, and also that Tesla ignored the NTSB, not responding to their recommendations given in 2017.

The general consensus is to ensure your own safety not by trusting a programmed autopilot feature, but by using all your senses (common sense included).

And if you need help with Managed IT Services give EnhancedTECH a call at 714-970-9330.

-Emmy Seigler

Source: https://www.zdnet.com/article/apple-and-tesla-under-fire-over-software-engineers-fatal-autopilot-crash/?ftag=TRE-03-10aaa6b&bhid=28837826891618282212917048574090

Image Source: https://www.pexels.com/photo/person-sitting-inside-car-2526127/

Samantha Keller
Leave a Comment
Read previous post:
Oh look! Another place you should never charge your phone!

We’re all aware of the dangers lurking behind stranger’s phone cords and charging ports (and if you’re not we’ll explain),...

Close