NTSB: Tesla's Autopilot steered Model X into highway median, causing fatal crash | News | Palo Alto Online |

News


NTSB: Tesla's Autopilot steered Model X into highway median, causing fatal crash

Two-year federal investigation also lays blame on lax regulations, driver distraction, damaged highway safety device

In order for all area residents to have important local information on the coronavirus health emergency, Palo Alto Online has lifted its pay meter and is providing unlimited access to its website. We need your support to continue our important work. Please join your neighbors and become a subscribing member today.
A Tesla Model X is surrounded by firefighting foam after crashing and catching fire on Highway 101 in Mountain View in March 2018. An NTSB investigation blamed the car's Autopilot system for steering into the median divide, and said the driver likely failed to react because he was playing a video game. Courtesy of Mountain View Fire Department

A 38-year-old Apple engineer died from a high-speed crash because his Tesla Model X's Autopilot driving system steered the car into a median on Highway 101 in Mountain View in 2018, federal officials concluded Tuesday, following a two-year inquiry.

The probable cause of the crash, approved by the National Transportation Safety Board (NTSB) at the Feb. 25 meeting, lays significant blame on Tesla for shortcomings in the electric car company's partially autonomous driving system. But it also points to driver complacency as a significant factor in the crash — he was likely playing a video game at the time — along with larger concerns that car manufacturers are marketing and selling autonomous features without adequate testing and clear disclosure of the limitations.

The NTSB had blasted Caltrans for failing to repair safety equipment along Highway 101 that contributed to the severity of the crash, and found that the driver would have likely survived the collision if a safety buffer called an "attenuator" had been in place at the median.

"When all of these combine, we end up with a tragic accident and one less father that's coming home to the child he had just dropped off to school that morning," said NTSB board chair Robert Sumwalt.

In March 2018, San Mateo resident and Apple engineer Walter Huang was commuting south on Highway 101 in his Tesla Model X when his vehicle veered left towards the "gore area" between the southbound lanes and the Highway 85 carpool flyover lane in Mountain View. The Model X struck the concrete barrier at over 70 mph, destroying the front of the vehicle and causing it to careen into two other vehicles before coming to a stop. Huang was pulled out the vehicle shortly before the damaged battery caught fire. He was taken to a local hospital where he died.

NTSB investigators launched a probe into the accident almost immediately, and spent two years collecting data and conducting interviews to determine what caused the Tesla to veer from the roadway and crash into a barrier at full speed. The subsequent findings show that the vehicle's autosteering function was enabled at the time of the collision and, about 6 seconds before the crash, steered the SUV to the left toward the median.

The errant path may have been caused by faded paint markings along the left-hand lane of Highway 101, but NTSB staff couldn't say for sure. NTSB accident investigator Don Karol told board members that video imagery that could have shed light on the crash wasn't available because the vehicle's computer system was heavily damaged due to the "catastrophic nature" of the crash.

Tesla representatives did not immediately respond to requests for comment.

Tesla's Autopilot is a partially autonomous system that can control the vehicle's steering, braking and lane changing without driver input, using radar, cameras and ultrasonic sensors to detect objects and lane markings. Though it goes beyond typical driver assistance, Tesla warns consumers that drivers must maintain awareness, understand the limitations of the autonomous features and have their hands on the steering wheel at all times.

Information extracted from the Model X involved in the crash shows Huang's hands were off the steering wheel for roughly one-third of the half-hour trip, and he did not attempt to correct the vehicle's path when it veered into the Highway 85 barrier.

Perhaps the biggest reveal at the Tuesday meeting was that NTSB investigators found that Huang's lack of response before the crash was likely because he was distracted from "a cell phone game application" and was over-reliant on Autopilot.

"If you own a car with partial automation, you do not own a self-driving car, so don't pretend that you do. This means that when driving in the supposed self-driving mode, you can't sleep. You can't read a book. You can't watch a movie or TV show. You can't text. And you can't play video games," Sumwalt said. "Yet that's precisely what we found that this driver was doing. He was playing a video game on his smartphone when his car veered over into the median and struck the barrier in the median."

Sumwalt, in his opening comments, said that the fatal collision could have been avoided, but the car manufacturing industry and federal regulators have failed to implement the NTSB's safeguards proposed back in 2017. There needs to be a way to limit autonomous functions in road conditions that Autopilot was never designed to handle, and there needs to be effective ways to flag the drivers who are complacent on the road, he said.

"What struck me the most about the circumstances of this crash was the lack of system safeguards to prevent foreseeable misuses of technology. Instead, the industry keeps implementing technology in such a way that people get injured or killed," Sumwalt said. "And the industry, in some cases, is ignoring the NTSB's recommendations intending to help prevent such tragedies.

Drivers need to be aware that Tesla's Autopilot, and comparable systems in commercial vehicles, are not autonomous, and shouldn't be branded as such. "The car in this crash was not a self-driving car, as I've said many times before," he said. "You cannot buy a self-driving car today. We're not there yet."

Beta testing at highway speeds

One of the major sticking points among NTSB members was the idea that Tesla's autosteering features, despite its widespread, daily use on roadways, is actually still a work in progress. The company itself considers the features a "beta," a label that it believes encourages drivers to approach Autopilot with a clear-eyed understanding that they need to remain vigilant and attentive.

The label didn't sit well with board member Bruce Landsberg, who said that enabling autosteering amounts to using a faulty safety feature that still has bugs. Federal regulators only test semi-autonomous collision avoidance systems under a limited set of conditions, such as rear-end crashes at speeds up to about 45 miles per hour, rather than in high-speed scenarios or in cross traffic, which have been the center of multiple Tesla-related investigations.

Two of the Autopilot functions — the Forward Collision Warning system and Automatic Emergency Braking — did not activate during the crash.

Landsberg slammed the current Autopilot system and testing scheme, calling it "completely inadequate" and said it's not enough for Tesla to simply give consumers the caveat that Autopilot is prone to issues.

"It seems to me when you put a system out there that is a safety-critical system, and it has known bugs in it, that it's probably pretty foreseeable that somebody is going to have a problem," Landsberg said. "Then they come back and say, "Oh, but we warned you" — that doesn't seem like a very robust safety mechanism."

Since the fatal Mountain View crash, Tesla has pushed out a firmware update for the Autopilot system. Among the changes, the vehicle will more quickly alert drivers when their hands are not detected on the steering wheel and it loses its ability to keep the vehicle centered in a lane.

The features are valuable improvements to Autopilot, said Robert Molloy, NTSB's director of highway safety, but he underscored that it's important to be proactive about making upgrades to autonomous vehicle functions.

"Fixing problems after people die is not really a good highway (safety) approach," he said.

Huang had complained of problems with his Autopilot and navigation systems in the weeks leading up to the crash.

Board member Jennifer Homendy said she is concerned that the National Highway Traffic Safety Administration (NHTSA) is shirking its responsibility to regulate the emerging market of partially autonomous vehicles, described as "Level 2" automation on a scale of 0 to 5 (fully automated). She said a contributing factor in a fatal Tesla crash in Delray Beach, Florida, was the "failure of NHTSA" to compel vehicle manufacturers to incorporate acceptable safeguards for vehicles with Level 2 automation.

Homendy said she was also troubled by a Twitter post by the agency stating that the regulations should take into account the cost of buying a car. The Jan. 7 tweet states "For some, affording any car — let alone a new one — can be a challenge. That's why NHTSA is working to keep regulations reasonable so cars, trucks and SUVs — with the latest safety features — are more affordable and families can be safer on the roads."

The NHTSA's mission is not to sell cars, and lowering the bar on safety to lower costs shouldn't even be considered a factor, she said.

Not a new problem

Throughout the Feb. 25 meeting, NTSB board members referred to Tesla's poor track record of responding to the agency's past recommendations, which could have led to safety improvements and prevented further collisions.

In 2017, the NTSB had wrapped up a yearlong investigation into a fatal Tesla crash in which a Model S struck the side of a truck, killing the driver. Among the safety recommendations, NTSB suggested that six car manufacturers with Level 2 automation systems incorporate safeguards that limit the use of autonomous features to roads and conditions that it was designed to handle. The agency also suggested that vehicles need to better detect when the driver is being complacent and not paying attention to the road, and alert them when "engagement is lacking."

Sumwalt said the request was pretty simple — respond to the NTSB's recommendations within 90 days. Five of the manufacturers responded in time and stated they would comply with the recommendations. Only Tesla ignored the request and it remains the only non-compliant company.

"It has been 881 days since the recommendations were sent to Tesla and we've heard nothing," Sumwalt said. "We're still waiting."

Tesla has butted heads with NTSB officials since the investigation into the Mountain View crash first launched in March 2018. The federal agency originally invited Tesla actively participate in the investigation and provide technical assistance, but took the rare step of dropping the company from the investigation one month later after Tesla released multiple public statements speculating that the driver, and not its technology, was at fault.

Tesla was releasing incomplete investigative information that was bound to lead to speculation and incorrect assumptions about the probable cause of the crash, doing a "disservice to the investigative process and the traveling public," NTSB said in a statement at the time.

Sumwalt took time during the meeting to criticize Caltrans for its role in the severity of the Mountain View Tesla crash, noting that it was one of multiple occasions in which safety equipment was damaged and not adequately repaired or replaced. In front of the Highway 85 concrete barrier is typically a crash attenuator, which had been damaged to the point of being "nonoperational" due to a solo-vehicle crash 11 days before the March 23 fatality.

The attenuator could have significantly reduced the damage to the Model X, and NTSB investigators made clear at the Tuesday meeting that Huang likely would have survived if it had been there to cushion the impact.

The Mountain View Fire Department was mentioned by NTSB officials, who determined the emergency response to the accident was adequate and well-executed, given the circumstances.

The full report on NTSB's investigation will be published in the coming weeks. An abstract of the report, released Feb. 25, lists 23 findings that enumerate all the factors that contributed to the fatal collision. Limitations on Tesla's Autopilot lane-keeping assistance caused the vehicle to veer into the median and failed to provide an alert to the driver in the seconds leading to the crash. The Model X's collision avoidance system was not designed to detect a crash attenuator and the NHTSA does not require such capability, according to the report, which resulted in a severe crash in which the automatic braking and collision warning systems failed to activate.

"In order for driving automation systems to be safely deployed in a high-speed operating environment, collision avoidance systems must be able to effectively detect and respond to potential hazards, including roadside traffic safety hardware, and be able to execute forward collision avoidance at high speeds," according to one of the findings approved by the NTSB board Tuesday.

The NTSB doubled down on the recommendations it made to Tesla in 2017, adding that if Tesla does not create safeguards preventing the use of Autopilot on roads and in conditions it was not designed to handle, it risks future crashes. It also took a jab at NHTSA for "failing to ensure" that manufacturers of partially autonomous vehicles provide these safeguards.

The report's findings also state the driver did not attempt to correct the route of his Model X as it steered into the concrete barrier, most likely because he was distracted by a game on his cell phone. Distracted driving could be curbed by new technology and company policies that prohibit the use of portable electronic devices in a moving vehicle.

Among the nine recommendations, NTSB is asking nine smartphone manufacturers, including Apple, Google and Samsung, to develop a "lock-out" mechanism that would automatically disable any functions that would distract a driver while the vehicle is in motion making it a default setting that would need to be disabled by the user. In the case of Apple, board took it a step further and asked the company to create a company policy banning the use of cell phones by "all employees and contractors driving company vehicles, operating company-issued portable electronic devices, or using a portable electronic device to engage in work-related communications."

---

Follow the Palo Alto Weekly/Palo Alto Online on Twitter @PaloAltoWeekly and Facebook for breaking news, local events, photos, videos and more.

Kevin Forestieri writes for the Mountain View Voice, the sister publication of PaloAltoOnline.com.

We need your support now more than ever. Can we count on you?

Comments

19 people like this
Posted by resident
a resident of Mountain View
on Feb 25, 2020 at 10:11 pm

Tesla Autopilot is a danger to Tesla drivers and to all other road users. Tesla should be forced to disable it until they can prove that they are properly training their drivers to use it safely.


28 people like this
Posted by Leslie
a resident of Midtown
on Feb 26, 2020 at 12:37 am

Playing a video game on your smart phone while tooling down the freeway at 70 mph is an appointment with Darwin, I don't care how automated your vehicle is.


28 people like this
Posted by Resident
a resident of Another Palo Alto neighborhood
on Feb 26, 2020 at 7:24 am

This driver was everything all Tesla drivers are being criticized for.

He knew the car had problems at this area and yet he still ignored all the warnings and arrogantly played a game on his phone. It is very lucky that nobody else was killed due to his irresponsibility.

Darwin Award indeed.


21 people like this
Posted by I Despise Teslas
a resident of Crescent Park
on Feb 26, 2020 at 7:58 am

TESLA is now due for a MAJOR lawsuit! *yay*

I detest Tesla's & the arrogant people who tend to drive/own them...arrogant as in detached from the reality of other cars, pedestrians & bcyclists while going about their clueless, oblivious ways.

I also detest the braggarts who rave about making a 'killing' on Tesla stock...may their dividends & stock values plummet!

If you want autopilot features, drive a WAYMO...those things are so slow & overcautious that walking or taking the bus is often a faster alternative.

In closing, may all Tesla's someday disappear off the face of the Earth.


19 people like this
Posted by I LOVE Tesla’s
a resident of Palo Alto Hills
on Feb 26, 2020 at 9:19 am

As a owner of a Tesla, I love the car for all the technology it has and it’s just a fun car to drive. That being said, I would be a Village Idiot to play a video while on assisted driver.


5 people like this
Posted by Harry Merkin
a resident of Ventura
on Feb 26, 2020 at 11:39 am

>> He knew the car had problems at this area and yet he still ignored all the warnings and arrogantly played a game on his phone.

That is strange. Have other Tesla drivers had problems at this area? If not, there is more to this story than has been discussed.


7 people like this
Posted by mrlatham
a resident of Mountain View
on Feb 26, 2020 at 11:51 am

Once again the media plays the stupid game: the headline here says the car was at fault, as others have said maybe playing a video game at 70 mph especially when you have children caused most of the problem.
I hope sad as it is that they dont get a penny from their lawsuit. Stupid is as stupid does.....


4 people like this
Posted by Tesla's Are Not Race Proven & BORING
a resident of Barron Park
on Feb 26, 2020 at 1:06 pm

Unlike MB, Porsche, BMW, Honda etc. Tesla's have NEVER proven their prowess on a race track...they are designed for people who merely want to OPERATE a car while going from point A to point B.

How boring...no wonder playing video games is an outlet while on Tesla's 'autopilot'.


9 people like this
Posted by Teslas
a resident of Adobe-Meadow
on Feb 26, 2020 at 1:15 pm

Fasted on the street though. Mind numbing acceleration is never boring.
Do you decide on the car to buy based on race wins? That sounds...so smart! LOL.


5 people like this
Posted by I Despise Teslas
a resident of Crescent Park
on Feb 26, 2020 at 3:10 pm

quotes:

"Unlike MB, Porsche, BMW, Honda etc. Tesla's have NEVER proven their prowess on a race track...they are designed for people who merely want to OPERATE a car while going from point A to point B."

VS

"Fasted on the street though. Mind numbing acceleration is never boring."

VS

"maybe playing a video game at 70 mph especially when you have children caused most of the problem."

VS

"This driver was everything all Tesla drivers are being criticized for."

Lastly, the voice of reason...

"Tesla Autopilot is a danger to Tesla drivers and to all other road users. Tesla should be forced to disable it until they can prove that they are properly training their drivers to use it safely."

WHY? Because most Tesla drivers are techie types with no sports or race car acumen.

Being an electric car & given the typical Tesla driver who seemingly enjoys going fast without actually driving while on 'autopilot'...Teslas should be relegated to slot cars with tracks in the road guiding them to their eventual destinations!


4 people like this
Posted by Anon
a resident of Another Palo Alto neighborhood
on Feb 26, 2020 at 3:32 pm

Elon Musk/Tesla made electric cars cool and fun. My hat is off to Elon Musk about that. But, I wish Musk would shut up about "autopilot". There is no such thing for cars yet, and, no telling if there ever will be, or, when it will arrive if it does. People can crash using a cruise control also, but, nobody claims it is an "autopilot". But, just by naming it "Autopilot", Musk is misleading people about what it actually can do.

Web Link

Tesla website says things like: "Features - Autopilot enables your car to steer, accelerate and brake automatically within its lane. Full Self-Driving Capability introduces additional features and improves existing functionality to make your car more capable over time including"

Needless to say, "Full Self-Driving Capability" doesn't actually mean what you would expect, as the family of Walter Huang discovered. The car automates the normal stuff, lulling the driver into complacency. Suddenly, the driver has to detect a problem and perform well in a demanding situation. The aviation industry discovered the hazard in this quite some time ago. The feature needs to be renamed and drivers need to understand that they still need to drive the car.


6 people like this
Posted by Never Tesla
a resident of Mayfield
on Feb 26, 2020 at 3:53 pm

Tesla has taken the Facebook creed "Move Fast and Break Things," and implemented it in hardware


13 people like this
Posted by Tesla Owner
a resident of Midtown
on Feb 26, 2020 at 5:06 pm

It's sad how people who clearly just hate Tesla for god-knows-what reason but know nothing about the technology are willing to make ridiculously extreme comments. Same to those people who say cruel things about the person who died.

I had the same issue happen to me at the same place 3 weeks before the fatal crash. However, I was paying attention and was able to steer away from the barrier but it was scary. After the crash, I tried to recreate the problem (at a slow speed) and was able to do so. I believe it happened due to the left stripe of the lane being very dirty and the right stripe of the lane that peels off to 85 S. being freshly painted. The car followed the clean lane marking, believing it to be the left stripe of its lane, thus heading into the barrier. Tesla has since fixed the problem.

I have mixed feelings on the issue. Tesla should not have called the feature "autopilot". However, the gentleman was not following instructions either. I have found the autopilot (after the fix to the issue) to be very relaxing to use on longer drives on the freeway although its not at all trustworthy on local roads (yet). There's a Rand Corp. study that shows that early deployment of such technologies and the learnings that derive from it, results in a much lower rate of accidents and deaths over a period of time. However, I'm not sure I want to be the Guinea pig in that experiment.

Overall, I love driving the car with or without autopilot. On long drives and in stop and go traffic, the autopilot is a god-send..


3 people like this
Posted by Leslie
a resident of Midtown
on Feb 26, 2020 at 6:24 pm

Clearly with self-driving cars, no matter the make, you can't just sit back in your seat and play a video game as if you were a passenger on a train, bus or airplane where someone else controls the vehicle. In a self-driving car, YOU are in control. Hands on the wheel. Eyes on the road.


10 people like this
Posted by I Despise Teslas
a resident of Crescent Park
on Feb 27, 2020 at 8:15 am

My elder mother-in-law in Hillsborough has a very safe & efficient 'auto-pilot' feature on her early 1960s vintage-era Lincoln...it's called a chauffeur!


9 people like this
Posted by mauricio
a resident of Embarcadero Oaks/Leland
on Feb 27, 2020 at 3:36 pm

It's amazing that Tesla haters lump all Tesla owners together because of one knucklehead driver who played video games on his phone while the auto driving system was engaged caused a fatal accident. I drive a Tesla and would never dream of playing games on my phone, looking at the phone screen, taking my eyes off the road or even letting go of the steering wheel while the autopilot was engaged. I know many Tesla drivers and none of them would behave like that driver, who should really be in prison.

Tesla is a fantastic car, and reckless and clueless drivers drive all type of cars, high end, low end and anywhere in between, including Teslas.


9 people like this
Posted by I Despise Teslas
a resident of Crescent Park
on Feb 27, 2020 at 4:03 pm

>>> It's amazing that Tesla haters lump all Tesla owners together because of one knucklehead driver...

^ It's not any different than the bad rap BMW drivers got beginning in the 1980s & most it was true. During the late 1960s to mid 1970s BMWs were cool esoteric sport sedans & then the clueless yuppies started buying them & weaving in & out, cutting other drivers off.

The contemporary Tesla driver is just another incarnation of moneyed drivers who in most cases, don't know how to drive or respect the rights of other drivers.

As aforementioned, my mother-in-law is 'pre-Tesla' when it comes to auto-pilot. Her old Lincoln is the same model Kennedy was riding in Dalla & it gets far more attention & curiosity than a mundane-looking Tesla.


3 people like this
Posted by I Despise Teslas
a resident of Crescent Park
on Feb 27, 2020 at 4:04 pm

Dalla > as in Dallas - November 22, 1963


7 people like this
Posted by Never Tesla
a resident of Mayfield
on Feb 27, 2020 at 5:28 pm

>> I believe it happened due to the left stripe of the lane being very dirty and the right stripe of the lane that peels off to 85 S. being freshly painted. The car followed the clean lane marking, believing it to be the left stripe of its lane, thus heading into the barrier. Tesla has since fixed the problem.

Tesla should have caught that deadly feature during development testing, or during acceptance testing at the latest. Using its paying customers as beta (or gamma) testers is inexcusable.


Don't miss out on the discussion!
Sign up to be notified of new comments on this topic.

Email:


Post a comment

Posting an item on Town Square is simple and requires no registration. Just complete this form and hit "submit" and your topic will appear online. Please be respectful and truthful in your postings so Town Square will continue to be a thoughtful gathering place for sharing community information and opinion. All postings are subject to our TERMS OF USE, and may be deleted if deemed inappropriate by our staff.

We prefer that you use your real name, but you may use any "member" name you wish.

Name: *

Select your neighborhood or school community: * Not sure?

Comment: *

Verification code: *
Enter the verification code exactly as shown, using capital and lowercase letters, in the multi-colored box.

*Required Fields


Stay up to date on local coronavirus coverage with our daily news digest email.

Food Safety and Coronavirus: A Comprehensive Guide
By Laura Stec | 11 comments | 28,648 views

These local restaurants are donating meals to Bay Area residents in need. Here's how to help.
By Elena Kadvany | 6 comments | 10,628 views

Coronavirus: Plan ahead now for a big outbreak
By Diana Diamond | 16 comments | 3,703 views

Will the Coronavirus Save Lives?
By Sherry Listgarten | 27 comments | 3,571 views

How COVID-19 Affects Communities
By Jessica Zang | 3 comments | 911 views

 

DEADLINE EXTENDED

The 34th Annual Palo Alto Weekly Short Story Contest is now accepting entries for Adult, Young Adult and Teen categories. Send us your short story (2,500 words or less) and entry form by April 10, 2020. First, Second and Third Place prizes awarded in each category. Sponsored by Kepler's Books, Linden Tree Books and Bell's Books.

Contest Details