Tesla Autopilot shares blame in fatal crash, says NTSB

Tesla Autopilot shares blame in fatal crash, says NTSB
The National Transportation Safety Board (NTSB) in the United States finds that Tesla's Autopilot system was partly to blame for a fatal accident in 2016.

40 year old Joshua Brown lost his life in May 2016, when his Tesla vehicle collided with a truck. The driver of the truck was unhurt in the incident. Mr Brown was utilizing the autopilot system available in Tesla vehicles at the time of the collission, prompting the NTSB to probe its potential role in the incident.



Following an investigation, the NTSB concluded that the system was partly to blame because it allowed drivers to use Autopilot in an environment it wasn't designed to perform in.

"In this crash, Tesla's system worked as designed, but it was designed to perform limited tasks in a limited range of environments," Christopher Hart, NTSB member, said.

"Tesla allowed the driver to use the system outside of the environment for which it was designed, and the system gave far more leeway to the driver to divert his attention to something other than driving."

Previous reports that Mr Brown was watching a Harry Potter film at the time of the collission were dismissed, and the NTSB revealed that neither Mr Brown, or the truck driver, were under the influenc of drugs or alcohol, nor were they fatigued, at the time of the crash.

It also found that Mr Brown's inattention, and the truck driver's unwillingness to give way, were primary factors of the incident. A previous NTSB report stated that while Mr Brown was driving for 37 minutes, he had his hands on the wheel for just 25 seconds.

Still, Tesla released a statement welcoming the NTSB analysis and pledged to evaluate the recomendations, including steps to limit the use of system in situations it is not designed to operate in.

Read More: BBC News

Written by: James Delahunty @ 13 Sep 2017 10:49
Tags
Tesla
Advertisement - News comments available below the ad
  • 5 comments
  • hearme0

    Tesla's autopilot-related accidents are always resulting in fatalities as I recall, thought not in great numbers. Granted the number of Teslas on the road is small by comparison to others so this is a major factor.

    But seriously, individual autopilot systems without a grid system are going to fail. Giving way to a simple, one-car autonomous system and expecting it to account for all human activity, rash decisions on the road, illogical decisions on the road or just terrible decisions on the road is just stupid.

    The only way we'll be an autonomous driving world will be if one car knows precisely where the others already are.

    I wouldn't buy a Tesla to save my life. Autopilot is a joke and works only in a vacuum sealed environment. Add to that, batteries will expire and what do they do with them??? Doesn't seem environmentally sound but i could be wrong. I would not buy a used Tesla with batteries that have lost the main part of their life. Would NEVER use autopilot.

    13.9.2017 11:30 #1

  • Ripped1968

    Its just another case of one law for the poor and another for the rich. Why does Elon Musk get off scott free. If it was anyone else they would have been found to be responsible for this terrible loss of life. Why was he allowing retail purchasers to activate this beta system, and even more concerning is that the driver was able to set a speed for the car that far exceeded the speed limit. I thought the point of self driving cars was to make them safer. If humans can set the speed of the car then the system is seriously flawed and I hope that the greed of the government to keep booking speeding drivers does not outway their sanity and they allow this to continue just so that they can continue to collect speeding revenue.

    Computer driven cars should only have one input from the passenger and that is the destination. This is the only sane and safe way that the cars should ever be used.

    As it turned out the human did the wrong thing because he could. Elon Musk should have known that the retail purchasers would do whatever the system allowed them to do. These were not professional test drivers. They were ordinary people who really did not understand what they were doing and Elon Musk should have known this. They should never have been used by anyone other than professional test drivers until the production system has been released.

    Not so smart Elon Musk. Take yourself off to rehab. You are cognitively impaired due to reduced reward Neuro transmitter syndrome.

    15.9.2017 10:37 #2

  • Scaldari

    Originally posted by Ripped1968: Its just another case of one law for the poor and another for the rich. Why does Elon Musk get off scott free. If it was anyone else they would have been found to be responsible for this terrible loss of life. Why was he allowing retail purchasers to activate this beta system, and even more concerning is that the driver was able to set a speed for the car that far exceeded the speed limit. I thought the point of self driving cars was to make them safer. If humans can set the speed of the car then the system is seriously flawed and I hope that the greed of the government to keep booking speeding drivers does not outway their sanity and they allow this to continue just so that they can continue to collect speeding revenue.

    Computer driven cars should only have one input from the passenger and that is the destination. This is the only sane and safe way that the cars should ever be used.

    As it turned out the human did the wrong thing because he could. Elon Musk should have known that the retail purchasers would do whatever the system allowed them to do. These were not professional test drivers. They were ordinary people who really did not understand what they were doing and Elon Musk should have known this. They should never have been used by anyone other than professional test drivers until the production system has been released.

    Not so smart Elon Musk. Take yourself off to rehab. You are cognitively impaired due to reduced reward Neuro transmitter syndrome.
    So, human breaks rules so the car maker is responsible? Maybe we should make Ford responsible when people speed and wreck too!

    Account Created Saturday 12 January 2008. After 9 years I consider myself a Sr. Member no matter WHAT my post count says.

    18.9.2017 10:42 #3

  • defgod

    Originally posted by Scaldari: Originally posted by Ripped1968: Its just another case of one law for the poor and another for the rich. Why does Elon Musk get off scott free. If it was anyone else they would have been found to be responsible for this terrible loss of life. Why was he allowing retail purchasers to activate this beta system, and even more concerning is that the driver was able to set a speed for the car that far exceeded the speed limit. I thought the point of self driving cars was to make them safer. If humans can set the speed of the car then the system is seriously flawed and I hope that the greed of the government to keep booking speeding drivers does not outway their sanity and they allow this to continue just so that they can continue to collect speeding revenue.

    Computer driven cars should only have one input from the passenger and that is the destination. This is the only sane and safe way that the cars should ever be used.

    As it turned out the human did the wrong thing because he could. Elon Musk should have known that the retail purchasers would do whatever the system allowed them to do. These were not professional test drivers. They were ordinary people who really did not understand what they were doing and Elon Musk should have known this. They should never have been used by anyone other than professional test drivers until the production system has been released.

    Not so smart Elon Musk. Take yourself off to rehab. You are cognitively impaired due to reduced reward Neuro transmitter syndrome.
    So, human breaks rules so the car maker is responsible? Maybe we should make Ford responsible when people speed and wreck too!
    It's not really the fact that the human broke the rules. It's the fact that the human was allowed to break the rules that makes the manufacturer at fault.
    When a system is put in a vehicle that allows a human to break the rules. It then leaves the manufacturer at fault.
    In any system. No matter what the system. If we allow another human to break the rules they will.

    27.9.2017 07:46 #4

  • Scaldari

    wow, read out of context that is a very 1984ish picture you painted there friend!

    Account Created Saturday 12 January 2008. After 9 years I consider myself a Sr. Member no matter WHAT my post count says.

    17.10.2017 23:15 #5

© 2024 AfterDawn Oy

Hosted by
Powered by UpCloud