A Tesla Driver Has Died In The First Ever Fatal Crash For A Self-Driven Car
Tesla has revealed that a Model S driver died in an accident on 7 May while Autopilot was activated, in what’s thought to be the first fatal crash involving an autonomous vehicle. The driver - 40-year-old Joshua D. Brown - was on a divided highway in Williston, Florida, when a tractor pulled out, at which point neither Brown nor Autopilot reacted.
The Model S passed under the tractor’s trailer, with the bottom of the trailer hitting the windscreen. The car then continued down the road, before leaving the highway and hitting a fence. He died at the scene.
In a statement released on Thursday, Tesla said that the National Highway Traffic Safety Administration (NHTSA) has started a “preliminary evaluation” into the performance of Autopilot during the crash. “This is the first known fatality in just over 130 million miles where Autopilot was activated,” Tesla said, adding, “Among all vehicles in the US, there is a fatality every 94 million miles.”
Brown was well known in the Tesla community, and just a month before the fatal crash had posted a video on YouTube (below) of Autopilot successfully averting an accident. The video quickly clocked a million views.
Tesla’s Autopilot is at the moment intended to be a driver assist, and more of a ‘semi-autonomous’ mode that requires the driver to be holding the steering wheel at all times. In the statement Tesla notes that “Autopilot is getting better all the time, but it is not perfect and still requires the driver to remain alert,” but that hasn’t stopped some well-documented abuses of the system. It’s been heavily criticised in some corners for lulling its users into a false sense of security. Earlier this year, a senior Volvo engineer slammed Autopilot, labelling it an “Unsupervised wannabe” that “Gives you the impression that it’s doing more than it is.”
At this early stage of the investigation, it’s not known exactly why Brown didn’t brake himself. Tesla’s statement speculates that he simply did not see “The white side of the tractor trailer against a brightly lit sky,” however in a report in the Associated Press, the 62-year-old driver of the tractor claimed to have heard one of the Harry Potter films playing from the car at the crash scene. Tesla responded to the claims, stating that it isn’t possible to watch videos on the main screen found in the Model S.
Find out more about how Autopilot works by watching our video below:
Comments
Sadly i knew this was going to happen it sucks that he trusted the computer and the computer couldnt detect danger
It was drivers error. Not the computer’s. You’re not supposed o trust the car - Tesla themselves say so. While it prevents a lot of collisions (prevented 3 for me, lessened the impact on 1), it isn’t perfect. The driver is responsible for the vehicle and all persons in it. Not the Vehicle.
Felt sorry for him, car manufactres should see roads on London, or even worse like India, Indonesia, etc.. you cant rely on computer instincts for driving/riding there..,
We already have lane assists on highways, blindspot warning, reverse camera, brake warning, tcs, abs, cruise control, etc, which is enough. Dont change what’s already right.
I get sketched using cruise control couldnt imagine fully autonomous. Rip 🙏
So we know now that the system isnt perfect. They say this is one crash in 130m miles used. So i guess technically its possible for another crash to happen tomorrow and that would make one crash for every 65m miles. Im starting to think about how much is a life worth? Is the conveinience of millions of people worth more than the death of one? Hmm
By that logic, nobody should have cars, it’s not like manually driven cars do not crash.
Something similar happened to me last week. At the end of the day there’s just nothing you can do sometimes.
THEY SHOULD STOP BUILDING THESE CARS!
Tesla dont provide the autopilot feature for human to feel asleep behind the wheel
Do u need to be 18 to sitt alone ina tesla while it’s in autopilot?
Condolences to the family and friends…and as always f×ck autonomous vehicles. They tell you to trust them…but where does the trust end? I believe it ends when a human is killed by technology that was made to make life simpler…not endanger it further.
I think the self driving car thingy will be delayed alot now