A Tesla Driver Has Died In The First Ever Fatal Crash For A Self-Driven Car
Tesla has revealed that a Model S driver died in an accident on 7 May while Autopilot was activated, in what’s thought to be the first fatal crash involving an autonomous vehicle. The driver - 40-year-old Joshua D. Brown - was on a divided highway in Williston, Florida, when a tractor pulled out, at which point neither Brown nor Autopilot reacted.
The Model S passed under the tractor’s trailer, with the bottom of the trailer hitting the windscreen. The car then continued down the road, before leaving the highway and hitting a fence. He died at the scene.
In a statement released on Thursday, Tesla said that the National Highway Traffic Safety Administration (NHTSA) has started a “preliminary evaluation” into the performance of Autopilot during the crash. “This is the first known fatality in just over 130 million miles where Autopilot was activated,” Tesla said, adding, “Among all vehicles in the US, there is a fatality every 94 million miles.”
Brown was well known in the Tesla community, and just a month before the fatal crash had posted a video on YouTube (below) of Autopilot successfully averting an accident. The video quickly clocked a million views.
Tesla’s Autopilot is at the moment intended to be a driver assist, and more of a ‘semi-autonomous’ mode that requires the driver to be holding the steering wheel at all times. In the statement Tesla notes that “Autopilot is getting better all the time, but it is not perfect and still requires the driver to remain alert,” but that hasn’t stopped some well-documented abuses of the system. It’s been heavily criticised in some corners for lulling its users into a false sense of security. Earlier this year, a senior Volvo engineer slammed Autopilot, labelling it an “Unsupervised wannabe” that “Gives you the impression that it’s doing more than it is.”
At this early stage of the investigation, it’s not known exactly why Brown didn’t brake himself. Tesla’s statement speculates that he simply did not see “The white side of the tractor trailer against a brightly lit sky,” however in a report in the Associated Press, the 62-year-old driver of the tractor claimed to have heard one of the Harry Potter films playing from the car at the crash scene. Tesla responded to the claims, stating that it isn’t possible to watch videos on the main screen found in the Model S.
Find out more about how Autopilot works by watching our video below:
Comments
Auto pilot can be a good thing because there are a lot of stupid people who cant drive worth a dam, or people who choose to drive while impaired in some way. Will I ever buy a self driving car…maybe for commuting in heavy traffic, but will always choose the manual option first.
I hav a model s and find the robot thing a bit too creepy
You have a Model S but you can’t spell ‘have’. Right…
We should drive ourselvesthan trust a robot
HEY - Stop with the bloody hate. As a Tesla owner, I know firsthand that autopilot works perfectly 99% of the time. Tesla says, when you enable autopilot, and when you take your hands off the wheel that the system isn’t perfect and that the drivers responsibility is the car and remains that way. Of course the system didn’t see the trailer, because only the camera cannot detect an object. The radar and sensors combined with the camera are able to trigger AEB.
I agree, it is had to put blame on anyone, or anything. Tesla is amazing, and car accidents are innevitable no matter what.
Man, why is everyone so suprised?
It was bound that something like this was going to happen. Self driving cars seemed absolutely ridiculous to me when I first heard of it.
Well i think this was going to happen in some time, hands on the wheel is better than self driving, i do accept all the safety features on a car to prevent accidents, but a car that it drives it self its a computer that can fail too, this are machines not some sort of “god”. Sorry for the english!
I think people rely to much on tesla a autopilot system sorry to hear the dude died though
I believe that tesla should not
attach an autopilot to their modleauntil it is fully capable of avoiding these tough obstacles. I know they say its still a work in progress, but there are some big idiots in the world who might put all their lazy trust in the system, and maybe they should( not saying that man who passed away is an idiot for trusting, but their might be otjers who use it in a city.) Tesla is a fantastic company which is bringing us some amazing technology, though autopilot is a risky business, and if you want it in you car models, it better work 100 percent.
It is like tesla giving out a prescriptiin drug, and in fine print says it can cause death. Tesla should be responsible. My condolences to the family, though only 1 death every 130… Pretty great!
If you don’t want to drive a car, how about just not owning a car?
If you don’t want to pay for gas, how about just not owning a car?
Prepare for a huge moms against cars post on Facebook.
If that happens prepare for huge moms against cars post to be removed off the face of the earth…