Uber's Autonomous Car Detected The Pedestrian It Hit, Decided Not To Act

A new report claims that Uber's autonomous car prototype actually detected the pedestrian it hit and killed, but ignored the information
Uber's Autonomous Car Detected The Pedestrian It Hit, Decided Not To Act

A month and a half ago, a pedestrian was tragically killed by one of Uber’s Volvo XC90-based autonomous test cars. Now, we’re getting a better idea of what went wrong on 18 March.

According to a new report in The Information, the prototype vehicle’s software did actually spot 49-year-old Elaine Herzberg as she crossed the road in Tempe, Arizona, but decided not to act.

Citing “two people briefed about the matter,” the publication reports that Herzberg was identified as a ‘false positive’. It’s deliberately built in to the system to ignore objects like litter in the road, so it does not brake or take evasive action unnecessarily. So while Herzberg was indeed ‘spotted’, she was not correctly classified by the system, and as a result, the car carried on regardless. Subsequently released footage also revealed that the human ‘operator’ of the car was looking down at the time of the accident.

Uber has so far refused to comment, owing to the current National Transportation Safety Board (NTSB) investigation into the accident. The company suspended its testing programme immediately after the fatal incident in March.

Sponsored Posts

Comments

Anonymous
05/08/2018 - 13:09 |
141 | 1
Anonymous

In reply to by Anonymous (not verified)

Yeah, those 6502 processors man.

05/08/2018 - 17:13 |
4 | 0
TheMindGarage

This is just plain unacceptable. If this technology is to be tested on public roads, Uber (and other companies) have to be more transparent about what the car saw and tried to do. The fact this took so long showed that they tried to cover it up.

05/08/2018 - 13:16 |
58 | 4

Are you maing those spellinn mistake on purposs ?

05/08/2018 - 15:44 |
4 | 3

She would have died if the car braked or not, you could understand why the car wasn’t programmed to brake, it was a unlit road at night and she walked strait into the path of the vehicle traveling at 60mph. The result wouldn’t change no matter how it was programmed

05/19/2018 - 21:30 |
0 | 0
Litwiller

“Litter in the road” i heard the woman was homeless but jeeez is that what were calling the homeless these days?

05/08/2018 - 13:20 |
75 | 1
Anonymous
05/08/2018 - 13:22 |
25 | 0
Griffin Mackenzie

Yikes

05/08/2018 - 13:27 |
2 | 1
Anonymous

Wow even mazdas stop when a person is detected. They should compromise for more help. This technology should be shared for the better instead of monopolized.

05/08/2018 - 13:31 |
3 | 1
Dante Verna

I had a feeling that this was what happened…

05/08/2018 - 13:31 |
1 | 0
Kenan

The chip insidd it told it to execute order 66

05/08/2018 - 13:49 |
12 | 1
Anonymous

I hope this technology dies.

05/08/2018 - 13:54 |
20 | 2
TheMindGarage

In reply to by Anonymous (not verified)

I’m not against Level 5 autonomy as long as it means people can still drive if they want to. The real problem is this halfway-house - Level 3. People (other than driving instructors) cannot be trusted to stay 100% alert when using it.

05/08/2018 - 15:48 |
11 | 2
Burnout🔰(Rotary Fighter)(SaveCT)

Never trust these self driving cars….

05/08/2018 - 14:16 |
7 | 1