Tesla knew its customers were likely going to misuse its semi-autonomous Autopilot system, and tested all the ways that could happen, according to a government report released today on the fatal accident last year in Florida.
“It appears that over the course of researching and developing Autopilot, Tesla considered the possibility that drivers could misuse the system in a variety of ways, including those identified above — i.e., through mode confusion, distracted driving, and use of the system outside preferred environments and conditions,” the report from the National Highway Traffic Safety Administration reads.
Tesla engineers considered that drivers might fail to pay attention, fall asleep, or become incapacitated while using Autopilot. So they ran their vehicles through different scenarios to evaluate how Autopilot would handle each.
“The potential for driver misuse was evaluated as part of Tesla’s design process and solutions were tested, validated, and incorporated into the wide release of the product,” the report states. “It appears that Tesla’s evaluation of driver misuse and its resulting actions addressed the unreasonable risk to safety that may be presented by such misuse.”
The government exonerated Tesla for the May 2016 accident that killed Florida resident Joshua Brown, and won’t be ordering a recall of any of the carmaker’s vehicles. The driver of the tractor trailer the Tesla Model S crashed into claimed that Brown may have been watching a Harry Potter movie at the time of the accident, and the Florida highway patrol told Reuters that there was a portable DVD player in the vehicle.
http://www.theverge.com/2017/1/19/14326024/tesla-autopilot-misuse-nhtsa-report-fatal-accident via http://www.theverge.com/ #CIO, #Technology