Tesla is claiming that their auto-pilot feature is saving lives, but they have yet to release detailed data to definitively support this claim.
Tesla Auto-Pilot vs Tesla driver-controlled comparisons needed
The one fatality in 130 million auto-pilot miles against a US average of 94 million miles does not provide a meaningful metric. First, the one fatality does not provide a reliable average. This lack of depth to the metric is not Tesla’s fault, but simply that more Tesla auto-pilot miles driven will be required to get a normalized measurement. Additionally, Teslas are known to have industry-leading safety, especially as it relates to drivers walking away from serious accidents. A more accurate comparision would be the number of fatal accidents in a Tesla while driver-operated against the number of total miles logged, versus the Tesla auto-pilot system. This article from December 22, 2015 claims 4 deaths for Tesla under driver control. If we are given the number of total miles driven under driver control, we will get a better sense of the rarity of a fatal event in a Tesla. For the record, in two of the driver-controlled accidents, the drivers drove off cliffs, which Tesla claims their auto-pilot feature would have prevented.
Non-fatal events metrics please
Tesla should be able to produce a report showing the number of serious and minor accidents under driver control and auto-pilot mode. Considering the rarity of fatal accidents, the non-fatal accident data would be more reliable.
To further improve the auto-pilot analysis, it would be helpful to see the number of sudden decelerations and abrupt course corrections with and without auto-pilot. The auto-pilot, if implemented and working properly, should outperform the human in these areas.
Sudden driver take-over events would also be valuable, especially if that is quickly followed by a sudden deceleration or course change. Such events would indicate that the auto-pilot may have created less safe conditions than driver-controlled situations. One major problem that Tesla’s auto-pilot suffers from is the disconnection from the GPS/destination route. Example: If a driver is travelling through Los Angeles, and has to get over two or three lanes to make it to the I-405 from the I-5, the auto-pilot and the GPS are not talking to each other. Once the driver notices the issue, there will be a last-minute take-over to travese the lanes to avoid missing the turn-off. This is not good. GPS and Auto-pilot need to work in concert to protect against this unsafe use-case.
Consider naming convention change
Until GPS and “auto-pilot” are talking with each other, Tesla should think hard about renaming the technology. I have heard a number of alternative name suggestions since the fatal accident was reported. The one that seems most viable is “driver-assistance”. By rebranding, Tesla would be signaling to its customers that they indeed need to stay alert even in “driver-assistance” mode, similar to how driver’s treat cruise-control.
We knew this day would come
Car accidents are dangerous, and the auto-pilot technology is still perfecting. Is the Tesla auto-pilot saving lives overall? Perhaps. Will this put a fire under Tesla engineers to push the technology enhancements even faster? Most assuredly. Would Joshua Brown be alive today if he were driving alertly when the truck crossed his path? Sadly, most likely.
I am a huge proponent of the benefits and merits of self-driving technology. This accident puts a spotlight on the technology, and measures will be needed to assure the public that auto-pilot technology is beneficial. And not just slighly better, but substantially superior. Until Tesla can push the driver take-over events to a vanishly low number, Tesla drivers need to attentively supervise their vehicles. Although Tesla may not want to loudly expose the limitations of the auto-pilot features, it may be necessary. By making these admissions public, Tesla drivers will have a better understanding of how the technology works, and which scenarios the auto-pilot is weakest. Tesla may also want to present a road-map for milestones they are working on to solve these issues. The technology is hard. The scrunity is high. The margin for error is low. The communication and education is challenging. Self-driving is the future, but no one said it was going to be easy.