Although the details are still coming out about the Highway 101 Tesla Model X fatality, it seems highly likely that accident occurred while the vehicle was in self-driving (driver assist) mode.
The entire front side of the vehicle was sheered off. From all early indications, the Tesla accident appears to have happened at highway speeds.
If you have watched any Tesla self-driving videos, you will understand that Tesla’s self-driving technology puts a high emphasis on understanding lane lines. The technology also likes to follow lead cars. To simplify the concept, it is:
- follow the lead car at a recommended car length distance, and
- stay in the lane lines.
What the Uber self-driving pedestrian fatality in Arizona and the Tesla 101 accident have in common is that the vehicles failed to register an object in its direct path. In the Uber case, a pedestrian was slowly walking across the road. In the Tesla accident, the Model X slammed into a dividing barrier.
In the above Google Street View, you see a long run-up to the barrier. The white lines lead straight to the barrier.
Here’s the questions:
- Did the Tesla auto-pilot technology register this non-lane as a legitimate lane?
- Why did the vehicle not register the barrier and slow down or change lanes?
- Did the vehicle register the barrier but panic and not know whether to brake or change lanes?
- Was the Model X following another vehicle into this non-lane?
Let’s consider this hypothetical situation.
- The Model X is following a lead car. The lead car inadvertently enters this dead lane.
2. The Model X follows the lead vehicle into the dead lane. It reestablishes the white lines as a legitimate lane (incorrectly). The lead car exits the dead lane.
3. The Model X stays in the lane which it believes is a legitimate lane. The Model X does not recognize or know what to do with the barrier now that it has lost the lead vehicle.
If a Model X was bat-powered
Self-driving cars with concentrated pings that broadcasts far enough in front of the vehicle for it to stop if the ping comes back indicating a barrier could prevent these type of accidents.
The red ping response signals the Model X there is a barrier ahead. With the unexpected response ping, the Model X should increase the ping rates to get a better fix on the reason for the ping. On the subsequent pings, the Model X realizes that a barrier or object is immediately in its pathway and should begin a braking action.
Cameras have a depth perception issue. LiDAR, or better yet laser, may serve the ping echolocation requirement.
When a self-driving car lacks or loses a lead car, the emphasis on the laser and direct path LiDAR data needs to be emphasized. The laser pings and LiDAR data can be used to recognize potential stationary or slow moving objects that are in or moving towards the immediate path of the vehicle. As a ping response to what the cameras believe may be farther away or above the clearance height of the vehicle, the ping rates need to go into an accelerated ping rate and focus its attention on its direct path. The vehicle should be prepared to slow rapidly if the pings continue to come back positive.