Quote:
Originally Posted by strat61caster
I can't come up with an analogy that accurately describes how astronomically implausible that scenario is.
|
And I can't come up with another scenario that explains how the system failed so miserably that the car was carrying enough speed to not only go under the trailer, which would have sheered off the roof, yet still be carrying enough momentum to go through two fences, and hit a pole.
I'm not saying the programmers specifically did not foresee a tractor/trailer crossing in front of the car, but the bottom line here is that software, in general, does not "learn" it can only handle scenarios it has been programmed to do.
For example, we don't know if the system is programmed to judge clearance. The example earlier in this thread where a different Tesla rolled itself under a truck with a overhanging load suggests that may be an issue. That could be contributing factor (just a theory with no basis in fact except the one photo).
A human who had learned to drive a car only on a desert road with no concerns of clearance would know from other experiences he/she could not drive underneath a tractor/trailer parked across the road just because there is a gap below it. A computer that was never taught to look 'up' above the top of the hood, would not know that and would think all was well.
I am not underestimating Musk, but then he didn't personally write the software in the Tesla. He may be the brains behind it, and is certainly the money behind it, but he did not personally verify every system. There is pretty much no system in the world that doesn't have some level of bugs and what seems like an obvious miss in hindsight.
Admittedly, I'm just guessing, as we all are. We will know one day what happened here, and let's face it, it will happen again either in this, or in another way.
Heck, a person willing to put in the time could take an Autopilot car and turn it into a suicide drone. That is what scares me about it the most.