Tesla's autopilot sensors also have some notable blind spots, specifically items hanging from the ceiling or below the front grill. This first came to light after someone turned on the summon feature and the car casually (and slowly) ran into a parked trailer.
http://jalopnik.com/man-claims-his-t...all-1776021675
There are other sources as well, but this was the first that came to mind. The lead image in the article shows exactly how something like the accident can happen.

Parked car crash, not the fatal one (hopefully) obviously.
There are a lot of people using this as an excuse to make various blanket statements such as "all cars have to be autonomous for them to work, outlaw all drivers" or "autonomous cars will never work." I can dig up examples just as Google can, but it is all nonsense. While Tesla seems to arguably make a good product, the autopilot isn't fully baked yet, and a "public beta" is a bad idea. Turn it off. Let the engineers work out the blind spots in the senor net, and then take it back to the public. This accident was the result of rushing something to market that had no business being there, and will likely have a negative impact on the advancement and adoption of the technology in the future.
Keep in mind I say that while still not willing to give up my third pedal, let alone my steering wheel. Still, I understand the advantages for those who view driving as a necessity rather than a hobby.
ETA: To be clear: the accident is almost certainly the truck driver's fault for taking a left with oncoming traffic. My statement is about how the autopilot didn't stop the car when a truck had pulled in front of it far enough for the trailer to be what it struck.