View Single Post
Old 04-20-2021, 04:41 PM   #284
Irace86.2.0
Senior Member
 
Irace86.2.0's Avatar
 
Join Date: Mar 2017
Drives: Q5 + BRZ + M796
Location: Santa Rosa, CA
Posts: 7,884
Thanks: 5,668
Thanked 5,810 Times in 3,300 Posts
Mentioned: 70 Post(s)
Tagged: 0 Thread(s)
Quote:
Originally Posted by Dadhawk View Post
I am saying that about the consumers using this, but I also feel Tesla is more to blame. Your own example of them releasing "beta" code to be used in a public setting, even to a limited number of users, puts them at least partially at fault here.

I go back to my example about summon mode. Do you think Tesla is willing to accept blame if that inadvertently runs over an infant, or even a stray black cat laying on the black tarmac who somehow doesn't get picked up by the sensors, or is the fault of the car owner? At that point, Tesla is driving the car.

As far as the tweet, yes, I agree it's Tesla's responsibility to protect their user's data. I also think it is a perfect example of how Musk is using the situation for his own purposes, rather than addressing the concern. So, if the car wasn't on Autopilot, did the phantom driver just leave the car? Did Autopilot turn itself off right before the accident so technically yea, it wasn't own when it hit the tree but is was on 5 seconds before?

Show me another manufacturer that is having these same issues with a similar type system and I'll agree, yea, they are being irresponsible. Like it or not Tesla, and Musk, are being irresponsible at this point, IMO.
I think Tesla is to blame if the car or systems were used within the capacity of their capabilities, and the system is at fault. For example, if the car immediately started accelerating out of nowhere or jerked the steering wheel in a certain direction then that is on Tesla, but say someone summoned the car in Summon mode, and the driver did not pay attention like they are suppose to, and the car hit someone or something where it was in the wrong (I don't know how this happens, example?) then I suppose that implies partial fault on Tesla. For Summon examples, failures I have seen looked like negligence or like another vehicle caused the problem. Do you have an example of the car systems failing activate during Summon or was that more of a hypothetical? If it failed to pick up a black cat then that is on Tesla. If it failed to pick up a person laying on the ground in all black then that is on the person in black and on Tesla like it would be on the person in black and on a driver who doesn't see someone, although that driver would likely not be blamed, and it would depend the degree of negligence.

Once the warrant is served then the police will get the specifics, but Tesla shouldn't volunteer anything in my opinion until then to protect our amendment rights. I'm curious as well. If I read the article carefully enough, it sounds like they were test driving the car. I don't know if this was from a private seller or from an used car dealership, but it sounds like they were told the car could drive itself, so they were in the front passenger and rear seat. Was there a driver ever and did the person move to the passenger seat, or did someone leave the scene? I don't know. Were tricks used and warnings dismissed to allow the car to drive without a driver? Most likely. I think Musk is saying something to handle the negative PR, while not giving more, so he doesn't get into a 4th amendment issue, which both seem reasonable.

Well, following my point before, it would be good to be logically consistent across all manufactures and all systems, and not special plead to Tesla or to just driving assistance packages, but to answer your question, Tesla's system has been on the market for a much longer time, and unless I am mistaken, it is available for use in many more locations. Is Supercruise, for instance, only available on certain roads or certain highways that have been lidar mapped? I don't know if Tesla's systems are directly comparable. How many Supercruise vehicles are on the road and how limited are their capacities? Maybe that is a pro for your point, but it seems that Tesla's approach is about less babysitting and less regulations and less nannies, kind of like my gun ownership example or other driver/pedestrian protection and limitation examples.

https://gmauthority.com/blog/gm/gene...-super-cruise/

I'm curious if Tesla has settled cases or if they are just lucky to not lose lawsuits, but I am unaware of a case where the systems were found at fault. Is that telling, or is that just a case of Tesla paying off the plaintiffs? I don't know. All I am aware of is the Germany lawsuit that said the language in Musk's Tweets is misleading, partially because his timelines are off like saying the system will be fully autonomous by a given date, but that didn't happen, or by not being clear enough with his tweets what level the systems are actually capable of.
__________________
My Build | K24 Turbo Swap | *K24T BRZ SOLD*
Irace86.2.0 is offline   Reply With Quote
The Following 2 Users Say Thank You to Irace86.2.0 For This Useful Post:
Dadhawk (04-20-2021), Spuds (04-20-2021)