Quote:
Originally Posted by Irace86.2.0
It is literally a FSD beta. It is FSD. IT IS FSD!!!
|
So then, you are agreeing Tesla is misrepresenting their product, except you are saying they are underselling it? Didn't this start because Musk was quoted as saying FSD wasn't really FSD?
Quote:
Originally Posted by Irace86.2.0
Would it make you feel better if Tesla hired thousands of people to perfect FSD beta on the road instead of allowing users who are being monitored?
|
Yes, because Tesla would be fully responsible for any mishaps and couldn't sluff it off as "oh those wacky users, you know, they didn't act like we said they should"
Quote:
Originally Posted by Irace86.2.0
So far this thread has been full of theory and conjecture. Does anybody have any data to show Autopilot or FSD beta is responsible for an avoidable crash that involved injury to another vehicle’s occupants or to a pedestrian, but more specifically, that the incidence of these events is greater than these events without Autopilot or FSD beta?
|
Pretty much any accident where Autopilot/FSD is engaged is potentially one that could have been avoided. That's pretty much true for any accident regardless of driven, self-driven, or on a bus.
By the same token, we do not have statistics on how many potential accidents were avoided because the human driver took action to avoid it. I personally experience the situation once or twice a week.
I don't really trust the statistics yet on self-driving, but I do believe ultimately it will be safer. Right now though the small number of testers, while statistically significant, do not represent all the scenarios where accidents occur. The "human" statistics include all weather and road conditions, FSD stats do not.
I suppose it's possible they are peeling the onion back a bit and limiting human stats to similar conditions but I have not seen evidence of that.