Quote:
Originally Posted by RToyo86
Calling something "full self driving" then sticking a tiny disclaimer underneath that says "no it actually isn't self driving" is clickbait advertising.
Matt Farah and Jason Fenske had a intersting chat about it recently. One of them summed it up like buying a box that says cereal on it, and at the bottom of the box in small writing it says "not cereal".
There is also a moral standpoint that this stuff is all "beta" which means not finished and the implications of using it on the road around other drivers who haven't signed up for the service. I'm paraphrasing their conversation but it was interesting non the less.
I'm more or less neutral on the subject minus hating the term telsa used to call their fancy cruise control.
|
It is literally a FSD beta. It is FSD. IT IS FSD!!! I don’t know how many times I have to repeat myself. The analogy is off. It isn’t a box of...rocks...labeled a box of cereal, or something. It is a box of cereal with a recipe that is still being perfected, but once it is available on the shelf, it will be a normal box of cereal.
Would it make you feel better if Tesla hired thousands of people to perfect FSD beta on the road instead of allowing users who are being monitored?
So far this thread has been full of theory and conjecture. Does anybody have any data to show Autopilot or FSD beta is responsible for an avoidable crash that involved injury to another vehicle’s occupants or to a pedestrian, but more specifically, that the incidence of these events is greater than these events without Autopilot or FSD beta?