Toyota GR86, 86, FR-S and Subaru BRZ Forum & Owners Community - FT86CLUB

Toyota GR86, 86, FR-S and Subaru BRZ Forum & Owners Community - FT86CLUB (https://www.ft86club.com/forums/index.php)
-   Other Vehicles & General Automotive Discussions (https://www.ft86club.com/forums/forumdisplay.php?f=6)
-   -   Tesla Autopilot failed to notice a tractor trailer (https://www.ft86club.com/forums/showthread.php?t=107810)

mdm 07-01-2016 12:31 AM

Tesla Autopilot failed to notice a tractor trailer
 
https://www.washingtonpost.com/news/...-on-autopilot/

JD001 07-01-2016 08:51 AM

Yes, saw this story break in the UK. Seriously if you are driving in autopilot would you completely switch off from the task of driving. Not that I am saying this poor driver wasn't paying attention but reads like the driver was in automode in the mental sense..

krayzie 07-01-2016 08:53 AM

[ame="https://www.youtube.com/watch?v=sXls4cdEv7c"]Tesla Model S driver caught sleeping at the wheel while on Autopilot - Electrek - YouTube[/ame]

JD001 07-01-2016 08:56 AM

Just can't design for idiots.

Dadhawk 07-01-2016 09:07 AM

Not surprised, it was inevitable that this would happen regardless of the technology. Even if it worked perfectly there are circumstances that would have lead to this, just as human driven cars cannot avoid all fatal conditions.

What I did find interesting is how far the car traveled after going under the truck, going through two fences, and impacting a power pole. It seems like there was more than one failure of the autopilot system.

https://img.washingtonpost.com/wp-ap...tion.jpg&w=480

ScoobsMcGee 07-01-2016 09:49 AM

Tesla's autopilot sensors also have some notable blind spots, specifically items hanging from the ceiling or below the front grill. This first came to light after someone turned on the summon feature and the car casually (and slowly) ran into a parked trailer.

http://jalopnik.com/man-claims-his-t...all-1776021675

There are other sources as well, but this was the first that came to mind. The lead image in the article shows exactly how something like the accident can happen.

http://roa.h-cdn.co/assets/16/19/980...38345559-o.jpg
Parked car crash, not the fatal one (hopefully) obviously.

There are a lot of people using this as an excuse to make various blanket statements such as "all cars have to be autonomous for them to work, outlaw all drivers" or "autonomous cars will never work." I can dig up examples just as Google can, but it is all nonsense. While Tesla seems to arguably make a good product, the autopilot isn't fully baked yet, and a "public beta" is a bad idea. Turn it off. Let the engineers work out the blind spots in the senor net, and then take it back to the public. This accident was the result of rushing something to market that had no business being there, and will likely have a negative impact on the advancement and adoption of the technology in the future.

Keep in mind I say that while still not willing to give up my third pedal, let alone my steering wheel. Still, I understand the advantages for those who view driving as a necessity rather than a hobby.

ETA: To be clear: the accident is almost certainly the truck driver's fault for taking a left with oncoming traffic. My statement is about how the autopilot didn't stop the car when a truck had pulled in front of it far enough for the trailer to be what it struck.

FX86 07-01-2016 11:02 AM

i work in the IT field and i learned along time ago computers suck and don't work properly

i would NEVER EVER trust a computer to drive a car for me

JD001 07-01-2016 11:39 AM

Quote:

Originally Posted by FX86 (Post 2694332)
i work in the IT field and i learned along time ago computers suck and don't work properly

i would NEVER EVER trust a computer to drive a car for me

I suppose the old adage 'turn it off and turn it back on again' won't work driving down the highway at high-speed.

mdm 07-01-2016 11:49 AM

Quote:

Originally Posted by ScoobsMcGee (Post 2694280)
Let the engineers work out the blind spots in the senor net


I think it's going to be much harder than most people think. I suspect it's not about a literal blind stop, but about the ability of the computer to interpret complex visual scenes. Possibly, the computer was still able to see the road ahead underneath the trailer and interpreted it as if there was no obstacle.


We are still far from understanding how the brain does it, and much further from being able to apply it in a machine that operates using very different computational principles.

gravitylover 07-01-2016 11:50 AM

I also read that the driver was watching a movie when the crash occurred. Maybe he shouldn't have been :iono:

JD001 07-01-2016 11:55 AM

Quote:

Originally Posted by gravitylover (Post 2694384)
I also read that the driver was watching a movie when the crash occurred. Maybe he shouldn't have been :iono:

Let's hope it wasn't Vanishing Point.

Stang70Fastback 07-01-2016 12:01 PM

I am jumping to conclusions here, but this guy comes across as a bit of an idiot. This is the same guy whose video went viral a while back when his car swerved out of the way of a truck merging into his lane. He apparently just decided that Autopilot allows him to not drive at all. I feel bad that he's dead, but this guy sounds like he really had it coming the way he was acting.

Tcoat 07-01-2016 12:16 PM

Quote:

Originally Posted by FX86 (Post 2694332)
i work in the IT field and i learned along time ago computers suck and don't work properly

i would NEVER EVER trust a computer to drive a car for me

http://i.imgur.com/IXrKE8b.jpg


https://media.giphy.com/media/xTiTnL...LhCw/giphy.gif

FX86 07-01-2016 12:18 PM

https://ircimg.net/bsod2.jpg

strat61caster 07-01-2016 12:26 PM

Quote:

Originally Posted by JD001 (Post 2694363)
I suppose the old adage 'turn it off and turn it back on again' won't work driving down the highway at high-speed.

lol it absolutely worked in the F150 I learned to drive in when the coilpacks were going out, truck would shut off on the freeway, coast to the side of the road, give it five minutes, start it up and keep going.

It's also the solution to overheating if you're brave and poor, turn it off, let it cool, turn it on go a few more miles, turn it off...

Stang70Fastback 07-01-2016 12:34 PM

Quote:

Originally Posted by strat61caster (Post 2694443)
It's also the solution to overheating if you're brave and poor, turn it off, let it cool, turn it on go a few more miles, turn it off...

Ahh, that brings back so many memories with my old Outback. Like the time I kept turning the ignition on at midnight to see if the temp had dropped enough, and a cop just saw a suspicious wagon parked in a dark parking lot of a closed bank at 1 AM with the lights turning on and off. He went to pull up behind me with no lights on, just when I happened to determine that my engine had cooled enough, so I started the car and went to drive away right as he got behind me. Scary as fuck when suddenly a bunch of flashing lights and a siren come on RIGHT behind you in the middle of fucking nowhere.

darthpnoy1984 07-01-2016 12:40 PM

No technology is perfect or should I say nothing in life is perfect.


Sent from my iPhone using Tapatalk

Dadhawk 07-01-2016 01:00 PM

Quote:

Originally Posted by JD001 (Post 2694363)
I suppose the old adage 'turn it off and turn it back on again' won't work driving down the highway at high-speed.

Actually, I have done this, in the FR-S.

The original 2013 Pioneer base stock radio had a habit of dropping its pairing with my phone. The only way to resolve it was to power down the radio (turning it off with the switch did not suffice). Put the car in neutral, turn the key off, but not locked, restart the car. The act of engaging the starter temporarily cuts off all the accessories, essentially rebooting the radio.

I didn't say it was a wise thing, just that I did it.

NemesisPrime909 07-01-2016 01:14 PM

i never trust driver-less cars, there are just too many variables on the road for a computer to detect I don't care how advanced you claim the cars are, a computer can't drive a free space without a physical guidance system.

I bring up the notion of the Automated Warehouse. There is a guidance system build into the floor which keeps all the moving units in place and the units have sensors that make them "aware" of each other.

Currently driverless cars use a combination of sensors and satellites to guide the car where it needs to go. In theory, this should work fine. However, because driver-less cars share the road with human operated cars the margin for error gets wider.

NemesisPrime909 07-01-2016 01:38 PM


I feel like this video sums up Tesla Motors perfectly.

Tesla drivers take the douchbagery of Prius drivers to a whole other level.

Teslas are hideous and overpriced. Most of the time, the drivers who buy them seem to have nothing but contempt for the other non-electric cars on the road.

We're all taught to keep our eyes on the road
and to stay awake and alert. If we get tired while driving the pull over and rest. This guy is throwing all of that out of the window.

strat61caster 07-01-2016 01:52 PM

Having shared the road with hundreds of driverless cars at this point, I trust them more than the average driver.

Although Google's little pods are annoying as shit because they cap out at 25 mph and some of my primary routes home that they love to test on are 35 mph.

Stang70Fastback 07-01-2016 01:59 PM

Quote:

Originally Posted by NemesisPrime909 (Post 2694549)
We're all taught to keep our eyes on the road and to stay awake and alert. If we get tired while driving the pull over and rest. This guy is throwing all of that out of the window.

Ha! HA! HAHAHAHAHAHA!!!

Do you even drive on public roads? 90% of people are looking 10 feet in front of them, and have no clue what is going on beyond that distance, or around/behind them, and practically NOBODY pulls over when they are tired. Defensive driving is a lost art practiced only by a select few, and the situational awareness of the average driver is about on par with that of a blind, dead dog with a paper bag over its head.

Hate on Autopilot all you want (I'm not being a fanboy here) but quite frankly I would feel more comfortable driving alongside a car on Autopilot than a car driven by some clueless moron (which is most people, seemingly, these days.) At least I can be pretty sure that the car on Autopilot knows I'm next to it.

EDIT: And to add, I work in transit. You would be amazed and how cars will slam into our buses at speed. A giant bus. When asked what happened, it's not uncommon for them to say, "I just didn't see it." You just didn't see a 60-foot bus in front of you?! And yet people discussing this accident will inevitably take the position of, "A human-driven car would definitely have seen the truck."

As I've said before, I'm not necessarily saying Autopilot is ready for use by the general public. I'm just saying that self-driving cars don't need to be perfect. They just need to be less likely to crash than a human-driven car to be a net benefit for everyone on the road. We shouldn't expect perfection.

ScoobsMcGee 07-01-2016 02:03 PM

Quote:

Originally Posted by mdm (Post 2694381)
I think it's going to be much harder than most people think. I suspect it's not about a literal blind stop, but about the ability of the computer to interpret complex visual scenes. Possibly, the computer was still able to see the road ahead underneath the trailer and interpreted it as if there was no obstacle.


We are still far from understanding how the brain does it, and much further from being able to apply it in a machine that operates using very different computational principles.

That's part of my point, though. Tesla rushed the product to a public beta while other companies are spending years to make sure it is fully developed, and now someone is dead. They trivialized the complexity.

strat61caster 07-01-2016 02:43 PM

Quote:

Originally Posted by ScoobsMcGee (Post 2694573)
That's part of my point, though. Tesla rushed the product to a public beta while other companies are spending years to make sure it is fully developed, and now someone is dead. They trivialized the complexity.

How did they rush? There was no preliminary date hyped, no crazy buildup. Even in '14 Musk was downplaying the capabilities just claiming intelligent cruise control and automatic lane change, two features already available by other auto manufacturers at that time.

If they thought it was unsafe they would have sat on it longer with no ill effects. The numbers are still positive for them with a better batting average than human piloted cars according to the press release claims.

Someone was always going to die in a self-driving car, and many more will in the future, whether it's a sensor failure, an oversight in the system, or an act of god with a tree falling on top of someone. Sitting still out of fear of an inevitability is a death sentence in itself, if mankind didn't take risks we'd still be in caves.

krayzie 07-01-2016 03:16 PM

[ame="https://www.youtube.com/watch?v=bZuQ6v-05GM"]Tesla Car 'Autopilot' Feature Linked to Driver Death - YouTube[/ame]

[ame="https://www.youtube.com/watch?v=9I5rraWJq6E"]Autopilot Saves Model S - YouTube[/ame]


Wow WTF it can actually drive itself up the highway ramp? I always thought this Autopilot thing is just for cruising use.

I think the flaw is that the computer doesn't actually slow down to let other motorists merge. This is why the car narrowly miss that truck the first time around.

ScoobsMcGee 07-01-2016 03:30 PM

Quote:

Originally Posted by strat61caster (Post 2694618)
How did they rush? There was no preliminary date hyped, no crazy buildup. Even in '14 Musk was downplaying the capabilities just claiming intelligent cruise control and automatic lane change, two features already available by other auto manufacturers at that time.

If they thought it was unsafe they would have sat on it longer with no ill effects. The numbers are still positive for them with a better batting average than human piloted cars according to the press release claims.

Someone was always going to die in a self-driving car, and many more will in the future, whether it's a sensor failure, an oversight in the system, or an act of god with a tree falling on top of someone. Sitting still out of fear of an inevitability is a death sentence in itself, if mankind didn't take risks we'd still be in caves.

It was rushed in the sense that it was released as a software update on existing cars as a beta option with (what appears to be) inadequate testing to ensure the existing sensors could handle it. If they thought it was finished there wouldn't be beta disclaimers that need agreed to prior to engaging it. Mercedes and BMW don't have that as far as I can find. I don't suspect it was a deliberate oversight, and maybe they even felt as through they did test enough; but when cars started crashing into things with the summon feature, and the autopilot uses the same sensors and similar programming with more speed, the latter should have been disabled.

To be clear, I do not place blame only on Tesla. While some media might mislabel it as so, it isn't a fully autonomous car and I am pretty sure Tesla doesn't say that it is. Additionally, the driver did agree to the beta disclaimers and should have been prepared to take over. Rigs do not accelerate quickly. This wasn't a car already at speed going through an intersection, it was a rig turning from a stop. It had enough time to get past the cab to the trailer as well. I am curious to see more details on whether the car slowed at all, how long the rig was obstructing the road, and if there were any actual failures of the system.

ETA: http://jalopnik.com/does-teslas-semi...rom-1782935594

Stang70Fastback 07-01-2016 03:44 PM

Quote:

Originally Posted by krayzie (Post 2694684)
I think the flaw is that the computer doesn't actually slow down to let other motorists merge. This is why the car narrowly miss that truck the first time around.

No, the car did nothing wrong in this case. The truck cut across two lanes at once.

mav1178 07-01-2016 03:46 PM

Quote:

Originally Posted by krayzie (Post 2694684)
I think the flaw is that the computer doesn't actually slow down to let other motorists merge. This is why the car narrowly miss that truck the first time around.

Merge? Or lane change like an asshole? Because all I see is a truck driver making a lane change without a signal.

-alex

krayzie 07-01-2016 03:56 PM

Quote:

Originally Posted by Stang70Fastback (Post 2694717)
No, the car did nothing wrong in this case. The truck cut across two lanes at once.

Yea not saying the Autopilot was at fault obviously but a normal human driver would have significantly slowed down if anticipating or after that truck try to cut him off. It doesn't seem to me that the Tesla wanted to do that, hard for me to gauge with that video.

Or maybe it did but I wonder if the computer slowed down or the driver itself (probably the computer).

Stang70Fastback 07-01-2016 04:09 PM

Quote:

Originally Posted by krayzie (Post 2694737)
Yea not saying the Autopilot was at fault obviously but a normal human driver would have significantly slowed down if anticipating or after that truck try to cut him off. It doesn't seem to me that the Tesla wanted to do that, hard for me to gauge with that video.

Or maybe it did but I wonder if the computer slowed down or the driver itself (probably the computer).

I would have driven the same as the Tesla. I would have been watching that truck, half-assuming he would move onto me. I would also not have yielded, however, because I would want to be able to honk and make a huge deal if he tried cutting across two lanes, and running me off the road, lol.

It should be pointed out that Google recently updated their cars to make them drive a bit more aggressive/assertive while driving. It was found that they were being TOO cautious, and that alone was causing problems because they were proactively slowing down when other drivers weren't expecting them to.

There is no reason why you should slow down if someone is moving over from two lanes away into the lane next to you. You SHOULD be watching carefully to make sure they don't cross two lanes at once as this truck did, and if they start to come into your lane, then you honk or slow down if you want to be [overly] nice to them. I don't again mean to sound like a Tesla fanboy here, but that autopilot actually reacted quite brilliantly in this situation.

strat61caster 07-01-2016 04:27 PM

Quote:

Originally Posted by ScoobsMcGee (Post 2694703)
It was rushed in the sense that it was released as a software update on existing cars as a beta option with (what appears to be) inadequate testing to ensure the existing sensors could handle it.

http://s2.quickmeme.com/img/1e/1e483...d44ccd468e.jpg

ScoobsMcGee 07-01-2016 04:35 PM

Quote:

Originally Posted by strat61caster (Post 2694791)
.

http://i3.kym-cdn.com/entries/icons/...IDE_Poster.jpg

Dadhawk 07-01-2016 04:53 PM

Quote:

Originally Posted by Stang70Fastback (Post 2694567)
..90% of people are looking 10 feet in front of them, and have no clue what is going on beyond that distance, or around/behind them...

...At least I can be pretty sure that the car on Autopilot knows I'm next to it....

EDIT: And to add, I work in transit. You would be amazed and how cars will slam into our buses at speed. A giant bus. When asked what happened, it's not uncommon for them to say, "I just didn't see it." You just didn't see a 60-foot bus in front of you?! And yet people discussing this accident will inevitably take the position of, "A human-driven car would definitely have seen the truck."

Yet these comments exactly describe how the Tesla autopilot failed. It failed seeing something even bigger than your bus in front of it because (I suspect) the programmers didn't take into account a big rig crossing in front of it in such a way that the road looked clear.

I'm not buying the whole "bright sky/white truck" description as the full blame on this. I really think the sensors saw clear road ahead underneath the trailer, or that was at least a contributing factor.

If you look at the diagram of the accident its pretty clear there was no attempt by the autopilot to slow down before and maybe even after the accident.

I have no idea what the driver of the car was doing, but it was definitely not paying attention.

Now, to be fair, this exact accident could have happened with a regular driver using cruise control only and texting his girlfriend on the phone. The automation in either case is a contributing factor but the root cause is the nut behind the wheel.

mdm 07-01-2016 04:57 PM

Quote:

Originally Posted by Stang70Fastback (Post 2694717)
No, the car did nothing wrong in this case. The truck cut across two lanes at once.


Nonsense, the truck obviously wasn't taking that left turn at high speed. Yet it managed to go across two lanes and well into the side road before the impact, so that the car in the right lane hit somewhere in the middle of the trailer and yet didn't even attempt to brake or swerve.


The autopiled failed completely.


By the way, I am wondering if the autopilot did notice the truck, would it try hitting the front or back of the truck in such case, where there is wheel/axle structure and car crumple zones can be engaged, instead of going for the middle where driver decapitation is almost guaranteed.

Dadhawk 07-01-2016 05:06 PM

Quote:

Originally Posted by ScoobsMcGee (Post 2694703)
It was rushed in the sense that it was released as a software update on existing cars as a beta option with (what appears to be) inadequate testing to ensure the existing sensors could handle it. If they thought it was finished there wouldn't be beta disclaimers that need agreed to prior to engaging it. Mercedes and BMW don't have that as far as I can find...

I didn't realize the software was considered "beta". Just more proof that Tesla is getting unfair leeway on their car development.

Can you imagine what would happen if Ford or Chevy downloaded "beta" software to all their cars that eventually caused a finger to get caught in a door handle, never mind a death?

mdm 07-01-2016 05:32 PM

Quote:

Originally Posted by Dadhawk (Post 2694825)
I didn't realize the software was considered "beta". Just more proof that Tesla is getting unfair leeway on their car development.

Can you imagine what would happen if Ford or Chevy downloaded "beta" software to all their cars that eventually caused a finger to get caught in a door handle, never mind a death?


It's not just Tesla, it's also Toyota: http://www.safetyresearch.net/blog/a...%E2%80%9D-code


and probably everyone else, they were just lucky.

Now I don't know whether Tesla's problem was due to dismal programming practices (like Toyota's probably was) or due to delivering Autopilot before a reliable algorithms for obstacle detection were designed, as I suspect, but in the end it makes little difference.

Correction: it does - because with good programming practices one could prevent Toyota's problems, while Tesla's may be currently unsolvable.

strat61caster 07-01-2016 06:36 PM

Quote:

Originally Posted by Dadhawk (Post 2694811)
(I suspect) the programmers didn't take into account a big rig crossing in front of it in such a way that the road looked clear.

I can't come up with an analogy that accurately describes how astronomically implausible that scenario is.

Hundreds of people spent hundreds of thousands of man-hours on this system with countless meetings and brainstorming sessions coming up with every scenario. The disclaimers are there for a reason, it's because the system has limitations. If those limitations were deemed an inappropriate risk (combined with the clear disclaimers that the driver must still be engaged in operating the 2 ton high speed projectile we call a car) it never would have been put in the hands of users.

Get the right 10 people in a room with access to their data and they can tell you every detail of that system from the solder used on wire J234 to the bitrate of processor V1 to the field of view of each sensor to the torque used on the fasteners. I'd bet my bottom dollar Tesla knew where the system fell short within a week of hearing about the accident and had a report on Musk's desk as they prepared to inform the NHTSA.

I've seen people figure out what's wrong with spaceships >20,000 miles away with only basic telemetry. I'm not a fan of Musk, but I've met people that work for him, don't sell them short.

Toyarzee 07-01-2016 06:54 PM

Autopilot or not, you shouldn't be sleeping or watching Netflix while in the pilot seat. Period.

I do hope this push for automation doesn't further invoke careless behaviour...

Sonic Motor 07-01-2016 09:20 PM

Quote:

Originally Posted by Toyarzee (Post 2694950)
I do hope this push for automation doesn't further invoke careless behaviour...


With how the US is, I have a feeling this is the ONLY way it will go. Making things "easier" always makes people lazier, especially with how many people actually let themselves rely on technology.

Toyarzee 07-01-2016 09:23 PM

Quote:

Originally Posted by Sonic Motor (Post 2695055)
With how the US is, I have a feeling this is the ONLY way it will go. Making things "easier" always makes people lazier, especially with how many people actually let themselves rely on technology.

Makes me so sad. Manual transmissions are the dying breed today. Conscious and active drivers are the next to go... :cry:


All times are GMT -4. The time now is 04:00 PM.

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2026, vBulletin Solutions Inc.
User Alert System provided by Advanced User Tagging v3.3.0 (Lite) - vBulletin Mods & Addons Copyright © 2026 DragonByte Technologies Ltd.


Garage vBulletin Plugins by Drive Thru Online, Inc.