Tesla driver dies in first fatal crash while using autopilot mode

  • Thread starter polysmut
  • 89 comments
  • 7,259 views

polysmut

Member
Premium
4,722
United Kingdom
United Kingdom
Article
The first known death caused by a self-driving car was disclosed by Tesla Motors on Thursday evening, a development that is sure to cause consumers to second-guess the trust they put in the booming autonomous vehicle industry.

The 7 May accident occurred in Williston, Florida, after the driver, Joshua Brown, 40, of Ohio put his Model S into Tesla’s autopilot mode, which is able to control the car during highway driving.

https://www.theguardian.com/technology/2016/jun/30/tesla-autopilot-death-self-driving-car-elon-musk
 
I also read how he wasn't looking at the road in another instance and the autopilot avoided a sideswipe from a work truck.

It is amazing however, unfortuanate, how these things happen.
 
I also read how he wasn't looking at the road in another instance and the autopilot avoided a sideswipe from a work truck.

It is amazing however, unfortuanate, how these things happen.
Tesla have pointed out that more miles have been driven on autopilot before the its fatal accident than the average number of miles driven on the road generally per fatal accident. Also that they state that the driver should maintain concentration at all times.

Sad news though, however one looks at it.
 
Jalopnik(I know, I know...) Reported that tractor trailor driver heard a Harry Potter movie playing in the background after the crash.
 
A system still in development and he thought he could be inattentive? It's a tragedy it happened but he should have known better.

It is important to note that Tesla disables Autopilot by default and requires explicit acknowledgement that the system is new technology and still in a public beta phase before it can be enabled. When drivers activate Autopilot, the acknowledgment box explains, among other things, that Autopilot “is an assist feature that requires you to keep your hands on the steering wheel at all times," and that "you need to maintain control and responsibility for your vehicle” while using it. Additionally, every time that Autopilot is engaged, the car reminds the driver to “Always keep your hands on the wheel. Be prepared to take over at any time.” The system also makes frequent checks to ensure that the driver's hands remain on the wheel and provides visual and audible alerts if hands-on is not detected. It then gradually slows down the car until hands-on is detected again.

We do this to ensure that every time the feature is used, it is used as safely as possible. As more real-world miles accumulate and the software logic accounts for increasingly rare events, the probability of injury will keep decreasing. Autopilot is getting better all the time, but it is not perfect and still requires the driver to remain alert

and a video from the driver on a different drive:



I think Darwin did us a favor. Luckily no one else was hurt.
 
A system still in development and he thought he could be inattentive? It's a tragedy it happened but he should have known better.

This. It remains to be seen, of course, whether the accident was avoidable even with a human in full attentiveness or if indeed the driver was actually being inattentive.

I think Darwin did us a favor.

How so? Is that true of every vehicle accident, and wouldn't you say that the truck driver was the one initially at fault?
 
This. It remains to be seen, of course, whether the accident was avoidable even with a human in full attentiveness or if indeed the driver was actually being inattentive.



How so? Is that true of every vehicle accident, and wouldn't you say that the truck driver was the one initially at fault?


I wasn't referring to the video I posted, however I think it's safe to say the driving was was being fully inattentive considering the car impacted the trailer at full speed which shows there was no attempt even from the driver to slow down.

At what I meant by "favor" was that in this circumstance the driver was realistically the only person who was likely to be harmed.
 
I think it's safe to say the driving was was being fully inattentive considering the car impacted the trailer at full speed which shows there was no attempt even from the driver to slow down.

Both vehicles were at highway speed, the truck came across the car's lane as the car was alongside... no? What was particularly unfortunate was that in this case the car went under the trailer. The way you describe the accident makes it sound as though the truck was stationary across the road and the car simply barreled along into it.
 
Both vehicles were at highway speed, the truck came across the car's lane as the car was alongside... no? What was particularly unfortunate was that in this case the car went under the trailer. The way you describe the accident makes it sound as though the truck was stationary across the road and the car simply barreled along into it.

The truck was crossing the highway, most likely entering or exiting some type of business park in a rural area. So the truck was either stationary or moving very slowly in which case anybody paying even a little bit of attention to their surroundings should have noticed.
 
The truck was crossing the highway, most likely entering or exiting some type of business park in a rural area. So the truck was either stationary or moving very slowly in which case anybody paying even a little bit of attention to their surroundings should have noticed.

Fair enough, I'd misunderstood the truck's position.
 
Quite a bit about this in the UK press.

Allegedly, the truck turned across the lane the Tesla was in (at 90* to the Tesla) and the Autopilot couldn't distinguish the light coloured trailer against a light coloured sky, so made no effort to avoid it. Also allegedly, the driver was watching Harry Potter at the time (FFS!!!).
 
Quite a bit about this in the UK press.

Allegedly, the truck turned across the lane the Tesla was in (at 90* to the Tesla) and the Autopilot couldn't distinguish the light coloured trailer against a light coloured sky, so made no effort to avoid it. Also allegedly, the driver was watching Harry Potter at the time (FFS!!!).

Reminds me of the various old legends of people setting cruise control on and then going into the back of their motorhome to make a cup of tea/take a nap/etc. Those were never proven and likely just tales though, obviously. If this guy really was watching a movie, well, what can you say.
 
I think Darwin did us a favor. Luckily no one else was hurt.

Maybe not a favor, but Darwin material. He was operating the car irresponsibly.

According to his obituary, the accident happened on May 7, 2016, and he didn't have a wife or kids.
 
The intermixing of human pilots and autopiloted vehicles is massively complex, much more difficult than if everyone was on autopilot. It won't be long until any incident of this type will result in the human driver being found at fault. 'The Future' will not be fun for people who like to do things themselves and don't exist to be entertained.

Aside from safety, these autopilot features are always touted for giving people more time to be productive. Yeah, right.
 
Clearly he should have been more attentive, but I have to wonder if "autopilot" isn't a terrible name for something not designed to remove all human control.

Maybe it's technically applicable, but it does seem to be the kind of name that encourages this kind of distracted, inattentive driving.
 
Clearly he should have been more attentive, but I have to wonder if "autopilot" isn't a terrible name for something not designed to remove all human control.
I'd have thought quite the opposite and I expect Autopilot is a very deliberate name - in aircraft, an autopilot system doesn't absolve the pilot and co-pilot of their control, it just allows them to devote less time physically controlling the aircraft and more time monitoring other important aspects of flight.

I'd say it's more an education issue - a lot of people think planes fly themselves and presumably a lot of people think Teslas are fully autonomous - but it can't be said that Tesla hasn't done plenty to ensure its drivers are aware of this: there's even a warning on the screen to that effect when you select the Autopilot mode.

Unfortunately, this incident was only a matter of time. There have already been plenty of videos on Youtube of people abusing the system. I'd not be surprised if most of Tesla's statement was pre-written months ago, in the knowledge that someone would eventually come a cropper.
 
I'd have thought quite the opposite and I expect Autopilot is a very deliberate name - in aircraft, an autopilot system doesn't absolve the pilot and co-pilot of their control, it just allows them to devote less time physically controlling the aircraft and more time monitoring other important aspects of flight.

I'd say it's more an education issue - a lot of people think planes fly themselves and presumably a lot of people think Teslas are fully autonomous - but it can't be said that Tesla hasn't done plenty to ensure its drivers are aware of this: there's even a warning on the screen to that effect when you select the Autopilot mode.

Unfortunately, this incident was only a matter of time. There have already been plenty of videos on Youtube of people abusing the system. I'd not be surprised if most of Tesla's statement was pre-written months ago, in the knowledge that someone would eventually come a cropper.
That's sort of what I meant. Though autopilot is a perfectly apt name, the public perception of it as autonomous and idependent of the driver means it may not have been the best choice.

Something that implies an aid or assist rather than all out control probably would help people understand it better.
 
I think part of the problem is that eventually the system will be fully autonomous. Tesla's stuck between a rock and a hard place - it has developed a system with autonomy and can only properly develop it with the use of its customers (since the system works via machine learning), but on the other hand some of those customers can't be trusted not to treat the system as a fully working setup.
 
I'm confused as to how this happened if Tesla use radar as well as cameras, radar is not going to confuse a white trailer with the sky... only camera based tracking will so why did it fail to pick out the object?

I do worry that these systems encourage laziness and eventually will probably lead to the general degradation in driving skills of the public. I'm also concerned about the recent 'choice' issues where people were questioning whether you would be willing to put your life in the hands of an autopilot which might decide to kill you to avoid a bigger incident.
 
I'm confused as to how this happened if Tesla use radar as well as cameras, radar is not going to confuse a white trailer with the sky... only camera based tracking will so why did it fail to pick out the object?

I do worry that these systems encourage laziness and eventually will probably lead to the general degradation in driving skills of the public. I'm also concerned about the recent 'choice' issues where people were questioning whether you would be willing to put your life in the hands of an autopilot which might decide to kill you to avoid a bigger incident.
Mobileye, the company who help develop the autopilot systems for Tesla and other automakers, said the auto-braking functions are not designed for the scenario that played out in this wreck, so even if the camera and radar did happen to pick up the trailer in it's path I don't know if the system would have recognized what was going on and reacted properly. Cross-traffic issues are such a low percentage that I wouldn't think there would be much that can be done about it either without having side-by-side traffic come into effecting it's ability to maintain itself. Still, this tech is in early usage so as the years go by and more implementation and development arise from it, there should be advances in the tech.

But for now what needs to be done is to stop calling these systems "autopilot" and call them what they really are: semi-autopilot.
 
I'm also concerned about the recent 'choice' issues where people were questioning whether you would be willing to put your life in the hands of an autopilot which might decide to kill you to avoid a bigger incident.
Theoretically, it won't even have to make that decision, as it won't have got into a scenario in which such a thing is likely in the first place: in the hypothetical kid-stepping-into-road and truck-coming-the-other-way thing, for instance, the car will already have spotted both the kid and the truck and stopped rather than having to avoid one for the other.

The other thing about those decisions is that, to my knowledge, these cars aren't currently making moral decisions, only logical ones. They're not programmed to save your life, they're programmed to avoid an accident, so it's not making the same choices a human would. It's not "deciding" to do anything, just reacting to situations.

As for laziness, it's possible. But I think the theory here is that eventually these things will be sophisticated enough that they won't even need manual controls and those who aren't interested in driving won't have to do anything that driving traditionally requires.
 
The waves this is making on the interwebz are making me sick.

Today I was checking the newsfeed of Jalopnik and found a cross-post from Gizmodo, their technology blog, in which an editor, the urbanism editor no less, says she "insists on banning humans from driving cars". The expected onslaught of comments calling her an idiot, myself included, arrive, most of them with perfectly valid points...but, mark my words, she'll play the "it's because I'm a woman, isn't it?" card that Gawker loves to play :rolleyes:.

All in all, yet another inept driver off from the roads. He has nobody to blame but him. The available technology is far from sufficient to replace an actual human and, even when it gets to that point, it'd still be prone to flaws since it was made by flawed humans.
 
Even the best autopilot systems will still need human intervention at some point. Airplanes don't even have the ability to takeoff and land on their own using autopilot. I highly doubt they ever will.

Hell, even if every car on the road was autonomous, you still have to factor in system failures being a possibility. Driverless cars will probably never be 100% driverless.
 
Even the best autopilot systems will still need human intervention at some point. Airplanes don't even have the ability to takeoff and land on their own using autopilot. I highly doubt they ever will.

Hell, even if every car on the road was autonomous, you still have to factor in system failures being a possibility. Driverless cars will probably never be 100% driverless.

Simply put: no human creation will ever be humanless...unless we are willing to disappear for it's sake, which would be pretty :censored:ing stupid from an evolutionary point of view.
 
Slightly off topic but I must correct a few of you in regards to aircraft autopilot systems...
Many modern commercial aircraft have autoland systems, along with auto spoilers and brakes. This is used for Cat III landings, which allow an aircraft to land in near zero visibility conditions. Of course the flight crew must monitor the system to ensure correct and safe operation of such systems. Taxi and takeoff are still manual though.
 
The only way humans would be out of the picture in operating cars they are in, would be on rails or maglev.
18wer0kchr5fbjpg.jpg


People are blaming the parking sensors and reverse cameras for hitting objects while parking. Even when those systems are operating as per normal.

People are texting and are still drunk while driving. Some people just shouldn't have a licence and some don't have a licence yet, still operate motor vehicles. This sadly, becomes a new statistic.

Flying cars? Imagine when that happens.
 
Unfortunate that sometimes there is a price to pay for progress. Hopefully Telsa can learn from this and make changes to avoid more incidents with tall vehicles.
 

Latest Posts

Back