When Casualties are Inevitable, Who Should Self-Driving Cars Save?

  • Thread starter Eh Team
  • 145 comments
  • 6,578 views
1,944
Canada
It's cold here
The_EH_Team_43
The age of the self-driving car is coming upon us slowly, and with it comes a slightly different variety of crashes. The machine itself must decide what to do, but when it comes to a situation where someone is going to die because of the circumstances, who will the machine save/kill?

Currently MIT has a survey running with 13 randomly generated scenarios for people to choose from, to gain insight into how people think a self-driving car should operate in this very specific situation. It has its flaws but it does make you think about where you stand on certain types of people. You can take the survey here: http://moralmachine.mit.edu and after you have finished you'll see a comparison of your results to everyone else who has answered.

I saw the survey via facebook, and a friend had possibly the best overall reasoning to answer the survey:
"I killed all the passengers, for being lazy sobs."

Most Saved: Little Girls
Most Killed: Old Men

Scales: (Does not Matter) -1 to +1 (Matters a Lot)
Saving More People: +1/3
Protecting Passengers: 0
Upholding the Law: +4/10
Avoiding Intervention: -1/4
Gender Preference: +1 (Male <-> Female)
Species Preference: -1 (Humans <-> Pets)
Age Preference: -1 (Young <-> Old)
Fitness Preference: +1 (Fit <-> Large)
Social Value Preference: -1 (Doctor <-> Criminal)

In the event of inevitable death, who should self-driving cars save? Discuss.
 
I chose the deaths of the passengers each time it came up, as they were the ones choosing to go for a ride with no one driving the damn car.

However, in the event of them vs. a criminal, I'd sacrifice the criminal from an omnipotent point of view, unfortunately since it's impossible for any computers I know of to know the difference between a good and bad person, in the barrier they go, I'd say.

Fixing the damn brake failures in advance is probably the most important of the issues, though.
 
...Interesting. I'd prefer the AI driven cars to be slow on most urban streets so the inevitable crash would be the least lethal as possible, and have a program routine built in to perform various emergency maneuvers to bring itself to halt - but that's still a bit further away, I guess.

I went through the test, and I don't really understand the results - apparently I hate pets. Since I don't own one, it must be correct.

Also,

Saving more lives matters to me the most, so's protecting the passengers as well as upholding the law. I'm not far off from the norm when it comes to "Avoiding Intervention" whatever that means. And for genders, it's saying I discrimate, since I chose ladies over the dudes. Most chose to save younger, while I didn't, and it's saying I like fit and successful people over those who aren't.

...Gotta say, it makes me feel rather weird. Not in a good way, mind you.
 
Self driving cars is one of the worst ideas since smartphones.

Slowly turn the human race into complete morons.

Edit: to answer the question, it should save the people that aren't driving the self driving cars.
 
Last edited:
Self driving cars is one of the worst ideas since smartphones.

Slowly turn the human race into complete morons.
They have the potential to take lost commute time and turn it into productivity while also making transport safer, so I find it hard to agree.

The test seems a bit silly, my results really had nothing to do with my thoughts. I mostly sent the car away from people. If hitting people was unavoidable, I had the car go straight. Basically I had no preference for anyone, except humans over animals, but the results said otherwise due to statistics alone.
 
They have the potential to take lost commute time and turn it into productivity while also making transport safer, so I find it hard to agree.

Lost commute time? I don't even know what that means. You're so important right now you need to stare at your phone? Your drive to work or home must stink. That's a freedom and a priveledge. Try walking, biking or taking a bus.

Can I assume you stare at your phone while you drive like maybe 75% of the people I see?

Those people need to get a life in my opinion.
 
I used a simple strategy. Save humans over animals. Otherwise, save pedestrians over passengers. Otherwise, go straight. The logic behind that last one is a major component has failed, who's to say another hasn't as well unknown to the car's AI thereby potentially putting everybody in the scenario at risk; for instance a rollover because a wheel or the steering fails as well.

I didn't even look at the ages/genders/etc of anybody; it did not enter into my decisions at all. I doubt the car could recognize these factors either, at least not for a long time yet.
 
I mean, either way, relatives of the people that end up dying could probably sue either the company that made the self-driving vehicle, or the company that allowed the car to use the self-driving technology, so...
 
Self driving cars is one of the worst ideas since smartphones.

Slowly turn the human race into complete morons.

Edit: to answer the question, it should save the people that aren't driving the self driving cars.

This.

If casualties are inevitable though... I would kill everyone in the self-driving car.

And basically the point of this is this:

DRIVE YOUR DAMN CARS. IT'S WHAT THEY'RE MADE FOR!
 
This.

If casualties are inevitable though... I would kill everyone in the self-driving car.

And basically the point of this is this:

DRIVE YOUR DAMN CARS. IT'S WHAT THEY'RE MADE FOR!
I disagree. Like subways, taxi services, and every other method of travel, cars exist to make getting from point A to point B easier. While I dislike the idea of driverless cars, cars are not one dimensional. There's natural evolution from the question "how can we make this easier/safer/more accessible". Once again, I don't like the idea of autonomous cars, but I understand why.
 
...DRIVE YOUR DAMN CARS. IT'S WHAT THEY'RE MADE FOR!
Unfortunately, not everyone is allowed to drive a car, often for medical reasons. Are you proposing that some people should only rely on a dwindling public transport system or never get the chance to go anywhere because they can't drive?
 
I used a simple strategy. Save humans over animals. Otherwise, save pedestrians over passengers. Otherwise, go straight. The logic behind that last one is a major component has failed, who's to say another hasn't as well unknown to the car's AI thereby potentially putting everybody in the scenario at risk; for instance a rollover because a wheel or the steering fails as well.

I didn't even look at the ages/genders/etc of anybody; it did not enter into my decisions at all. I doubt the car could recognize these factors either, at least not for a long time yet.
Very much on the same page - though despite killing a lot of people in cars with my answers, I find the "Coz self-drive cars is da stupid" attitude that I'm seeing a lot of in here.... well...... stupid.

Maybe you didn't get the same scenario...... One that it spat out for me was a split pedestrian crossing, with separate lights for each side. The driving straight option killed 5 people walking on a green light, with the swerve option killing 5 people walking on a red light. Would you still choose straight if it meant that the swerve option would kill the same number of people, but in a place that the car could reasonably expect there to be no-one, and where the people were not following the road rules?
 
Did the test:

Most saved character: Female athlete - which is just a coincidence, I do not value the lives of women more than those of men.

Saving more lives: didn't matter much to me - when people cross the streets when the red lights are on its their fault when they get ran over. I rather save the lives of those that weren't idiots even if this results in a higher total death toll.

Species difference: yes, I rather sacrifice animals than humans. I love animals dearly, I have owned half a dozen pets but I'm staying true to my species.

Age preference: totally in favor for the younger people. Young lives are more valuable than those of old people. Thats just the way it is.

Fitness preference: completely in favor for the fitter people. Again, fit people are more valuable than those that don't care about their health.

Social value: doctors, athletes etc. are more valuable lives than criminals that cause only harm and destruction in our society.
 
Not surprised, but sad, at the choices some people make here.

http://moralmachine.mit.edu/results/-789990576

Ignore the demographics, except for hoomans versus pets. I do not choose based on the perceived value of the living or the dead. What matters is that the self-driving car is consistent in its application of its internal laws, and is predictable.

-

For those who say they'd sacrifice passengers every time, here's a (not so) hypothetical scenario.

We can make it so that trains will never have to hit another crossing car or pedestrian or random passenger who's fallen onto the tracks from the platform.

Simply, we equip the train with instant brakes... pistons that drive down into the ground and stop the train dead in a matter of feet.

Everyone standing inside will die. But the person who crossed the tracks at the wrong time lives.

Is this moral?

Your distaste for self-driving cars is in the idea that their riders are abrogating responsibility, instead relying on a machine to dictate their safety. Which is a laughable hypocrisy because we all do that. Every day.

You rely on hydraulic assist to clamp the brake pads for you. You rely on ABS to cadence brake for you in slippery conditions.

You rely on computers to decide whether or not this is the proper thing to do.

You rely on computers to decide when it's safe to go or when you need to stop at an intersection.

You rely on other drivers not to hit you.

You rely on computers or humans manning trains to not overspeed and kill you as it goes around a corner. Or not to come crashing through a station and into the stops at full speed.

As a pedestrian, you rely on the logical and rational behaviour of drivers not to hit you.

-

We all give up some responsibility on the road.

We all take risks.

What's most important is that factors are predictable. These situations should never occur. Because a self-driving car will not speed through an urban-suburban neighborhood, and will slow down before pedestrian crossings, whether or not there are pedestrians.

A self-driving car will be predictable.

The only correct choice in determining the actions of a self-driving car is to err on the side of predictability. Those who choose to take needless risks, that's their problem. We cannot kill the passengers on a train due to the recklessness of people who cannot follow simple traffic rules. This is harsh, but fair.

For those who do follow rules, they need the reassurance, at least, that if they're crossing on a green light, that self-driving car would destroy itself before killing them.

-

Also, I don't see the issue with self-driving cars. Or rather, while I can see a lot of technical issues, I accept that a machine can manage choke, timing advance, threshold braking, and constant speed better than me. I accept that a machine will never get tired, cranky or emotional. It will not overtake rashly after being cut off by another motorist. It will not brake check a tail-gater. It will not speed through a stop sign just because no one is watching. I know machines are not perfect yet, but they're getting better. I know that I'm not perfect. The question is, do you?
 
Last edited:
This.

If casualties are inevitable though... I would kill everyone in the self-driving car.

And basically the point of this is this:

DRIVE YOUR DAMN CARS. IT'S WHAT THEY'RE MADE FOR!

No consideration for the literally hundreds of deaths human controlled vehicles cause every day?

Maybe you didn't get the same scenario...... One that it spat out for me was a split pedestrian crossing, with separate lights for each side. The driving straight option killed 5 people walking on a green light, with the swerve option killing 5 people walking on a red light. Would you still choose straight if it meant that the swerve option would kill the same number of people, but in a place that the car could reasonably expect there to be no-one, and where the people were not following the road rules?

I did not worry too much in the case where hitting people was the only option just because the cases presented tended to be very simple. That said, I like the idea of the car not doing anything when it can't find a solution to a situation. Perhaps what should be added to that is visual/audible indication of failure to alert anyone near by of what is happening. If this is done in a consistent manner across all makes and models, it could allow potential victims to easily recognize and avoid danger.
 
Very much on the same page - though despite killing a lot of people in cars with my answers, I find the "Coz self-drive cars is da stupid" attitude that I'm seeing a lot of in here.... well...... stupid.

Maybe you didn't get the same scenario...... One that it spat out for me was a split pedestrian crossing, with separate lights for each side. The driving straight option killed 5 people walking on a green light, with the swerve option killing 5 people walking on a red light. Would you still choose straight if it meant that the swerve option would kill the same number of people, but in a place that the car could reasonably expect there to be no-one, and where the people were not following the road rules?
I also don't subscribe to the "Coz self-drive cars is da stupid" mindset. My reasoning is that the car passengers are there of their own volition (but make one or more passengers kidnap victims and it gets really sticky).

I believe I saw the same scenario you mentioned. I still went with the "go straight" regardless of the lights primarily because of the reason I already gave; in fact I didn't make note of the crossing lights at all.

I also don't believe the machine should become judge, jury and executioner of those crossing against the light but that's incidental to the question IMO.
 
This is the Trolley problem. Here's the right answer.

To rephrase the question slightly... the question is, should the self-driving car continue on its path killing lots of people, divert to kill fewer people, or divert to kill the passengers. These are the only scenarios. The car has no moral choice here, it's a box of metal, plastic, and bolts. The passengers are similarly amoral, since they don't control the car. The person who controls the car in this scenario is the person who programmed the car in the first place. So what is the moral programming?

The car is on a trajectory headed for an intersection. This is not the programmer's choice, it's just the scenario that the program is faced with. The programmer can insert logic that would have the car intentionally divert from its given trajectory onto a trajectory that it knows will kill people, or the programmer can choose not to do that. Choosing not to divert its course if the diversion would kill people is not a choice to kill the people in its path, it's a choice NOT to decide to kill people by your own actions. The path it's on is an accident (failure of the brakes).

The only scenario in which the programmer chooses not to kill people is the one where the car does not intentionally put itself on a course to kill people. As such, the only moral choice for the programmer is choice A in both cases. This is true regardless of whether you're at the wheel or programming the car.

In scenario 2, if you're the one driving (not in a self-driving car), then you can send your car off the cliff and take your own life (and only your life) to save the others.
 
@niky and @Exorcet have already made the point I largely agree with better than I ever could, but I'll have a go anyway.

I think if the car identifies a situation where it cannot avoid a crash, it should always go straight.

The (potential) intelligence of a self-driving car lies in being able to map out the situation before it, quicker and in more detail than humans likely could. This is very advantageous in situations where crashes are avoidable - the goal in each situation is common, and the car has the intelligence to achieve it - but when a crash is unavoidable, the goal changes depending on the moral rules, which the car won't make; those would be set by humans.

If it's necessary for the car to have those moral rules, then at some point someone will be deciding what type of person's life is more valued than other types of people - based on age, gender, race, criminal/non-criminal, whatever. Who will decide which form of discrimination is the correct one?

The engineers designing the car? They could be free to do so, but as this thread as shown people morals vary considerably..........so presumably different cars made by different companies would end up reacting differently to identical situations. This inconsistent behaviour wouldn't be helpful to potential victims, as
@Exorcet alluded to, which isn't ideal.

So maybe there could be a regulated set of morals common to all self-driving cars? Perhaps, but the idea of government getting to decide a "hierarchy" of who should be saved/killed first is not a tantalising prospect.

You could leave it for society as a whole to decide (somehow), but presumably most people would wish the car to never hit someone like them - in which case you'll get tryanny of the majority..........

It's seems to me like a huge minefield that is difficult to navigate without creating as many problems as you solve, or granting powers to government or groups which are open to significant abuse. Which leads me to think the best thing to do is to not have morals - ensure the car never discriminates between different people, and does something common and predictable no matter who is involved. Hence it should always go straight.

Yes, that would mean a car could kill a group of schoolchildren instead of swerving to kill the convicted sex offender - but us agreeing this situation is unjust doesn't help us settle on the countless number of situations where the grey-areas are far greater.
 
There is another aspect to this which is which one minimizes legal liability. The answer happens to be the same as the moral one (for the same reason). If your program decides to kill someone, that person's family can sue for murder. Here was someone walking down a sidewalk perfectly safely, and Ford's programming sent a 4000lb death machine at his head. Lawsuit. However, if the program, while trying to avoid all accidents, is willing to let the accident take place to avoid causing a different accident, that's to be blamed on a mechanical failure of the vehicle, or tire failure, or whatever caused the accident.

Interestingly, if there's a software bug that results in, say 100 deaths in a given year, Google may face a class action for wrongful death due to the software glitch.
 
I like how one of the analysed areas is "Gender Preferences" as though somehow one would prefer to kill one gender over another, or save one gender over another.

Gender should have zero influence in these scenarios, just as much as someone's eye colour or forearm hair length.
 
I like how one of the analysed areas is "Gender Preferences" as though somehow one would prefer to kill one gender over another, or save one gender over another.

Gender should have zero influence in these scenarios, just as much as someone's eye colour or forearm hair length.

If you're a strict utilitarian you might beg to differ. Men make more money right (gender pay gap)? That means a larger tax base if you run over the women. OTOH societies are generally more stable when men have an easier time finding a mate, so maybe it's better to aim at men after all. Statistically, black men are more likely to end up in prison, so that should put them high on the list of people to run over - again if we're strictly utilitarian about this. Do the most good and all...

Old people are causing social security to go upsidown, so it would help balance the budget if we could put them high on the list.
 
After reading the discussions particularly by @niky, @TRGTspecialist and @Danoff, I've come around to the "take no action" school of thought. Accordingly I retook the quiz, selecting in every case the "go straight" option.

The results were interesting, and clearly demonstrate the pitfalls that can come from polling and (over)analyzxing the results thereof.

I was rated on the importance of several things, Saving More Lives, Protecting Passengers, Upholding the Law, etc (anyone who's taken this has seen this). The only one they got right was the Avoiding Intervention, which they say to me matters a lot. Most of their results were middle-of-the-roadish (no pun intended) when in fact they didn't matter one whit to me. So they very much got the Wrong Answer from their analysis of my choices.

Keep that in mind next time you read an analysis of a poll's results.
 
OK, I've got a two part scenario:

A self-driving car with five passengers is driving along and its brakes fail just before it needs to begin braking to not crash into the barrier ahead. The SDcar looks ahead on the other side of the road and see's that the "walk" light is on, but there is no one using the cross-walk.

Q1: should the SDcar veer over to the wrong side of the road?

Lets say that the SDcar decides to veer over to the other side of the road because there is no one in the cross-walk. Just after the SDcar veers, a kid chasing a ball begins running across the crosswalk.

Q2: should the SDcar take no additional action and run over the kid, or should the SDcar veer back to the original side of the road and crash into the barrier?
 
Back