Every so often, I’ve seen the “ethical dilemma” of Self Driving cars come up for debate. Specifically, the scenario goes something like this:
A self driving car is approaching a crowd of children, it can veer off a cliff and kill the occupants, saving the children, what choice does it make? Who is responsible for the deaths?”
Its a dilemma to be sure, but it’s also completely absurd and effectively a non issue, which is an angle no one seems to really look at or realize. This specific scenario is completely absurd because, why are a bunch of children blocking a road on the side of a cliff to begin with? It can be toned down to be a bit more realistic of course, what if it’s a blind corner, maybe the children are just on a street and it’s just a crowd of people and not children. The children are just there to appeal to your emotional “Think of the children!!” need anyway. Maybe the alternative is to smash into a building at 60 mph after turning this blind corner into the crowd of people.
No wait, why was the car screwing around any corner where people may be at 60mph? That’s highway speeds, there’s a reason we have different speed limits after all, open view open areas like highways are faster because we can see farther down the road and we have more room to swerve into other lanes or the shoulder and not into buildings or random crowds of people.
Exceeding the speed limit like that is a human problem, not a robot problem.
So, maybe the car is obeying the speed limit, maybe the brakes have suddenly, inexplicably, failed, and the car simply can’t stop…
No wait, that doesn’t work either. Brakes generally don’t just “fail”. A robot car will be loaded with sensors, it will know the instant the brakes display even a little bit of an issue and probably drive off to have itself serviced. Or at the very least it will alert the driver of the problem and when it reaches a critical stage, simply refuse to start or operate until fixed. Should have taken it into the shop, that on demand last minute fix service call will probably cost you three times as much while you are late to work.
Looks like ignoring warning signs of trouble is also a human problem, not a robot problem.
So what if there simply isn’t time to react properly because it’s a “blind corner”? Maybe some idiot is hiding behind a mailbox or tree waiting to jump out in front of your self driving car. Except this is still more of a human problem than a robot problem.
All of these self driving robot cars, are all going to talk to each other. You car will know about every crowd of people in a twenty mile radius because all of the other cars will be talking to it and saying things like “Yo dawg, main street’s closed, there’s a parade of nuns and children there,” and the car will simply plan a different route.
They will even tell each other about that suicidal fool hiding behind the tree.
Maybe your car is alone, in the dark in a deserted area. First, it’s a robot, it doesn’t care about the darkness, if there isn’t some infrared scanner attached telling it there is someone hiding somewhere, it’s going to still see the obstruction. It will be able to know “How fast could a dog or a person jump out from behind that thing, how wide should I swing around it, how slow should I pass by it.”
It knows, because this is all it does.
Speaking of dogs, or possums, or deers, this also becomes a non issue. The car will be able to see everything around it, in the dark, because it can “see” better than any human. It also constantly sees everything in a 360 degree view. The self driving robot car will never get distracted rubber necking at an accident, it will never be distracted by that “hot chick” walking along the side of the street, it will never road range because some other robot car cut it off (which won’t happen anyway).
It just drives.
And it will do it exceptionally well.
And even if our crazy scenario comes true, even if a self driving car has a freak accident and kills a buss full fo children every year or really every month, it will still kill fewer people than humans kill while driving.
So feel free to waste time debating which deserves to die, the driver or the pack of people, or debate who is responsible, you may as well ask who will be responsible for cleaning up all the poop cars make when they replace the horse and buggy.
Josh Miller aka “Ramen Junkie”. I write about my various hobbies here. Mostly coding, photography, and music. Sometimes I just write about life in general. I also post sometimes about toy collecting and video games at Lameazoid.com.
Self Driving Cars
Every so often, I’ve seen the “ethical dilemma” of Self Driving cars come up for debate. Specifically, the scenario goes something like this:
A self driving car is approaching a crowd of children, it can veer off a cliff and kill the occupants, saving the children, what choice does it make? Who is responsible for the deaths?”
Its a dilemma to be sure, but it’s also completely absurd and effectively a non issue, which is an angle no one seems to really look at or realize. This specific scenario is completely absurd because, why are a bunch of children blocking a road on the side of a cliff to begin with? It can be toned down to be a bit more realistic of course, what if it’s a blind corner, maybe the children are just on a street and it’s just a crowd of people and not children. The children are just there to appeal to your emotional “Think of the children!!” need anyway. Maybe the alternative is to smash into a building at 60 mph after turning this blind corner into the crowd of people.
No wait, why was the car screwing around any corner where people may be at 60mph? That’s highway speeds, there’s a reason we have different speed limits after all, open view open areas like highways are faster because we can see farther down the road and we have more room to swerve into other lanes or the shoulder and not into buildings or random crowds of people.
Exceeding the speed limit like that is a human problem, not a robot problem.
So, maybe the car is obeying the speed limit, maybe the brakes have suddenly, inexplicably, failed, and the car simply can’t stop…
No wait, that doesn’t work either. Brakes generally don’t just “fail”. A robot car will be loaded with sensors, it will know the instant the brakes display even a little bit of an issue and probably drive off to have itself serviced. Or at the very least it will alert the driver of the problem and when it reaches a critical stage, simply refuse to start or operate until fixed. Should have taken it into the shop, that on demand last minute fix service call will probably cost you three times as much while you are late to work.
Looks like ignoring warning signs of trouble is also a human problem, not a robot problem.
So what if there simply isn’t time to react properly because it’s a “blind corner”? Maybe some idiot is hiding behind a mailbox or tree waiting to jump out in front of your self driving car. Except this is still more of a human problem than a robot problem.
All of these self driving robot cars, are all going to talk to each other. You car will know about every crowd of people in a twenty mile radius because all of the other cars will be talking to it and saying things like “Yo dawg, main street’s closed, there’s a parade of nuns and children there,” and the car will simply plan a different route.
They will even tell each other about that suicidal fool hiding behind the tree.
Maybe your car is alone, in the dark in a deserted area. First, it’s a robot, it doesn’t care about the darkness, if there isn’t some infrared scanner attached telling it there is someone hiding somewhere, it’s going to still see the obstruction. It will be able to know “How fast could a dog or a person jump out from behind that thing, how wide should I swing around it, how slow should I pass by it.”
It knows, because this is all it does.
Speaking of dogs, or possums, or deers, this also becomes a non issue. The car will be able to see everything around it, in the dark, because it can “see” better than any human. It also constantly sees everything in a 360 degree view. The self driving robot car will never get distracted rubber necking at an accident, it will never be distracted by that “hot chick” walking along the side of the street, it will never road range because some other robot car cut it off (which won’t happen anyway).
It just drives.
And it will do it exceptionally well.
And even if our crazy scenario comes true, even if a self driving car has a freak accident and kills a buss full fo children every year or really every month, it will still kill fewer people than humans kill while driving.
So feel free to waste time debating which deserves to die, the driver or the pack of people, or debate who is responsible, you may as well ask who will be responsible for cleaning up all the poop cars make when they replace the horse and buggy.
Josh Miller aka “Ramen Junkie”. I write about my various hobbies here. Mostly coding, photography, and music. Sometimes I just write about life in general. I also post sometimes about toy collecting and video games at Lameazoid.com.