The car did what it was programmed to do— unfortunately, that’s not what was best for the time. I think some kind of human override is needed for this type of situation.
But this feels more like a general car problem than anything. Car infrastructure is typically not pedestrian friendly :(
Removing pressure from her leg could have meant bleeding to death before paramedics could arrive. As horrifying as it is to have a car parked on your leg, she was stable and as safe as she could possibly be. Removing the car from her leg wouldn’t have reduced the pain - chances are it would’ve got a lot worse and I bet emergency services didn’t remove the car from her leg as soon as they arrived, they would’ve done a bunch of prep work first (especially given her drugs for the pain).
When there’s a serious accident, you stop what you’re doing and wait for help. Only act if you’re trained or if it’s very clear that something needs to be done right now (e.g. if a car is on fire and someone is inside it).
If the weight of the car stopped her from breathing it would have been a very different thing.
You are adapting your arguments to the situation.
It should be clear that no self-driving car will ever know what “the right thing” is in cases like this and it would require human interaction/intervention to resolve*. This is simply because the car would be unable to gather the necessary information about the situation.
That should not deter us from adopting self-driving, as self-driving vehicles will be the biggest boon to pedestrian safety seen since the advent of urbanization.
* One could obviously imagine a future where other vehicles could contribute information about the situation so that the vehicle in question could take actions and react based on what happens around it and seeing different perspectives than its own. Interactions with robots or drones could potentially also contribute information or actively aid in the situation.
If the vehicle was intelligent enough to converse with other humans or even the human in question, or at least use human voice to gather information to aid its decision making this could also be different. But the vehicle itself will always struggle with the lack of information about what is actually going on in a situation like this.
If the weight of the car stopped her from breathing she’d be dead.
The facts of the matter are:
The car is programmed to stop and turn on its hazard lights when it detects an obstruction underneath it.
That is good policy, overall, for when a person is trapped under a vehicle
As exemplified by this situation, where moving off her leg could endanger her life
A larger narrative was attempted to be extrapolated from the smaller narrative here of a car endangering someone’s life.
However as has been described already, this car did not endanger anyone’s life any more than a human-driven car would have. In fact, given then scenario of a pedestrian literally flying into its path, it behaved optimally for that scenario. Something a human driver may not have done.
I like how you just keep on talking about what we all agree on.
Would you like to imagine how you would argue if the first sentence you wrote was true?
That’s when the interesting scenarios start showing up, including how humans are ready to grab the pitchforks when an automated system kills someone, but when humans do it 10x more it’s perfectly fine.
NO, a human driving a car and hitting another person is NOT perfectly fine.
People just didn't make a big fuse about them because our society already have the institutions to deal with this kind of situation.
The driver would be punished according to the legal institution. If it is a deliberate murder they would go to court and be trailed.
Local media will also make sure the culprit would be punished socially. Everyone in town will know who hit our neighbor.
On the other hand, the responsibility of driver-less vehicles are not well defined yet. Is the engineer responsible? Is the programmer responsible? Is the CEO of the manufacturer responsible?
This is why these incidents receive so much outcry.
The car did what it was programmed to do— unfortunately, that’s not what was best for the time. I think some kind of human override is needed for this type of situation.
But this feels more like a general car problem than anything. Car infrastructure is typically not pedestrian friendly :(
No, the car did the right thing.
Removing pressure from her leg could have meant bleeding to death before paramedics could arrive. As horrifying as it is to have a car parked on your leg, she was stable and as safe as she could possibly be. Removing the car from her leg wouldn’t have reduced the pain - chances are it would’ve got a lot worse and I bet emergency services didn’t remove the car from her leg as soon as they arrived, they would’ve done a bunch of prep work first (especially given her drugs for the pain).
When there’s a serious accident, you stop what you’re doing and wait for help. Only act if you’re trained or if it’s very clear that something needs to be done right now (e.g. if a car is on fire and someone is inside it).
If the weight of the car stopped her from breathing it would have been a very different thing.
You are adapting your arguments to the situation.
It should be clear that no self-driving car will ever know what “the right thing” is in cases like this and it would require human interaction/intervention to resolve*. This is simply because the car would be unable to gather the necessary information about the situation.
That should not deter us from adopting self-driving, as self-driving vehicles will be the biggest boon to pedestrian safety seen since the advent of urbanization.
* One could obviously imagine a future where other vehicles could contribute information about the situation so that the vehicle in question could take actions and react based on what happens around it and seeing different perspectives than its own. Interactions with robots or drones could potentially also contribute information or actively aid in the situation.
If the vehicle was intelligent enough to converse with other humans or even the human in question, or at least use human voice to gather information to aid its decision making this could also be different. But the vehicle itself will always struggle with the lack of information about what is actually going on in a situation like this.
If the weight of the car stopped her from breathing she’d be dead.
The facts of the matter are:
A larger narrative was attempted to be extrapolated from the smaller narrative here of a car endangering someone’s life.
However as has been described already, this car did not endanger anyone’s life any more than a human-driven car would have. In fact, given then scenario of a pedestrian literally flying into its path, it behaved optimally for that scenario. Something a human driver may not have done.
I like how you just keep on talking about what we all agree on.
Would you like to imagine how you would argue if the first sentence you wrote was true?
That’s when the interesting scenarios start showing up, including how humans are ready to grab the pitchforks when an automated system kills someone, but when humans do it 10x more it’s perfectly fine.
NO, a human driving a car and hitting another person is NOT perfectly fine.
People just didn't make a big fuse about them because our society already have the institutions to deal with this kind of situation.
The driver would be punished according to the legal institution. If it is a deliberate murder they would go to court and be trailed.
Local media will also make sure the culprit would be punished socially. Everyone in town will know who hit our neighbor.
On the other hand, the responsibility of driver-less vehicles are not well defined yet. Is the engineer responsible? Is the programmer responsible? Is the CEO of the manufacturer responsible?
This is why these incidents receive so much outcry.
Just needed to be sure, but thanks for confirming.