Tesla knew Autopilot caused death, but didn’t fix it::Software’s alleged inability to handle cross traffic central to court battle after two road deaths
Tesla knew Autopilot caused death, but didn’t fix it::Software’s alleged inability to handle cross traffic central to court battle after two road deaths
I remember reading about the ethical question about the hypothetical self driving car that loses control and can choose to either turn left and kill a child, turn right and kill a crowd of old people, or do nothing and hit a wall, killing the driver. It’s a question that doesn’t have a right answer, but it must be answered by anybody implementing a self driving car.
I non-sarcastically feel like Tesla would implement this system by trying to see which option kills the least number of paying Xitter subscribers.
At the very least, they would prioritize the driver, because the driver is likely to buy another Tesla in the future if they do.
Meanwhile hundreds of people are killed in auto accidents every single day in the US. Even if a self driving car is 1000x safer than a human driver there will still be accidents as long as other humans are also sharing the same road.
When a human is found to be at fault, you can punish them.
With automated driving, who’s to punish? The company? Great. They pay a small fine and keep making millions while your loved one is gone and you get no justice.
People generally aren’t punished for an accident unless they did it intentionally or negligently. The better and more prevalent these systems get, the fewer the families with lost loved ones. Are you really arguing that this is a bad thing because it isn’t absolutely perfect and you can’t take vengeance on it?
Generally, people are punished for causing an accident, purposefully or not. Their insurance will either raise their rates or drop them causing them to not be able to drive. That is a form of punishment you don’t get with automated driving.
Increased rates aren’t a punishment they’re a risk calculation and insurance (outside of maybe property insurance in case a tree falls on the car for example) may not even be needed someday if everything is handled automatically without driver input. Why are you so stuck on the punishment aspect when these systems are already preventing needless death?
punish and justice are synonymous… edit WOW bad typo should have read punish and justice are NOT synonymous.
I think the whole premise is flawed because the car would have had to have had numerous failures before ever reaching a point where it would need to make this decision. This applies to humans as we have free will. A computer does not.