I'm certainly no scientist, but I can see the opportunity to declare my love for Jesus in the science of driverless cars.
When going down a highway, surprises happen. When they happen choices must be made. Do you swerve to avoid a child chasing a ball and endanger yourself, or, do you kill the child and keep yourself safe?
Animals run across roads and similar choices must be made. All these choices will be made by your car's computer, not by yourself. Your car's computer must reflect your True Christian™ values.
The Bible teaches us that children are not as important as adults. Animals are to be used and abused as we see fit. We know Jesus intended for us to favor large cars over small cars and, especially, bicycles.
In general then, driverless cars must be programed so we True Christians™ who drive big cars have dominion over all lesser creatures and objects on the road. God's will must be carried out.
The Murky Ethics of Driverless Cars
A new study explores a moral dilemma facing the creators of self-driving vehicles: In an accident, whose lives should they prioritize?
By Tom Jacobs
So you’re driving down a dark road late at night when suddenly a child comes darting out onto the pavement. Instinctively, you swerve, putting your own safety in jeopardy to spare her life.
Very noble of you. But would you want your driverless vehicle to do the same?
That question, which can be found idling at the intersection of technology and ethics, is posed in the latest issue of Science. A variation on the famous trolley dilemma, it won’t be theoretical for long: Self-driving vehicles are coming soon, and they will need to be programmed how to respond to emergencies.
A research team led by Iyad Rahwan of the Massachusetts Institute of Technology argues that this poses a huge challenge to their creators. In a series of studies, it finds people generally agree with the “utilitarian” argument — the notion that cars should be programmed to spare as many lives as possible.
However, when asked what they would personally purchase, they tended to prefer a vehicle that prioritized the safety of its riders. And a theoretical government regulation that would mandate a spare-the-greatest-number approach significantly dampens their enthusiasm for buying a driverless car.
When going down a highway, surprises happen. When they happen choices must be made. Do you swerve to avoid a child chasing a ball and endanger yourself, or, do you kill the child and keep yourself safe?
Animals run across roads and similar choices must be made. All these choices will be made by your car's computer, not by yourself. Your car's computer must reflect your True Christian™ values.
The Bible teaches us that children are not as important as adults. Animals are to be used and abused as we see fit. We know Jesus intended for us to favor large cars over small cars and, especially, bicycles.
In general then, driverless cars must be programed so we True Christians™ who drive big cars have dominion over all lesser creatures and objects on the road. God's will must be carried out.
The Murky Ethics of Driverless Cars
A new study explores a moral dilemma facing the creators of self-driving vehicles: In an accident, whose lives should they prioritize?
By Tom Jacobs
So you’re driving down a dark road late at night when suddenly a child comes darting out onto the pavement. Instinctively, you swerve, putting your own safety in jeopardy to spare her life.
Very noble of you. But would you want your driverless vehicle to do the same?
That question, which can be found idling at the intersection of technology and ethics, is posed in the latest issue of Science. A variation on the famous trolley dilemma, it won’t be theoretical for long: Self-driving vehicles are coming soon, and they will need to be programmed how to respond to emergencies.
A research team led by Iyad Rahwan of the Massachusetts Institute of Technology argues that this poses a huge challenge to their creators. In a series of studies, it finds people generally agree with the “utilitarian” argument — the notion that cars should be programmed to spare as many lives as possible.
However, when asked what they would personally purchase, they tended to prefer a vehicle that prioritized the safety of its riders. And a theoretical government regulation that would mandate a spare-the-greatest-number approach significantly dampens their enthusiasm for buying a driverless car.
“Figuring out how to build ethical autonomous machines is one of the thorniest challenges in artificial

Comment