Why driverless cars may have to be programmed to kill

Imagine in the future that you are deciding whether or not to buy a self-driving car, but the self-driving car is programmed to sacrifice your own life if it means saving 10 others.

While that would represent the lesser of two evils for society as a whole (the utilitarian approach), those from the deontological school of thought would argue that the car should not be programmed to make the decision to “actively” kill the driver and should instead let it plough into the larger group as nature intended.

Sourced through Scoop.it from: i100.independent.co.uk

See on Scoop.itlevin’s linkblog: Modern Society Channel

Advertisements

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: