Automotive News & Other Vehicle Discussions A forum for discussions pertaining to other vehicles other than Hyundai's, Press Releases, new model lauches and general Automotive news.

Step in front of a car, get mowed down

Thread Tools
 
Old 06-26-2016, 05:39 PM
  #1  
Member
Thread Starter
 
Jonas's Avatar
 
Join Date: Apr 2002
Location: Arizona
Posts: 58
Likes: 0
Received 0 Likes on 0 Posts
Vehicle: 2003 Accent
Default Step in front of a car, get mowed down

The Self-Driving Dilemma: Should Your Car Kill You To Save Others?





​Scientists investigate a tricky moral dilemma that machines will have to grapple with when cars drive themselves.​







In a split-second, the car has to make a choice with moral—and mortal—consequences. Three pedestrians have just blindly stumbled into an oncoming crosswalk. With no time to slow down, your autonomous car will either hit the pedestrians or swerve off the road, probably crashing and endangering your life. Who should be saved?



A team of three psychologists and computer scientists, led by Jean-François Bonnefon at the University of Toulouse Capitole in France, just completed an extensive study on this ethical quandary. They ran half a dozen online surveys posing various forms of this question to U.S. residents, and found an ever-present dilemma in peoples' responses. "Most people want to live a world in which everybody owns driverless cars that minimize casualties," says Iyad Rahwan, a computer scientist with the team at MIT, "but they want their own car to protect them at all costs."



This isn't just a trivial riddle or a new take on the trolley problem thought exercise. Now that computers are driving large metal machines that can kill, they'll have to be programmed to make these kinds of decisions. "It's a rather contrived and abstract scenario, but we realize that those are the sorts of decisions that autonomous vehicles are going to have to be programmed to make," says Azime Chariff, a psychological researcher with the team at the University of Oregon.



"It's also a big challenge to the widescale adoption of autonomous vehicles, especially when there's already a basic fear about entrusting a computer program to zip us around at 60 miles an hour or more," he says. "So we conducted a series of online experiments to gauge how people were thinking about these ethical scenarios and how comfortable they would be to buy autonomous vehicles that were programmed in various ways." The survey results are outlined today the the journal Science.



The Self-Driving Dilemma



The scientists used the Amazon Mechanical Turk platform to conduct their surveys between June and November 2015, and paid 25 cents for each survey. Only American residents were polled.



Whether the choice was between their own car fatally crashing itself to save two, three, or ten pedestrians, "what we found was that the large majority of people strongly feel that the car should sacrifice its passenger for the greater good," says Bonifan. "Even when people imagined themselves in the car, they still say that the car should sacrifice them for the greater good. And even when people imagine being in a car with a family member or even with their own child, they still said the car should kill them for the greater good."



The numbers here were stark. In one survey, where the choice was between saving the car's passenger or saving a crowd of ten pedestrians, more than 75 percent of respondents agreed that sacrificing the autonomous vehicle's passenger was the more moral choice. In short, "most people agree that from a moral standpoint, cars should save the [maximum number of people] even if they must kill their passengers to do so," Bonifan says.



There is a big "but" coming. When given the option of hypothetically buying a self-driving car that's utilitarian (it saves the greatest number of people) or one that's selfish (programmed to save its passenger at all costs) people are quick to buy the selfish option. When it comes to utilitarian cars, "they tell us that it's great if other people get these cars, but I prefer not to have one myself," says Bonifan.



Economists call this feeling a social dilemma. It's a bit like how most people view paying taxes. Yeah, everyone should do it. But nobody is too keen on doing it themselves.



What if Selfish Is Better?



When considering these thorny questions about whom self-driving cars should and should not kill, it's easy to lose sight of the bigger picture, which is that autonomous vehicles have the potential to drastically reduce the number of car accidents and traffic-related deaths by eliminating human error, be it drunk drivers, distracted drivers, or good drivers who just make a mistake."Just in the United States last year, there were nearly 40,000 traffic fatalities and about 4.5 million with serious injuries," says Chariff at the University of Oregon. "Depending on how you calculate it, the dollar cost of those accidents approaches $1 trillion a year."



Just because the numbers say that self-driving cars will be safer, though, doesn't mean people are ready to trust computers to take the wheel. Here, Bonifan and his colleagues suggest their findings could be useful to policymakers hoping to ensure the safest possible implementation of self-driving cars while still encouraging people to accept them. "These cars have the potential to revolutionize transportation, eliminating the majority of deaths on the road (that's over a million global deaths annually) but as we work on making the technology safer we need to recognize the psychological and social challenges they pose too," says Rahwan at MIT.



Oddly enough, "the best strategy for utilitarian policy-makers may, ironically, be to give up on utilitarian cars," writes Joshua Greene, a psychologist at Harvard University (who wasn't involved in the study), in an essay accompanying the paper. "Autonomous vehicles are expected to greatly reduce road fatalities. If that proves true, and if utilitarian cars are unpopular, then pushing for utilitarian cars may backfire by delaying the adoption of generally safer autonomous vehicles."



Curious how you might approach these ethical self-driving car scenarios? The scientists published an interactive website today for you to explore them.





No ones going to buy a car that is going to kill the owner in an accident.
Old 06-26-2016, 06:00 PM
  #2  
Senior Member
 
Zekkal's Avatar
 
Join Date: Apr 2011
Location: Churubusco, IN
Posts: 835
Likes: 0
Received 0 Likes on 0 Posts
Vehicle: 84 VW Rabbit, 01 Audi A8L, 08 VW GTI
Default

http://www.youtube.com/watch?v=ZbLRIhRqMVk
Old 06-26-2016, 09:13 PM
  #3  
Super Moderator
 
Stocker's Avatar
 
Join Date: Sep 2001
Location: Pflugerville, TX
Posts: 10,795
Received 5 Likes on 5 Posts
Vehicle: 2000 Elantra
Default

Self-driving cars are a bad idea. This is like asking if you want a vomit sandwich or a feces sandwich. I'll take an apple, thanks.



The car should brake and sound the horn, and let the dumbass pedestrians figure out the rest.
Old 06-27-2016, 05:10 PM
  #4  
Administrator
 
187sks's Avatar
 
Join Date: Mar 2006
Location: Lacey, WA
Posts: 12,515
Likes: 0
Received 2 Likes on 2 Posts
Vehicle: Two Accents, Mini, Miata, Van, Outback, and a ZX-6
Default

I would never buy a car that has any way of intentionally killing me. I don't care if an unexpected parade crosses in front of me...do your best to save the occupants of the vehicle.



With some further development self driving cars will be safer in nearly every way but some accidents can't be prevented. Punishing owners of safer vehicles by programming them to sacrifice the occupants doesn't seem like a reasonable idea.




All times are GMT -6. The time now is 09:50 AM.