In: Psychology
Please answer the question after reading the short paragraph.
The Department of Transport estimated that last year 35,000 people died from traffic crashes in the US alone. Worldwide, 1.2 million people die every year in traffic accidents. If there was a way we could eliminate 90 percent of those accidents, would you support it? Of course you would. This is what driverless car technology promises to achieveby eliminating the main source of accidents -- human error. Now picture yourself in a driverless car in the year 2030, all of a sudden, the car experiences mechanical failure and is unable to stop. If the car continues, it will crash into a bunch of pedestrians crossing the street, but the car may swerve, hitting one bystander, killing them to save the pedestrians. What should the car do, and who should decide? What if instead the car could swerve into a wall, crashing and killing you, the passenger, in order to save those pedestrians? This scenario is inspired by the trolley problem, which was invented by philosophers a few decades ago to think about ethics.
Question: Should your driverless car kill you if it means saving five pedestrians? In this primer on the social dilemmas of driverless cars, Iyad Rahwan explores how the technology will challenge our morality and explains his work collecting data from real people on the ethical trade-offs we're willing (and not willing) to make. What are the possible outcomes of a society where these kind of vehicles are mixed into human driven cars? Think about the ethical issues when responding. (At least two paragraphs, a paragraph is at least 5 sentences long.)
The technology of driverless vehicles means 90% reduction in traffic accidents. This would drastically lower the number of people killed in traffic accidents. These vehicles will be laced with the latest technology and techniques to minimize the accidents, still there are chances of any mishap, which could be rarest of rare. When such mishap occurs the moral question here is who should be saved, the passengers or the pedestrian? This is moral dilemma that this technology is facing, the problem here is not with the technology of driverless vehicles, but with human morality and the way of government's regulation.
We all agree that there should be minimum harm and maximum happiness. But if we are the one's to suffer at the minimum condition or for maximizing the benefit, we cannot accept it. We won't allow the sacrifice of ourselves for the benefit of others. This is the dilemma that this technology is facing. We know the benefits of such technology and know that they are better and much efficient than human drivers. But, if we know that these vehicles are programmed to sacrifice us for the sake of others if any mishaps occurs, we would definitely choose the normal manual driven vehicles. Hence, the researchers have to think about a solution for this moral dilemma because the driverless vehicles are going to be real soon.