In: Economics
compose a post that outlines one of the many ethical dilemmas posed by autonomous vehicles. First, explore both sides of the issue fully, without bias. Then, once you’ve laid out the problem, explain which side YOU agree with, and why you feel that way.
Now as self driving technology is being used in cars, the debate is sharpening about what would they do and how would they behave in any situation that they didn't anticipate. Studies show that autonomous cars would reduce road accidents upto 90%, but nobody believes that accidents can be completely eliminated which brings to the ethical dilemma about whom should the car save first and whom would it harm if it comes to such unavoidable circumstances.
Germany is the first country that tried to answer this question by coming up with a set of guidelines. The rules states that self driving cars should first attempt to minimise or avoid death and shouldn't discriminate between humans based on their gender, age or any other factor. It stated that human rights must be given more priority than any animal or property. Some people though believe that autonomous cars cannot take reasonable decisions like humans in such situations. Or like with the emergency brake, it can cause injuries that might not have been caused if humans were driving.
According to me, with the appropriate guidelines like the one made by Germany, this should no longer be a dilemma. Because when human drivers get into accident, they make subconscious decisions and choices that can be selfish when it comes to them and the passengers. Machines don't have these flaws. If reasonable instructions are there and are modelled that way, then they can take more rational decisions than a human brain in such situations.