The dilemma of whether self-driving cars are ethically correct, and when they will be a thing, is a question that has been on the table for years. But are self-driving cars ethically wrong, and are they as far away as we think?
Elon Musk, the CEO of Tesla, did back in 2014 state that there would be self-driving cars thoroughly tested within 2018. Now, it is not entirely inaccurate.
A video by "Philosophy Tube" on YouTube portrays an ethical problem, known as the "trolley problem". This has taken an example of self-driving trains and subways, etcetera.
If such a vehicle were to get out of control and was about to kill 5 people, but you were standing next to the switch that could re-rail the vehicle, you could choose what to do.
He then states the same thing as many others speaking of the ethical problems of self-driving cars, which is that if a person takes action in a situation like a car crash, they act by choice. But if an automated vehicle were in the same position, would they be able to make that ethical choice?