Short Bytes: MIT Moral Machine analyzes the situations of the moral dilemma while driving. The decisions made by actual humans as outside observers can enable self-driving cars to deal with these situations on the road. The platform also compares your responses with the responses made by other human observers.
There are times when you’re driving at 60mph and a dog shows up in the middle of the road. Situations with a possibility of an accident are an important question which must be answered in the case of these autonomous cars.
MIT is working on a platform called Moral Machine to address situations of moral dilemma faced by humans. Based on different situations, Moral Machine wants to know the “human perspective on moral decisions made by machine intelligence, such as self-driving cars”.
Who would you save if a small kid and an old lady are on the road, or you’ll try to save both of them at the cost of your own and fellow passengers’ lives?
Scenarios can be different, there may be a group of people on the road and a single person in the car. It’s per the human driver which life is worth saving. The moral machine takes your input as an outside observer on different accident scenarios that may arise in front of a driverless car. This can help the car to make right decisions when the situation becomes real.
Try MIT Moral Machine and help the self-driving car to “kill” people on the road in order to save others.
If you have something to add, tell us in the comments below.
Also Read: Programmer Makes Self-driving Toy Car Powered By Raspberry Pi, Arduino, Python