Since 2016, scientists have posed this scenario to folks around the world through the “Moral Machine,” an online platform hosted by the Massachusetts Institute of Technology that gauges how humans respond to ethical decisions made by artificial intelligence.

The team behind the Moral Machine released responses from more than two million people spanning 233 countries, dependencies and territories. They found a few universal decisions – for instance, respondents preferred to save a person over an animal, and young people over older people — but other responses differed by regional cultures and economic status.

The study’s findings offer clues on how to ethically program driverless vehicles based on regional preferences, but the study also highlights underlying diversity issues in the tech industry – namely that it leaves out voices in the developing world.

The Moral Machine uses a quiz to give participants randomly generated sets of 13 questions. Each scenario has two choices: You save the car’s passengers or you save the pedestrians. However, the characteristics of the passengers and pedestrians varied randomly — including by gender, age, social status and physical fitness.

 An example question posed to Moral Machine participants.
An example question posed to Moral Machine participants.

What they found: The researchers identified three relatively universal preferences. On average, people wanted: to spare human lives over animals, save more lives over fewer, prioritize young people over old ones.

Collectivism Vs Individualism

Researchers note that participants from collectivist cultures like China and Japan are less likely to spare the young over the old—perhaps, the researchers hypothesized, because of a greater emphasis on respecting the elderly.

The results showed that participants from individualistic cultures, like the UK and US, placed a stronger emphasis on sparing more lives given all the other choices—perhaps, in the authors’ views, because of the greater emphasis on the value of each individual.

[Source / Via]