The story at a glance
- As more individuals spend more time online, advanced algorithmic technologies have become a commodity as platforms compete for users’ attention.
- But many of these algorithms are based on biased data collected over years and can perpetuate stereotypes and even lead to harmful results.
- New research from New York University shows that a country’s level of gender equality is reflected in its search engine results, and that these results can have biased consequences.
Algorithms have become a staple in the digital age, influencing everything from news feeds to healthcare delivery. But these aggregated lists of results have come under fire from some lawmakers after evidence showed those on social media are spreading misinformation and creating echo chambers online.
Now, new research from New York University psychologists highlights persistent gender biases in algorithms that drive real-world effects and reinforce social inequalities.
“These results call for an integrative model of ethics [artificial intelligence] which includes human psychological processes to illuminate the formation, functioning, and mitigation of algorithmic biases,” the authors wrote in the journal Proceedings of the National Academy of Sciences.
Artificial intelligence (AI) algorithms are designed to detect patterns in large data sets. The problem is that many of the large datasets that currently exist reflect inherent societal biases.
To understand the basic gender inequalities already present in society, the researchers assessed data from the Global Gender Gap Index (GGGI). This index shows the extent of gender inequality in 153 countries through the prism of economic participation, educational attainment and other measures.
America is changing faster than ever! Add Change America to your Facebook Where stream to stay up to date with the news.
Investigators then searched for the gender-neutral term “person” in Google images from 37 countries using the dominant language of each respective country. Three months later, they conducted the same experiment in 52 countries, including 31 in the first round.
In both research, greater gender inequality at the national level – as reported by the GGGI – was associated with more male-dominated Google image search results, demonstrating a link between disparities at the societal level and the algorithmic output, the authors wrote.
To test the impacts this bias might have on real-world decisions, the researchers conducted a series of experiments with 395 men and women in the United States. In the experiments, the investigators designed sets of images based on the results of Internet searches in different countries.
Low inequality countries (Iceland and Finland) tended to have almost equal representation of men and women in the image results, while high inequality countries (Hungary and Turkey) had predominantly male (90%).
Participants were told they were looking at the search results for images of four lesser-known professions: material handler, draper, peruker, and lapidary. Before viewing the image sets, participants were asked which gender was most likely to be employed in each occupation. For example, responses to “Who is more likely to be a peruker, male or female?” served as baseline perceptions.
In each of the four categories, participants, regardless of gender, determined that men were more likely than women to be material handlers, drapers, perukers and lapidaries.
The same questions were asked after participants viewed the image sets.
This time, those viewing images from countries with low inequality reversed their male-biased assumptions reported in the baseline experiment, the researchers found.
Participants saw datasets from high-inequality countries sustain their male-biased perceptions.
An additional experiment asked participants to judge the likelihood of a male or female being employed by each occupation. Individuals were then presented with male and female images and asked to select one to hire in each profession.
Images shown from countries with lower inequality rates led to more equal selection of contestants among participants, and vice versa.
Overall, the studies point to a “cycle of propagating bias between society, AI, and users,” the authors wrote.
“These findings demonstrate that levels of societal inequality are evident in Internet search algorithms, and that exposure to this algorithmic output can cause human users to think and potentially act in ways that reinforce this societal inequality.”
Posted on July 13, 2022