The impact of Olay’s commitment to correcting beauty biases in search engine algorithms


As the world becomes increasingly dependent on research results to inform decisions in our daily lives, it is of the utmost importance that these results are not only accurate, but also contain as little bias as possible. However, the researchers found that while women of color make up 40% of the total US population, they only appear in 20% of search results for keyword combinations like “women” + “beautiful skin.” “.

It can be easy to forget that the Internet does not exist in a vacuum, and in many ways, it is a self-fulfilling prophecy. Adjectives like ‘beautiful’ have long and exclusively been associated with Europe-centric ideals (read: slim, white, cisgender, able-bodied, and heterosexual), and images that reflect these ideals end up filling our research results.

It’s time to stop this problematic cycle.

Olay, who has always been a pioneer at the intersection of beauty and social impact, plays a central role in helping to diversify the people who write code. The brand is committed to #FacetheSTEMGap over the next decade, striving to double the number of women and triple the number of women of color in STEM (an abbreviated term for fields of “science, technology, engineering and mathematics’).

Olay’s most recent efforts to close the STEM gap focus on “Decode bias” in algorithmic coding. The brand called on Joy Buolamwini, founder of the Algorithmic Justice League (AJL), a digital advocacy organization that combines art and research to illuminate the social implications and harms of artificial intelligence, to serve as the face of the campaign. A true trailblazer in the data science space, Buolamwini and her decades of work have been, in many ways, the catalyst for today’s public discourse on the ways in which data is inherently biased.

Like Buolamwini, Tashay Green, a senior applied data engineer in Chicago, wants more people to not only think critically about the data we interact with every day, but also create lasting change. For the past four years, she has spent her days “creating predictive learning models to predict some sort of future outcome.” These templates help fill the distinctive advertisements shown to you online or the suggested items that appear in your online shopping cart. Green is intimately familiar with the biases that can be present in understanding this data, as well as the implications that these biases can have. “We always have an impact on the data we receive, none of this is objective and there is no one right way to do it,” she adds.


About Author

Leave A Reply