The Impact of Olay’s Commitment to Correcting Beauty Biases in Search Engine Algorithms

As the world becomes increasingly dependent on research results to inform decisions in our daily lives, it is of the utmost importance that these results are not only accurate, but also contain as little bias as possible. However, researchers have found that while women of color make up 40% of the total US population, they only appear in 20% of search results for keyword combinations such as “women” + “beautiful”. skin”.

It can be easy to forget that the internet doesn’t exist in a vacuum, and in many ways, that’s a self-fulfilling prophecy. Adjectives like “beautiful” have long and exclusively been associated with European-centric ideals (read: thin, white, cisgender, able-bodied, and heterosexual), and images that reflect these ideals are what end up populating our results. research.

It’s time to stop this problematic cycle.

Olay, which has always been a pioneer at the intersection of beauty and social impact, plays a pivotal role in helping to diversify who can write code. The brand is committed to #FacetheSTEMGap over the next decade, striving to double the number of women and triple the number of women of color in STEM (abbreviated term for the fields of “science, technology, engineering and mathematics”).

Olay’s most recent effort to close the STEM gap focuses on “Decoding the bias” in algorithmic coding. The brand called on Joy Buolamwini, founder of the Algorithmic Justice League (AJL), a digital advocacy organization that combines art and research to illuminate the social implications and harms of artificial intelligence, to serve as the face of the campaign. A true pioneer in the field of data science, Buolamwini and her decades of work have, in many ways, been the catalyst for today’s public discourse on how data is inherently biased.

Like Buolamwini, Tashay Green, a senior applied data engineer in Chicago, wants more people to not only think critically about the data we interact with every day, but also create lasting change. For the past four years, she has spent her days “creating predictive learning models to forecast some kind of future outcome.” These templates help populate distinctive advertisements presented to you online or suggested items that appear in your online shopping cart. Green knows intimately the biases that can be present when understanding said data, as well as the implications these biases can have. “We always have an impact on the data we receive, none of it is objective and there is no one right way to do anything,” she adds.