Trending Topics

Beauty Contest Judged by Robots Show Bias Against Darker Skin Tones, But the Humans Are to Blame

skin-tones-1

An international beauty contest is swirling in controversy after its “judges” seemingly favored contestants with lighter complexions over those with darker complexions.

Beauty.AI, the first international beauty competition judged by artificial intelligence, held its inaugural contest this year, in which 6,000 people from across the globe submitted photos with hopes of being crowned the “most beautiful.”

According to The Guardian, the contest’s “robot juries” were supposed to use objective parameters like facial symmetry, wrinkles and gender to chose the most attractive contestant(s). However, creators of the unique beauty contest quickly realized there was a problem with their algorithm, as almost none of their chosen winners had dark skin.

Despite a large number of entries coming from contestants in Africa and India, nearly all of the contest’s 44 winners were white. A few were of Asian descent, but only one had dark skin, The Guardian reports.

There are probably a number of reasons why the “machines” favored lighter-skinned people over dark ones, but Beauty.AI’s chief science officer, Alex Zhavoronkov, attributes the clear racial bias to the lack of minorities represented in the data used to create standards of attractiveness. Thus, artificial intelligence didn’t view contestants with darker skin tones as beautiful.

“If you have not that many people of color within the data set, then you might actually have biased results,” said Zhavoronkov. “When you’re training an algorithm to recognize certain patterns … you might not have enough data, or the data might be biased.”

A simpler explanation for the disparity is that oftentimes the humans who create these algorithms harbor implicit racial biases themselves. Those biases are then amplified by design, ultimately skewing results that are supposed to be neutral and completely objective.

The Beauty.AI contest results offer “the perfect illustration of the problem,” Bernard Harcourt, Columbia University professor of law and political science, told The Guardian. “The idea that you could come up with a culturally neutral, racially neutral conception of beauty is simply mind-boggling.”

The contest’s major faux pas has raised new concerns among computer science professionals, social justice advocates and experts from other arenas about the growing use of biased artificial intelligence systems. In some cases, this technology can be detrimental to the lives of African-Americans and other people of color.

For instance, a May 2016 investigation by non-profit news corporation ProPublica found that an algorithm used to determine the future criminality of inmates discriminately flagged Black defendants twice as often as white defendants in their likelihood to be arrested for a crime over the next two years. The faulty risk assessment software was developed by popular research and consulting firm Northpointe Inc., Atlanta Black Star reports.

The investigation revealed that Black defendants who didn’t go on to commit new crimes were mislabeled as high-risk 45 percent of the time, compared to just 23 percent of the time for whites. Black defendants were also more likely to be labeled as “high-risk” for committing violent crimes.

A less life-altering case of AI prejudice came when a trio of Brazilian researchers found that a Google search of the phrase “ugly woman” garnered disproportionate image results of Black women.

According to their report, titled “Identifying Stereotypes in the Online Perception of Physical Attractiveness,” 85.7 percent of the countries examined in Google and 76.4 percent of those examined in the Bing search engine displayed negative stereotypes of Black women. These countries included Spain, Guatemala, Argentina, USA, Peru, Mexico, Venezuela, Chile, Brazil and Paraguay among others.

“…These are countries with a strong presence of the Hispanic and Latino cultures,” the analysis read. “The centroid of this cluster (Black: -3.28, Asian: 0.60, white: 2:02) indicates that for this group of countries, there is a very negative stereotype regarding Black women and a positive stereotype for white women.”

Back to top