Article Image

IPFS News Link • Science, Medicine and Technology

Here's What's Really Going on With That Study Saying AI Can Detect Your Sexual Orientation

• http://www.sciencealert.com

Last week, scientists made headlines around the world when news broke of an artificial intelligence (AI) that had been trained to determine people's sexual orientation from facial images more accurately than humans.

According to the study, when looking at photos this neural network could correctly distinguish between gay and heterosexual men 81 percent of the time (and 74 percent for women), but it didn't take long before news of the findings provoked an uproar.

On Friday, sex and gender diversity groups GLAAD and the Human Rights Campaign (HRC) issued a joint statement decrying what they called "dangerous and flawed research that could cause harm to LGBTQ people around the world".

The AI, which was trained by researchers from Stanford University on more than 35,000 public images of men and women sourced from an American dating site, used a predictive model called logistic regression to classify their sexual orientation (also made public on the site) based on their facial features.

This included fixed features, such as the shape of a person's nose, as well as transient features, such as grooming style.

In the researchers' testing using a separate set of images that the algorithm hadn't seen before, the neural network outperformed human judges attempting to determine the sexual orientation of the individuals shown, with the human judges scoring 61 percent for men and 54 percent for women.

When the algorithm was presented with five facial images of each person, it became even more accurate, the researchers claimed, getting it right 91 percent of the time with men and 83 percent with women.

It's worth pointing out that the sample of images the researchers used had some definite limitations. For starters, they're all profile shots taken from a dating site, so they're not exactly regular images of the individuals involved, and the study only compiled images of white people aged between 18 and 40.

Taking these kinds of factors into account, GLAAD's Chief Digital Officer Jim Halloran, says any claims that this AI can determine people's sexual identity is grossly flawed.

"Technology cannot identify someone's sexual orientation. What their technology can recognise is a pattern that found a small subset of out white gay and lesbian people on dating sites who look similar," Halloran says.

"This research isn't science or news, but it's a description of beauty standards on dating sites that ignores huge segments of the LGBTQ community, including people of colour, transgender people, older individuals, and other LGBTQ people who don't want to post photos on dating sites."

GLAAD and the HRC further pointed out that the study, which has not yet been published, didn't verify people's information, assumed only two sexual orientations and didn't include data on bisexual individuals, and wasn't peer-reviewed.

But more damningly, the organisations said that this kind of "junk science" could even be threatening in the wrong hands.

"This is dangerously bad information that will likely be taken out of context," explains HRC's Director of Public Education and Research, Ashland Johnson.

"Imagine for a moment the potential consequences if this flawed research were used to support a brutal regime's efforts to identify and/or persecute people they believed to be gay."


AzureStandard