New AI can guess whether you're gay or straight from a photograph

Jimi Swagger

I say whatever I think should be said
Supporter
Joined
Jan 25, 2015
Messages
4,365
Reputation
-1,340
Daps
6,058
Reppin
Turtle Island to DXB
An algorithm deduced the sexuality of people on a dating site with up to 91% accuracy, raising tricky ethical questions

facialrec.jpg



It’s easy to imagine spouses using the technology on partners they suspect are closeted, or teenagers using the algorithm on themselves or their peers. More frighteningly, governments that continue to prosecute LGBT people could hypothetically use the technology to out and target populations. That means building this kind of software and publicizing it is itself controversial given concerns that it could encourage harmful applications.

But the authors argued that the technology already exists, and its capabilities are important to expose so that governments and companies can proactively consider privacy risks and the need for safeguards and regulations.

“It’s certainly unsettling. Like any new tool, if it gets into the wrong hands, it can be used for ill purposes,” said Nick Rule, an associate professor of psychology at the University of Toronto, who has published research on the science of gaydar. “If you can start profiling people based on their appearance, then identifying them and doing horrible things to them, that’s really bad.”

Rule argued it was still important to develop and test this technology: “What the authors have done here is to make a very bold statement about how powerful this can be. … Now we know that we need protections.”

Kosinski was not available for an interview, according to a Stanford spokesperson. The professor is known for his work with Cambridge University on psychometric profiling, including using Facebook data to make conclusions about personality. Donald Trump’s campaign and Brexit supporters deployed similar tools to target voters, raising concerns about the expanding use of personal data in elections.

In the Stanford study, the authors also noted that artificial intelligence could be used to explore links between facial features and a range of other phenomena, such as political views, psychological conditions or personality.

This type of research further raises concerns about the potential for scenarios like the science-fiction movie Minority Report, in which people can be arrested based solely on the prediction that they will commit a crime.

“AI can tell you anything about anyone with enough data,” said Brian Brackeen, CEO of Kairos, a face recognition company. “The question is as a society, do we want to know?”

Brackeen, who said the Stanford data on sexual orientation was “startlingly correct”, said there needs to be an increased focus on privacy and tools to prevent the misuse of machine learning as it becomes more widespread and advanced.

Rule speculated about AI being used to actively discriminate against people based on a machine’s interpretation of their faces: “We should all be collectively concerned.”

:pachaha::dame:

 

Scoop

All Star
Joined
Jun 17, 2012
Messages
6,100
Reputation
-2,690
Daps
9,695
Reppin
Tampa, FL
Pretty easy to tell 90% of the time

A broader discussion on AI discrimination would be more interesting
 

southpawstyle

Superstar
Joined
Aug 31, 2013
Messages
4,363
Reputation
1,405
Daps
15,705
Reppin
California
lmao. id be considered gay as hell because I like Gilmore Girls, Fiona Apple, and the musical Chicago. I'll still jab the teeth out of anyone who wants to be disrespectful.
 

mr.africa

Veteran
Joined
Aug 17, 2013
Messages
19,837
Reputation
3,955
Daps
68,290
:wrist: :hula: :rambo:

:wow:we got too much time on our hands as a human race
:gucci:is this really necessary??
:wow:some guys be like :'need a handheld version of this, so i can use it before approaching a woman':troll:
 
Top