Welcome, Guest: Register On Nairaland / LOGIN! / Trending / Recent / New
Stats: 3,152,736 members, 7,817,029 topics. Date: Friday, 03 May 2024 at 11:36 PM

Advances In AI Are Used To Spot Signs Of Sexuality - Programming - Nairaland

Nairaland Forum / Science/Technology / Programming / Advances In AI Are Used To Spot Signs Of Sexuality (1218 Views)

Top Benefits And Practical Issues In AI And Machine Learning / Help Needed On API Used To Generate Wema Bank Virtual Accounts / 13 Years Old Tanmay Bakshi World's Youngest IBM Watson Programmer In AI (2) (3) (4)

(1) (Reply)

Advances In AI Are Used To Spot Signs Of Sexuality by nijabazaar: 3:05pm On Sep 08, 2017
MODERN artificial intelligence is much feted. But its talents boil down to a superhuman ability to spot patterns in large volumes of data. Facebook has used this ability to produce maps of poor regions in unprecedented detail, with an AI system that has learned what human settlements look like from satellite pictures. Medical researchers have trained AI in smartphones to detect cancerous lesions; a Google system can make precise guesses about the year a photograph was taken, simply because it has seen more photos than a human could ever inspect, and has spotted patterns that no human could.
AI’s power to pick out patterns is now turning to more intimate matters.

Research at Stanford University by Michal Kosinski and Yilun Wang has shown that machine vision can infer sexual orientation by analysing people’s faces.

The researchers suggest the software does this by picking up on subtle differences in facial structure. With the right data sets, Dr Kosinski says, similar AI systems might be trained to spot other intimate traits, such as IQ or political views. Just because humans are unable to see the signs in faces does not mean that machines cannot do so.

The researchers’ program, details of which are soon to be published in the Journal of Personality and Social Psychology, relied on 130,741 images of 36,630 men and 170,360 images of 38,593 women downloaded from a popular American dating website, which makes its profiles public. Basic facial-detection technology was used to select all images which showed a single face of sufficient size and clarity to subject to analysis. This left 35,326 pictures of 14,776 people, with gay and straight, male and female, all represented evenly.

Out of the numbers
The images were then fed into a different piece of software called VGG-Face, which spits out a long string of numbers to represent each person; their “faceprint”. The next step was to use a simple predictive model, known as logistic regression, to find correlations between the features of those faceprints and their owners’ sexuality (as declared on the dating website). When the resulting model was run on data which it had not seen before, it far outperformed humans at distinguishing between gay and straight faces.
When shown one photo each of a gay and straight man, both chosen at random, the model distinguished between them correctly 81% of the time. When shown five photos of each man, it attributed sexuality correctly 91% of the time. The model performed worse with women, telling gay and straight apart with 71% accuracy after looking at one photo, and 83% accuracy after five. In both cases the level of performance far outstrips human ability to make this distinction. Using the same images, people could tell gay from straight 61% of the time for men, and 54% of the time for women. This aligns with research which suggests humans can determine sexuality from faces at only just better than chance.

Dr Kosinski and Mr Wang offer a possible explanation for their model’s performance. As fetuses develop in the womb, they are exposed to various levels of hormones, in particular testosterone. These are known to play a role in developing facial structures, and may similarly be involved in determining sexuality.

The researchers suggest their system can pick up subtle signals of the latter from the former. Using other techniques, the program was found to pay most attention to the nose, eyes, eyebrows, cheeks, hairline and chin for determining male sexuality; the nose, mouth corners, hair and neckline were more important for women.

The study has limitations. Firstly, images from a dating site are likely to be particularly revealing of sexual orientation. The 91% accuracy rate only applies when one of the two men whose images are shown is known to be gay. Outside the lab the accuracy rate would be much lower.

To demonstrate this weakness, the researchers selected 1,000 men at random with at least five photographs, but in a ratio of gay to straight that more accurately reflects the real world; approximately seven in every 100.

When asked to select the 100 males most likely to be gay, only 47 of those chosen by the system actually were, meaning that the system ranked some straight men as more likely to be gay than men who actually are.

However, when asked to pick out the ten faces it was most confident about, nine of the chosen were in fact gay. If the goal is to pick a small number of people who are very likely to be gay out of a large group, the system appears able to do so.
The point is not that Dr Kosinski and Mr Wang have created software which can reliably determine gay from straight. That was not their goal. Rather, they have demonstrated that such software is possible.

(1) (Reply)

Learn Java Programming From The Begining. / Peer To Peer,matrix,mlm & Sport Websites Expert In Nigeria / Differences Between Web Dev And Dev Ops

(Go Up)

Sections: politics (1) business autos (1) jobs (1) career education (1) romance computers phones travel sports fashion health
religion celebs tv-movies music-radio literature webmasters programming techmarket

Links: (1) (2) (3) (4) (5) (6) (7) (8) (9) (10)

Nairaland - Copyright © 2005 - 2024 Oluwaseun Osewa. All rights reserved. See How To Advertise. 15
Disclaimer: Every Nairaland member is solely responsible for anything that he/she posts or uploads on Nairaland.