The gaydar: a threat of AI

27

September

2017

5/5 (4)

Nowadays AI enables us to use fingerprints for almost everything, from giving us access to buildings in terms of security to unlocking our phone. In case of the latter, Apple takes it one step further by using Face ID for the iPhone X to recognize your face in order to unlock our phone. Until now, all these innovations seem quite useful, right?

Unfortunately, some people seem to use their time creating somewhat differently, for example to invent a “gaydar”. According to the researchers, this software radar, an intuitive ability to assess whether someone is gay, is better capable of doing this compared to humans. In their research, the software found patterns in the facial features of homosexual men and women in 35,326 pictures of a dating website (Kosinski & Wang, 2017).

You might be thinking, what is this good for? Luckily, we are not the only ones, as this research has led to some fuss. A woman wrote to the scientist “stop studying my face, it makes me feel I am not human”. Moreover, what if this kind of technology is being used to discriminate or, even worse, to prosecute homosexuals? According to Kosinski, a deep neural network (to find patterns in pictures) showed similarities that increase the possibility of someone being homosexual. For men: a narrower jaw line, a long nose, and narrower eyebrows, in women around the other. However, the 91 per cent accuracy of the algorithm does not mean that the software is able to pick out 91 per cent of the homosexuals in the society, as the chance of false positives remains high. Often, AI only works under artificial conditions.

Artificial intelligence can make life easier, but as in this case, used to make hasty generalizations. Sexual orientation is something human and is not to be found in one binary state. As the LGBTQ nation website states: you can tell you are gay, behave like you are gay, or feeling attracted to people from the same sex. One terrifying thing one can learn from this “gaydar”: as all technologies improve over the years, these models will be as well, and together with our pictures and personal information easily available on the Internet, privacy and civil rights could be damaged. This is especially dangerous in countries where people can be prosecuted for their sexual orientation.

Source: https://www.theguardian.com/technology/2017/sep/12/artificial-intelligence-face-recognition-michal-kosinski

Kosinski, M. and Wang, Y. (2017) Deep neural networks are more accurate than humans at detecting sexual orientation from facial images. [online] OSF. Available at: https://osf.io/zn79k/ [Accessed 27 Sep. 2017].

Please rate this

2 thoughts on “The gaydar: a threat of AI”

  1. Hi Puck,

    It indeed is quite concerning that governments that do not agree on “contemporary” views on sexual orientation, might be able to apply this algorithm and single out non-heterosexual people, with probably ominous consequences…
    But I think that publicizing this research might be better than hiding it, since this allows people to take preventive measures. Governments that really want to subject their citizens to facial analytics, will probably have found this out either way (or might already have found this out). Some research also suggests that governments and companies are already doing this (Chin & Lin, 2017; Lubin, 2016),

    The authors of the paper also said something about this:

    “(…) this work does not offer any advantage to those who may be developing or deploying classification algorithms, apart from emphasizing the ethical implications of their work. We used widely available off-the-shelf tools, publicly available data, and methods well known to computer vision practitioners. We did not create a privacy-invading tool, but rather showed that basic and widely used methods pose serious privacy threats. We hope that our findings will inform the public and policymakers, and inspire them to design technologies and write policies that reduce the risks faced by homosexual communities across the world.”

    And I think it’s a bit unfair to the authors that The Guardian only devoted one single sentence (like most other news websites did) to the concerns raised by the authors, while these concerns covered about 2 pages of their work.

    But that’s not the only thing what my reaction will be about 🙂 I noticed that most of the media coverage is only about the “accuracy” of the algorithm. While this is an important, and easy to grasp, scoring method, there is more than just accuracy when talking about classifier performance!

    As the authors also write in their paper, an accuracy of 91% does not imply that 91% of gay men in a given population can be identified, or that the classification results are correct 91% of the time. Other scoring methods than accuracy provide more insight in the actual performance of the algorithm: precision and recall.

    A high precision means that a classifier returns more relevant results than irrelevant results when classifying (that is, the fraction of gay people amongst those classified as gay). The recall is the fraction of gay people in the population correctly classified as gay).

    It’s a bit of a pity that the authors of the paper did not disclose their Precision and Recall scores, they only discussed this with a short example. Perhaps, we could better estimate the powers of the classification algorithm if we had those scores…

    Anyway, I’ll wrap this up before this comment gets too long 🙂
    —–
    References
    – Chin, J., & Lin, L. (2017, June 26). China’s all-seeing surveillance state is reading its citizens’ faces. The Wall Street Journal.
    – Lubin, G. (2016, October 12). Facial-profiling could be dangerously inaccurate and biased, experts warn. Business Insider.

  2. Interesting quote that the lady had which was: “stop studying my face, it makes me feel I am not human.” With the amount of publicly available data including personal images as well as group images up on social media, this may be a difficult problem to address. Lets hope similar studies are not used in countries with oppressive regimes, where facial features may put you at a disadvantage, regardless of your orientation.

Leave a Reply

Your email address will not be published. Required fields are marked *