Artificial intelligence can accurately guess whether people are gay or straight based on photos of their faces, according to new research suggesting that machines can have significantly better "gaydar" than humans.
一项新研究显示,人工智能可以通过人脸照片精确识别出这个人是直男还是同性恋,该研究认为,机器的“gay达”(同志雷达)比人类准确得多。
The study from Stanford University – which found that a computer algorithm could correctly distinguish between gay and straight men 81% of the time, and 74% for women – has raised questions about the biological origins of sexual orientation, the ethics of facial-detection technology and the potential for this kind of software to violate people's privacy or be abused for anti-LGBT purposes.
这项斯坦福大学的研究发现,计算机算法能正确区分直男与同性恋,准确率高达81%,对女性性取向判别的准确率为74%。这一研究引发了人们对性向的生物学起源、人脸识别科技的道德伦理以及此类软件对个人隐私可能造成的侵犯,或被滥用于反同性恋、双性恋及变性人群体等问题的争议。
The machine intelligence tested in the research, which was published in the Journal of Personality and Social Psychology and first reported in the Economist, was based on a sample of more than 35,000 facial images that men and women publicly posted on a US dating website. The researchers, Michal Kosinski and Yilun Wang, extracted features from the images using "deep neural networks", meaning a sophisticated mathematical system that learns to analyze visuals based on a large dataset.
这项研究率先被《经济学人》报道,并发表在《人格与社会心理学》杂志上。这种人工智能分析了美国某交友网站上公开发布的35000多张男女面部图像样本。研究人员迈克•科辛斯基和Yilun Wang利用“深层神经网络”从图像中提取相关性别特征,这是一个从大量数据中学会视觉分析的复杂数学系统。
The research found that gay men and women tended to have "gender-atypical" features, expressions and "grooming styles", essentially meaning gay men appeared more feminine and vice versa. The data also identified certain trends, including that gay men had narrower jaws, longer noses and larger foreheads than straight men, and that gay women had larger jaws and smaller foreheads compared to straight women.
研究发现,同性恋男女往往具有“非典型性别”特征、表情和“打扮风格”,也就是说男同性恋一般趋向于女性化,而女同反之。研究数据还发现了一些其他趋势,如男同性恋的下巴比直男更窄,鼻子更长,前额更宽。而同性恋女性相比直女下巴更宽,前额更窄。
Human judges performed much worse than the algorithm, accurately identifying orientation only 61% of the time for men and 54% for women. When the software reviewed five images per person, it was even more successful – 91% of the time with men and 83% with women. Broadly, that means "faces contain much more information about sexual orientation than can be perceived and interpreted by the human brain", the authors wrote.
人类在这方面的的判断表现逊于机器算法,其判断男性性向的准确率仅为61%,女性的为54%。当人工智能软件能够浏览5张测试对象的照片时,准确率则更高:对男性性向判断的准确率为91%,对女性的为83%。研究人员在论文中写道,从广义上讲,这意味着“人类面孔包含的性取向信息比人类大脑可以感知和解读的更多”。
The paper suggested that the findings provide "strong support" for the theory that sexual orientation stems from exposure to certain hormones before birth, meaning people are born gay and being queer is not a choice. The machine's lower success rate for women also could support the notion that female sexual orientation is more fluid.
文中指出,有理论认为胎儿出生前接触到的某些激素决定了其性向,也就是说同性恋是天生的,而不是后天的选择,该研究结果对此提供了“有力支持”。而机器对于女性性向识别成功率较低的现象,则印证了女性性取向更加易变的说法。
While the findings have clear limits when it comes to gender and sexuality – people of color were not included in the study, and there was no consideration of transgender or bisexual people – the implications for artificial intelligence (AI) are vast and alarming. With billions of facial images of people stored on social media sites and in government databases, the researchers suggested that public data could be used to detect people's sexual orientation without their consent.
虽然研究结果对性别和性征有明显的局限,有色人种没有被纳入研究,而变性者和双性恋也没有纳入考量,但这已经显示了人工智能的巨大影响,并给人类敲响了警钟。社交网络和政府数据库中存储了数十亿人像图片,研究人员认为这些公共数据都可能在未经本人同意的情况下,被人用来进行性取向识别。
It's easy to imagine spouses using the technology on partners they suspect are closeted, or teenagers using the algorithm on themselves or their peers. More frighteningly, governments that continue to prosecute LGBT people could hypothetically use the technology to out and target populations. That means building this kind of software and publicizing it is itself controversial given concerns that it could encourage harmful applications.
可想而知,夫妻可能会用这项技术测试被他们怀疑是深柜的另一半,青少年也可以使用这种算法来识别自己和同龄人。更加可怕的是,一些对LGBT群体进行法律制裁的国家可能会利用该技术让人出柜。这说明开发并公开此类软件的行为本身存在争议,因为这可能会导致有危害性的应用软件出现。
But the authors argued that the technology already exists, and its capabilities are important to expose so that governments and companies can proactively consider privacy risks and the need for safeguards and regulations.
但该论文的作者表示,这些技术早已存在,曝光其功能很关键,因为这样政府和公司才能主动关注其隐私风险,以及进行管理防范的必要性。
"It's certainly unsettling. Like any new tool, if it gets into the wrong hands, it can be used for ill purposes," said Nick Rule, an associate professor of psychology at the University of Toronto, who has published research on the science of gaydar. "If you can start profiling people based on their appearance, then identifying them and doing horrible things to them, that's really bad."
多伦多大学心理学教授尼克•鲁尔曾发表过关于“同志雷达”的研究。他表示:“这当然是令人不安的,它就像任何新工具一样,如果心术不正的人得到它,就会用来做坏事。如果我们开始以外表来分析一个人,由此得出判断,并对他们做出恐怖的事情,那就太糟糕了。”
下一篇: 妮可•基德曼首登视后!获奖感言很值得一听