Digital Discourse Database

Number of Posts: 4
Posts 1 - 4

Tech’s sexism doesn’t stay in Silicon Valley. It’s in the products you use.

Hyperlink

Newspaper | Washington Post
Date | 8.8.2017
Language | English
Country | U.S.
Topic Tags | (mental) health, artificial intelligence, diversity, gender, research/study
Summary | Slicon Valley has been entangled in scandals around sexism and racism recently. Many innovations incorporate artificial intelligence which means that the software learns from data reflecting our social reality but which are biased. This leads to issues like image recognition not recognizing black people as humans but as gorillas because the data the program learned from included predominantly white people. A similar case is a health app that tracked various physical paramenters but not the menstrual cycle thereby disregarding a large proportion of the female population.
Image Description | N/A

Taking poetic license with AI personalities

Hyperlink

Newspaper | Washington Post
Date | 7.4.2016
Language | English
Country | U.S.
Topic Tags | artificial intelligence, emojis, gender, research/study
Summary | Artificial intelligence assistants are now being creatively enganced by educated and professional writers and poets so as to make their conversation appear more human-like (f.i. by using emojis) and their personalities more authentic. Polls have shown that users prefer female voices for AI assistants and most companies have acted accordingly. Microsoft has however pre-empted reinforcing stereotypes about female assistants by limiting the number of apologies and self-deprecating comments for their AI assistant Cortana.
Image Description | Image of a meeting of professional writers working in AI at Microsoft.
Image Tags | computer/laptop, female(s), male(s)

Maschinen sind nicht die besseren Menschen

(Machines are the better people)

Hyperlink

Newspaper | Sonntagszeitung
Date | 14.5.2017
Language | German
Country | Switzerland
Topic Tags | artificial intelligence, diversity, gender, translation
Summary | One could think that artificial intelligence robots are not racist or sexist but because they learn from information circulating on the internet, they are subject to the biases as most poeple. This is why a beauty contest judged by an AI robot favored white people as more beautiful. Online job listings can also be biased based on gender so that women will not see higher-paying job listings or gender inclusive language gets lost in translation.
Image Description | N/A

L'intelligence artificielle reproduit aussi le sexisme et le racisme des humains

(Artificial intelligence also reproduces human beings' sexism and racism)

Hyperlink

Newspaper | Le Monde
Date | 15.4.2017
Language | French
Country | France
Topic Tags | artificial intelligence, gender, research/study, threat
Summary | Gender stereotypes are reproduced in some artificial intelligence programs. Researchers at the University of Stanford show how machine learning can replicate people's biases. They based their research on a technology called GloVe, which is trained to look for common associations. The technology points to some problematic associations that illustrate sexism and racism. The fact that AI follows people's prejudices can have some serious consequences, so people are trying to find solutions against AI's biases.
Image Description | N/A

Page 1 of 1