Number of Posts: 3
Posts 1 - 3
Tech’s sexism doesn’t stay in Silicon Valley. It’s in the products you use.
Newspaper | Washington Post
Date | 8.8.2017
Language | English
Country | U.S.
Topic Tags | (mental) health, artificial intelligence, diversity, gender, research/study
Summary | Slicon Valley has been entangled in scandals around sexism and racism recently. Many innovations incorporate artificial intelligence which means that the software learns from data reflecting our social reality but which are biased. This leads to issues like image recognition not recognizing black people as humans but as gorillas because the data the program learned from included predominantly white people. A similar case is a health app that tracked various physical paramenters but not the menstrual cycle thereby disregarding a large proportion of the female population.
Image Description | N/A
Taking poetic license with AI personalities
Newspaper | Washington Post
Date | 7.4.2016
Language | English
Country | U.S.
Topic Tags | artificial intelligence, emojis, gender, research/study
Summary | Artificial intelligence assistants are now being creatively enganced by educated and professional writers and poets so as to make their conversation appear more human-like (f.i. by using emojis) and their personalities more authentic. Polls have shown that users prefer female voices for AI assistants and most companies have acted accordingly. Microsoft has however pre-empted reinforcing stereotypes about female assistants by limiting the number of apologies and self-deprecating comments for their AI assistant Cortana.
Image Description | Image of a meeting of professional writers working in AI at Microsoft.
Image Tags | computer/laptop, female(s), male(s)
L'intelligence artificielle reproduit aussi le sexisme et le racisme des humains
(Artificial intelligence also reproduces human beings' sexism and racism)
Newspaper | Le Monde
Date | 15.4.2017
Language | French
Country | France
Topic Tags | artificial intelligence, gender, research/study, threat
Summary | Gender stereotypes are reproduced in some artificial intelligence programs. Researchers at the University of Stanford show how machine learning can replicate people's biases. They based their research on a technology called GloVe, which is trained to look for common associations. The technology points to some problematic associations that illustrate sexism and racism. The fact that AI follows people's prejudices can have some serious consequences, so people are trying to find solutions against AI's biases.
Image Description | N/A
Page 1 of 1