Number of Posts: 3
Posts 1 - 3
Die Maschine erziehen und trainieren
(Raising and training the machine)
Newspaper | Sonntagszeitung
Date | 20.11.2016
Language | German
Country | Switzerland
Topic Tags | artificial intelligence, computer programming, research/study, threat
Summary | Some researchers say that artificial intelligence may eliminate the need for human programmers. Modern programs are becoming more similar to human brains in that it is no longer just the programmer who creates every step of the program but the program itself is capable of learning from experience (technically: exposure to large amounts of data). Some find this idea that computers will become intellectual equals of humans frightening.
Image Description | N/A
Diese Technik soll uns den Alltag erleichtern
(This technology should make our daily lives easier)
Newspaper | Tages-Anzeiger
Date | 5.4.2017
Language | German
Country | Switzerland
Topic Tags | artificial intelligence, privacy, research/study, smartphone, translation
Summary | Computer are becoming more and more intertwined in our daily lives. Some smartphones can already translate real-time conversations with imitating the speakers voice thanks to advances in voice recognition. Image recognition has also advanced substantially to being able to "read" moods, age, and attractiveness of the photographed individuals. Research is being done for smartphones and other devices to monitor body odor, sweat (to allet to dehydration), or tear liquid (for diabetics).
Image Description | Various simple visualizations of smartphones/devices interacting with people (depicted by emojis), body parts, et cetera.
Image Tags | chart, emojis, female(s), smartphone
L’intelligence artificielle, aussi raciste et sexiste que nous
(Artificial intelligence, as racist and sexist as us)
Newspaper | Le Temps
Date | 4.5.2017
Language | French
Country | Switzerland
Topic Tags | artificial intelligence, research/study, threat
Summary | A new research shows that artificial intelligence can also have biases and prejudices. The results are not really surprising but it can be dangerous if one uses AI to hire people for instance. The study shows that some AI programs actually reproduce racist and sexist stereotypes that exist in language. Researchers created an "association test"called GloVe and demonstrated that, for example, the machine associated names of flowers with positive connotations, and names of insects with negative ones, as would human beings do. The results are not surprising because learning machines are actually a mirror of human behavior.
Image Description | N/A
Page 1 of 1