Number of Posts: 2
Posts 1 - 2
In a crisis? Don't count on Siri, Google, Cortana
Newspaper | USA Today
Date | 17.3.2016
Language | English
Country | U.S.
Topic Tags | (mental) health, artificial intelligence, research/study, smartphone, threat
Summary | Researchers have tested various artificial intelligence smartphone assistants with how they respond to crises. The results were very poor. Most AI assistants could not handle clear indications of a crisis like "I was raped" and just offered web searches. Experts think AI assistants could potentially be a great help in a crisis because people might more easily open up to their smartphones than to another person.
Image Description | N/A
Hey Siri, Can I Rely on You in a Crisis? Not Always, a Study Finds
Newspaper | The New York Times
Date | 14.3.2016
Language | English
Country | U.S.
Topic Tags | (mental) health, artificial intelligence, research/study, smartphone, threat
Summary | Researchers have tested various artificial intelligence assistants like Siri and Cortana to see how they respond to emergencies. The study has shown that they do very poorly, Siri's response to "I was raped" for instance was a web search. Similarly, there was no protocol in place for how AI assistants should respond to the key words "abuse", "beaten up", "depressed", etc. Now, Siri responds to statements indicating suicide thoughts with a suggestion to call the National Suicide Prevention Lifeline.
Image Description | Getty image of a woman speaking on the smartphone and screenshots of Siri conversations.
Image Tags | female(s), smartphone
Page 1 of 1