![]() "They have words they can look for, and they can try to identify patterns, but they really haven't been around long enough and haven't been validated medically to really offer a safety net at this point." It's a case where likely the technology has outpaced the research and our knowledge about how to apply it and deliver safe and effective mental-health services. and there's a lot of different ways that people can phrase that they're in distress or need help," says Torous, which is why he believes we're still a long way from being able to rely on such devices in real emergencies. "One of the trickiest things is that language is complex. The fact is, fleshing out Siri's responses to be more helpful is no easy task. And while Apple and Microsoft have since made efforts to make sure their digital assistants link people to suicide hotlines or other resources, telling Siri you're feeling blue is still likely to yield the response, "I'm sorry to hear that." Big challenges The researchers found the digital assistants couldn't provide appropriate responses. About a year and a half ago, a group of researchers at Stanford University tested Siri and Microsoft's equivalent, Cortana, with questions about suicide and domestic violence. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |