Human-Technology Interactions in Health

multisense

Coincidentally, the topic of social/human-technology interaction is in the news quite a bit today.  I’m pleased that the topic of the human factors implications of the social interaction with technology is getting more focus.

First, Dr. Wendy Rogers of Georgia Tech gets interviewed in the New York Times about her work on older adults and in-home helper robots:

Dr. Rogers has been experimenting with a large robot called the PR2, made by Willow Garage, a robotics company in Palo Alto, Calif., which can fetch and administer medicine, a seemingly simple act that demands a great deal of trust between man and machine.

“We are social beings, and we do develop social types of relationships with lots of things,” she said. “Think about the GPS in your car, you talk to it and it talks to you.” Dr. Rogers noted that people developed connections with their Roomba, the vacuum robot, by giving the machines names and buying costumes for them. “This isn’t a bad thing, it’s just what we do,” she said.

In a more ambitious use of technology, NPR is reporting that researchers are using computer-generated avatars as interviewers to detect soldiers who are susceptible to suicide. Simultaneously, facial movement patterns of the interviewee are recorded:

“For each indicator,” Morency explains, “we will display three things.” First, the report will show the physical behavior of the person Ellie just interviewed, tallying how many times he or she smiled, for instance, and for how long. Then the report will show how much depressed people typically smile, and finally how much healthy people typically smile. Essentially it’s a visualization of the person’s behavior compared to a population of depressed and non-depressed people.

While this sounds like an interesting application, I have to agree with with one of its critics that:

“It strikes me as unlikely that face or voice will provide that information with such certainty,” he says.

At worst, it will flood the real therapist with a “big data”-type situation where there may be “signal” but way too much noise (see this article).

Similar Posts (auto-generated):

About Richard Pak

Associate Professor at Clemson University/Department of Psychology
Comments are closed.
%d bloggers like this: