Robotic companions are being promoted as an antidote to the burden of longer, lonelier human lives. At stake is the future of what it means to be human.
I was briefly quoted about the ethical dilemma:
Many in the field see the tensions and dilemmas in robot care, yet believe the benefits can outweigh the risks. The technology is “intended to help older adults carry out their daily lives,” says Richard Pak, a Clemson University scientist who studies the intersection of human psychology and technology design, including robots. “If the cost is sort of tricking people in a sense, I think, without knowing what the future holds, that might be a worthy trade-off.” Still he wonders, “Is this the right thing to do?”
The third edition of the definitive source for information for designing for older adults has been published:
This new edition provides easily accessible and usable guidelines for practitioners in the design community for older adults. It includes an updated overview of the demographic characteristics of older adult populations and the scientific knowledge base of the aging process relevant to design. New chapters include Existing and Emerging Technologies, Work and Volunteering, Social Engagement, and Leisure Activities. Also included is basic information on user-centered design and specific recommendations for conducting research with older adults.
A 20% discount is available by using code ‘A004‘ at checkout from CRC Press.
The focus of this workshop is to bring together representatives from companies, organizations, universities, large and small, who are involved in industry, product development, or research who have an interest in meeting the needs of older adults. Additionally, members of the CREATE team will present guidelines and best practices for designing for older adults. Topics include; Existing & Emerging Technologies, Usability Protocols, Interface & Instructional Design, Technology in Social Engagement, Living Environments, Healthcare, Transportation, Leisure, and Work. Each participant will receive a complimentary copy of our book Designing for Older Adults.
If you would like a registration form or any further information on the conference accommodations, please contact Adrienne Jaret at: email@example.com or by phone at (646) 962-7153.
The social science research that we cover in this blog is carried out by a multitude of talented scientists across the world; each studying a different facet of the problem. In our second post in a new series, we interview one the leaders in the study of the human factors of autonomy, Dr. Mica Endsley.
The world’s population of 7.3 billion is predicted to grow to 9.7 billion by 2050, according to the Global Harvest Initiative. To feed all those people, global agricultural productivity must increase by 1.75 percent annually.
One person working to drive this increase is Margaux Ascherl, PhD, user experience leader at John Deere Intelligent Solutions Group in Urbandale, Iowa. John Deere recruited Ascherl in late 2012 while she was finishing her PhD in human factors psychology at Clemson University. Five years later, she now leads a team responsible for the design and testing of precision agriculture technology used in John Deere equipment.
Ascherl spoke to the Monitor about what it’s like to apply psychology in an agricultural context and how her team is helping farmers embrace new technology to feed the world.
As the first post in a series, we interview one the pioneers in the study of human-AI relationships, Dr. Julie Carpenter. She has over 15 years of experience in human-centered design and human-AI interaction research, teaching, and writing. Her principal research is about how culture influences human perception of AI and robotic systems and the associated human factors such as user trust and decision-making in human-robot cooperative interactions in natural use-case environments.
This is our second post on our “throwback” series. In this paper, I will take you through an article written by the best in the human factors and ergonomics field, the late Raja Parasuraman, Tom Sheridan, and Chris Wickens. Though several authors have introduced the concept of automation being implemented at various levels, for me this article nailed it.
My third job (in addition to being a professor, and curating this blog) is working on another blog with Arathi Sethumadhavan focused on the social science of autonomy and automation. You can find us over here.
Occasionally, I will cross-post items that might be of interest to both readerships. Over there, we’re starting a new series of posts called Throwback Thursdays where we go back in time to review some seminal papers in the history of human-automation interaction (HAI), but for a lay audience.
The first post discusses Bainbridge’s 1983 paper discussing the “Ironies of Automation”:
Don’t worry, our Throwback Thursday doesn’t involve embarrassing pictures of me or Arathi from 5 years ago. Instead, it is more cerebral. The social science behind automation and autonomy is long and rich, and despite being one of the earliest topics of study in engineering psychology, it has even more relevance today.
The American Psychological Association’s member magazine, the Monitor, recently highlighted 10 trends in 2018. One of those trends is that Applied Psychology is hot!
In this special APA Monitor report, “10 Trends to Watch in Psychology,” we explore how several far-reaching developments in psychology are transforming the field and society at large.
Our own Anne Mclaughlin, along with other prominent academics and industry applied psychologists were quoted in the article:
As technology changes the way we work, play, travel and think, applied psychologists who understand technology are more sought after than ever, says Anne McLaughlin, PhD, a professor of human factors and applied cognition in the department of psychology at North Carolina State University and past president of APA’s Div. 21 (Applied Experimental and Engineering Psychology).
Also quoted was Arathi Sethumadhavan:
Human factors psychologist Arathi Sethumadhavan, PhD, has found almost limitless opportunities in the health-care field since finishing her graduate degree in 2009. Though her background was in aviation, she found her human factors skills transferred easily to the medical sector—and those skills have been in demand.
One more thing…
Arathi and I have recently started a new blog, Human-Autonomy Sciences, devoted to the psychology of human-autonomy interaction. We hope you visit it and contribute to the discussion!
The original Bladerunner is my favorite movie and can be credited as sparking my interest in human-technology/human-autonomy interactions. The sequel is fantastic if you have not seen it (I’ve seen it twice already and soon a third).
If you’ve seen the original or sequel, the representations of incidental technologies may have seemed unusual. For example, the technologies feel like a strange hybrid of digital/analog systems, they are mostly voice controlled, and the hardware and software has a well-worn look. Machines also make satisfying noises as they are working (also present in the sequel). This is a refreshing contrast to the super clean, touch-based, transparent augmented reality displays shown in other movies.
The article suggests that the team really thought deeply about how to portray technology and UI by thinking about the fundamentals (I would love to have this job):
Blade Runner 2049 was challenging because it required Territory to think about complete systems. They were envisioning not only screens, but the machines and parts that would made them work.
With this in mind, the team considered a range of alternate display technologies. They included e-ink screens, which use tiny microcapsules filled with positive and negatively charged particles, and microfiche sheets, an old analog format used by libraries and other archival institutions to preserve old paper documents.
Anne’s research on attention and rock climbing was recently featured in an article in Outside Magazine:
To trad climb is to be faced with hundreds of such split-second micro decisions, the consequences of which can be fatal. That emphasis on human judgment and its fallibility intrigued Anne McLaughlin, a psychology professor at North Carolina State University. An attention and behavior researcher, she set out to model how and why rock climbers make decisions, and she’d recruited Weil and 31 other trad climbers to contribute data to the project.
The idea for the study first came about at the crag. In 2011, McLaughlin, Chris Wickens, a psychology professor at Colorado State University, and John Keller, an engineer at Alion Science and Technology, converged in Las Vegas for the Human Factors and Ergonomics Society conference, an annual event that brings together various professionals practicing user-focused product design. With Red Rocks just a few minutes away, the three avid climbers were eager to get some time on the rock before the day’s sessions, says Keller, even if it meant starting at 3 a.m.