Bauer Researchers Study Long-term Consequences of Living with Digital Creatures
Digital assistants and other computerized communicators are increasingly designed to appeal to us on a personal, emotional level.
“I am sorry that your credit card cannot be authorized.”
“I hope you enjoy your new purchase.”
“I regret that the item requested is not in stock.”
For years, C. T. Bauer College of Business Associate Professor Jaana Porra has been questioning the long-term implications: “What does it mean when we as human-beings bond with machines over long time periods, instead of bonding with human beings?” she asks. “When we start introducing machines that have been designed for other purposes, what might be the consequences in terms of human evolution?”
Porra’s research paper, “Can Computer Based Human-Likeness Endanger Humanness?” – A Philosophical and Ethical Perspective on Digital Assistants Expressing Feelings They Can’t Have,” was recently published in a special issue of Information Systems Frontier “On Being (More) Human in a Digitized World.”
Co-authors include Associate Professor Michael Parks, who, like Porra, is a faculty member of Bauer’s Department of Decision & Information Sciences; and former Bauer College doctoral student Mary Lacity, now a professor at the University of Arkansas.
Porra explains the premise for her paper: “In allthis research on what we can do with artificial intelligence for humanity, we have really forgotten the human being and humanness. We’re studying human aspects: What entices someone to buy something? What entices someone to associate with computer-based information systems with digital human likeness. But we have seen little research on how healthy this is.”
There is reason for concern, Porra says. Digital assistants that make factual, informational statements, such as, “Your application has been approved,” likely pose little threat to how humans evolve. On the other hand, the authors write, “The more time we spend with human-like machines the more we will resemble them. It is conceivable, that we will also incorporate into ourselves their lack of feelings, endangering the future of our genuine humanness.”
Porra has no illusions about turning back the clock on technological advances, especially as others are passionate about pushing boundaries to achieve ever-more sophisticated digital assistants in order to accomplish short-term goals. Digital assistants are expected to outnumber humans by 2021, she said.
“We cannot really stop this development,” she says. “What we’re saying is that we should start researching it with a multidisciplinary approach, and we should also give people options.”
Among the study’s recommendations: Software engineers could add features that make consumers more aware that they are speaking with a machine. There could be a standard feature allowing consumers to eliminate feeling statements from their digital assistant dialogue and more attention paid to stressing that the user is in charge. These options may be especially important for young children, she said.
“This study is just scratching the surface,” Porra said. “But I see glimpses of growing interest and have gotten some very positive responses to the idea that this is something we need to talk about in a multi-disciplinary way. I think it is a matter of time to where people mature to the fact that we really need to talk about the long-term consequences of living with digital creatures that express human-like feelings. We argue that feelings are the very substance of our humanness and therefore are best reserved for human interaction.”
By Julie Bonnin