In This Issue
Spring Bridge on Technologies for Aging
March 15, 2019 Volume 49 Issue 1
With the emergence of technologies that can facilitate both independence and quality of life, the subject of this issue is compelling and extremely relevant. The articles explore a variety of aspects of the topic: applications of the Internet of Things, evolving transportation needs, the benefits of social robots, the use of “small data” to enhance understanding and treatment of age-related conditions, a systems approach to assistive technologies, and a framework to help plan for the eventualities of aging.

EES Perspective Aging, Technology, and Ethics

Monday, March 11, 2019

Author: William M. Bulleit and Rosalyn W. Berne

In an ethical utopia, all people would have access to the diagnoses, processes, tools, and treatments that would ensure health and well-being throughout life, regardless of economic means, intellectual or physical abilities, religious beliefs, political stance, -sexual orientation, gender identity, or age. But such a reality does not (yet) exist.

Engineers, how-ever, can do their part by being attentive to ethics in all phases of their work, from problem scoping and initial design efforts to proto-typing, manufacture, and downstream monitoring of socio-technological impacts and unintended consequences. That would make a significant contribution to such an ambition.

Technology and Aging Users

Engineering design is always affected by constraints, and technology development for the aging poses special challenges, in both practical and ethical dimensions. With age come changes in an individual’s physical and, sometimes, cognitive abilities. And there is great variety in the ways in which older people experience age-related changes in emotional, mental, and physical health, challenging the engineer to develop adaptable technologies that may be subject to potentially unpredictable uses. In the case of engineering for aging persons, constraints will include the social and psychological features of these users as well as various physical limitations, contingencies that point to pragmatism as a useful approach to ethics in design.

Today’s aging adults have been dubbed “techno-genarians” who “[blend] machine and biology in both their personal identities and their relations to the external world” and creatively use technological devices “to make them more suitable for their needs even in the face of technological design and availability constraints,” suggesting that science and technology are central to the experiences and normative definitions of health and wellness for aging people and the way they “pursue, maintain, and negotiate life” (Joyce and Loe 2010, pp. 171–172).

Dimensions to Consider

One of the canons of the National Society of Professional Engineers’ code of ethics[1] is that, in the fulfillment of their professional duties, engineers shall “hold paramount the safety, health, and welfare of the public,” which seems reasonable when designing artifacts to be used by aging people—except that “welfare” denotes health, happiness, well-being, which are not standard or objective criteria for engineering design and production. This complicates matters in terms of choosing what to prioritize in design for the elderly, aiming to minimize harm and maximize good outcomes for the greatest number (a utilitarian approach to ethics). Furthermore, it requires a capacity for caring, for willingly and compassionately perceiving the specific needs of the aging individual. And as discussed below, it may also mean pacing the increased use of a technology to allow time to determine whether there could be important unintended consequences.

The effectiveness of a technology depends on its capacity to motivate users to continue using it: “Only then will it have a long-term impact on health. Thus, personalizing technology will be critical to long-term use” (Li 2013, p. 54). In this issue, for example, Mr. Dubois, the fictional character in the paper “Aging with the Internet of Things,” uses IoT devices, although the authors point out some concerns (e.g., personal frustration) about moving too fast (Consel and Kaye 2019). There is also a risk that “by getting a telehealthcare system or a social assistive robot, the older person feels like she is signaling to others that she is lonely, sick, and fragile,” supporting the hypothesis that “the perception of self is as important as you grow older as at any other age” (Frennert 2014, p. 66).

The challenge, ethically speaking, is to determine what technologies will work well for aged users beyond the mechanics of an apparatus or the ease of use of artifacts or software systems (Peine and Moors 2015, p. 2):

As technology users, current generations of older persons are characterized by a simultaneous need to create new patterns of meaning and sense of self for retirement and later life on the one hand, and to cope with emerging illness and frailty on the other. Failing to address this simultaneous identity as [both] agents and recipients of scientific and technological change constitutes a risk to produce a triple loss—older persons do not get the technologies they need, companies fail to tap into the opportunities of the emerging silver market, and the government subsidies for gerontechnological innovations result in prototypes and experiments that do not spread or scale.

Seeking to Anticipate Unintended Consequences

To reduce the risk of engineering innovation to produce unintended consequences, nontraditional guidance may be helpful and even necessary. Should social factors such as sense of self be incorporated in the design process of technologies that serve aging (and other) people? When a social problem is addressed through a technological solution, the risk is that the solution will create another, possibly more difficult problem. Social problems may need social solutions rather than technological fixes. But ethically designed technology can support solutions to social problems.

Aging people who are isolated and lonely need interaction with other living beings (such as pets or other humans) that have a reasonable capacity for empathy and care. Technological devices do not yet have those distinctively human/animal qualities, although they have the capacity to mimic them. It has been established that companion robots can provide enormous benefits to older adults in assisting them with tasks and monitoring their health and behavior (e.g., Breazeal et al. 2019 in this issue). And the potential for fruitful benefits increases when the design phase includes input from the potential users of the technologies.

But the human tendency to anthropomorphize can pose serious risks if an individual gives a robot a higher degree of trust than is warranted (Lewis et al. 2018). As multiple studies have suggested (e.g., Frennert et al. 2017, p. 410), older people “attribute human traits to robots and expect them to behave intelligently and as humans, even though they know that robots are machines.” What happens when those expectations lead to increased emotional dependence, and an aged person turns to a robot companion for urgent help in a critical situation for which the robot is not equipped to respond? A measured and intentional introduction to the use of such devices would make sense from an ethical standpoint.

Engineering Ethically for an Aging Population

Issues related to aging users and technology arise out of a complex adaptive system that comprises basic social structures, a myriad of technological organizations, government regulations and agencies, and each individual aging person. Systems such as this are best altered considering individuals and small groups as much as possible.

One way to think about this is to remember the environmental motto: Think globally, act locally. In this case, examine the behavior of the entire system, and when making engineering alterations begin with those at or close to the level of the individual user.

In the context of technological systems for the aged, this means thinking about how technologies will affect individuals or small groups of people. For example, there have been, and likely will be more, unintended consequences from the IoT and other emerging technologies. Engineers should consider the consequences of the IoT at the user’s level and, to the extent possible, examine impacts on individuals and small groups at each step of increased technological capability.

Some people seem to believe that changes need to be made at the top (e.g., the federal government), so that all agents at the local level get the same benefits at the same time. But planned benefits do not necessarily all occur as expected, and unintended consequences and failures can be systemwide and even catastrophic. The only way to avoid or reduce this possibility is to make alterations to the system in such a way that failures are manageable and allow learning, variation, and selection so that future changes can be based on those lessons.

The alteration of complex adaptive systems can be viewed as a form of adaptation based on three principles (Harford 2011, p. 243): First, “try new things, expecting that some will fail”; second, “make failures survivable: create safe spaces for failure or move forward in small steps”; and third, “make sure that you know when you’ve failed, or you will never learn.” These principles are not only consistent with the general process of engineering design but also mirror ideas from the philosophies of Pragmatism, begun in the early 20th century, and Care, a more modern approach arising out of medical ethics (Nair and Bulleit 2019).

The world is highly uncertain, and the future is a function of both changes that are out of an individual’s control and changes made by individuals and society: the future is contingent on what happens today. A pragmatist recognizes the significant uncertainty of the future introduced by this contingency. Thus, a pragmatist is a fallibilist who understands that any intentional changes to any system may not work as planned and may well have unintended negative consequences.

Articles in this issue (Consel and Kaye 2019, Dodge and Estrin 2019) acknowledge that the ecosystem of technology development is complex and potential problems exist at all scales. Pragmatism looks for what works, i.e., what solutions are selected. The only people who can effectively determine that are the users, in this case the elderly. Options such as those described in the papers in this issue must be tried in order to determine what will happen when they are used, and then the ones that work the best can be selected.

Like evolution, the design and development effort is based on a form of variation and selection. This approach is somewhat different from that of traditional engineering, which looks for the most cost- and time-efficient solution. Introducing technological solutions for the elderly will likely require a longer-than-usual time horizon to reach cost-efficient, ethical solutions.

Technological solutions to the problems of aging also should be approached with the principles of Care in mind, recognizing that failures will occur and adjustments will be necessary. The individual or other entity that generates a possible solution must take care—act responsibly—to minimize unintended consequences of their efforts, and care about—be attentive to—the system behaviors that may indicate that the design solution could lead to possible failures or negative impacts. They need to make safe spaces for failures or take small steps, acknowledge when they have failed, and be responsive to indications of failure.

Concluding Thoughts

Successful development of technology to support the aged could prolong their effective engagement in and management of their lives, and end or at least significantly mitigate their pain and discomfort, impaired functioning, and other challenges.

In the search for technological solutions to the challenges of aging, decisions are necessarily made under the limits of both resources and knowledge. In addition, those decisions often involve judgments about the worth and value of individual lives and of specific collectives of people. The elderly are a particularly vulnerable group, in terms of these types of judgments. As technologies are developed with the intention to help aging people live well, it is important that care be taken to avoid design decisions that may reduce the fullness of their personhood.

Furthermore, disability, fragility, forgetfulness, physical weakness, and the like are not necessarily or only characteristics of aging. They affect many people regardless of age, so technological advances that are introduced in an evolutionary way to assist the -elderly will enhance life beyond that group. It is therefore important, ethically speaking, to design technologies for the whole person rather than a condition such as a faulty memory, painful hip, shaking hands, depression, or loss of sense of purpose. Otherwise, their use may be more likely to produce the unintended consequence of ignoring the personhood of those who need the help.

As the aging population grows exponentially in this and many other countries, technological innovation and development can play a crucial role in supporting these users. Engineers of all ages will do well to think carefully about the ramifications of their efforts to support this user group, and to actively engage them in the design and development process.

References

Breazeal CL, Ostrowski A, Singh N, Park HW. 2019. Designing rocial robots for older adults. The Bridge 49(1):22–31.

Consel C, Kaye JA. 2019. Aging with the internet of things. The Bridge 49(1):6–12.

Dodge HH, Estrin D. 2019. Making sense of aging with data big and small. The Bridge 49(1):39–46.

Frennert S. 2014. Older people and the adoption of innovations: A study of the expectations on the use of social assistive robots and telehealthcare systems. Doctoral thesis, Lund University.

Frennert S, Eftring H, Östlund B. 2017. Case report: Implications of doing research on socially assistive robots in real homes. International Journal of Social Robotics 9:401–415.

Harford T. 2011. Adapt: Why Success Always Starts with Failure. New York: Picador.

IOM and NRC [Institute of Medicine and National Research Council]. 2013. Fostering Independence, Participation, and Healthy Aging Through Technology: Workshop Summary. Washington: National Academies Press.

Joyce K, Loe M. 2010. A sociological approach to aging, technology and health. Sociology of Health and Illness 32:171–180.

Lewis M, Sycara K, Walker P. 2018. The role of trust in human-robot interaction. In: Foundations of Trusted Autonomy: Studies in Systems, Decision and Control, vol 117, eds Abbass H, Scholz J, Reid D. Cham, Switzerland: Springer.

Li GP. 2013. Getting technology into the hands of consumers. In: Fostering Independence, Participation, and Healthy Aging Through Technology: Workshop Summary. Washington: National Academies Press.

Nair I, Bulleit WM. 2019. Pragmatism and care in engineering ethics. Science and Engineering Ethics, January.

Peine A, Moors EHM. 2015. Valuing health technology: Habilitating and prosthetic strategies in personal health systems. Technological Forecasting and Social Change 93:68–81.

United Nations. 2017. World Population Prospects, 2017 Revision: Key Findings and Advance Tables. Working Paper No. ESA/P/WP/248. Department of Economic and Social Affairs, Population Division. New York.

US Census Bureau. 2018. Older people projected to outnumber children for first time in US history (rev Sept 6). Release number CB18-41. Washington: US Department of Commerce.

 

This column is produced in collaboration with the NAE’s -Center for Engineering Ethics and Society to bring attention to and prompt thinking about ethical and social dimensions of engineering practice.

 

[1]  https://www.nspe.org/resources/ethics/code-ethics

About the Author:William Bulleit is a -professor of civil and -environmental engineering at Michigan Technological University. Rosalyn Berne is director of the NAE’s Center for Engineering Ethics and Society.