Risk perception and communication are important factors in decisions about managing risk events and their impacts.
One of the most perplexing problems in risk analysis is why some relatively minor risks or risk events (as assessed by technical experts) elicit strong public concerns and result in substantial impacts on society and the economy. Such concerns and impacts are typically the result of “social amplification”—changes in risk perception and response based on psychological, social, institutional, and cultural processes. Social amplification is most likely to flourish when the risks are serious and the situation is fraught with uncertainties.
In this article I describe a tool, the social amplification of risk framework, for understanding and accounting for public attitudes toward risk. The framework links the technical assessment of risk with psychological, sociological, and cultural perspectives of risk and risk-related behavior (Kasperson et al., 1988; Pidgeon et al., 2003). The main thesis of the framework is that hazards interact with these perspectives in ways that may amplify or attenuate public responses. In this article I focus on amplification.
Risk amplification typically occurs at two stages in a risk scenario: in the transfer of information about risk and in social response mechanisms. Signals about risk are both transmitted and processed by individuals and social entities, called “amplification stations.” The individual might be a scientist, for example, who communicates the risk assessment; a social entity might be the news media, a cultural group, or an interpersonal network. The perceived amplified risk leads to behavioral responses that result in secondary impacts or “ripples.”
Social amplification may qualitatively and quantitatively increase not only the perception of risk but also the risk itself and its consequences. For this reason, social amplification of risk must be included in analyses of public and regulatory reactions to risk events.
The key amplification stages are listed below:
- filtering signals (only a fraction of all incoming information is actually processed)
- decoding and reframing signals processing risk information (e.g., drawing inferences)
- attaching social values to information as a basis for drawing implications for management and policy
- interacting with cultural and peer groups to interpret and assess the validity of signals
- formulating behavioral intentions about whether to tolerate a risk or take action against the risk or risk manager1
- engaging in group or individual actions to accept, ignore, tolerate, or change the risk
The Framework: Terms and Definitions
The starting point in the social amplification framework is a risk event, which might be an actual or hypothesized incident (or even a new report about a known risk) and which may be minimal, largely irrelevant, or localized in its impact, unless it is observed and communicated to others and thus amplified. The characteristics of the risk are then portrayed through communication signals that interact with psychological, social, institutional, and/or cultural processes in ways that intensify perceptions of the risk and its manageability.
The experience of risk thus involves not only concern about potential physical harm but also interpretation of risk by groups and individuals. The social amplification of risk framework enables effective assessment of this multidimensional risk experience, secondary and tertiary consequences, and the actions of risk managers, stakeholders, and the public.
The term “amplification” (which comes from classical communication theory) refers to the process of various social agents generating, receiving, interpreting, and passing along risk signals, which are always changed in the process.2 In fact, risk signals are subject to predictable “transformations” as they filter through social and individual amplification stations. The social amplification stations generate and transmit information via communication channels such as mass media, social media, letters, telephones, and face-to-face conversations; Figure 1 illustrates the many sources, channels, and filters of information that combine to transform risk perception. The transformations may increase or decrease the volume of information about an event, heighten the salience of certain aspects of a message, or reinterpret and elaborate available symbols and images, leading to particular interpretations and responses by those who next receive the information.
Individual amplification stations are affected by “risk heuristics” (i.e., qualitative aspects of the risk and context such as attitudes, blame, or trust). Individuals are also members of cultural groups and other social units (social stations of amplification) that codetermine their risk perception (Dietz and Stern, 2008). Individuals in groups and institutions do not simply pursue their personal values and social interpretations; they perceive risks, those who manage them, and the risk problem through the lens of values of the organization or group and, perhaps, its cultural biases (Dietz and Stern, 1996).
Social amplification also accounts for the secondary and tertiary consequences, or ripples (illustrated on the right side of Figure 1), of some events. Like ripples in a pond, they may spread far beyond the initial point of impact and may even affect previously unrelated groups or institutions.
Imagining ripples in a pond is a good way to think about how impacts associated with the social amplification of risk spread outward (Figure 1), from those directly affected (or first notified) to the next, institutional level (a company or an agency), and, in some cases, to other parts of an industry. For example, in the wake of the Deepwater Horizon explosion in April 2010, the effects spread from the drilling rig to the rest of the Gulf of Mexico, including the wetlands and beaches in all of the adjacent states, and then to politicians and petroleum industry representatives who were compelled to reconsider their plans to expand offshore drilling.
The concept of rippling impacts suggests processes that can extend (in risk amplification) or constrain (in risk attenuation) temporal, sectoral, and geographical impacts. It also illustrates that each order of impact, or ripple, may not only have social and political effects but also trigger (in risk amplification) or hinder (in risk attenuation) managerial interventions to reduce risk.
Secondary effects include market impacts (e.g., consumer avoidance of a product or related products), demands for regulatory constraints, litigation, community opposition, loss of credibility and trust, stigmatization of a facility or community, and investor flight. They may also include some or all of the following effects:
- enduring changes in perceptions, images, and attitudes (e.g., antitechnology attitudes, alienation from the physical environment, social apathy, stigmatization of an environment or risk manager)
- losses in local business sales, lower residential property values, and lower levels of economic activity
- political and social pressure (e.g., political demands, changes in the political climate and culture)
- changes in the nature of the risk (e.g., feedback mechanisms that heighten or lower the risk)
- changes in training, education, or required qualifications for operations and emergency response personnel
- social disorder (e.g., protests, riots, sabotage, terrorism)
- changes in risk monitoring and regulation
- higher liability and insurance costs
- repercussions on other technologies (e.g., lower levels of public acceptance) and on social institutions (e.g., erosion of public trust), as when the 1989 explosion at the chemical plant in Bhopal, India, raised concerns about the possible failure of “fail-safe” systems at nuclear power plants.
Once secondary impacts are perceived by social groups and individuals, they may lead to another stage of amplification and tertiary effects that may affect other parties, more distant locations, or future generations.
Each order of impact may also trigger (in risk amplification) or hinder (in risk attenuation) positive changes for risk reduction. Examples of positive changes were apparent in the wake of the Fukushima nuclear accident in Japan when Germany restructured its energy system and the United States (among other countries) launched a major review of its nuclear plants.
Uncertainty is inescapable, even in familiar situations—such as crossing a street or driving a car—but such quotidian uncertainty usually remains within reasonable bounds. People rely on existing knowledge and experience to guide future expectations.
But contexts change, and new elements affecting risk unexpectedly appear. For highly complex systems with extensive connectivity and interactions, or novel problems or technology for which experience provides little guidance, decisions must often be made quickly and under conditions of high uncertainty, greatly complicating the assessment of risk.
Uncertainty may arise from gaps in data, insufficient models, or incomplete scientific understanding of a risk. Depending on the type and source of uncertainty, new information and more data may not reduce it. As was noted in Thinking Strategically, a 2005 National Research Council report, scientific progress may not only reduce some uncertainties but also uncover new ones (NRC, 2005).
It is not surprising that—in a world of complex systems involving rapid technological change, highly coupled human and natural systems, and a kaleidoscope of social, economic, and political institutions—high levels of uncertainty challenge existing assessment methods as well as public consideration and communication of risk decision and management procedures. In Science and Decisions: Advancing Risk Assessment (NRC, 2009), a committee of experts identified six core principles for addressing uncertainty and vulnerability (Box 1).
Management strategies have evolved for determining the interrelationships between types of uncertainty and decision patterns (Funtowicz and Ravetz, 1990). For situations in which uncertainties and decision stakes are low, standard routines and procedures usually suffice for making decisions. As stakes and uncertainties increase, professional consultants and other experts may be called upon. Finally, risks characterized by high-stakes decisions and significant uncertainty require the involvement of “post-normal science,” which applies when “facts are uncertain, values in dispute, stakes high, and decisions urgent” (Funtowicz and Ravetz, 1991). Social amplification is especially likely to be a compounding factor in such cases, for both assessment and decision making.
While there is little question about the challenges of risk uncertainty and social amplification for the scientific community, they are not issues for the scientists alone. They greatly affect people and environments far beyond science. Uncertainty and amplification reflect differences in patterns of vulnerability, in the natural environment, in social and cultural communities, and in ambiguities surrounding the choice of management approaches and interventions.
Risk assessment is based on an assumption that sufficient knowledge and quantification can be achieved to enable command-and-control strategies and regulation. But much depends on the extent of both residual uncertainties and amplification, whether they can be reduced significantly, and how they affect the acceptability of the risk.
Large uncertainties and social amplification may necessitate alternative management approaches. Adaptive management strategies (e.g., “going with the flow” and making midcourse corrections) are based on the presence of substantial uncertainty and the understanding that knowledge is coevolving with the risk. The effectiveness of such approaches depends on the risk and the extent to which midcourse corrections can be made in technology, siting, and project design and how extensively amplification affects decision options.
A particular policy strength of the social amplification of risk framework is its capacity to mesh emerging findings from different venues of risk and impact research, to bring various insights and analytic leverage into conjunction, and (particularly) to analyze connections and interactions in specific social and cultural contexts. Because of its broad applicability and inclusion of a wide variety of factors and linkages, this framework is useful for teasing out patterns and broader interpretations that may yield new insights and hypotheses.
Yet, even with 15 years of experience since this framework was set forth, a full-fledged theory that explains why some risks and risk events undergo more or less amplification or attenuation has yet to emerge. There is more to study and learn in this evolving field of research.
Dietz, T., and P. Stern, eds. 1996. Understanding Risk. Washington, D.C.: National Academy Press.
Dietz, T., and P. Stern, eds. 2008. Public Participation in Environmental Assessment and Decision Making. Washington, D.C.: National Academies Press.
Funtowicz, S.O., and J. Ravetz. 1990. Global Environmental Issues and the Emergence of Second Order Science. London, U.K.: Council for Science and Society.
Funtowicz, S.O., and J.R. Ravetz. 1991. A new scientific methodology for global environmental issues. Pp. 137–152 in Ecological Economics: The Science and Management of Sustainability, edited by R. Costanza. New York: Columbia University Press.
Kasperson, R.E., O. Renn, P. Slovic, H.S. Brown, J. Emel, R. Goble, J.X. Kasperson, and S.J. Ratick. 1988. The social amplification of risk: A conceptual framework. Risk Analysis 8(2): 177–187.
NRC (National Research Council). 2005. Thinking Strategically. Washington, D.C.: National Academies Press.
NRC. 2009. Science and Decisions: Advancing Risk Assessment. Washington, D.C.: National Academies Press.
Pidgeon, N., R.E. Kasperson, and P. Slovic, eds. 2003. The Social Amplification of Risk. Cambridge, U.K.: Cambridge University Press.
1 “Risk manager” refers here to a public- or private-sector agency rather than an individual.
2 In this context, amplification does not explicitly or exclusively mean intensification or elaboration. It also means the modification of information as it is transmitted from one source to another and received through the filters of personal, social, cultural, and other biases.