Often, it's the first inclination of technically-minded folks to simply dismiss these people as irrational, even stupid. And yes, this is certainly easy - even satisfying (particularly on days when I'm feeling especially curmudgeonly...). It's also terribly unproductive. In light of this, I wanted to dig deeper into the idea of how public perceives risk, drawing on an established body of literature (again, perhaps most famously through projects like the Yale Law School's Cultural Cognition Project).
Understanding "perceived risk"
A particular school of thought in the social science of risk perception, known as the Cultural Theory of Risk, purports that the relative perception of risk - and ultimately, the determination of "acceptable" risk, is governed by cultural factors exogenous to strictly technical evaluations of risk alone. In other words, despite the fact that flying is safer than driving, people perceive the latter to be less safe due to other, outside factors. Thus a key element in understanding how risk is perceived by members of the public (i.e., the "non-technical" community) requires understanding the factors which tend to weigh upon evaluations of risk - in other words, factors which promote perceived risk. These include:- Involuntary exposure
- Lack of personal control
- High/catastrophic consequences
- Inequitable distribution of risk
- Lack of familiarity
- Lack of perceived benefit
- "Dread" factors (e.g., cancer)
- Irreversible consequences
In this way, risks of higher probability but more moderate consequences (e.g., natural gas explosions, coal waste accidents, etc.) are viewed as more "acceptable" despite much lower probabilities of harm from sources such as nuclear accidents. By the same token, risks which are mundane and taken up voluntarily - think smoking, etc. - are viewed as acceptable despite well-known and demonstrably higher probabilities of harm.
Values and risk perception
An outgrowth of the Cultural Theory of Risk (or perhaps simply an alternative model altogether, although arguably not entirely incompatible) is the theory known as Cultural Cognition of Risk, which posits that deeply-held values influence how risks are perceived and processed by members of the public - and thus, which risks are seen as more prominent. Cultural Cognition (and the Cultural Cognition Project) seek to explain gaps in public perception of risk by looking at the correlations of risk perception to values - in other words, looking at why different political and cultural groups show wide disparities in perceived risks of large social issues, such as global climate change and other divisive issues.
Cultural Cognition divides value systems into two main axes. Roughly speaking, the vertical axis corresponds to values about how social goods (wealth, power, duties, and entitlements) are distributed, with "hierarchical" orientations favoring their distribution according to relatively "fixed" social markers - age, sex, race, etc. - and thus seeking to maintain these orderings. Conversely, egalitarian values tend to reject the idea of ranked hierarchies in the distribution of social goods. Along the horizontal axis is the relationship of the individual to society - leftward emphasizing a higher emphasis upon individuals and competitiveness, rightward emphasizing group solidarity over the individual. (An example of this can easily be observed in Eastern versus Western cultures, and in particular the expectations of individuals with respect to their societies.)
For those familiar with the Nolan Chart, or its variant, the Political Compass, there is a relatively intuitive mapping between the values proposed by Cultural Congition and the Personal/Economic liberty axes in each one (i.e., at the top left would be considered "conservatives," bottom right "liberals", bottom left "libertarians," and top right "populists"). Thus, the familiar partisan splits in nuclear energy support begin to grow more clear as one draws associations between the commonly held values of self-identified liberals, conservatives (and of course, libertarians!).
The central thesis of Cultural Cognition is that risk perception tends to be oriented along lines that remain harmonious with one's social values - risks which appear to challenge one's social values are minimized, which risks which speak to concerns of social values are heightened. Many of the topics studied under these lines of thought tend to include divisive social issues such as the role of gun ownership, abortion, nanotechnology, and indeed, nuclear power (in particular, nuclear waste management). Thus Cultural Cognition theory posits that differences in perceived risk due to major social issues comes from a reconciliation of information about risk with deeply-held personal values, thus explaining the gap in risk perception between different groups.
Education alone is not enough
Bringing this back to the subject of nuclear, it seems like once we understand what drives perception of risk, this should be enough to influence such perceptions more in line with actual facts. Yet one of the most discouraging findings in the literature on cultural cognition of risk is in that simply educating people is insufficient on its own, despite the naive assumption that such efforts bring about familiarity, thus diminishing outsized perceptions of risk. Why is this? Cognitive dissonance. For individuals already negatively predisposed toward a subject (i.e., nuclear energy), the presentation of new information produces an uncomfortable state of dissonance, which the natural mental reaction is to resolve. Typically this is done by dismissing the conflicting information and seeking reinforcing information from "trusted" sources, thus perhaps illustrating why, in spite of repeated debunking, some myths just won't die. And indeed, this is something we've seen before - again and again.
Going yet further, proponents of the Cultural Cognition hypothesis posit that educating participants on topics to which they were begin previously uninformed can actually produce a polarizing effect in attitudes. An example of this is a study in participant attitudes in nanotechnology, where most individuals have little starting information. The presentation of educational materials on the risks and benefits of nanotechnology actually had the effect of polarizing these individuals, despite the same information being presented - again implying that education on its own does not necessarily lead to broad accord.
Going yet further, proponents of the Cultural Cognition hypothesis posit that educating participants on topics to which they were begin previously uninformed can actually produce a polarizing effect in attitudes. An example of this is a study in participant attitudes in nanotechnology, where most individuals have little starting information. The presentation of educational materials on the risks and benefits of nanotechnology actually had the effect of polarizing these individuals, despite the same information being presented - again implying that education on its own does not necessarily lead to broad accord.
Does this mean education is hopeless? Not at all - but what it does mean is that education must be carried out in a way which minimizes cognitive dissonance, namely by engaging with the value system of the listener. That is, in presenting information in such a way which affirms rather than challenges the deeply-held values of the audience, said persons are more likely to be open to processing this new information and challenging previously-held beliefs.
In my last post, I alluded to the fact that individuals holding an "individualist" value persuasion were more likely to be open to evaluating risks of global climate change if nuclear power is presented in this context as the solution to climate change, rather than regulation. (Joe Romm, are you listening?) In this case, it is a matter of a message speaking to the values of the listener - individualists tend to be more prone to considering technological solutions to social problems and disinclined to solutions which encroach upon private, market-oriented mechanisms to social ordering.
As a personal aside, I will say as someone with a similar worldview, the connection between nuclear energy and climate change made a similar impression upon myself - that is, in evaluating climate change as a problem to be solved through human ingenuity rather than imposed impoverishment, a discordance is removed - it is possible to reconcile a concern for climate change with previously-held values.
Obviously, this works with different value orientations as well - those with egalitarian value systems can arguably be brought around to support nuclear energy if it is seen as affirming egalitarian social values - two examples which come to mind are those of energy poverty and the inherently unequal outcomes of climate change, which would disproportionately impact the world's poorest nations (i.e., those incapable of adapting to climate change through economic and technical means).
Summing it up
To summarize - providing education and facts are good, useful even - but on their own insufficient without presenting those facts in a context which engages with the deeply-held values of the audience. To produce actual engagement - and even inducement to support - requires a producing a context of facts compatible with the values of those one is trying to reach. In other words, for the case of nuclear, it means going beyond education and comparative evaluation of risk (again, to emphasize, both of which are valid in and of themselves) and placing these within the framework of how this speaks to the values of the audience.
For individualists (who the research shows already tend to have a lower perceived risk attached to nuclear energy), this might mean presenting nuclear energy as a practical solution to climate change - something which has the spillover benefit of bringing about thoughtful consideration of the issues of climate change itself. For communitarians and egalitarians, this might mean both engaging in a demonstration of how nuclear energy can serve to mitigate much larger, more inequitable risks while meanwhile also honestly engaging concerns over safety and inter-generational equity issues like waste management. In other words, validating these concerns while demonstrating that these are issues which we take seriously and continue to devote considerable attention to.
None of this is a silver-bullet solution for engaging with the public, but it provides an illuminating context for which to facilitate a more productive discussion over energy.
A passage which struck me while I was researching this topic was when one proponent of cultural rationality (i.e., arguing that emotional reactions to risk have validity as moral, "normative" evaluations, alongside strictly technical, "positive" evaluations of risk) argued that members of the technological community do not have a privileged view of the normative factors associated with risk, particularly with respect to nuclear (the paper was on perceptions of nuclear risk in light of Fukushima) - that is, while members of the technical community have a privileged view of technical facts, they do not have a privileged view of overall assessments of what constitutes acceptability in risk - a normative judgement.
All of this of course is true. In as much, in my mind it is the job of the nuclear professionals (as members of the "technical community") to do our best to provide an accurate technical framework for these evaluations of risk by the public, such that they can make the most sound decisions on risk. Meanwhile it is the job of nuclear communicators and advocates to speak to values, as to produce more fair evaluations of both the benefits and risks of nuclear, particularly in the context of available energy choices.
No comments:
Post a Comment