Friday, January 25, 2013

Where's the real bottleneck for natural gas? Distribution.

As a scientist and an unabashed nerd, I love data. Particularly, I love it when ready access to data reveals things that are surprising in the face of conventional wisdom.

Graph of wholesale electricity and natural gas prices from ISO-NEAn interesting case comes up with this year's colder winter in New England. (Thankfully, I'm located in damp, icy East Tennessee, where even the threat of ice and snow manages to bring civilization to a grinding halt.) In particular, Meredith Angwin noted an interesting press release by ISO-NE (the grid operator for the northeastern United States), ostensibly pointing to record-low wholesale electricity prices, but containing something more interesting buried beneath - a slow but quite noticeable creep of wholesale natural gas prices upward to the range of $6/MMBtu (1 MMBtu = 1 million British Thermal Units).
At first glance, this seemed a bit surprising to me, given that while spot prices for natural gas have nudged a bit upward, they're still hovering well under the $4/MMBtu mark; in other words, they don't seem to be going anywhere fast. What really began to catch my eye however was the cyclic behavior of gas prices in the chart from ISO-NE, something which doesn't show up in spot prices from Henry Hub (which generally sets the market spot price for natural gas in the U.S.) 

Natural gas spot, retail, and electricity pricesLooking to verify the trend, I dug a little further around EIA's website. While unfortunately their data on "citygate" prices are a few months behind, the regular periodicity in the citygate price was likewise not there - instead, taken as an average across the U.S., spot, wholesale, and electricity costs derived from natural gas tend to have a strong correlation. Yet here we see above in the data from ISO-NE that prices clearly are deviating substantially from spot prices - what gives?

Natural gas prices spike in the NE corridorIt turns out in fact that the culprit is in distribution. A look around EIA's website brought me to this interesting report, which notes that supply bottlenecks in U.S. northeast for natural gas are expected to produce significant variances in energy prices from the rest of the U.S., and in particular from Henry Hub prices.

In essence, despite a relatively abundant supply of natural gas at the wellhead due to the proliferation of wells seeking to exploit unconventional resources, one thing the laws of physics haven't changed for is the capacity of distribution infrastructure - in other words, pipeline capacity. Natural gas doesn't really care where its end destination is - be it for electricity or home heating. Which means a cold winter can easily drive up demand and stress pipeline capacities - precisely what is occurring, according to the EIA report: utilization rates at the Algonquin compressor station have averaged around 86% for the months of November through December 2012.

Capacity versus price comparison for Algonquin compressorBasic economics can predict what happens next. Because natural gas is generally shipped across pipelines as a compressed gas, due to frictional losses across the pipeline, it must be repressurized at terminals across the pipeline network. The higher demand for gas goes, the closer to maximum capacity these terminals reach. And, as EIA data helpfully shows, the closer utilization reaches to 100%, the larger prices begin to "spread" from spot prices at Henry Hub.



In other words, while the commodity price of gas may indeed be cheap, the wholesale cost to utilities can be an entirely regional phenomenon. This is especially true in the blustery cold of New England winter, where demand is especially cyclic.

Citygate prices in the Northeast versus U.S. average
I compiled together citygate prices for natural gas across the Northeast and compared them to U.S. average citygate prices, and the effect is quite clear - states in the Northeast pay on average an appreciable premium on wholesale prices over the U.S. average wholesale, precisely due to these types of bottlenecks, particularly during times of peak demand (i.e., cold winters). Obviously, pipeline capacity has been steadily increasing in response to demand for gas, however the real issue will inevitably be cyclical "spikes" due to competing uses of gas as a heating source.

Why go to all of this trouble to look at trends in gas consumption (especially when I'm not an energy economist by trade)? Namely because it upends some recent "conventional wisdom" about energy; namely, gas prices are still a regional phenomenon.  While there are some places where supply is not as constrained by distribution capacity (or driven by cyclic consumption) and it thus makes perfect sense to look at natural gas a short-term replacement electricity source (particularly for coal), the Northeast is a shining example where this is definitely not the case. Which in turn makes efforts to shut down stable and relatively low-cost (not to mention carbon-free) baseload sources like Vermont Yankee (and Indian Point) all the more insane - despite the claims of proponents to this end, the balance of electricity from these plants would not be made up from renewables but rather almost certainly come from natural gas. To which again, low spot prices for natural gas are in fact a remarkably poor indicator - especially if one considers what adding an additional consumption driver would look like in terms of utilization "bottlenecks" during times of peak demand (i.e., winter).

I still remain unconvinced of the argument that wholesale natural gas prices are set to explode (and so stand by my bet with Rod Adams) - namely because of the fact that an abundance of known, recoverable supplies makes economics of the raw commodity relatively self-correcting. (In other words, as gas prices go up, so too will the number of wells, pushing prices back down to an equilibrium). However, revolutions in the recovery of gas from novel geology hasn't changed the fundamental physics of pipelines - which in turn may be the real constraint to natural gas growth as an electricity source, at least on a regional scale.

Update: Via Twitter, Rod Adams points to an EIA alert from yesterday about natural gas prices and capacity in the Northeast, given the recent cold snap. Current local spot prices at Algonquin and Transco Z 6 NY (a New York-based distribution hub) are around ~$30/MMBtu - almost 10 times the Henry Hub spot price, with utilization factors reported to be at over 85% capacity. As Rod puts it, "Winter happens."

Wednesday, January 16, 2013

DOE's spent fuel strategy: Not a bang but a whimper

There is a hallowed tradition in Washington known as the "Friday Document Dump," in which news and announcements the government wishes to bury are strategically timed for Friday afternoons, when such announcements tend to fall through the cracks of the typical news cycle (i.e., assuming reporters are even present to cover the event, the strategic timing tends to ensure it will miss the weekend papers, thus effectively "burying" the story by the time the new week rolls around).

DOE SNF strategy wordle
In this storied tradition, the Department of Energy released the Obama administration's response to the Blue Ribbon Commission report last Friday to relatively scarce media coverage. In fact, one would be hard-pressed to find any coverage in many of the major papers; what little coverage there was can be found in the Washington Times, Platts (an energy publication), and the Las Vegas Review-Journal. (Needless to say, the timing appears to have had its intended effect).

AREVA's NextEnergy blog and Nuclear Diner have already posted some of their thoughts on the release, but after reading the DOE's report I have to say I've felt a bit underwhelmed. As a friend remarked, it's a document "laying out the next set of milestones for the nation's spent fuel management program to miss." I wish I could say he was joking.

Some of the major highlights:
  • An emphasis upon a flexible, staged, consent-based process for locating a permanent geologic repository for used nuclear fuel designed to be adaptive to potentially changing circumstances.
  • A new, independent waste disposal organization charged with overseeing used fuel management and disposal, along with legislative action to reform allocation of the Nuclear Waste Fee paid by operators to allow for greater operational flexibility and independence.
  • Short-term emphasis upon siting a pilot interim storage facility for used nuclear fuel, with a triage priority of relocating fuel from decommissioned reactor sites first. Operations would begin in 2021.
  • Transitioning toward an operational interim storage site with sufficient capacity to meet the existing federal government's liabilities under the Nuclear Waste Policy Act of 1982; operations to begin in 2025.
  • Making "demonstrable progress" toward locating and characterizing a potential geologic repository with a target operations date of 2048.
Copy Pasta
Much of the above points are relatively familiar, essentially retreading what has already been detailed in the original BRC report findings (thus begging the question of why a 14-page response would take so very long). And, for the most part, the BRC findings, translated to the DOE report, are not bad findings - however it's hard to find where the DOE's report has added much at all to the discussion aside from a blanket endorsement.

Perhaps to the disappointment of the AREVA (who emphasized reprocessing as a viable fuel cycle strategy in their blog response), the report seems to go out of its way to minimize the potential role of reprocessing in a future U.S. fuel cycle strategy - in fact, one point which stuck out to me was in that the DOE report recommended that the scope of the waste management organization (referred to as a "management and disposal organization, or "MDO" - because if there's one thing Washington loves, it's acronyms...) should be explicitly constrained to explicitly exclude reprocessing. Here's the relevant quote:
In addition, the mission of the MDO will need to be carefully defined. For example, funding made available to the MDO should be used only for the management and disposal of radioactive waste. While this could include the management and disposal of waste resulting from the processing of defense materials, the MDO itself should not be authorized to perform research on, fund or conduct activities to reprocess or recycle used nuclear fuel. These limitations on the MDO mission are consistent with the recommendations of the BRC.
Thus, it would strongly indicate a commitment to a once-through fuel cycle for the time being. Among other factors cited to support this decision was ORNL research I'd highlighted in my previous post, which indicated that most of the current used nuclear fuel inventory (98%, in fact) could be consigned to direct disposal even assuming a future closed nuclear fuel cycle.

With respect to the emphasis on interim storage, I have to admit to having a somewhat adverse reaction while reading the report - namely because of the jarring disparity between words and deeds. In particular, such a ready-made pilot facility for interm storage based upon local consent has already been proposed - Private Fuel Storage. PFS existed as a consortium of nuclear utilities; it negotiated a contract with the Skull Valley Band of the Goshute Indian Tribe located in Utah (located about 70 miles SSW from Salt Lake City).

PFS had been attempting to open a privately owned and operated interim storage site for over ten years (it first filed a license application with the NRC in 1997); in the process, it has been a political football of multiple administrations in the ongoing battle over Yucca Mountain. Ultimately, PFS received an operating license from the NRC in 2006, yet various shenanigans from both the state and federal level prevented it from ever opening. (The Bureau of Land Management refused to allow for the expansion of a rail line to ship fuel to the reservation, and the state of Utah continued to block any shipments of spent fuel canisters to the site along Utah highways. Despite the fact that as a Native American tribe the Skull Valley Band is legally autonomous from the state of Utah, the state government found plenty of other ways to frustrate the intentions of the Goshute Tribe and PFS.) 

Roughly two weeks before the DOE report was released, PFS finally announced its intention to withdraw its license from the NRC - namely because it was clear that the process was going nowhere (and licenses aren't free). Thus, a jarring chasm between word and deed - clearly, a pilot interim storage site already existed - one which had the consent of the local government (in this case, the Goshute Tribe); however, the Obama administration has shown little inclination to intervene. One is left to wonder then how any other future site could hope to get off the ground when a ready-made solution such as this one is abandoned to state-level sabotage; one can easily see such a scenario playing itself out with states blocking shipments to interim sites located outside their borders based on the Utah example.

Particularly depressing about the overall strategy is in its relative lack of ambition; a planned operating date for an interim storage site which would happen a mere 27 years after the original timeline obligated by the Nuclear Waste Policy Act (and 43 years after the act was first passed), with no repository in sight until I (a relatively young and spry individual at the present) am poised to retire - a full 50 years past the original deadline. (Only in the federal government is one allowed to miss a deadline by a full half-century with a straight face.)

I will be the first to say that the 1987 amendments to the Nuclear Waste Policy Act which ultimately decreed Yucca Mountain as the nation's sole geologic repository by virtue of legislative fiat was a mess. But the warmed-over copy-'n'-paste job combined with completely lackluster goals for siting a repository look like rather unseemly indicators that Obama administration's approach to the BRC process was essentially that of a stalling tactic, following their contentious decision (both politically and legally) to cancel the Yucca Mountain project. If one is to unilaterally dismantle nearly three decades of standing policy of nuclear waste disposal policy, a little more should be expected in terms of an alternative. The DOE report would not be it.

Tuesday, January 15, 2013

To reprocess or dispose? A look at fuel cycle triage

A recent study by former colleagues of mine from Oak Ridge National Laboratory raises some interesting questions about the future direction of U.S. nuclear fuel cycle. My colleagues have been presently engaged in a scientific triage study for used nuclear fuel disposition options. One of the largest parts of their work has simply been in collecting the massive amount of data on the 67,600 metric tons (1 MT = 1000 kg) of commercial used nuclear fuel in the U.S., including issues such as how long it was burned in the reactor, the fuel type, and the initial enrichment, with an objective of being able to accurately characterize the composition and location of every used nuclear fuel assembly presently in the U.S. (I also am tangentially involved in this work, funding an undergraduate for data collection and am hoping to expand my role into doing modeling work in support of this effort).

The overall goal of this work is to support a more informed decision framework to specifically look at how we deal with spent fuel inventories in the U.S. - in other words, performing a triage analysis on what fuel would be the best candidates for various fuel cycle options (including direct disposal versus recycling). Given that some fuel is inherently going to be less suitable (read: more expensive) for recovering actinides as future fuel material, the goal is to sort out what can be disposed of immediately and what might be preserved for future fuel cycles.

Their (surprising) finding was that of the present inventory, 98% of the current used fuel inventory (by mass) could be disposed of without leaving open the option of future retrieval while still allowing for the ability to facilitate a future closed fuel cycle in the U.S. This conclusion was based upon the assumption that the U.S. would eventually open a fuel reprocessing facility; even under this assumption most of the present inventory of used nuclear fuel is not needed to support such a cycle. Some of this is simply due to the large inventory of used nuclear fuel in the U.S. - at nearly 68,000 metric tons of heavy metal with the largest fuel reprocessing centers having a throughput on the order of 1,000-1,500 MTHM per year, there is simply more "legacy" fuel out there than a typical facility would ever usefully process.

Their decision analysis was based on several factors, including the value of material which would be recovered (older fuel tends to have less plutonium available for recovery, and the plutonium is of lower quality); complexity (older fuel has other complicating factors such as different types of cladding material - like stainless steel - which can complicate potential recovery and thus make it less preferable to newer fuel), and simply the amount of material needed to sustain a closed fuel cycle (given the time before such a facility would come online, it is anticipated more than sufficient inventories would be present to sustain a closed fuel cycle without drawing into older fuel). Likewise, they considered what fuel assemblies might be useful to future reprocessing research efforts by DOE (such as used, highly-enriched fuel from naval and research programs).

To many who advocate exploiting the resource potential of used nuclear fuel (myself included), this is a jarring conclusion. There has always been a tacit assumption in mind that domestic reprocessing would not only include future inventories of used nuclear fuel, but help to alleviate the pressure on current demand for geologic repository space by making use of the readily available inventories out there. Yet beyond looking at what is economically practical (i.e., prioritizing the most valuable fuel for recovery), the report brings in an eye-opening reality - given the fact that the U.S. has spent the last thirty years committed to a once-through fuel cycle track, there is simply more used fuel than a single modern reprocessing facility would have capacity to handle, especially given the stable influx of fuel coming out of future reactors which would form the foundation for a future closed fuel cycle. As a result, much of this "legacy" fuel becomes unnecessary to support such future fuel cycles.

A more important implication relates to geologic disposal itself. The plans for the (now likely former) Yucca Mountain site called for a 50-75 year "retrievability" window; in other words, the repository was to be operated for an extended period which would allow for retrieval of used fuel out of the repository for other uses. (After the retrieval period, it was generally assumed if no use case had emerged by this point, permanently closing the repository was the most reasonable option).

Designing a repository with future retrievability in mind doesn't come for free; it essentially adds another engineering constraint (read: cost) to the problem and ultimately requires further analysis of how the repository will perform in containing waste in addition to the "post-closure" period. (It also tends to bias one's choice of geology - a feature of salt-based repositories like WIPP is that they are explicitly not designed to be retrievable - the heat from nuclear waste packages generally causes salt to plastically deform around waste packages, effectively "sealing them in.")

Thus, figuring out what spent fuel has little potential prospect for future recovery represents an technical triage which can help simplify a future repository design (as well as open up options for where such repositories might be located). In essence, separating out the "wheat" (fuel more useful for recovery) from the "chaff" (fuel which has limited potential for recovery) allows for a more intelligent approach to used fuel disposition which can ultimately make constructing a future permanent geologic repository cheaper and easier.

Of course, the standard caveat applies: the hardest part of opening any geologic repository has never been technical so much as it has political. Nonetheless, the ORNL report offers a rather bracing conclusion as to what a future U.S. fuel cycle may look like, even if the decision is made to restart reprocessing in the U.S. Ultimately, the vast majority of the current inventory of used nuclear fuel may yet still be destined for direct disposal, simply due to the realities of waiting over three decades before finally deciding to reconsider our rather ill-fated national decision to abandon a closed nuclear fuel cycle.

Thursday, January 3, 2013

Scientagonism: The problem of antagonistic science communication

A recent column by Daniel Sarewitz in Nature on bridging the "partisan divide" with respect to public perception of science inspired some spirited debate over on my twitter feed yesterday. The short version goes something like this: scientists are often perceived as being in the thrall of Democrats, exposing the greater scientific enterprise to being undermined as simply another partisan front (or, alternatively factionalizing, wherein partisan camps each bring in their own "experts" an accuse the other side of "junk science). None of this is helped by scientists who go out of their way to bring on their antagonism - see, for example, the letter signed by 68 Nobel laureates endorsing President Obama over Mitt Romney in the last election (in which Sarewitz notes that of the 68, 43 have a record of public donations to candidates, and of these, only five have ever donated to Republican candidates, and none in the last election cycle). It goes without saying that, well-meaning as it may be, openly partisan activities like this aren't helping with the whole "not being perceived as a lockstep Democratic constituency" thing. (Note that I am explicitly not advocating mass abdication of scientists from the political discourse, which a genuinely terrible idea - but rather, a caution that lending one's scientific credibility to openly partisan ventures may not be in the best strategic interests of science...)

Dueling PhD banjos
Sarewitz recommends bringing together scientists with less monolithic political views together to demonstrate overall scientific consensus on key issues such as global climate change and the like, along with ensuring greater ideological balance in high-profile scientific advisory panels. The overall of goal of such an enterprise would be in restoring a public perception of science as a bipartisan enterprise - and in particular, inoculating policies based on scientific recommendations as simply being based upon "partisan science" - or to use a favorite expression - bringing in the "dueling PhD's." Unfortunately, while Sarewitz correctly diagnoses the problem, his solution falls far short.

The deeper problem here antagonism - both perceived and real. Dan Kahan (of the Yale Cultural Cognition Project) has prolifically written about the issue of "Cultural Cognition" - in other words, how our individual values can (unconsciously) conspire to shape perceptions of risk to accommodate our pre-existing worldviews (something I've discussed prior in how this relates to public perception of risk and nuclear energy) - also known as motivated reasoning. In essence, the mind rebels against cognitive dissonance and will do what it takes to ensure such is resolved - namely by shaping our perceptions to confirm previously-held beliefs. Ideology, as it turns out, is an extremely effective marker for predicting risk perception - and more distressingly, these differences in perception grow more pronounced with "high-information" individuals, strongly pointing to the existence of motivated reasoning.

So what does all of this have to do with antagonism? Quite simply: everything. People will by nature rebel against information perceived to be antagonistic to their worldviews - downplaying evidence of phenomena that threatens their worldviews. (Kahan notes how this cuts several ways - both in how the threat of global climate change threatens market-oriented views of individualists and hierachists, and how the associations of nuclear power with "big business" and highly concentrated capital raises the hackles of those of more egalitarian and communitarian mindsets.) These associations are particularly acute when said scientific issues are charged with a single solution - such as in the case of climate, direct government intervention into the economy to regulate carbon dioxide emissions.

One of the more interesting outcomes of some of Kahan's experimental work has been in strategies toward de-polarization - science communication strategies which seek to minimize these perception gaps, namely by presenting scientific information in a way which seeks to minimize antagonism toward deeply-held values.  An example of this depolarization with respect to climate change is of course nuclear energy (along with geo-engineering); when communication of climate risks is presented with policy prescriptions of increased use of nuclear energy or new technologies such as geoengineering, individuals oriented toward skepticism of climate risks become more receptive - in other words, the use of framing has a demonstrable de-polarizing effect. Why? Namely because the science is now presented in a context where it is no longer threatening to the worldview of the listener.

And yet too often in science communication (and at times among nuclear advocates as well) there is the very opposite at work - science is presented as antagonistically as possible to the audience - as if somehow dismissing climate skeptics and religious fundamentalists as stupid and venal will cow them into belief. (Once again, to my horror I have seen the same phenomenon at work in certain discussions over nuclear energy - where those representing the house will shout down any who dare trespass in their domain instead of making any attempt at reasonable engagement.)

The same goes as well for the policies that from the science - absolutist arguments that inherently tie science to one favored set of policies - rather than a panoply of potential solutions. Such strategies are practically an open invitation to partisanship and motivated reasoning, and yet all too often are the standard for how high-profile science communication on controversial issues gets done. (Similarly, attempts to reconcile the idea of science as not being fundamentally incompatible or at odds with various political and religious values are frequently dismissed as at best naive and at worst "selling out" science.) It is in these cases where members of the scientific community in fact become their own worst enemies - namely in hardening an opposition predicated on the idea that certain scientific findings are fundamentally antagonistic to their values (and thus we return to the realm of "dueling PhD's...")

To put it on a meta level for a moment - getting the public to accept the scientific process as a means of understanding the natural world is in essence getting them to agree upon a common source for facts. But the role of science communication is not and should not be a platform for antagonizing whatever misguided metaphysical or theological beliefs the speaker believes the audience has. In other words, science can and should speak to facts and leave issues of metaphysics to others. (Or, to put it yet another way as I did on Twitter - is your goal to change beliefs over scientific facts or religious theology?)

This problem of "dueling PhD's" - or to put it another way, competing certifications on science, and in turn what experts we trust inherently come back to these kinds of issues. Kahan recently posted an interesting four-part essay (drawing heavily on the ideas of Karl Popper) on the notion of a "Liberal Republic of Science" (IIIIIIIV) - discussing how a key issue which arises even in societies which broadly accept science as a foundation of knowledge is in the inevitable conflicts of how we certify these sources of facts - in other words, the dueling PhDs. (Kahan stresses that in his view, much of the current wrangling over hot-button issues like climate, nuclear power, and vaccines is not even a question of who accepts science as a source of knowledge as it is the process of how our values shape whose information we certify as credible - which again, comes back to how this information validates existing value systems. Kahan's argument is thus for a science of science communication.) Ultimately this once again returns to the issue of antagonism - science presented in a way which is directly antagonistic to the values of the listener will be stripped of credibility in favor of information from sources which does not antagonize values. (Thus we get to Kahan's argument for a science of science communication - determining the best means of ensuring the best and most accurate scientific information is received and accepted by the overall public.)

Growing a consensus on science as a source of knowledge (or further, developing a common understanding on the same core set of scientific facts) does not imply unanimity in policy ends (and nor should it!), namely because policy is inherently a normative process. More importantly, dropping an explicitly antagonistic communication strategy in favor of one more easily accommodating to diverse values doesn't it in any way imply "giving in" or "selling out" science (as my position has been rather uncharitably characterized). Above all else, the goal here is to get people recognize a common starting point for facts, and letting the implications - both policy and metaphysical - flow from this common starting point. Getting people to agree to the reality of climate change does not imply unanimity about what to do about it, namely because this inherently involves value judgments over the required trade-offs - and of course the same is true for nuclear energy as well. What it does do however is to ensure a more honest, reasoned, and productive discussion of the available options.

Again, however - this requires a strategy for science communication that inherently puts aside antagonism and focuses upon compatibility with existing values. Two recent posts - one by +Suzanne Hobbs Baker at the ANS Nuclear Cafe and one by +Rod Adams at Atomic Insights fit well into what I'm proposing. Both discuss the role of communicating the value of nuclear energy as a strategy for combating climate change - Suzy within the context of framing nuclear as an ally of environmentalism in the face of climate change, and Rod in regards to how because discussions of climate are often so charged even within pro-nuclear communities that such debates become toxic (and thus are often placed strictly off-limits), thus depriving the nuclear community of a key message in communicating with the public. Both of them are focusing on how presenting nuclear as explicitly compatible with concerns with the environment can perhaps help to potentially forge partnerships from communities skeptical (and even at times adversarial) to one another. (And again, to emphasize - a deep concern over how to rectify doing something about climate change while maintaining our present standard of living is one of the fundamental reasons I decided to change careers...)

This is something that I myself have tried to embrace myself when dealing with audiences hostile to nuclear (such as the NNSA hearing on disposing of surplus weapons plutonium in MOX fuel in Chattanooga, back in September). The very first thing I acknowledged to the audience is that we clearly have disparate opinions about nuclear energy (ones unlikely to be resolved in the span of a single evening) but that everyone in the room shared common concerns over peace and security - our preferred means of achieving this ("...to MOX or not to MOX, that is the question...") simply differed. I'm not so fantastically egotistical as to believe this changed the entire tone of the meeting (there were still certainly rancorous and loud comments by the opposition), but I do sincerely believe starting from a position of common values and as much as possible eschewing antagonism helped to provoke thoughtful discussions which occurred afterwards (and at least some civility during).

None of this implies stepping down antagonism in science communication is a magic-bullet or a panacea, nor will it necessarily work in all cases (such as dealing with perhaps the most hardened zealots - be they of the anti-nuclear or fundamentalist variety...) But what it can do (in fact, what folks like Kahan have explicitly demonstrated when it comes to "compatabalist" communication strategies), is that it can help to detoxify these kinds of discussions, namely by pulling people away from the brink by not threatening their deeper values. That in itself would be progress.