Showing posts with label risk. Show all posts
Showing posts with label risk. Show all posts

Thursday, April 21, 2016

Eduardo Porter Overstates His Case

Eduardo Porter in “Liberal Biases, Too, May Block Climate Change” blasts those on “the left” who question the use of nuclear power to mitigate the discharge of carbon that contributes to global warming. His allegation and its premises require careful consideration.

Porter aims his case against those on the left who oppose or doubt the need to adopt nuclear power as the “the only technology with an established track record of generating electricity at scale while emitting virtually no greenhouse gasses.” Porter quotes Netscape founder Marc Andreessen that “the left is turning anti-science” and has become “reactionary”, with Andreessen citing resistance to the use of genetically modified foods and the expression of doubts about the displacement of workers by technology as two examples a “reactionary” trend. Porter, by citing the quote (and in the remainder of the article) apparently shares this view. Porter cites survey results that 65% of members of the American Association for the Advancement of Science support nuclear power. He fails to reveal or discuss the grounds upon which the 65% supporting the use of nuclear energy or the 35% who oppose it base their decisions. Porter even notes—suggesting, I think, that we (liberals) should be shocked—that more Republicans support the use of nuclear power than Democrats.

While Porter remarks that climate change denial as espoused by Senator Cruz is “absurd,” he counter-balances Cruz’s absurdity by stating: “But Bernie Sanders’s argument that “toxic waste byproducts of nuclear plants are not worth the risks of the technology’s benefit” might also be damaging.” Porter fails to follow-up on this quote by explaining how this concern isn’t legitimate. Instead, Porter moves into a discussion of “our scientific and technological taboos”, suggesting that Sanders statement is an example of yielding to such a taboo.
Porter’s argument turns toward issues like evolution and general relatively (Einstein’s theory) as examples of how beliefs and interests affect a person’s willingness to adopt a scientific proposition. No doubt that this is true, but it’s not equally true or consequential for every possible scientific theory or decision based on a theory. For instance, Christian fundamentalists question or deny the theory of evolution because it conflicts with a literal reading of the Bible. Porter compares this with those on the left “who said scientists either disagreed or were divided on the safety of storing nuclear waste”, suggesting that this, too, is a belief motivated by bias against science based on some other beliefs. Porter ignores the 35% of members surveyed by the American Association for the Advancement of Science referenced earlier who didn’t support building more nuclear plants—or are those scientists “the left” in that group who are subject to reactionary taboos?
In another balancing point, Porter notes that the right favors smaller government and free enterprise and are therefore motivated to deny climate change because doing so would require a modification of those ideological beliefs. Fair enough, but then look at the other side:
On the left, by contrast, people tend to mistrust corporations — especially big ones — as corrupt and destructive. These are the institutions bringing us both nuclear power and genetically modified agriculture.
Porter seems to suggest that we should be asking ourselves “why on earth would someone adopt such a foolish attitude about big corporations? What they told us about the safety of cigarette smoking and their concealment of global warming evidence was so honest, forthright, and helpful to all of us!”. (I leave other examples to your sound recollection.)

Porter concludes with this peroation:

Fixing it [what exactly?] won’t require just better science. Eliminating the roadblocks against taking substantive action against climate change may require somehow dissociating the scientific facts from deeply rooted preferences about the world we want to live in, on both sides of the ideological divide.

For Porter it’s simple: just follow the scientific facts.

Now I should put my cards on the table.

I’m skeptical but not actively opposed the expanded use of nuclear power. As someone who’s been around for almost all of the nuclear age, including fallout from atomic testing, Three Mile Island, Chernobyl, and Fukushima, I have a profound concern about the harm that nuclear energy can unleash, as well as an appreciation of its potential benefits.

I understand that if we drastically reduce the use of carbons for fuel, we could suffer a reduction in the amount of energy available to us in our daily lives. Generally speaking, the greater the energy supply, the more complex the society, and a more complex society allows a better quality of life. Thus, I take a reduction in the energy available to society as a threat to our collective well-being if taken to an extreme. Of course, conservation, walking, and taking public transportation, for instance, are examples of reducing demands on the energy grid that won’t hurt—and make actually improve—the quality of life. But if taken too far, we all will suffer. As for alternative energy sources, they remain a hope, not a reality. Thus, we spurn any source of energy at our peril.

But the critique of Porter’s argument must go to a deeper level, addressing the conceptions of science, engineering, and risk assessment upon which Porter bases his argument.

Porter notes that science changes its opinions continually and sometimes drastically. Science and its practical application in engineering are human projects subject to human strengths and weaknesses. While science has continued to push back the barriers of ignorance and engineering has allowed the creation of increasingly sophisticated structures and systems, both still carry the curse of human fallibility. Science knows a lot and really smart scientists know that we are ignorant of a great deal more, both because of the inherent limitations of the human thinking and as a result of the biases that we all struggle with. As to creations that we build, we create amazing things, and we create catastrophic failures. Having seen what failings nuclear plants can suffer as a result of Nature’s unpredictability (Fukushima) or as a result of human flaws of design and operation (Three Mile Island and Chernobyl), caution should be our guide. While we can push back against the tide of ignorance and failure, we can never fully defeat it. Science and engineering are not monolithic gods to whom we should bow before in a new idolatry. Instead, they are human enterprises that must remain subject to an awareness of  our limitations. Hubris is the greatest enemy of science and engineering; it always has been, and I suspect this will remain so.

But to warn of hubris on the part of science and technology in believing that these enterprises can move beyond failure and ignorance is a meta-perspective. We need to address the particular problem of nuclear energy. As the advisability of using nuclear power, we should ask about the specific risks and benefits. Having spent more than three decades as a practicing lawyer—and because I believe that certain legal principles provide a wide-ranging and sensible set of guidelines—we should ask about issues of liability if problems with a nuclear plant develop (and not limit ourselves to fuel waste). What liability insurance coverage does a nuclear plant in the U.S. receive? What is required? How does an insurance carrier measure the risk? How is the likelihood of risk measured? How is the magnitude of damage measured? What standard of liability applies? Strict liability (liability regardless of fault, say because of an earthquake or tsunami) or ordinary negligence (a foreseeable risk that a reasonable person in like or similar circumstances could have avoided). Are there any caps on damages? In other words, what is the largest loss that any carrier or guarantor (i.e., taxpayers) would be expected to pay out? What are the other available options and how do they compare on these and other relevant criteria, such as technical feasibility? And last but not least, what Black Swans lurk in the field? Or in Rumsfield language, what are the unknown unknowns? If the Deepwater catastrophe did x amount of harm, what can we expect from the next nuclear disaster?

There are some answers to these questions. The U.S. nuclear industry does have insurance coverage, although it's backed up by the government. But I question (and don’t presume to have a final answer) whether the appreciation of the risks has been appropriately addressed. Before I’d say yea or nay to further plants—and without any concern for Porter’s imagined taboos—I’d have to review and consider these risks and the available alternatives.  


The point of this exercise is not that Porter is wrong to argue in favor of using nuclear power to ameliorate the carbon loading that increases the likelihood of catastrophic climate change. This argument can and should be made. However, to argue that those who question the wisdom of using nuclear power are Luddites who refuse to worship the god of Science and Technology is a calumny. It presents a naïve view of science, and it fails to consider the demanding issues of weighing risks and benefits. We can and must do better. Mr. Porter should do better by his readers. 

Wednesday, June 3, 2015

The Future Ain’t What It Used to Be

The Master: John Maynard Keynes

Niall Ferguson, a Scot @ Harvard
Robert "Baron" Skidelsky











A short while ago I wrote about Jeff Sachs’s criticisms of Paul Krugman on fiscal stimulus. Since then, in an intellectual volley worthy of a Wimbledon final (all Brit, no less) pits Niall Ferguson on behalf of Cameron-Osborne austerity against Robert Skidelsky representing proponents of Keynesian stimulus. The two have squired off at in order: Ferguson 1, Skidelsky May 19, Ferguson May 19, Skidelsky May 28,  Ferguson June 1 ). As I set forth in my Sachs-Krugman post, I’m more persuaded by the Keynesian position. Following Ferguson, I agree that Keynes indeed was an austerian when austerity was appropriate (around the wars). But a Keynes quote cited by Ferguson—hoping to hoist Keynes biographer Skidelsky on his own petard—captured my attention. Ferguson writes, quoting Keynes: 

Responding to some early critics of his General Theory, Keynes showed that he recognized the importance of uncertainty in economic life, and consequently the difficulty of making predictions. “The whole object of the accumulation of wealth,” he wrote, “is to produce results, or potential results, at a comparatively distant, and sometimes at an indefinitely distant, date.” 


But, Keynes continued, “our knowledge of the future is fluctuating, vague, and uncertain.” There are simply too many things – from the “prospect of a European war” to the “price of copper and the interest rate 20 years hence” – about which “there is no scientific basis on which to form any calculable probability whatever.” 




Ferguson doesn’t mention it, but Keynes wrote about probability and uncertainty in his A Treatise on Probability. Keynes was a complex and deep thinker, perhaps muddled? Did he contradict himself by writing the above about uncertainty while (at times) recommending active government intervention in the economy by way of fiscal stimulus? I think not. And this brings us to the crux of my concern. 

While Keynes at times recommended stimulus and at other times austerity, there is no reason to believe that his appreciation of the uncertainty about the future and the consequences of any current course of action upon the future changed with his policy recommendations. Yet even in the face of acknowledged uncertainty, he acted. He chose (or recommended) courses actions that he believed would most likely bring about a desired result. Was he certain of either cause or effect? No, but as any decision-maker, he was confronted with choices to spend or not to spend, to provide only one example. He or any policymaker could, of course, chose to do nothing, but barring ignorance or negligent indifference, that too is a choice. So with any economic decision. We don’t very often have the luxury of knowing for certain that our choices will bring about the results that we intend. That’s been the downfall of many an economic prediction. Thus, when we make economic decisions, we have (but do not necessarily know) a range of probabilities that our choice will bring about a desired effect, much like a weather forecast (“a 50% chance of rain today”). If we cut the price of our widgets, we’ll probably sell more. Probably. If we invest in Acme Corporation, we may make money, or we may find out that the market for widgets has collapsed, and with it our investment. Only rarely can we act with a sense of certainty, especially in a complex system like the economy.

I think that wise decision-makers put faith—and money—on the soundness of the decision-making process and not on theories about results. Repeated experience is the best guide, but it’s often only available by analogy. History can provide many lessons, but it’s easy to apply the wrong lesson to a problem. And history never—exactly—repeats itself. (“History does not repeat itself, but it does rhyme”—attributed to Mark Twain.) We judge by analogies. Humans, societies, governments, and economies: all are complex organisms that defy consistent mechanical interpretation and manipulation. Human culture mutates too quickly to pin too much certainty on a mechanical prediction of action. We can gain some insight into behavior demonstrated by large numbers, but even that method is subject to change. 

So when Ferguson says to Skidelsky and Krugman that you can’t prove  that the British economy would have performed better with a fiscal stimulus instead of austerity, he’s correct in a limited but inconsequential sense. Ferguson is talking about an alternate course of history, a counter-factual, as Ferguson has used and practiced the concept. (See his Virtual History, a book of counter-factual essays that he edited and contributed to.) The only path we know with (some) certainty is the path taken—and even then the contours of that path are hard to discern, as the Ferguson-Skidelsky war of stats shows. 

The NYT recently published a feature on Paul Ehrlich’s The Population Bomb that predicted social and economic collapse with rising population within a decade or so of its publication. It didn’t happen. Ehrlich was wrong—but how wrong? Is Ehrlich wrong because there are no limits to the carrying capacity of Earth to support any human population? I don’t suspect many knowledgeable people will support this conclusion. If so, at some point, over-population could trigger a catastrophic decline in human well-being. (If you don’t believe that an over-stressed environment could lead to civilizational collapse, then you should brush up on your Diamond (here under Social Science and here via Stephen Walt), Tainter, Homer-Dixon (here and here), Ophuls, Mark Buchanan,  and Ferguson*, for starters.) We can conclude that Ehrlich’s initial predictions were inaccurate, but we can’t conclude that his basic premise (limits to human population) is unfounded. 

This leads us to territory explored by Nassim N. Taleb, who has argued along lines I believe appropriate. Because we don’t know the magnitude or probability of some risks, we should not take the risks. That is to say, we are in a territory marked by uncertainty, no probability. We should not run some experiments where, if the experiment turns out badly, N=1 because we are no longer able to repeat the experiment. For instance, how much do we want to experiment with nuclear war? How many times could humanity run a nuclear war experiment?  Taleb argues that genetically modified foods and human-caused global warming should be addressed with the acknowledgement that we don’t know with any real certainty the potential consequences or the likelihood of a catastrophic (or slow motion) event. Without referring to it directly, Taleb seems to follow a negligence theory that judges behavior by the magnitude of the risk of harm (perhaps measurable, probably not) times the likelihood of the occurrence of the risk (perhaps measurable, probably not) to decide whether a particular course of action should be undertaken. (Taleb would add “skin the game” as well: consequences if the wrong choice is made. But in some of our examples, we’d all suffer the consequences.) The difference of course from a court of law (one of many) is that this formula can be applied to judge what action to take (or not) prospectively to avoid loss instead of using it to apportion loss retroactively. (I’ve written more on this general topic in this earlier blog post, “Thinking Like a Lawyer and Antifragility”.)

In both economics and population predictions, the other wild card variable is human action, which is responsive and strategic. Did Paul Ehrlich’s cry of wolf affect the wolf and not just the villagers? (And this assumes that population is not a problem; based on personal observation from living almost three years in India and China, I’m not willing to concede that population density doesn’t remain a crucial challenge.) Does the possibility of stimulus or austerity change the calculations of innumerable economic decisions? Certainly, and you can observe that most easily in markets. But what you see may surprise you and upset your expectations, for instance, some—like Ferguson—have not seen the high interest rates and inflation that they predicted with lose monetary policy from the Fed and a bit of fiscal stimulus. Even the common benchmarks can fail. Krugman, on the other hand, has called it right on the inflation issue.

So what are we left with? Educated guesses, uncertainty, caution, and a need for resilience when things don’t turn out as we hoped. The public dialogue can at its peril ignore risks and probabilities, but we do so to our detriment. We should specify our judgments about likelihoods of benefits and harms and our standards of proof. Voters, unlike jurors, get a second bite at the apple, and we should keep a track record of those who seek to guide and lead the public. And decision-makers and those who advise them would do well to become more sophisticated in this perspective. Our future depends upon it.

*This article by Ferguson displays a lot about Ferguson. The first part draws upon a deep ground of historical knowledge and sheds light on  the phenomena of collapse by applying current thinking about complexity. Thus, as Ferguson argues, “declines” are less dangerous than precipitous “falls”. In a complex system, including financial systems, a system can collapse very quickly, or as Ferguson’s felicitous prose describes it, there can be a “sudden shift from a good equilibrium to a bad mess”. Financial collapses and regime changes, such as the French Revolution, Ming China, the fall of the Hapsburg and Ottoman Empires, and the disintegration of the Soviet Union, happened quickly and to widespread surprise. Although Ferguson doesn’t make the point, such examples should provide some measure of humility in making predictions. However, he goes on to suggest that the Obama Administration and the Fed in spending and printing money might trigger a financial collapse (this was published in 2010). His forecast is vague but ominous. Once again, Ferguson the historian gets sidetracked by Ferguson the political hack. But I must say overall, the article is well argued and perceptive when it sticks to history and avoids forecasting.