Wednesday, June 3, 2015

The Future Ain’t What It Used to Be

The Master: John Maynard Keynes

Niall Ferguson, a Scot @ Harvard
Robert "Baron" Skidelsky











A short while ago I wrote about Jeff Sachs’s criticisms of Paul Krugman on fiscal stimulus. Since then, in an intellectual volley worthy of a Wimbledon final (all Brit, no less) pits Niall Ferguson on behalf of Cameron-Osborne austerity against Robert Skidelsky representing proponents of Keynesian stimulus. The two have squired off at in order: Ferguson 1, Skidelsky May 19, Ferguson May 19, Skidelsky May 28,  Ferguson June 1 ). As I set forth in my Sachs-Krugman post, I’m more persuaded by the Keynesian position. Following Ferguson, I agree that Keynes indeed was an austerian when austerity was appropriate (around the wars). But a Keynes quote cited by Ferguson—hoping to hoist Keynes biographer Skidelsky on his own petard—captured my attention. Ferguson writes, quoting Keynes: 

Responding to some early critics of his General Theory, Keynes showed that he recognized the importance of uncertainty in economic life, and consequently the difficulty of making predictions. “The whole object of the accumulation of wealth,” he wrote, “is to produce results, or potential results, at a comparatively distant, and sometimes at an indefinitely distant, date.” 


But, Keynes continued, “our knowledge of the future is fluctuating, vague, and uncertain.” There are simply too many things – from the “prospect of a European war” to the “price of copper and the interest rate 20 years hence” – about which “there is no scientific basis on which to form any calculable probability whatever.” 




Ferguson doesn’t mention it, but Keynes wrote about probability and uncertainty in his A Treatise on Probability. Keynes was a complex and deep thinker, perhaps muddled? Did he contradict himself by writing the above about uncertainty while (at times) recommending active government intervention in the economy by way of fiscal stimulus? I think not. And this brings us to the crux of my concern. 

While Keynes at times recommended stimulus and at other times austerity, there is no reason to believe that his appreciation of the uncertainty about the future and the consequences of any current course of action upon the future changed with his policy recommendations. Yet even in the face of acknowledged uncertainty, he acted. He chose (or recommended) courses actions that he believed would most likely bring about a desired result. Was he certain of either cause or effect? No, but as any decision-maker, he was confronted with choices to spend or not to spend, to provide only one example. He or any policymaker could, of course, chose to do nothing, but barring ignorance or negligent indifference, that too is a choice. So with any economic decision. We don’t very often have the luxury of knowing for certain that our choices will bring about the results that we intend. That’s been the downfall of many an economic prediction. Thus, when we make economic decisions, we have (but do not necessarily know) a range of probabilities that our choice will bring about a desired effect, much like a weather forecast (“a 50% chance of rain today”). If we cut the price of our widgets, we’ll probably sell more. Probably. If we invest in Acme Corporation, we may make money, or we may find out that the market for widgets has collapsed, and with it our investment. Only rarely can we act with a sense of certainty, especially in a complex system like the economy.

I think that wise decision-makers put faith—and money—on the soundness of the decision-making process and not on theories about results. Repeated experience is the best guide, but it’s often only available by analogy. History can provide many lessons, but it’s easy to apply the wrong lesson to a problem. And history never—exactly—repeats itself. (“History does not repeat itself, but it does rhyme”—attributed to Mark Twain.) We judge by analogies. Humans, societies, governments, and economies: all are complex organisms that defy consistent mechanical interpretation and manipulation. Human culture mutates too quickly to pin too much certainty on a mechanical prediction of action. We can gain some insight into behavior demonstrated by large numbers, but even that method is subject to change. 

So when Ferguson says to Skidelsky and Krugman that you can’t prove  that the British economy would have performed better with a fiscal stimulus instead of austerity, he’s correct in a limited but inconsequential sense. Ferguson is talking about an alternate course of history, a counter-factual, as Ferguson has used and practiced the concept. (See his Virtual History, a book of counter-factual essays that he edited and contributed to.) The only path we know with (some) certainty is the path taken—and even then the contours of that path are hard to discern, as the Ferguson-Skidelsky war of stats shows. 

The NYT recently published a feature on Paul Ehrlich’s The Population Bomb that predicted social and economic collapse with rising population within a decade or so of its publication. It didn’t happen. Ehrlich was wrong—but how wrong? Is Ehrlich wrong because there are no limits to the carrying capacity of Earth to support any human population? I don’t suspect many knowledgeable people will support this conclusion. If so, at some point, over-population could trigger a catastrophic decline in human well-being. (If you don’t believe that an over-stressed environment could lead to civilizational collapse, then you should brush up on your Diamond (here under Social Science and here via Stephen Walt), Tainter, Homer-Dixon (here and here), Ophuls, Mark Buchanan,  and Ferguson*, for starters.) We can conclude that Ehrlich’s initial predictions were inaccurate, but we can’t conclude that his basic premise (limits to human population) is unfounded. 

This leads us to territory explored by Nassim N. Taleb, who has argued along lines I believe appropriate. Because we don’t know the magnitude or probability of some risks, we should not take the risks. That is to say, we are in a territory marked by uncertainty, no probability. We should not run some experiments where, if the experiment turns out badly, N=1 because we are no longer able to repeat the experiment. For instance, how much do we want to experiment with nuclear war? How many times could humanity run a nuclear war experiment?  Taleb argues that genetically modified foods and human-caused global warming should be addressed with the acknowledgement that we don’t know with any real certainty the potential consequences or the likelihood of a catastrophic (or slow motion) event. Without referring to it directly, Taleb seems to follow a negligence theory that judges behavior by the magnitude of the risk of harm (perhaps measurable, probably not) times the likelihood of the occurrence of the risk (perhaps measurable, probably not) to decide whether a particular course of action should be undertaken. (Taleb would add “skin the game” as well: consequences if the wrong choice is made. But in some of our examples, we’d all suffer the consequences.) The difference of course from a court of law (one of many) is that this formula can be applied to judge what action to take (or not) prospectively to avoid loss instead of using it to apportion loss retroactively. (I’ve written more on this general topic in this earlier blog post, “Thinking Like a Lawyer and Antifragility”.)

In both economics and population predictions, the other wild card variable is human action, which is responsive and strategic. Did Paul Ehrlich’s cry of wolf affect the wolf and not just the villagers? (And this assumes that population is not a problem; based on personal observation from living almost three years in India and China, I’m not willing to concede that population density doesn’t remain a crucial challenge.) Does the possibility of stimulus or austerity change the calculations of innumerable economic decisions? Certainly, and you can observe that most easily in markets. But what you see may surprise you and upset your expectations, for instance, some—like Ferguson—have not seen the high interest rates and inflation that they predicted with lose monetary policy from the Fed and a bit of fiscal stimulus. Even the common benchmarks can fail. Krugman, on the other hand, has called it right on the inflation issue.

So what are we left with? Educated guesses, uncertainty, caution, and a need for resilience when things don’t turn out as we hoped. The public dialogue can at its peril ignore risks and probabilities, but we do so to our detriment. We should specify our judgments about likelihoods of benefits and harms and our standards of proof. Voters, unlike jurors, get a second bite at the apple, and we should keep a track record of those who seek to guide and lead the public. And decision-makers and those who advise them would do well to become more sophisticated in this perspective. Our future depends upon it.

*This article by Ferguson displays a lot about Ferguson. The first part draws upon a deep ground of historical knowledge and sheds light on  the phenomena of collapse by applying current thinking about complexity. Thus, as Ferguson argues, “declines” are less dangerous than precipitous “falls”. In a complex system, including financial systems, a system can collapse very quickly, or as Ferguson’s felicitous prose describes it, there can be a “sudden shift from a good equilibrium to a bad mess”. Financial collapses and regime changes, such as the French Revolution, Ming China, the fall of the Hapsburg and Ottoman Empires, and the disintegration of the Soviet Union, happened quickly and to widespread surprise. Although Ferguson doesn’t make the point, such examples should provide some measure of humility in making predictions. However, he goes on to suggest that the Obama Administration and the Fed in spending and printing money might trigger a financial collapse (this was published in 2010). His forecast is vague but ominous. Once again, Ferguson the historian gets sidetracked by Ferguson the political hack. But I must say overall, the article is well argued and perceptive when it sticks to history and avoids forecasting.