Showing posts with label Deirdre McCloskey. Show all posts
Showing posts with label Deirdre McCloskey. Show all posts

Wednesday, August 13, 2014

A Bone to Pick with Professor McCloskey re Inequality


Professor McCloskey discussing some of her ideas

Professor Deirdre McCloskey gets a lot right. Her “Bourgeois trilogy” (the third volume will soon be published) is an important contribution to current thinking about economics. Indeed, her academic career, including a stint at the high temple of market economics at the University of Chicago, now centers on re-thinking economic discourse and perspective.* Her project of developing “humanomics” (as she has dubbed it) adds another voice to those attempting to bring a new reality to the subject of economics. In this, she joins the likes of Diane Coyle, Eric Beinhocker, and Nick Hanauer and Eric Liu among those whom I’ve read recently that want to change our understanding of the fundamentals of economics. McCloskey’s project in this area merits the highest praise and encouragement.

But I have a bone to pick. In a recent Financial Times article, McCloskey dismisses current concerns over inequality. Diane Coyle (Enlightenment Economics blog) and Evan Davis in The Spectator describe her as taking an anti-Picketty position. McCloskey argues that “the Great Enrichment”, the tide of economic development that has lifted all boats, counts more than any resulting inequality. This Great Enrichment, which comes as a result of changing ideas and practices and not so much as a result of the accumulation of capital, has created the greatest era of human dignity and liberty that the world has ever known. I don’t disagree. I have enough of a sense of history to know that as a Baby-Boomer American I have lived in a unique, golden era of general prosperity and even relative peace. (I was just young enough—and lucky enough—not to have to go to Vietnam or even get drafted.) McCloskey provides a compelling explanation and defense of the engine of material and human well-being that we call “capitalism”. (She and I both have reservations about the usefulness of the loaded word.) People around the world have benefited immensely since thinking about economics and the bourgeois life began to change in the 1600s. (As I write this now living in China, it boggles the mind to consider the speed of change this country has undergone since I first visited in 2004, not to mention since 1979.) She argues that reducing the wealth of “the rich” (however you define them) doesn’t increase the wealth of the poor. It’s not a zero-sum game. Innovation in ideas and institutions create wealth, not exploitation. Thousands of years of exploitation increased the wealth between warrior classes in a zero-sum game involving lands and peoples with per capita incomes that hovered at about $3 a day for about as long humans have lived in civilizations. Nor does economic growth come from mere investment, so we have no incentive—in fact a disincentive—to choke off investment by the rich.

McCloskey argues that the current concern with inequality stems from envy, and to the extent her perception is correct (as it is in some measure), she has a valid concern. Envy is a mental poison. It’s ugly. It’s despicable. Perhaps Rene Girard is correct in identifying it (under his designation of “mimetic desire”) as the root of evil. But perhaps most obvious to me, entertaining envy of the rich is stupid. Why should we envy the rich, the 1%? All the money in the world doesn’t buy you love, respect, admiration, or other things that we value (although it can buy expensive imitations). Do I admire Jay Gatsby or The Wolf of Wall Street? Who admired the lead character in Wolf? I found the movie—centered on this narcissistic, money-intoxicated jerk—too boring to finish. Having a great deal of money and earning respect (for something other than having a great deal of money) are unrelated. Perhaps there’s even a negative correlation. Don’t get me wrong: all things being equal, I’d rather have more money than less. But all things are never equal, are they? For me to have big money—I wasn’t born rich—I would have needed to have been very lucky and willing to trade more of my time, my energy, and my values (family, friends, leisure, community service) to get more dollars. At some point, the marginal cost exceeds the marginal benefit. We all draw that line at different points, as individuals and as cultures, but a healthy individual does draw it. I’m somewhere between Henry David Thoreau and Donald Trump, but I sure as hell hope that I’m closer to Thoreau!^

So what’s my beef with inequality if I don’t think envy worth the coin and the rich don’t create poverty? It’s political. First, in our American political system with its legalized corruption, money buys influence. And lots of money buys lots of influence. Money isn’t speech. The U.S. Supreme Court made a monumental and disastrous blunder in equating speech with money. Money can coerce and seduce, but it does not reason. And with a lot of money, such as Koch brothers-size money or Sheldon Adelson-size money, you can buy a whole lot of scare time (sometimes referred to by the euphemism “air time”). The same is, of course, true of more liberal millionaires, say a George Soros. But either way, allowing any one person or group of individuals to dominate our political discourse (if we can give such low talk a hifalutin name) is anathema to reasoned, democratic discourse.

I realize that my referring to “reasoned, democratic discourse” I have referenced (what can most kindly be labeled) a Platonic ideal. But some conversations are more reasoned than others. It’s an ideal to which we can aspire. I understand what most strongly motivates humans: fear, greed, power, prestige, and sex. These are our base motives. But we want to hide our motives in attempting to convince others. Perhaps we have an innate shame of such animal instincts. So instead, we use reason; that is, public arguments based on shared criteria relevant to the subject. In other words, what should convince a disinterested audience of decision-makers? I realize this is an ideal and not a reality. After over 30 years of practicing law, I know the motivations that lead people into litigation: greed and an aggrieved sense of dignity—although most persons don’t appreciate the prominence of the later. I understand the challenge of trying to convince (what you hope is) a disinterested judge or jury. Of course, they too have their built-in biases and motives, but like democracy as a political system, it’s the worst way of deciding things except when compared with all the others. (Thanks, Churchill).

When we concede the bulk of public discourse to the super-rich, they set the agenda in their favor. They dominate the conversation (to the extent we can call still call our democracy a conversation), and they win the support of the populace by repeating their message over and over again with no practical way to limit or challenge their assertions. (A lie told often enough and sincerely enough becomes true.) Make no mistake: Republicans and Democrats are both beholden to money. Individuals and organizations with big dollars to contribute to campaigns for or against a candidate control the agenda. Fear and resentment (and to a lesser extent seduction) are the current hallmarks of mass-media political advertising. To allow this type of communication to go unchecked, subject only to the checkbooks (and whims) of the wealthiest, is an error of judgment that will long haunt us. Politics is about the future, and a poisoned polity creates a poisoned future.

My other concern with inequality stems from living almost two years in India. Corruption in India isn’t legalized and winked at as it is in the U.S. In fact, I believe it’s comparable to the corruption experienced in the U.S. during the Gilded Age and the era of Tammany Hall. But an even greater problem in India stems from the pervasive and crippling social inequality that has such deep roots there. The caste system is legally abolished, but the social, economic, and political inequality remains. Upon exiting the Rambaugh Palace in Jaipur, once a home to the local maja raja and now a five-star hotel adjoining Jaipur Polo Club grounds, you can walk across the road and find a shantytown slum of the deepest poverty. Between venues in India, one finds an impoverished public space with poor infrastructure and little concern for the shared environment. The rich, as they tend to do around the world, think that they can retreat into their gated homes, their luxury cars, and their guarded, air-conditioned buildings. But each day they have to experience “the real India” of poor infrastructure, poor politics, and poor public spaces. I come from a small town in Iowa where even the kid reputed to have been born a millionaire had to attend the same schools and hang out at the same places as the rest of us, the kids of the encompassing middle class. Based on that experience growing up, I found the staggering inequality of India one of the most shocking aspects of that culture. To the extent that the U.S. moves to increasing levels of inequality—as I believe we are—we move in a way that will denigrate the well-being of all, even the richest among us. If we in the U.S. mimic the inequality found in India, we invite not only an impoverished public space, but also social upheaval. While the poor can suffer from a relative lack of money, no one willingly or happily suffers from denigration of their dignity. This is the repeated lesson of civil rights and human rights movements. Do we want to create a class of second-class citizens? Or rather, perpetuate and enlarge it? I hope not.

Professor McCloskey, who has gotten so much right in her analysis and defense of the innovation economy, the importance of ideas, and the relative usefulness of markets (i.e., capitalism), understates the importance of the growing inequality in America. This growing inequality, as seen in stagnated wages and the decline of much of the white middle class, had given rise an increasing politics of resentment and desperation. The Tea Party and kindred groups, in the weird world of American politics, ally with the plutocrats and free-market ideologues who denigrate government and any intervention that limits business profits and prerogatives. It’s not rational, but it’s real. We don’t have to soak the rich—let them keep their money. I’m not happy when I pay taxes, but I’ve seen what happens without an effective tax and public finance system. I don’t believe that I can buy a private world that will insulate me from the poor, nor would I want that even if I could afford it. I only argue that we not allow the super-rich to skew our system in a way that warps out polity. We need to ponder the inequality that affects liberal, capitalist democracies and not just celebrate the wealth that this system has created.

*McCloskey also held appointments in economics and history at the University of Iowa from 1980 to 1999. I don't believe that we ever met.

^Having said this, I must report that I’m not retired or independently wealthy and that anyone knowing of remunerative work in Suzhou, China that doesn’t involve me teaching English should contact me immediately! Ditto if you know a lawyer who could benefit from my freelance lawyering service (www.GreenleafAdvocacy.com).

Thursday, May 23, 2013

A Review of the Film "The Reluctant Fundamentalist" and a Consideration of the Ideas of Liah Greenfeld





The Reluctant Fundamentalist (2012) PosterThe Reluctant Fundamentalist, a film by Mira Nair, attempts to accomplish a great deal. Fundamentalism, which trails behind it the threat of terror and violence, is a crucial issue in our age. Nair’s film attempts to deal with issues surrounding fundamentalism by following the life of a young Pakistani. In doing so, the film attempts to pack a great deal into a two-hour story. Essentially, a young, bright Pakistani comes to the US to attend Princeton University. He graduates with honors, and is hired by a McKinsey-like company in New York City. He meets and beds a young artist, and then when it, when it seems like he’s on top of the world, 9/11 happens.

After 9/11, of course, things change drastically for him. He becomes the subject of airport searches and an unjustified arrest and interrogation. (The facts of the arrest and interrogation scenes struck me as over the top—or am I naïve?—given the utter lack of probable cause  and the sophistication of the protagonist depicted in the film, but this is endemic to the film itself). Returning to Pakistan, the young man begins to teach at Lahore University and becomes identified with radical views in the eyes the local CIA contingent. In the end, (spoiler alert), our young man rejects two types of fundamentalism: both the religious fundamentalism that leads to terrorism and violence, and the fundamentalism that the CIA and others deploy in their own Manichean worldview.
The film as well acted, well directed, and as I mentioned, except for perhaps trying to pack too much into the script, it was a decent storyline that explored some of the difficulties and challenges that such a young man would face.

Liah GreenfeldImmediately before seeing this movie, I happened to read a blog entry by Liah Greenfeld. She’s a professor of Sociology, Political Science, and Anthropology at Boston University. (I found her piece on the website for Project Syndicate, which carries blogs by a wide variety of academics and intellectuals across a broad spectrum of issues.) In her entry, Prof. Greenfeld talks about the Boston bombers and their motivations in terms of her recently published book Mind, Modernity, Madness: The Impact of Culture on Human Experience (Harvard University Press, 2013) that argues an interesting hypothesis about not only the roots of terrorism, but also how the entire Modern Age affects us all. After reading the blog entry, I discovered and reviewed her personal website, her blog, and the blog she writes for Psychology Today. Her Psychology Today blog sets out the premises of her book in a continuing series of entries.
Greenfeld’s initial hypothesis starts with the growth of nationalism beginning in Tudor England in the 1500s, when the death and disruption caused by the War of the Roses created opportunities for unprecedented social mobility. This social mobility, along with new understanding of sovereignty, led to the rise of nationalism. From England, the idea passed to France, Germany, Russia, Japan, and around the world. Her second book, The Spirit of Capitalism: Nationalism and Economic Growth (2003) argues that capitalism arose not because of the Protestant Ethic (Weber) or some unique event or asset in the West, but because the idea of economic growth, as opposed to the mere accumulation of wealth, became a touchstone of nationalist thinking in the rivalry between the nation-states of Europe. 

Her most recently published book (Mind, Modernity, Madness) argues that the views of Emile Durkheim and Max Weber, two of the pillars of sociology and political economy, give us insight into what has happened in our modern world. From Durkheim, she pulls the concept of anomie, the idea that modern society cuts us loose and leaves us adrift from many of the traditional ties that moor our identity and self-concept, leaving us adrift in the modern world. From Weber, although she dismisses, as do most scholars now, his thesis that the Protestant Ethic was the guiding idea of capitalism, she nevertheless takes the premise that an idea or ideas were in fact the motive force behind this world-changing events that led to capitalism and modernity. (In this regard, she seems very close to what Deirdre McCloskey is arguing in her books, including The Bourgeois Dignity, although McCloskey apparently differs to some extent based upon the brief references to Greenfeld's work in Bourgeois Dignity.) 

In Mind, Modernity, Madness, Greenfeld argues that the most difficult and intractable of modern mental illnesses, schizophrenia, bipolar disorder, and major depression, are primarily diseases of the mind and not of the brain, contrary to the current trend of contemporary thinking. She believes that individuals thrown into the modern condition often cannot cope with the challenges foisted upon them, and as a result, some suffer these illnesses. This thesis attracts me because I believe that for a long time now that we can’t account for all mental illnesses through biology alone. Some mental illness is without question solely of biological origin, but others have too much of a social-cultural component among antecedents and symptoms to allow us to look at biology alone. 

Let me share an example. Suppose someone is around the corner and you’re not aware that the other person is there. You turn the corner and confront the person. If you are not expecting that person, you are startled. If you thought you were alone in a house, for instance, you are likely quite frightened. The moment you see the person and react, we can record a number of biological changes that occur almost instantaneously. Your face reveals a startled expression, your shoulders hunch, your hands raise, while internally, your body is immediately flooded with adrenaline. Now, for our little thought experiment, let’s assume that we have a medical team present to draw a blood sample and do a quick check of your body. They report that you have extraordinarily high adrenaline in your blood, and your muscles are strongly flexed. So what to do? “Well, we can prescribe a medication to help you relax to counter-act the flood of adrenaline, and while we’re at it, a muscle relaxant”, says the good doctor standing by. So what was your diagnosis? Excess adrenaline in the blood and tight muscles, some would say. True. In addition, our medical team tells you that the excess adrenaline can be treated with prescription medication X. “Your problem,” they say, “is excess adrenaline”. You need to get that under control and you’ll be just fine”. Of course, there’s an alternative way to look at the diagnosis and an alternative way to look at the cause. I say, “Your mind has deceived you; the figure that you feared was your husband, who had been working quietly in the other room unbeknownst to you”. It’s a matter much like the story of the rope mistaken for a snake in a dark room. Our mind deludes us and causes us anxiety. Both of the explanations are true from their unique perspectives, but one, unless you’re into Big Pharma, presents a lower cost, more effective way to deal with the problem: let Mother Nature run her course and put a bell around your husband’s neck. 

All of the above is a long-winded way of saying that I find Greenfeld’s perspective very persuasive with many practical applications. Whether schizophrenia, bi-polar disorder, and major depression are in fact essentially one disease, diseases of the mind unique to the Modern Age, I can’t posit with authority since this type of conclusion remains far above my pay grade. However, to take a side, I’d take hers. People who do and say things while hallucinating that we consider “weird” or “crazy” do so within their language and culture. Someone may believe himself to be Jesus or Napoleon if he lives in Europe, but probably not in India, where perhaps one would imagine oneself as Rama, or some other god. People may say or do “crazy” things, but they do so within a language and a culture. Per Greenfeld and me (for what the latter is worth), even the “crazy” behavior and its cause is at least in part because of culture and society. 

I’m eager to explore Greenfeld’s trilogy in more detail when I can get the books (too big to enjoy reading on the Kindle). I’m eager to continue exploring fundamentalism and the Modern Age. I think more than one scholar (Karen Armstrong pops to mind) has argued persuasively that fundamentalism is a disorder of modernity. I suspect that religion is a carrier of the real disorder and not the cause: like rats and fleas spreading the Black Death, the root cause is anomie and a feeling of social uncertainty and belittlement that latches on to religion (Judaism Christianity, Islam, Hindu, etc.) to serve as a carrier. A particular religion provides a ready-made story that can be bent to justify the feeling of a need to change the world and to right the injustices and wrongs experienced, even if the change requires wanton violence. (See this review of Reza Aslan’s How to Win a Cosmic War about how this virus ranges across the three great monotheistic religions.) 

I think that Greenfeld’s hypothesis as it applies to terrorism as we just witnessed it in Boston fits with the ideas of Scott Atran as well (see my review here). Before learning of Greenfeld’s work, I found Atran the most persuasive writer that I’d read on this topic, but I think that Greenfeld provides a more comprehensive perspective, which, by the way, also apples to acts of domestic terror (if we don’t limit terrorism to random violence that purports to have some political motive). Atran is good at identifying how small social groups reinforce norms that lead to acts of terror, but he doesn’t as effectively address what I think are the underlying issues of social or status dislocation that are behind these acts. I think Greenfields’s work gives us the more comprehensive viewpoint. Thus, the way I currently understand Greenfeld, she would not see a significant underlying difference between the Boston bombers, or the Newtown shooter; between jihadist acts of terror or those of the Oklahoma City bombers or the Branch Davidians.

The question I have for Greenfeld is that violent apocalyptic movements have been around for a very long time, certainly going back to biblical times; the Middle Ages are full of millennial movements. How do we distinguish such actions in the Modern Age from their pre-modern precursors?

Understanding how social and cultural factors influence and create acts of violence or dislocation is crucial for our ability to try to counteract these trends. Understanding the pressures as portrayed in a film like The Reluctant Fundamentalist, or as analyzed in Greenfeld’s book, are both sources that we must use to try to come to a deeper understanding. In my own opinion, in a society so concerned with economics and economic growth, we tend to focus far too much on material incentives as the primary driver of human action. While certainly significant, a deeper understanding of human motivations is called for. The passions are probably more important than the interests. People like René Girard are among those who try to take a deeper look at what is going on. Even going back to Thucydides, we see a more complex understanding of human motivation than what many of us have come to believe of late. Simple pain and pleasure are insufficient to understand the complexity of human motives. The human individual, by herself or himself, makes no sense unless we take into account the mirrors of society and culture. (This concluding metaphor comes from the serendipity of currently re-reading Robert Pirsig’s Lila.)
 

Wednesday, April 17, 2013

Thinking Like a Lawyer & Antifragility

Thinking like a lawyer is what law school is supposed to be all about, and to some extent, it is. Now many people, like Daniel Pink or Gerry Spence, agree that is true--and woe to us all. And to some extent, that also is true. Word nitpicking, “word gravel” (Spence), and endlessly argumentative perspectives (for the purpose of delay or battle by attrition) are all too common in the legal profession. But I want to emphasize a more positive way of thinking about this topic. In fact, I want to consider legal thinking in the context of how lawyers tell laypeople to think about the legal issues put to them, because in this case—especially with juries—we have to put our perspectives in terms that everyone should be able to grasp. 

The Legal System, Torts, and Negligence

I’ve thought about this because I’ve recently finished Nassim N. Taleb’s Antifragile, his intriguing (further) consideration of how risks can affect us. Because I will review the work in a separate post, I won’t go on at length here about the many aspects of the book, but I do want to point to some salient features. One, following a pattern that NNT set forth in his two previous books, Fooled by Randomness and The Black Swan, he favors practitioners over academics. He preaches empirical skepticism from a linage coming starting with Menodotus of Nicomedia through Sextus Empiricus down to the present. He criticizes much of academic medicine because of its rationalist (as opposed to empirical, skeptical) mindset. The mindset that he criticizes says, “If we can do it, we should”, which leads us to think that we can fix a problem by doing this or that without consideration of system effects. He offers a number of examples that make sense. 

However, one thing that he doesn’t address much is lawyers and the common law legal system.* Perhaps it’s because he’s only been exposed to contract-drafters and securities regulators, the sorts of attorneys who attempt to extinguish risk and minimize every manner of Black Swan. Unfortunately, he doesn’t address the common law world of tort laws, for instance. Tort law developed in the U.S. in part as a response to increasing industrialization and commercialization. In short, we had more not governed by contract or status. We had a number of more casual but often consequential interactions that required the law to decide who lost and who gained in such interactions. For instance, who paid what to whom if farmer Brown’s cow was on the railway track and was struck by the train? (Such collisions could damage the train, but the cow usually came up the worst; locomotives have cowcatchers, but cows don't have train catchers.) With the advent of automobiles, all of this became much worse. Now autos could run into trains and to each other. So how did the law respond? 

Following a pattern laid down in the British Isles before the American Revolution, the courts laid down principles on a case-by-case basis, using prior decisions as precedents to establish the norms for later courts and parties to follow. This system dealt with the practicalities of the law at the level of actual litigants and in light of practical necessities. This should make NNT happy: practitioner and practice-based guidelines established according to need. Of course, sometimes the system seemed too overgrown and complex and needed some rational pruning (my Garden metaphor again!), but generally, the rationalization process was a distillation and not a complete overhaul. Some standardization occurred, but not a huge amount. One state could look at what another was doing and could adopt or reject a practice as it saw fit. What works in Iowa might not work in California or New York, but it might prove quite useful (and enlightened) in Nebraska. 

Consider the legal concept of negligence, from the realm of torts described above, which attempts to weigh whether a loss suffered by an injured party (plaintiff) should shift to another party when the plaintiff believes the defendant should have acted differently to avoid the plaintiff’s loss. The plaintiff can shift the loss (obtain a judgment for damages) to the defendant, if the plaintiff can prove by a preponderance of the evidence (more about burdens and evidence later in this essay) that defendant should have acted to prevent the harm from occurring.

We say that defendant had a duty to act to prevent the harm from occurring by acting as a reasonable person would act under like or similar circumstances. The logic of negligence most widely known in legal circles comes from the decision of Judge Learned Hand in U.S. v. Carroll, a case about a barge that broke loose and sank after a tow ship accidentally severed a line anchoring the barge to a pier. The U.S. government, owner of the barge, sued the tow ship (Carroll) for the damages caused, alleging negligence. Justice Hand established a “calculus” of negligence that has remained quite influential throughout the years:
         
[T]he owner's duty, as in other similar situations, to provide against resulting injuries is a function of three variables: (1) The probability that she will break away; (2) the gravity of the resulting injury, if she does; (3) the burden of adequate precautions. Possibly it serves to bring this notion into relief to state it in algebraic terms: if the probability be called P; the injury, L; and the burden, B; liability depends upon whether B is less than L multiplied by P: i.e., whether B < PL.

To expand on this, one can state: If (Burden < Cost of Injury × Probability of occurrence), then the accused will not have met the standard of care required. If (Burden ≥ Cost of injury × Probability of occurrence), then the accused may have met the standard of care.

This classic formula has served at the touchstone of thinking about negligence since it was written in the Carroll decision back in 1947. When you think of the issue in this manner, it makes a lot of sense to formulate things this way. Analytically, it’s very clear and concise. Of course, in practice, it’s much messier. Indeed, as any experienced tort lawyer will tell you (or the Iowa Rules of Appellate Procedure for that matter), questions of negligence are normally decided by applying the principles to the facts of the case, and what does or does not constitute negligence normally lies with the fact-finder, which is usually the jury. Thus, predicting what is or is not negligence becomes difficult because the issue is by definition fact and context-specific; indeed, juries bring in attitudes from their communities that vary over time and place. For instance, what juries accepted as negligence 30 years ago might be rejected today because of changing community attitudes (from business interests campaign against tort claims brought against them) or by distance (how a jury might dispose of a claim can vary from place to place (and even jury to jury). All of this remains decentralized and ad hoc, thus allowing change over time without recourse to the political process (for the most part; much of so-called “tort reform” is an instance to trying to freeze the system to limit damages set by jurors for injury awards). To the extent juries aren’t limited by new reforms, this most democratic (in both its good and bad manifestations) process allows some predictability overall while allowing for change and individual variations over time and place. This, to me, seems a workable system, for all of its faults, which I’ve come to know well in the course of over 30 years of practice. 

Burden of Proof

Another key area where I think that the law has some useful insights to offer all of us concerns the burden of proof. Specifically, who has the burden of proving (or disproving) a proposition (e.g., “Jerry was negligent and this injured me”) and how much proof is necessary for an actor (the court system) to accept that the proposition as proven sufficiently to act upon it (e.g., order Jerry to pay Sandy money for the harm caused by his negligence). 

Generally speaking, a party seeking to establish a proposition has the burden of proving it. In other words, if you want the court to order something, such as payment of money damages or to order a person to refrain from acting via an injunction, you must prove the necessary elements. The standard of proof is the amount of proof necessary to establish a proposition sufficiently for a court to act upon it. In most civil cases in most jurisdictions, this standard is a preponderance of the evidence; in criminal cases, beyond reasonable doubt (and with various other permutations available for less common cases). For instance, for me to obtain a judgment against you for ten million dollars, I need only prove the legally required facts by a preponderance of evidence; however, if the State wants to send me to jail for a day, it must prove all of the legally relevant facts beyond a reasonable doubt. Criminal punishment, including fines, requires a higher standard of proof than a recovery of any sum of money in a civil action.

Think about this in light of the daily decisions that you make. Do you need to know something “beyond reasonable doubt” or by a “preponderance of the evidence”, or perhaps “maybe” works. It depends mostly on the magnitude of the consequences of the decision. Will I like a new ice cream flavor? “Maybe” might provide a sufficient burden of proof for you to try the new flavor. Others may have a higher standard; for instance, another person might believe that she probably will not enjoy the new flavor more than her established favorite (which is chocolate, of course). 

So how skeptically and conservatively (as a matter of judgment, not politics) we may choose to act becomes (to some extent) a function of the magnitude of the risk involved. Let’s take global climate change as an example. On one hand, NNT argues that we are poking Mother Nature with a stick and that we shouldn’t. ("Burden of evidence: The burden of evidence falls on those who disrupt the natural, or those who propose via positiva policies." Antifragile, 429.) The burden, because of the huge consequences if human conduct is upsetting the global climate structure must be upon those who want to maintain the status quo (a via positiva if there ever was one!); i.e., those who don't want to take active steps to counter any human-caused climate change. (Technically, we get into burden-shifting here. In other words, if the plaintiff provides some evidence of proposition A, then the defendant can offer evidence to rebut the proposition, and then the plaintiff must have enough evidence to respond to the defendant’s contrary contention. Plaintiff has the ultimate burden, but it switches back-and-forth during the course of the trial.) 

Blogger, self-experimenter, and UC Berkley-Tsinghua psychology professor Seth Roberts, on the other hand, is a climate change denier. (He will more likely call himself a skeptic and attempt some burden-shifting of his own, but given what I’ve read him to say about this, I think he belongs in the denier camp because of his displays of confirmation bias when considering this the issue.) Like others I’ve read recently, persons I otherwise can find quite persuasive and intelligent, like Roger Scruton and Deirdre McCloskey, suggest that we’re overreacting when we say that we need to change. They argue, I think, that there is no Black Swan (or White Swan, for that matter) of risk, that human-caused global climate change is all a mirage. Roberts and the others argue in effect: 

I know I might be the turkey and they say that there’s this Thanksgiving Day that might terminate my well-being with extreme prejudice, but how do I really know that this will come to pass? I know that I hear the farmer sharpening long, metal blades, talking about this upcoming holiday, and looking longingly at me, but really, this might all be something else. We shouldn’t take any action unless we know for sure that this Thanksgiving thing that you think is coming is proven by  . . . ." 
What, an actual occurrence? This strikes me as a bit late, but that seems to be their preferred position. 

I mention this, because I’ve made lawyerly arguments on Roberts’s blog when this comes up, but my points about burdens and standards of proof don’t elicit a response. I admit that I’m not sure about human-caused global climate change since I don’t have the credentials to independently verify and interpret the data (as I don’t for most of life’s biggest questions). But from what I do know is that if those arguing that we humans are causing global climate change and that there are things that we should be doing to alleviate its impact (it’s beyond stopping now, according to most), then we darned well ought to do all we reasonably can given the magnitude of the risks. (Refer back to the negligence calculus for a refresher on how to think about this.) The climate change deniers don’t seem to want to respond to this. The changes will cause big dislocations (to put it mildly) whether we choose to make changes ourselves or we choose to allow Mother Nature to make them for us. In other words, either our economy changes or Earth's climate changes. 

There are those who might argue that while we’re causing global climate change, we’ll innovate our way out of the pickle. “We’ve done it before and we can do it again and we can do it again.” Maybe. (See Thomas Homer-Dixon’s The Ingenuity Gap about the limits of our ability to innovate our way out of climate change.) However, t this strikes me as a very big gamble, and frankly, one I'd not like to bet upon (I've too much to lose).

All of this may seem quite discursive, but I hope that I’ve made a point that at least in some ways, “thinking like a lawyer” actual can prove useful, albeit demanding. (We don’t like to expend the energy that thinking requires of us, so we dodge it whenever we can). Thinking like a lawyer may not allow us to see Black Swans, but because of the turbulence that lots of everyday decisions in law and business inevitably create, so long as they are disbursed and have some degree of independence should allow a measure of antifragility to arise.
 
* While reviewing Antifragile to make some notes I did find the following, which qualifies the statement made above. I will be looking for any other references to the law as I continue my review: 



The same bottom-up effect applies to law. The Italian political and legal philosopher Bruno Leoni has argued in favor of the robustness of judge-based law (owing to its diversity) as compared to explicit and rigid codifications. True, the choice of a court could be a lottery—but it helps prevent large-scale mistakes. (90)

I argue my effort has been to further elucidate this passing remark.