Friday, May 05, 2006

Fog of Life

In the military they refer to the "fog of war". As the Wikipedia entry states:

Prussian military analyst Carl von Clausewitz, wrote: "The great uncertainty of all data in war is a peculiar difficulty, because all action must, to a certain extent, be planned in a mere twilight, which in addition not infrequently - like the effect of a fog or moonshine - gives to things exaggerated dimensions and unnatural appearance."
While the generals have to deal with the fog of war, I think we all have to deal with the fog of life. Just like the generals we are inundated with lots of information and don't know which of it is correct.

The fog of life is being aware that not all we believe is true, but not knowing which particular beliefs are false.

Or more eloquently put as it applies to advertising:
"Half the money I spend on advertising is wasted," John Wanamaker, the owner of America's first big department store, allegedly said in the 1870s. "The trouble is, I don't know which half."
The question then is: how much of what I believe is actually true? If I were to ballpark it, I would guess around 90%. That means 1 out of 10 things that I think I know, are actually wrong. I ballpark it here because many things that I thought I knew, I later found out that they were incorrect. For example, I thought I knew that AIDS was ravaging Africa. Then I read the following in the Washington Post:
KIGALI, Rwanda -- Researchers said nearly two decades ago that this tiny country was part of an AIDS Belt stretching across the midsection of Africa, a place so infected with a new, incurable disease that, in the hardest-hit places, one in three working-age adults were already doomed to die of it.

But AIDS deaths on the predicted scale never arrived here, government health officials say. A new national study illustrates why: The rate of HIV infection among Rwandans ages 15 to 49 is 3 percent, according to the study, enough to qualify as a major health problem but not nearly the national catastrophe once predicted.

The new data suggest the rate never reached the 30 percent estimated by some early researchers, nor the nearly 13 percent given by the United Nations in 1998.
I thought I knew that scientists were able to clone human embryos and make stem cells from skin cells and then I read this in the BBC News:
Research by South Korea's top human cloning scientist - hailed as a breakthrough earlier this year - was fabricated, colleagues have concluded. A Seoul National University panel said the research by world-renowned Hwang Woo-suk was "intentionally fabricated", and he would be disciplined.
Even rigorously peer reviewed medical research is not as certain as we would hope. As I blogged previously, John Ioannidis, a Greek epidemiologist, believes 50% is a fair estimate of the proportion of scientific papers that eventually turn out to be wrong.

To deal with the fog of life, one key is realizing that there is no such thing as a fact. Rather each piece of knowledge has an accompanying probability of certainty. They range from rumors which have a low probability, to political information which is a little better, to press releases by companies which are a little better yet, to news on local TV which is good, to well researched documentaries and books which are very good, to peer reviewed science which is some of the best. But even the best information can be wrong as the examples above illustrate.

The key to reducing your fog is to obtain the most certain information you can and to have an accurate understanding in your mind as to its certainty. By that I mean, get the information from the best source you can, be realize that even the best sources are wrong 1% of the time. Do like Warren Buffet when he says "I don't worry about what I don't know -- I worry about being sure about what I do know". To estimate the accuracy of information, try and determine any potential bias it has, how good the data collection was, how well thought out it is, how well researched it is, and how well it has been reviewed by others. These will all make it more likely to be true.

Another key is try and dig deeper into the information, and look at it critically. People enjoy analyzing data much more than they like checking and verifying data. I bet people spend 80% of their time analyzing and 20% verifying data. But, great analysis on bad data is worthless. Or as Keynes put it "It is better to be roughly right than precisely wrong". To fight the fog of life, spend 80% of your time checking that your data is correct and only 20% of your time analyzing it.

Another way to deal with the fog of life is to be wary of conventional and common wisdom. That is why I liked the book Freakonomics, where economist Steven Levitt shows many cases when conventional wisdom is wrong.
Which is more dangerous, a gun or a swimming pool? What do schoolteachers and sumo wrestlers have in common? Why do drug dealers still live with their moms? How much do parents really matter? What kind of impact did Roe v. Wade have on violent crime?
He takes a look at conventional wisdoms like: a gun is more dangerous than a swimming pool or drug dealers make lots of money, and shows why they are wrong through well thought out and peer reviewed research. This made me wonder, what percentage of common wisdom is wrong? Hmm, maybe 10%? And, what percentage of this book will turn out to be wrong in hindsight? Hmm, maybe 5%? And lo and behold a couple months after the book comes out I read this in The Economist:
It was a good test to attempt. But Messrs Foote and Goetz have inspected the authors' computer code and found the controls missing. In other words, Messrs Donohue and Levitt did not run the test they thought they had-an "inadvertent but serious computer programming error", according to Messrs Foote and Goetz

Fixing that error reduces the effect of abortion on arrests by about half, using the original data, and two-thirds using updated numbers. But there is more. In their flawed test, Messrs Donohue and Levitt seek to explain arrest totals (eg, the 465 Alabamans of 18 years of age arrested for violent crime in 1989), not arrest rates per head (ie, 6.6 arrests per 100,000). This is unsatisfactory, because a smaller cohort will obviously commit fewer crimes in total. Messrs Foote and Goetz, by contrast, look at arrest rates, using passable population estimates based on data from the Census Bureau, and discover that the impact of abortion on arrest rates disappears entirely. "I am simply not convinced that there is a link between abortion and crime," Mr Foote says.
So it appears that he had a programming error in his analysis. Levitt still thinks his conclusion holds in spite of this due to other evidence, but it certainly makes you wonder if this is wrong, what else might be? But the test should not be that the book is infallible but rather it hits for a higher batting average than common wisdom. And I think it does.

A couple of other books that set that debunk conventional wisdom using mathematical analysis are Moneyball and More Guns, Less Crime. Or so I thought.

In Moneyball, the author looks at how the low budget Oakland As used statistical analysis that was different from other teams in evaluating players in order to create a competitive team. Conventional wisdom in baseball says you should look at batting average, home runs and RBIs when evaluating players. But, by analyzing it deeper, they found that statistics like on base percentage, batting average with zero outs, and other statistics lead more to winning games. By finding players that others did not evaluate as highly, the As could get players that contributed to winning, but do not have high salaries.

I thought this all sounded good, and I like the idea of hitting the statistics hard to find the hidden story. But then I am reading the Freakonomics blog, and I find out that he thinks the Moneyball analysis is not accurate.
There has been much hype recently about baseball clubs finding statistics to identify good players. Levitt read Michael Lewis's book Moneyball about the supposed innovators, the Oakland As, and is unimpressed. "If you look at all the stats they say are so important, the As are totally average! There's very little evidence Billy Beane [the club's general manager] is doing something right."
More on the Freakonomics vs. Moneyball debate can be found here.

More Guns, Less Crime used economic regression and analysis to show how having more guns actually lead to less crime. Instead of making a argument on gun control based on simplistic reasoning and anecdotes, here was a book that was used cold hard data and analysis in order to come to its conclusion. Or so I thought. On page 133-134 of Freakonomics, Levitt says that the author John. R Lott invented some of the survey data. If the data is phony, then who cares about the analysis, the whole book is worthless. How the heck am I as a reader supposed to know that? Hence the fog of life.

Here we had 3 books that were trying to debunk conventional wisdom, one has an computer programming error, another's analysis might not support its team success, and the third made up data. So even when we are trying to remove wrong conventional wisdom beliefs that we hold, new ones can appear. And there is no way of knowing other than having others go through the data or try and replicate the work that we will know that mistakes have been made. There is no way that anyone has that kind of time to dig that far on every book they read. Better to just accept that fact that books like these have a, lets say, 5% chance of being wrong and living with it.

How do you make decisions with the fog of life when you know that not all of your beliefs are true? The first key is not to hide from it. Fundamentalist religions claim they have the answers to all of life's problems. All the answers are there for you in black in white in their holy book. But, they aren't solving the fog of life, they are hiding from it. Instead of coming to terms with the uncertainty in their knowledge, they are ignoring it. Yes, this philosophy allows you to have full confidence in the decisions you make, but that does not make them correct. And the incorrect decisions will set yourself up for bigger problems down the road. Beware of this kind of certainty. As Michael Crichton says: "I am certain there is too much certainty in the world".

Instead, you must be willing to make decisions with incomplete information. You must first try and get the best information you can. But, at a certain point it is not worth spending more time to try and get more accurate information. You must live with the fact that certain information may only be 95% certain and that decisions based on it will be wrong 1 out of 20 times. You realize that it will take less time to fix that one problem than it would to try and get even more accurate information on all of your issues.

Poker players are a good example of how to deal with the fog of life. They have to make calls based on incomplete information. They don't know what the other player is holding. They know what he bet, but they have to evaluate whether they think he is bluffing or not. They have to know the odds of the various hands that can come up. And then they have to make a call based on this incomplete information. The better players are able to ascertain more information at the table by picking up tells that other players have. This allows them to have more certainty with their calls. But, they are not right every time. To be a good poker player, they don't have to be right every time, they just have to be right a higher percentage of the time. So too in the fog of life. It is not about being right all the time, it is about being right a higher percentage of the time.

Robert Rubin, the secretary of treasury under Bill Clinton, also understands how to make decisions in the fog of life. He wrote about it in his book In an Uncertain World. He also spoke about it in this excellent speech:
Everything I've experienced in life suggests to me that the key to the future is a decision-making approach that begins with the proposition that there are no provable certainties. That is the view of modern science and much of modern philosophy. And, this view that there are no absolute or certain answers quickly leads to recognizing that all significant issues are inherently complex and uncertain and, as a consequence, that all decisions are about probabilities and trade-offs. That, in turn, should lead to restlessly seeking to better understand whatever is before you in order to most effectively refine your judgments about those probabilities and trade-offs.
In the book Blink by Malcom Gladwell, he talked about how Wall Street traders and Marine Corps brass were fundamentally in the same business. While the brass had never been to New York, and the traders had long hair, were unkempt and overweight they are soul mates. They both had to make snap decisions with imperfect information. One was dealing with the fog of war, the other the fog of life.

The fog of life surrounds us. We are aware that not all we believe is true, but we don't know which particular beliefs are false. In order to deal with the fog of life, we must think more about the certainty of the information we hold and try to get the most accurate information we can. We must be skeptical of conventional wisdom, but realize that the debunking of conventional wisdom can also be wrong. The fog of life must not deter us from making decisions. Instead we must get the best information we can, but at a certain point realize that it is better to fix the few mistakes that do occur, then spend more time getting more accurate data. The fog of life is about accepting that some of what we believe is incorrect, and moving on.

1 comment:

Anonymous said...

Great post! Embracing this fog of life is a common trait of leaders.

So how much of what we believe is true? 90%? I don't know if I'd put a number that high. Lot of our facts are based on gross hypotheses -- true in a certain frame of reference but it ultimately breaks down under finer scrutiny. But if one says that 9 out of 10 decisions are accurate to a point of utility, I might agree; newtonian physics, for example, is inaccurate but still useful.

The interesting question is -- does technology and science increase our collective ability to scrutinize (ie. depth), or does it broaden our scope (ie. breadth)? Breadth without depth yields a greater fog of life.

Post a Comment

Note: Only a member of this blog may post a comment.