Desperately Seeking Signal (credit to Nate Silver for this title)

Media274As part of my ongoing tours as a Society of Petroleum Engineers (SPE) Distinguished Lecturer, I was in Bartlesville, Oklahoma a little while ago giving my presentation. My talk is about appropriate and inappropriate ways we can apply our risk tolerance when making strategic decisions on major projects. I spend some time talking about the statistical traps people sometimes fall into when analyzing uncertain opportunities (and let’s be honest – all business opportunities are uncertain).

Afterward, a gentleman approached me and asked my opinion of how many business executives understand the way statistics – probabilities, correlations, dependencies, etc. – work. I said, “Very few.” Most people have had a course or two in statistics somewhere in their education, but they soon forget whatever they learned (assuming they didn’t sleep through the class in the first place – statistics isn’t exactly the most riveting of subjects for most folks). This may not have been such a big issue historically, but in the era of Big Data and complex systems, businesspeople of all ranks – not just executives – need to be able to understand and use the outputs of statistical analyses appropriately.

Bill Gates had an interesting article in the Jan. 26-27 Wall Street Journal in which he makes the case that measuring things (i.e., collecting real data) around our most serious problems is the key to solving them. This may very well be so, but those measurements will be noisy, they will suffer from sampling bias, and they may or may not be representative of what we can expect in the future. Teasing out meaningful patterns and true cause-and-effect relationships – and then determining what action to take in order to maximize the probability of achieving our objectives – will require some degree of understanding of how real data behave, of what you can and cannot conclude from the data at hand.

I am currently about one-third of the way through a great book which addresses these issues: Nate Silver’s The Signal and the Noise. Silver gained fame this past November by correctly predicting the outcome of just about every race in the U.S. elections, including how all fifty states voted in the presidential election. The book is filled with fascinating insights into what can and cannot be inferred from statistical data, given that real data are always noisy to one degree or another. It has chapters with titles like “Are you smarter than a television pundit?” and “Desperately seeking signal.”

Silver talks about the horrendously bad predictions made by financial institutions and ratings agencies in the lead-up to the financial collapse in 2008, and he goes into some detail about the mistakes which lead to those bad predictions. He discusses the terrible forecasting record of most economists, and the importance of communicating one’s uncertainty around any prediction in the form of a margin of error. In one of Silver’s examples, when the Red River was approaching flood stage in 1997, the town of Grand Forks, ND was told that the river would crest at 49 feet. No problem; the levees were 51 feet high. But, of course, there was a problem: nobody knew what the river’s maximum height would actually be when it crested. There was a range of possible heights for the crest of the river; 49 feet was just the average. Based on the accuracy of past predictions, there should have been a margin of error of about +/- 9 feet reported along with that estimate. But the average was the only value that was communicated, so the townspeople took it at face value. The river actually crested at 54 feet – well within the forecast range – but with disastrous results. Yet another example of “The Flaw of Averages,” as Sam Savage calls it.

As you might imagine from the title of Silver’s book, a number of the errors discussed come from the difficulty of distinguishing signal (trends in the data which come from real-world cause-and-effect relationships) from noise (spreads or apparent patterns in the data which result from random variability, chance, or irrelevant factors). I’ve used some of Silver’s wisdom in my ongoing discussion with Dave Charlesworth in this space regarding global warming and the merits of a carbon tax. In fact, Silver’s chapter on the subject of global warming, “A climate of healthy skepticism,” is a terrific summary of how people’s misinterpretation of climate data leads them to draw incorrect conclusions about the problem.

Certainly, I do not expect every business executive to become as statistics-savvy as Nate Silver. C-level people are busy men and women, and more to the point, they have people working for them who can do the kinds of analyses that Silver does. But executives need to be able to interpret and use the outputs of those analyses intelligently and appropriately, and they need to know what questions to ask. Not to acquire at least that level of rudimentary skill in an area as important as data interpretation is an abdication of responsibility.

However, learning the rules of statistics is the easy part. The real goal is to think probabilistically. I like to say that at Decision Strategies, we change the way our clients see their world. It’s extremely satisfying and fun to bring into focus the “cone of uncertainty” that extends into the future for someone and their company, to help them to see in the multi-dimensional space that is uncertainty. It’s like taking someone to a 3-D movie for the first time.

This is why teaching training courses is my favorite part of my job. I really enjoy getting people to see things in new ways. (Shameless plug: Decision Strategies has a training course in Integrated Decision Management coming up on Feb. 21-22. I’m not teaching this one because of a schedule conflict, but my colleagues are very good instructors.)

One quick caveat: quantitative analysis and probabilistic thinking are necessary, but not sufficient. No mathematical model, no matter how sophisticated, replaces the need for experience, judgment, and framing the problem correctly in the first place (this is why our course emphasizes framing, and just introduces participants to probabilistic analysis). Models simply provide insights that you’re not going to get otherwise. Once you have these insights, it’s still up to you to decide what course of action to take.

But you’re not going to get those insights if you don’t know how to use probabilistic results appropriately. Whether you attend a training course, read a book on the subject, or spend some time learning from your local statistics geek, I encourage everyone to start thinking and seeing probabilistically. It makes for a very interesting view.

Leave a Reply

Your email address will not be published. Required fields are marked *