Thursday, May 10, 2007

BP's Black Swan

BP's Black Swan

I was happy to discover Nassim Nicholas Taleb's new book waiting for me when I arrived home yesterday evening. Although I've recently been experimenting with vignettes like this mini-blog entry (I'm standing on the train, scribing with my thumbs, so please forgive any errors or incoherence), I think I'd rather have something to read.

I heard an interview with the author on Econtalk this weekend and ordered the book on Sunday. Very nice service from amazon, if a bit pricey.

Recently there has been some indignance, mostly from the yellow press, alleging that Lord Browne had callously decided £10m to be the value of a human life. I suppose some would say that human life is priceless, but this conveniently ignores the fact that industrial accidents do occur, and they have a financial impact that must be considered.

More to the point is that BP's analysis of the likely consequences of an industrial accident was an utter failure. The reason for this grossly uninformed statement takes us back to the subject of Taleb's book: the vast majority of industrial accidents result in little or no loss of life. A large firm collecting data on the impact of these events would reasonably expect that a large explosion in a major refinery in a wealthy industrial centre to be quite unlikely. So unlikely, we can be fairly sure it was unanticipated, in spite of widely available reports of dangers at the affected plant site.

I am sure this sounds improabable to most people. Why would a large company look purely at a statistical model to account for risk, and ignore the widely-reported concerns of their employees? From a purely technical perspective, it's all just data.

The crux of the matter is that most risk models based on statistcal assumptions are (for lack of a better word) wrong. They assume that the probability of an event occurring in the future can easily be derived as a function of past occurrences. Even if the concerns of employees could be quantified in a meaningful way, the risk calculations would become intractable.

So maybe the indignant voices have a point after all. Putting too much faith in mathematical models is foolhardy: many improbable events are more likely than can be forecast, if only because they've never happened in the past. While the black swan is maybe a tired example, it's easy to understand.

More on Taleb's book once I've actually read it...

1 comment:

Graeme Burnett said...

It's the classic quandry - how do we enumerate the subjective - applied, in this case, to the unpallatable: the value of human life.

Every day we build models to segment our world based single digit numbers. 7 being the maximum simultaneous objects with which the mind can easily cope apparently.

An example is hierarchy - in particular, ontologies - but I'll cover that in a forthcoming post on news analysis and how people store and retrieve information. I'll hypothesis 4 is an optimal level of hierachical decomposition.