25
Apr
2020

Analysing uncertainty – TMI

“These days everyone has the same data regarding the present and the same ignorance regarding the future.” Howard Marks

 

I’m fascinated by this talk on Three Mile Island (TMI) and analysing different stories around what caused the disaster. Please do watch the whole thing. You’ll thank me I promise!
Who, or what, was to blame for TMI? Our minds are wired to think in terms of “cause and effect” – but the reality is that when an accident in a nuclear power station happens, a combination of improbable failures have to happen to cause a crisis. In hindsight these “causes” look both inevitable and preventable (the pilot operating relief valve had been leaking almost since the first day the plant was operational).

Too Much Information

It’s hard to put ourselves in the place of people making decisions at the time. How could the nuclear plant operators miss the sump alarm? Well, there were 600 alarm lights in total. On any given day, when the plant is running exactly as it is designed to, 40 to 50 of these alarms would be flashing a warning. One cause of the disaster at Three Mile Island (TMI) was Too Much Information (TMI).

The component parts of uncertainty

The word “analysis” comes from the Ancient Greek ἀνάλυσις – means breaking (lusis) something up (ana) into smaller parts to gain a better understanding. Intriguingly people have suggested that “uncertainty” can be broken up into at least two different types: aleatory uncertainty (the chance of an unpredictable event) and epistemic (personal ignorance of a known event). David Spiegelhalter gives the example of a lottery ticket or a dice throw (alea = dice in Latin) for the first type of uncertainty, and a scratch card (epistemic = knowledge) where the outcome is already decided, we just don’t know what it is. Aleatory uncertainty you can’t know, but epistemic uncertainty you can try to figure out. 

Dark Number

The Germans have a word for describing epistemic uncertainty: die Dunkelziffer.  Literally “dark number”, but a better translation is “estimated number of unknown or unreported cases.”  Events have happened in the past, but they may not have been recorded consistently (differences in mark to market accounting, causes of death on a death certificate, reported crime figures are all examples).

Equity investors have to live with aleatory uncertainty, unpredictable events. But I think that it is possible to develop some sort of advantage with a better understanding of epistemic uncertainty, an event has already happened, and investors have to figure out the correct interpretation of reported results. Epistemic uncertainty is what the nuclear engineers at Three Mile Island were dealing with. They knew something wrong had already happened, they were sitting in the control room at 5AM trying to figure out what to do next, knowing that good decision making could prevent a disaster. But they failed because they couldn’t work out which improbable things had already happened.  They were using the wrong models in their heads (specifically they had been trained on nuclear submarines, and were worried about the reactor “going solid” – which is less relevant for a 2840 Mega Watt  power plant than a 12 Mega Watt submarine ).  

Dealing with epistemic uncertainty 

More information is only helpful, if it reduces uncertainty. Models are only useful, if they represent reality in an insightful way. When dealing with epistemic uncertainty, very often a simple model or heuristic is more insightful than a more complicated model. That is because a more complicated model is often overfitted to the past that is no longer relevant – there is a bias / variance tradeoff. Gerd Gigarenzer has written about this a lot.

Heuristics for reducing uncertainty

So at the moment here are some heuristics am I using:

Past crisis have taken longer to resolve than was anticipated at the start (true of GDC 07-08, TMT collapse 01-03, but also the 1970s oil shock)
Be very careful of throwing good money after bad and avoid risk of ruin (if bank investors took up the rights issues in 2008, they would have lost more than their initial investment)
Things will go wrong that have not gone wrong before (in 2007 there hadn’t been a bank run on a modern day UK bank. But just because something has not happened before, doesn’t mean it is impossible. Oil price going negative is another example)
Be sceptical of complicated models and those who use them to justify poor decisions (remember David Viniar CFO of Goldman Sachs claiming in 2007 the models showed we were experiencing 20+ sigma events several days in a row)
NN Taleb will get annoyed with people who call such disasters a “black swan” because he will claim that a disaster was inevitable given a long enough time frame. (OK I cheated. This has already happened

There will be opportunities as well as risks (I’m still kicking myself for not investing in Rightmove which was below 20p Dec 2008.  The shares peaked at 684p earlier this year. A 35 bagger!)
Watch what Buffett does (he has pretty much the same information as the rest of us, but he is superb at interpreting that information)

Stay safe

Photo by Dan Meyers on Unsplash