Wednesday, May 5, 2010

The trouble with RISK...

Unless you have been under a rock lately you have heard of the two biggest events in American (well technically BP is British) business since the recession began. First, Goldman Sachs and those lads there engage in some legal but unethical practices. Second, BP manages to sink a deep horizon oil rig in the Gulf of Mexico and recreate the Exxon Valdez incident only on a much grander scale (I swear if this accident is the result of a drunken individual all oil companies should require mandatory twice daily breathalyzer tests). However, while there are countless ways that these events are different. There is one major way in which they are similar. They are similar because of the risk.

Now we can all agree that drilling for oil, two miles below the surface of the ocean, or gulf as it is in this case, is decidedly risky. And we should all know by now that investing in anything is also inherently risky. However knowing something is risky and truly understanding the risk that goes along with it are two entirely different things. The difference between knowing and understanding risk should matter to anyone owning, managing, or operating any business. The reason being that assuming you reduced all the risks or assuming that an event won't happen leaves you no room to maneuver if it does.

As an example, I will use the NASA model of successful risk management... there is no absolute, there is only probability and possibility. For example, they launch people into space knowing that 1) They may never make it, 2) They make it but something goes wrong up there, 3) They make it, but then they can't bring them back (see Columbia Tragedy) 4) They make it, they make it back, but something goes wrong down here (see Fantastic Four) or 5)They make it and they make it back safely. Those are all the big picture risks that exist in manned Space exploration which by my simple math equates to a 80% chance of complete failure. Would your company take on any project that had only 20% chance of success? How then does NASA or anyone operate successfully within such a risky environment? I argue that are successful because they understand risk to be inevitable and they have a reaction plan for everything.

Therefore in order to move beyond knowing the risks towards understanding the risks there are three things to do, which apply to EVERY business decision:

1) Create a plan to mitigate the risk of everything imaginable. Really and truly think Independence Day alien invasion, think hurricanes in Iowa, blizzards in Egypt, and meteor impacts. Once you are done, think smaller, think tornadoes, think power outages, think floods like Nashville. Once that is done think even smaller, think CEO heart-attack, think computer viruses, think office fire. And so on and so on until you think you have covered it all. Then once you are done with that reduce the risks from the minutiae up to through alien invasion, the risks associated with a fire, flood, and meteor strike are likely to be similar (building location for example) but not the same so by starting small you should be building a base of mitigation for larger events. The most important thing to remember when doing this is that you will miss something. You might mitigate the risk of an office fire by installing a fire suppression system, but what if the system is down for maintenance. That should help you see the importance of step 2.

2) Create a plan for what happens after the Aliens invade, the fire burns down the building, and your CEO chokes on a peanut M&M. By doing this you allow for the possibility of something you didn't think of from happening and you have a plan of action in response. This is what is known in the biz as a business continuity plan, however that sounds like a one and done sort of thing, which isn't helpful. I like to consider it more like a base set of ideas of what to do should the CEO not die, but merely be incapacitated. Now this is where semantics takes over and a lot of people give up on risk analysis. If the CEO is incapacitated but not dead then we should do this, but if he is this, then we should do this etc. However when aspects of risk begin to delve into semantics, reconsider NASA, there is only life or death that defines success and failure for them, you should have a similar big picture definition of success and failure. This way you won't bog down in the infinite semantics of "what if."

**Update, in items 1 and 2, be sure to factor in the Human elements of risk as it plays on your greatest corporate capital, the employee. As this post points out, how you manage employees in a post calamity environment is extremely important. I mean nothing would be worse than a risk related event coming full circle by losing the employees best equipped with the knowledge and skill to improve upon whatever brought about the risk event in the first place.

3) Review steps 1 and 2 often. As Goldman Sachs found popular opinion changed. As BP found out, the technically impossible happened. If you review steps 1 and 2 at least bi-annually you won't prevent the risk of the epic fail, but you will reduce its probability and its effect.


  1. nasa made the shuttle with the lowest tenders and cheapest equipment. and you wonder why things go wrong?

  2. Undoubtedly... to me the bidding process should be included in the risk assessment. The same can be said for BP and the parts and equipment they use. And well... Goldman doesn't really have a dog in this fight, but the NY Stock Exchange does, as evidenced by the "mysterious" glitch the eliminated billions in wealth due to shadowy algorithm trading.

    Great comment!