Sam Gentle.com

Worst unsurprising case

It's good advice when planning something to try to think about and head off the worst case scenarios before they happen. If your event is outdoors, you should think about rain. If you're inviting an important keynote speaker, have a plan for if they bail at the last minute. If you're depending on something, you should be prepared to not have it. But how far do you take this? At a certain point you end up worrying about hash collisions and wolf attacks, and that's just pointless.

So, sure, your event could be cancelled because of a fluke meteor strike or your speaker could have a heart attack during the presentation, but most people don't make plans for those situations. And usually when people say "worst-case scenario" they either explicitly or implicitly indicate it's not the worst worst case, which would presumably involve brimstone in some way, but the worst reasonable case. The worst case that isn't just ridiculous. That's a bit wishy-washy for me, though, so I'd like to suggest using the worst unsurprising case.

Surprising is something that catches you completely off-guard. A meteor strike is surprising. But one of your team suddenly getting sick? Basically expected. However, things can be unexpected and still not surprising. If you drive a car you don't expect to get in an accident, but I also wouldn't call it surprising. Your first reaction to someone getting in a car accident is "how terrible", not "how could this possibly have happened?" Surprising isn't when your server loses power, it's when it loses power at the same time as your redundant backup in a different location, and the third-party monitoring system you set up to catch that mysteriously fails at the same time.

You could reasonably point out that what constitutes a surprise is subjective and ill-defined, but I think that is actually a feature of this way of thinking about risk. Perhaps for NASA, nearly any failure is surprising. Their culture has them thinking a lot about really weird risks, up to and including the impact of aliens on society. Conversely, there are times when severe failures are totally unsurprising, to the point where many people involved in a project know it's doomed to fail.

Which is to say that if the way people respond to risk is cultural, perhaps it's not so strange to assess risk culturally too. Nobody's going to take risk management seriously if they're doing it to prevent mass wolf attack or something equally ridiculous. The worst unsurprising case lets you address the level of risk that people actually give credence to, either so you can know what to plan for, or so you know when they're not being imaginative enough.