How to think about preparation for a crisis

published Mar 25, 2020, last modified Feb 01, 2022

A thread

(original source — this version contains edits to add)

Epistemology gets more important in times like this. A lot of people fall into the normal-times habits:

  • trusting authoritative sources
  • assuming that anything in a scientific paper is true and anything from common sense is false — that action must wait on good evidence

But, actually:

  • authoritative sources have bad info incentives: "woops, we didn't refill the mask and PPE reserve, better tell people that masks don't work" (this happened)
  • most authoritative people's emotions make them refuse to consider bad outcomes: "get a grippe, America, the flu is a much bigger threat" (yes, this happened too)
  • many authoritative sources have a vested interest in preserving their wealth, and admission that there's a crisis underway would endanger that: "oh no, not my stockerinoooos"
  • scientific papers are subject to a number of very serious biases, and the standards for publication are very low nowadays

Thus, your decision rule must be Bayesian — do not wait until you are 90% certain that something is true before acting on it, as it will be too late.  You need to work out the expected utility of an action, and under that target, even a 20% probability could reasonably be enough to act on.

For example:

  • If you assigned a 20% probability to this SARS-CoV-2 crisis getting bad in January, that was enough to go do prepping.
  • If you assigned a 20% chance to authorities lying about masks, but you could have bought a mask for just $5 back in January, you probably should have bought one, just on expected utility grounds.

Another one is the problem of throwing away disposable PPE (like masks) rather than trying your best to sterilize it and then reusing it.  Just because the rules say it's disposable doesn't mean you have to dispose of it.  30 minutes at 50° C is enough to kill any SARS-CoV-2 that might have landed on your mask.

Yet another rule-following pitfall is authorities reassuring you that the food/water supply won't fail.  Their incentives are not aligned with yours.  If they had any info indicating that water or food supply might fail, they would categorically not tell you.  So, when (in a crisis) the authorities say that everything's going to be OK and we'll pull through, that statement is almost worthless.  To be clear, food and water are a priori unlikely to fail in this crisis, and in my model of a disease outbreak they will probably not fail.  But the thing to watch is the model, not the announcements.

Bad times favor non-sheeplike thought, which means you need to be careful about what you believe and why.