Missing Flights
There’s this maxim that sometimes comes up in rationalist circles:
The optimal number of missed flights is non-zero.
It's kind of this fun thing where it's like a little contrarian and unintuitive but well within the Overton window so you can say it at a dinner party.
You can say it about other things and it turns into a bit of a meme:
The optimal number of foods that spoil is non-zero
The optimal number of bad glasses of wine is non-zero
The optimal number of bad hair days is non-zero
The optimal number of times worrying about what others think of you is non-zero
The optimal number of people that you’ve annoyed by referencing this style of thinking is non-zero
It sort of catches people out because it sounds like you’re saying that you like spoiled food and bad wine and missing flights which is obviously absurd.
Rather, what you’re saying is that living in a way that ensures these things never happen comes at a cost: And this cost might be outweighed by the cost of food occasionally spoiling or wine tasting bad or even missing a fight.
I was thinking about this maxim last week as I was forking over $400 for a new domestic flight and canceling evening plans with my friends while waiting around the airport for five hours.
Did I finally earn my rationalist card? If I miss one more flight will I have to hand it back?
What would Chigurh tell me?
A rationalist would amicably remind Chigurh of outcome bias: Missing this particular flight isn’t necessarily proof of a bad rule. If you take 1,000 flights, you would expect to miss some. This might be that some. Arriving early enough to ensure you never miss flights might be more costly overall.
(I wouldn’t mention the bad hair day example to make the point.)
Shall we stick on some made-up numbers? Let’s say missing a flight costs you $1,000 of disutility. And getting to the airport an hour early costs you $10 of disutility. In an expected value framework, you should arrive an additional hour earlier if it reduces the chances of missing the flight by more than 1%. (ie, 0.01 * 1,000 = 10). At some point there’s no margin to be gained and you should take the risk.
(I’m aware that Deutschians are screaming at me for using probabilities and utility when reasoning about real life. I hear you. Let’s just roll with it for now.)
So if you can figure out what those made up numbers should actually be, then you can figure out when to get to the airport and how many flights you should be missing over your life.
People's utilities are different of course. If you're a neurotic mess you may want to arrive neurotically early. If you don't mind hanging around airports because you're going to be reading or scrolling twitter for a few hours at home anyway you may want to arrive calmly early.
The calculation still holds: At some point there’s no margin to be gained.
Payoffs will be different for morning flights and afternoon flights. Getting up at 4am to eliminate risk sucks a lot worse than arriving earlier than usual in the afternoon.
As noted, we can apply this style of thinking to other areas. Missing flights is just a provocative and memorable example as it’s so obviously incredibly annoying to miss your flight. Scott Aaronson lists a few1:
If you never cut yourself while shaving, you’re not shaving close enough.
If you’ve never been rejected, you’re not asking enough.
If you’ve never regretted a blog entry, your blog is boring.
You get the idea. In many areas the default is to be too risk averse. Not only that, but we may justify this risk aversion due to hindsight bias when something bad happens.
Social problems
It can also be applied in the social realm. My favorite formulation of this was when the comedian David Mitchell played a politician complaining that there were zero drownings in his region that year.
Zero drownings implies that too much was spent on fencing, warning signs, swimming lessons, and waterside paths. Proper priorities would mean at least a few drownings each year.
It makes something explicit that is usually taboo. We often make tradeoffs that require having an amount of a bad thing. Most people’s contender for the worst type of bad thing is people dying. Sometime we require that: We just don’t like thinking about it. Unfortunately for us, the optimal amount of thinking about bad things is also non-zero.
The comic skit feels more concrete for many of us post 2020: The optimal number of COVID deaths is non-zero. This was controversial to say at the start of the pandemic: People’s lives are a quintessential example of a Tetlockian taboo tradeoff.
Risk vs expected outcomes
Let’s stop thinking about people dying and jump back to those pesky individual decisions.
I think there’s an additional source of confusion on top of the whole “it sounds like you prefer it when bad things happen”. Namely, it’s easy to conflate the optimal risk of something happening with the optimal expected outcome.
For instance, if the optimal number of missed flights is one out of every 200 flights. Someone who only takes 10 flights should not expect to miss any of them. Someone who takes 500 flights should expect to miss some. Basic math. Obvious enough.
In most of the “optimal number of [bad thing] is non-zero” scenarios our expected number of [bad thing] is non-zero. For instance, we should expect to be rejected by someone at some point. We should expect our food to spoil. These things not happening is a bad sign: A life lived too cautiously.
However, there are some examples where the optimal amount of risk should be non-zero but the optimal expected number of times it happens for a particular person, even over a lifetime, actually is zero.
Scott Aaronson had another example in his list:
If you’ve never been robbed, you’re spending too much time locking doors.
I’m not so sure about this. Sure, if you’ve eliminated all risk of being robbed - if that’s even possible - then you’ve put too much effort into this problem. It’s sort of like an inverse paperclip maximizer: You’d have to kill everyone to eliminate any risk of being robbed.
We should obviously take precautions to reduce the risk of being robbed. It's not clear to me that this risk shouldn’t be low enough so that we expect to never be robbed. To reiterate, the risk of being robbed is non-zero. And some unlucky people will get robbed while still taking these precautions. But the risk should be so low that a particular person would expect zero robberies to happen to them.
Again, it’d be hard to calculate these things in real life (made up numbers and all that) but the point I’m making is that non-zero risk can be consistent with an expected outcome that rounds down to zero. And that the optimal number of [bad thing] happening sometimes is actually just plain-old-not-going-to-get-any-funny-looks-at-the-dinner-party-when-you-bring-it-up zero.
I don’t know what the true optimum of missed flights is. My hunch is it’s non-zero for the average person. Unlike robberies, a life well-lived has some missed flights. I say this despite the recency bias of being slouched in my chair in the departure lounge staring blankly out the window at the runway for several hours.
If I’m wrong, at least I know that the optimal number of regrettable essays is non-zero.
I didn’t realise how old the post was: Aaronson’s been consciously missing flights for 20 years.