Why We Suck at Risk Analysis
I’m a numbers guy, my master’s degree is in math, and even
though it didn’t specifically have a concentration in statistics, about half my
courses were related to it. I also don’t enjoy flying (ok, that’s underselling
it. I’m afraid of flying) so whenever I sit down in that cramped seat that has
somehow gotten even smaller, I remind myself that the number of fatalities per
mile driven is twice as many per mile flown.
If you look at injuries, it’s even worse for the drivers. Nevertheless, my
hands are sweaty, gripping the armrest as the plane comes in for landing,
something I rarely feel while driving. Why is this? It basically comes down to
the fact that we (humans) suck at evaluating relative risks, figuring
out if one action is riskier than another.
This has come up many times in considering vaccinations and other
medical treatments and there are a number of psychological reasons this is the
case. Knowing this does not always make us feel better, I still get
nervous on flights, but we can at least be aware of it when you make decisions.
The first
thing that messes us up is we often look at risks in only one direction or
“What is the risk of X compared to nothing?”. I know that flying to a different
city is more dangerous than staying home, but often I want to be in that
different city, so I compare it to driving.
Likewise, we should also consider the alternatives whenever we choose a
medical intervention, “What is the risk of doing X vs what is the risk of doing
the alternative? I’ll be honest,
there are sometimes adverse reactions to vaccinations which are rarely serious,
so yes, the danger is not 0. However, even
with many diseases close to eradication, you are usually much more likely to
get that disease, and all the serious effects from it, than have a serious
adverse reaction to the vaccine. To give
another example, when I was in high school, I had a pre-cancerous mole removed
from my back. There was a non-zero
probability I would have a serious reaction to the anesthesia or other
complications, but this was dwarfed by the (still rather low) probability the
mole would cause a problem later in life.
So, I decided to cut that sucker off.
Related to
this is when you look at the number of good or bad things that happen to a group
without comparing it to the number of things that happen to those not in the
group. One scary report I saw said that over 1,000 patients died within 48
hours of getting the Covid-19 vaccine between December 2020 and April 15th,
2021. Holy crap, that is scary isn’t it? I researched it expecting it to be
false, but it turns out it was true. Holy crap, again! But then I remembered
how many people are getting vaccinated, literally millions, 80 million fully
vaccinated to be precise, with another 70 million who are partially
vaccinated. Running through the numbers
(Converting deaths per year to expected deaths per any 2-day period of that
number of people, I can show you the numbers, but I expect I would lose the
three readers I have left) I came up with an expected number of over 10,000.
Hey, that’s a completely different story, I can just as well argue that the
vaccine lowers your risk of to 1/10 of what it would be for the next two days.
This is again, too simplistic because we are specifically looking at people
healthy enough to be vaccinated but goes to show you how the raw numbers are
just part of the story.
Another
issue I have noticed is we seem to minimize somewhat low risk and increase really
low risk. (Humans are nothing if not consistently inconsistent) For instance, 5%
or 5 in 100 seems extremely low, but becomes “larger” when you look at a
group. People have been asking, “Why do
we need to continue to wear facemasks after being vaccinated? The vaccine is 95% effective!” Well, imagine
500 people vaccinated who become exposed, we would expect around 5% of them, or
25, to be infected. So until we have herd immunity (hmmm…maybe that can be a
topic for a later blog) we should still be careful. In addition, we shouldn’t
be surprised of people who are fully vaccinated becoming infected. We should expect it, even though it is
sometimes disheartening. After all, as the XKCD to the right shows, if you do
something enough, something unlikely is bound to occur.
Now tiny risks, how do we quantify them intuitively? What is less likely two events with 1%
probability happening back-to-back or a 1 in a million-event happening? A lot of people are surprised when I tell
them that two 1% events happening back-to-back are 100 times more likely
than the 1 in a million. So often times
we’ll be concerned about that 1 in a million even while ignoring the double 1%
events. If you want to get a feel for
the numbers, my suggestion is always write them out so it becomes obvious: 1%
of 1% is one out of ten thousand or 1/10,000.
One in a million is 1/1,000,00 so this form makes it more obvious which
one is bigger.
Looking back at my example of plane vs flying, I want to discuss how we view “giving control to others.” There is a well-known issue (at least well known among the nerds who such things) in how we evaluate ourselves called the Dunning-Kruger effect. This is where we tend to believe we are better at something than we actually are, especially if we are bad at it. My favorite example is the actual statistic that over 80% of people believe they are above average drivers. I mean think about it, have you ever met someone who thinks they are a bad driver? And yet, they are everywhere. This discrepancy is larger the worse you are at something and has led to various graphics like this:
Dunning and Kruger found it was more of a straight line and leads to issues of Dunning-Kruger Effects of understanding the Dunning-Kruger Effect, but the point stands
Chances are, you aren’t as good as you think you are. You don’t exercise as much as you think you
do. You drive worse than you think you do. You are not as good at picking
stocks as you think you are. And you understand topics you aren’t an expert in
worse than you think you do. So often times we’ll think, “Ok, it might be like
that for other people, but I’m different.” . So we underestimate
the risks associated with things we think we can control: drive, gambling, shooting
free throws, even having our immune system fight off a disease. At the same time, we’re leery of handing over
control to someone else: flying, taking our pet to the vet, weather, or even
just going along with expert advice because they know more than you do. Giving
up that sense of control can be hard, but sometimes people really do know better
than you, especially in areas where they are experts and you are not.
One final thing that mixes us up on evaluating risks is our combination of clumping of data and our small personal experience. What do I mean by “clumping of data”? Well, I’m going to share with your one of the most surprising things I learned in my math studies: random data tends to clump a hell of a lot more than we think it will. Below are examples of data spread out to be more uniformly (left) and data that is random (right).
You can see what I mean by clumping, there are areas with lots of black dots and areas with very few in the random while data that was purposefully spread out doesn’t have that. So often we look at a clump (“Man, D.J. LeMahieu only has 1 hit in the last 17 at bats,”) and attribute some cause to it (“he needs to fix his swing.”) Likewise, whenever a few things happen in our lives, we start seeking an underlying reason to it. Humans are great at pattern recognition; we are so amazing at it that we often invent patterns that aren’t even there. This is important to keep in mind whenever you make a decision on what you yourself have experienced, we often only see a small part of all the information and it makes it hard to determine what is a real issue and what is just a clump of random data.
So, now
that we know all of this, we’re set to only be influenced by logic and data,
right? Yeah, no. I still get nervous on planes.
I have a lucky shirt that I wear for marathons. After my son and I both
threw-up during 3rd grade read alongs of Romana Quimby, Age 8
I burned a copy to break the curse it had over my family. We’re humans, and we’ll
always be crappy at risk analysis and let our gut instincts influence us. What I hope you do get out of this is
remembering that when you make an important decision that we need to take a
step back and look at everything. We
need to be sure to remember that the risk we feel isn’t always indicative of
the risk that is really there. And then we need to take a deep breath and get
on the damn plane.
Comments
Post a Comment