Review: The Science of Fear

image

Today, I read The Science of Fear (2008) by Daniel Gardner. It’s a remarkably well-done book for what it is – namely, a journalist’s (informed) overview of some of the psychological components of fear, and a large number of example as to how people exploit that tendency to fear.

It’s a nice book because he relies on rather well-accepted psychological research, while going into great depth on examples. It helps people to understand that these principles actually mean something.

On the other hand, for those people who were looking for a explanation about how fear works inside the brain – such as myself – it’s a bit lacking. And the reliance on solid psychological principles means I didn’t learn any new psychology. Regardless, I enjoyed the read because it was well-written and interesting.

In lieu of a review, allow me to review some of his points. In the course of the book, Mr. Gardner outlined three ways the brain screws up, leading people to irrational fears.

1. The Availability Heuristic

The availability heuristic is a pretty good rule. It’s a general cognitive bias – pretty robust across all humans – and works out to people predicting the probability of events in proportion to how many instances of it they can recall (are available).

If you do something a lot – work on computers, go hunting, etc – then over time you establish a battery of experiences. If someone asked you how probable something was, you could reach back into your experience and get a feel for how many times you’ve seen it – and give a pretty good example.

The big advantage is that it’s computationally very fast. If you need to make a split-second decision, you want it to be fast.

It’s also extensible: that is, people don’t differentiate between their own experiences (memories) and other people’s (stories). This works out really well if you talk to people who do the same thing as you do – say, a bunch of hunters sitting around a fire swapping stories. That way, you can tap into the knowledge of your entire community (if you haven’t yet experienced it, you don’t know how common it is – hearing stories of experiences can both ameliorate your ignorance and give you ideas of what to do to deal with it).

But therein lies the rub. The media specialize in providing stories – really compelling anecdotes – about things that happen. The brain doesn’t differentiate based on sources, so the availability heuristic can be screwed in the incorrect direction. People vastly overestimate the risk of terrorism, kidnappings, and murder; but vastly underestimate the risk of car accidents, drowning, diabetes, etc.

2. Confirmation Bias

Confirmation bias is an old favorite of psychologists, simply because it explains so much.

It’s pretty simple, actually. Once you believe something, your brain tends to look for other instances of it – confirming instances. It does not, however, look for falsifying instances for your belief. Sometimes, your brain will even change it’s recollection of the facts to conform to your current belief (one example is the “rose-colored glasses” effect; you believe the past was better, so you unconsciously modify your memories of the past to make it match your belief).

But it also means once you believe something about, say, terrorism, you’ll focus on the positive (that is, supporting) instances – and ignore the others. No terrorist attacks does not affect your belief about the danger of terrorism even though a terrorist attack down – which is illogical. It’s a binary outcome, therefore one outcome value  should be just a good a predictor as the other. The brain doesn’t think so.

A rather insidious effect of confirmation bias concerns the use of statistics. If you believe someone, and you come across a statistic (or a story) you disagree with, you’re going to scrutinize it very closely. If, however, you come across a statistic which supports your belief – then, hey, no need to question the source or the methodology, it’s obviously correct. People apply different levels of evaluation to information that conforms with their existing beliefs to information that violates their existing beliefs.

3. The Urge to Conform

Conformity has been studied a great deal, and the results are pretty consistent. When people are in a group and a task is difficult, you see more conformity. That is, lower confidence in the result for any one individual means that people are more willing to accept a group consensus. Funnily enough, though, each individual’s belief in the accuracy and reliability of the group consensus goes way up – even though the confidence of any individual’s conclusion is low.

Mr. Gardner makes the important point that conformity actually serves a good purpose. If you’re on the African plains, and everyone around you begins to get worried about a tiger in the grass – well, even if you can’t see the tiger yourself, there’s a pretty good reason to take precautions. More formally, it allows all members of a group to take advantage of the knowledge from all members of the group, and not rely on their own knowledge all the time.

The problem is that once a belief has taken hold in the general population, it’s bloody hard to get rid of. The combination of conformity – people fall into line – and the confirmation bias means that as a group, people don’t deal with falsifying evidence well at all. Mr. Gardner goes through a hilarious number of examples showing that (i) people say they believe something because of the evidence, (ii) you prove the evidence is wrong, (iii) people still believe it despite accepting that the evidence is wrong.

A Passing Note

In addition to those three psychological features, Mr. Gardner notes a few other issues. Here’s one I found striking.

It has to do with pointing out how badly people deal with numbers. People have no innate ability to deal with numerical data; though they do have a pretty good ability to deal with proportions. Unfortunately, this isn’t a good thing.

Mr. Gardner gives a great example. Take two groups of people: in both, tell them they are reviewing how much money to devote to improving airport safety. Tell the first group that implementing the precautions will save 150 lives; tell the second that it will save 98% of 150 lives. Consistently, people rate saving 98% of 150 lives higher than 150 lives (that is, the second group would devote more money to the project then the first group, even though they were saving objectively fewer people).

And don’t get started on how bad people are with probability – it doesn’t bear thinking about.

A Brief Conclusion

The Science of Fear rests on some good psychology, and goes into a large number of examples as to how human reason fails us when it comes to knowing what to fear.

The real effect of the book is to persuade people to be less afraid; it reduces fear. Mr. Gardner systematically goes through most hot-button political issues, and shows how the data doesn’t back up the fear-mongering. Not only is he persuasive, but he writes in such a fashion that you’ll pick up an innate skepticism of the media (if you didn’t already have it) and a deeper skepticism for anecdotes (if you have no statistical background).

It’s certainly worth the time just for that.

Post Revisions: