The psychological tricks we use to fool ourselves into believing fake news

One of the things I hope my statistics students learn is how irrational, subjective and suggestible human beings are. It is very easy for us to believe things that are obviously not true because random factors can have a huge effect on how we see the world.

Renowned psychologist Daniel Kahneman pioneered the idea of behavioral economics. His research reveals so much about how chance elements influence our thoughts that he won the 2002 Nobel in Economics. He shared his life’s work in Thinking, Fast and Slow, a hefty book explaining many of the things that make us poor interpreters of the world around us.

For example Kahneman and his research partner, Amos Tversky, performed an experiment involving the use of a modified Wheel of Fortune game. Students simply played the game in the same way contestants play it on TV.

There was a twist, of course.

There are actually two wheels. Unknown to participants, both had a series of random numbers, but one series of random numbers were larger than the series on the other wheel. Participants were randomly assigned to a wheel, played the game, won modest prizes, and competed a short survey before leaving.

Here is where it gets interesting .

The survey consisted of a number of arcane questions whose answers are unlikely for most people to know, forcing a guess. How old was Gandhi when he died? How deep is the deepest point in the ocean? How high above the Earth does space begin?

Amazingly, the people exposed to the lower set of numbers guessed lower to the questions and the group exposed to the higher set of numbers guessed higher.

Simple exposure had an effect on how subjects interpreted their world.

In another experiment, subjects were shown to a small office and asked to complete an assessment of altruism — attitudes about sharing and caring for others. Again, there were two experimental conditions. In one, the computer monitor on a nearby desk had a screen saver of a dancing dollar sign. In the other was a screen saver of a dancing heart.

Again, this seeming inconsequential random variable had a measurable effect. People exposed to the dollar sign scored lower on measures of sharing and caring than did the people exposed to the dancing heart.

These and similar experiments show that the way we experience the world can be influenced by events to which we do not realize we have been exposed.

This is why it is so important to know something about science and statistics. These subjects teach us a disciplined method of interpreting the world.

Even then, our brains are hard wired to trip us up.

Kahneman tells us we interpret the world using one of two methods.

System 1 operates very quickly with little conscious awareness. Generally, this method consists of heuristics — mental shortcuts that help us make decisions or gather information quickly. Soliciting an opinion from an expert in an example of a heuristic.

System 2 is the one with which we are most familiar. This the cognitive process of recalling previous information, thinking about it, reflecting, weighing the strength of variables and finally coming to a decision. An example might be working out a math problem, or inferring probabilities from tables of numbers.

These two systems are in the title of Kahnemans’ book, Thinking Fast and Slow.

One of the subtle System 1 methods of decision-making involves substituting an emotional/heuristic question for an objective/rational one. Kahneman gives the example:

Objective/rational question

“How much would you contribute to endangered species?”

Heuristic/emotional question:

“How much emotion do I feel about dying dolphins?”

We do this without being aware that we are doing it.

This might explain a behavior I find puzzling.

In one breath, staunch supports of universal health care passionately criticize the pharmaceutical industry as greedy opportunists preying on vulnerable ill people. In the next, they argue just as passionately for universal health care, which would open the gates of the US Treasury to pillage by the pharmaceutical villains they demonize.

Here is another example from the recent past:

Providing universal health insurance for all Americans is a monumental task with a host of interlocking challenges.

System 1 substitutes the hard objective/rational question of “how to pay for universal health care” with the emotional/heuristic question “how passionate am I about providing health care to everyone?”

System 1 provides a short cut to decisions, but it is not objective or rational. Subjecting the question of universal health care to System 2 thinking — the coldly logical cognitive process — reveals that without substantial rationing universal health care is simply unaffordable:

The California Legislature spent an entire session trying to hammer out such a system, but could not come to an annual price less than $400 billion dollars — more than the entire budget of the state of California.

None of the proposed solutions — taxing the rich being the most common — is workable. If any of them were viable the problem would be solved and California, (and maybe the rest of us too), would have top-notch health care.

There are other ways System 1 deceives us.

The Representativeness heuristic

The representativeness heuristic states that the more similar a person is to what we think are characteristics of a group the more likely we assume they are members of that group.

A woman moves into the apartment next door. She is prim and proper, wears conservative clothing, reading glasses hang form her neck and you notice the movers carrying in crates of books.

The System 1 representativeness heuristic would compare what you know about your new neighbor to people you know in various occupations and decide that she is a librarian, and not a business manager, doctor or truck driver.

However if you had used System 2 and gone to the Bureau of Labor Statistics website and compared numbers of women in those occupations you would find out that any of those other occupations have far more women than the occupation of librarian. Therefore, your new neighbor is not likely to be librarian.

The Availability heuristic.

The easier it is to being information to mind — the more salient it is — the greater its influence on decision-making.

We tend to give greater weight to negative and emotionally powerful information. There is good reason for this. Our species wandered savannas and jungles for hundreds of thousands of years. In that environment, information about sightings of snakes and lions is more important than wildflower blooms.

This is why we overestimate the probably of rare, but terrorizing events. Airline crashes are in the news for days or weeks, while fatal auto crashes are so common they barely make into a news feed. Consequently, many people are afraid of air travel, but have no qualms about driving to the airport, even though they are far more likely to be die driving there than flying somewhere else.

The current media and special interest driven hysteria about school shootings is another good example of the availability heuristic. Children are far more likely to die at the hands of parents, stepparents and foster parents than by a crazed gunman at school.

Another example of a media related availability heuristic the common fear of being murdered. The truth is that we are far more likely to commit suicide than to be murdered. This is because the media publicizes murder, but never mentions suicide, in the belief it would encourage “copycats”. (The same argument can apply to publicizing murders, by the way.)

Priming

Similar to the availability heuristic, priming refers to the availability of conscious information. Medical students often enter a stage of hypochondria because they are exposed to information about so many ailments, they interpret physical sensations they would otherwise ignore as a symptom of disease.

Priming also explains why we startle easier after watching a horror film.

Here is a real life example.

In 1974, I was walking along a sidewalk with a knot of people after watching The Exorcist. If you have seen the movie you know it contains graphic scenes of projectile vomiting . I noticed a young woman across the street vomiting into a sewer grate as her boyfriend held her steady. I have no idea why she was sick — maybe she had stomach flu, food poisoning or simply drank too much beer.

It didn’t matter to a woman well primed by the movie who yelled frantically and loud enough for everyone to hear, “My God, she’s possessed! Get away from her!”

Everyone was silent for a moment, and then broke into laughter as the embarrassed woman ran across the street to apologize and comfort the young woman.

Anchoring and Adjustment heuristic

The Anchoring and Adjustment heuristic is when we use a number or event as a starting point and make adjustments as we move forward.

You see a car on Craigslist you think you might want to buy. The price the owner lists is the anchor. It is at that point you and the owner negotiate adjustments.

That sounds straightforward enough, but what if you go to a used car lot?

In that case, there are all sorts of anchoring and adjustment: how much the salesperson offers for your trade in, the interest rate, the stated price of the car. This is why used car salespeople have such a poor reputation — they are very skilled at anchoring and adjusting variables to their advantage.

Here is a suggestion:

Let System 2 take over. Learn how to use Excels’ Payment (PMT) and Solver functions, then negotiate with the used car salesperson.

Anchoring and adjustment does not apply only to decisions involving numbers. Imagine you try a new restaurant, and have a bad experience. The server is surly, the food is cold and the plates look dirty. You vow never to go again, but a few weeks later a friend asks you give the restaurant another chance. You do and your dining experience is completely different. Great service, fabulous food and the place looks clean and inviting. Your opinion of the restaurant improves, but not to the extent that it might if you had not had the original bad experience. You will not likely choose the restaurant again, nor would you be likely to recommend it to your friends.

These are just a few of the ways that events we may not even realize influence our judgement and decision-making.

System 1 is fast and handy. Hundreds of thousands of years of cognitive evolution shaped System 1and its value is unquestioned.

But System 1 is more suited to immediate interactions with the physical world.

When we really want to know what causes things to happen — to identify cause and effect relationships — we need the discipline of System 2 thinking. It is slow and deliberate, but returns more accurate information than System 1, especially when we make judgements about other people.

Just yesterday a gentleman claiming to be an employment recruiter challenged a statement I made about a research finding that about one third of the men laid off during the Great Recession were still unemployed three years later and likely exited the labor force completely.

I posted a quote from the study supporting my assertion along with the citation.

His response?

“So 67% did (have a job). In my math, that’s a majority.”

He seems to accept the fact of the numbers, but completely ignores their meaning. Two thirds sounds so good!

For the person laid off a 33% chance of not getting another job for three years and possibly exiting the labor force is what is relevant, unlike the fact that a majority — more than 50% — might not face that experience.

A similar incident at the community college where I teach made me cringe for my colleagues running a vocational program.

A recurring advertisement on one of the big screen monitors all over campus featuring this vocational program quoted the total price of books and tuition as $12,500, and the tag line bragged of an 85% job placement rate after graduation.

Eighty-five percent makes me feel good, but…

Wait a minute! Two years and twelve thousand dollars for a vocational program that fails to place 15% of it’s graduates in a job? That’s awful! The unemployment rate hasn’t been that high since the depths of the Great Recession!

If we can turn objective questions into easy to solve heuristic/emotional questions to get a quick answer, we can do the opposite to turn heuristic/emotion questions into objective rations ones.

Would I gamble my life savings of $12,500 and two years of work without pay, (the two years of education), and the labor it takes to replace that $12,500 if there were a 15% chance of losing it all?

Only if I were desperate enough.

That’s why we need to approach many of life’s questions in a disciplined and methodical way.

So take that stats class!

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.