Three reasons you jumped to the wrong conclusion

And how you can jump to the right one

Carol Moynham
9 min readOct 20, 2017

Like Canada, Australia is blessed with mostly-free healthcare. This system has saved my life more than once.

Years ago, when I heard a rumor it was about to be privatized, I created a petition and gathered over 500 signatures.

This was before the internet. Every signature was gathered in person, by hand, with crude writing instruments known as ‘pen and paper’.

Take that, fascists

Everyone I approached signed with a ‘Damn the Man’ flourish of the pen. Everyone except one person, who didn’t quite believe my warnings of billion-dollar doctor visits and hospitals restricted to the ultra-rich.

Thrown by their response, I researched, spoke to a few politicians, and discovered the actual proposal was far less dramatic.

I also found out my beloved lefties (The Labour Party) had been the only party to ever suggest privatizing the Australian healthcare system, in a move that didn’t fly some years prior.

To this day, Australia still enjoys mostly free healthcare. It was never going to be privatized. Not then, anyway.

This isn’t about politics. It’s about how we form opinions.

Remember that time you disliked someone and later realized they were awesome?

Maybe you were passionately religious (or not) and you ‘saw the light’.

Perhaps you didn’t think burnt marshmallow was a great ice cream flavor, and now you’re eating an entire tub of it while writing a post for Medium. Hypothetically.

How did I get it so wrong?

And how did that one person jump to the right conclusion while everyone panicked?

Why was your first reaction so different to reality?

In his book Thinking, Fast and Slow, Khaneman identifies two ways your brain processes information.

Most of your time is spent in System 1, where you quickly interpret body language, emotional cues, and unusual patterns. System 1 is at play when you ‘spot a face in the crowd’ or ‘go with your gut’. It makes faster decisions with less energy, but is also more susceptible to cognitive bias.

System 2 is more analytical. It takes the time to figure things out. System 2 is more accurate, less extreme, and less prone to cognitive bias, especially in complex situations.

The brain relies on System 1 as much as possible because of speed and energy efficiency. Quick judgments are crucial in spotting a predator on the horizon, a hostile face in a crowd, and processing the hundreds of thousands of images you see every day.

By nature, System 1 is more judgmental, identifying and reacting to situations that could immediately affect your survival.

However, many times a day, the techniques System 1 uses will cause you to miss vital information, and passionately jump on the wrong bandwagon.

Decisions, decisions

Overcoming System 1 biases won’t just reduce political animosity. You’ll also improve your judgement in areas like evaluating business opportunities, social interactions and shopping choices.

The Halo Effect

The Halo Effect is where your judgement about one thing is extrapolated to assess a broader situation.

In marketing, the Halo Effect is why successful companies promote their brand more than the merits of a specific product.

Because Canadians trust brands like MEC and Costco, System 1 assumes products from these stores are high quality and excellent value, without taking the time to examine each product on its own merit.

Interpersonally, this bias is at play when your System 1 sees someone physically attractive, and assumes they are also more intelligent and trustworthy. The Halo Effect is one reason appearance matters. This may have been helpful when mating and survival was primarily a physical exercise.

The Halo Effect reduces time and stress by removing the need to evaluate every single decision. However it can also miss important, game changing details.

Remember that petition I humbly rescinded? I assumed privatizing healthcare was a right-wing policy because my overall impression of right-wing politics was to sell-out government services. This is the Halo-effect in action.

The Halo effect, explained by bunnies

Helping the Halo effect along is another bias known as WYSIATI, or ‘What You See Is All There Is’.

WYSIATI

System 1 excels at constructing the best possible story that incorporates ideas currently activated, but it does not (cannot) allow for information it does not have

Simply put, when presented with incomplete information, System 1 jumps to conclusions based on face-value information.

After an alleged race crime in Australia, former Prime Minister John Howard was condemned for saying “I’m not pointing fingers”. The media, and many System 1 brains immediately branded him racist and unwilling to act.

If System 2 had started asking questions, the analytical mind would have discovered there was more to it.

Howard’s statement was actually more like “I’m not pointing fingers at this stage. A full investigation is underway and those responsible will be punished. What happened here is disgusting.” Shortly after, the criminals were arrested.

WYSIATI Bias meant that if someone didn’t hear the full statement, which most people didn’t, then it didn’t exist. Most media outlets only released the first few words, successfully branding the politician an apathetic racist.

The Halo Effect and WYSIATI work together. People leaning to the political left were more likely to believe the story of Howard being ambivalent to the race crime, because their overall impression of right-wing politics was already skewed towards racism.

System 1 used the Halo Effect and Confirmation Bias to form a face-value conclusion based on preexisting beliefs. Due to WYSIATI, no further information was gathered. There were riots in the street. But that was years ago. This is a new time.

Yet even in this new time, why do so many people vote based on law enforcement policies like the war on immigration, the war on terrorism, whoever we can declare war on next, while ignoring a politicians’ stance on health?

Terrorists and murderers a bigger issue than healthcare? I’ll just leave this here:

Preventable Heart Disease Source, Homicide Source, Terrorism Source

How are we so disproportionately focused? Consider WYSIATI Bias and the news.

You’re probably familiar with Saddam Hussein, The Taliban, Osama Bin Laden, ISIS, 9–11, and now Trump is going nuclear.

Thanks to WYSIATI, your System 1 brain assumes what’s in front of you is the biggest issue. Especially with increasingly extreme partisanship patterns in the population.

Partisanship

Partisanship is the prejudice or bias associated with a particular cause. It is based on emotional allegiance more than actual data. Now more than ever, political partisanship also influences who people choose to spend time with, and even where they live

You’re surrounded by people who agree with you, and rarely see a human version of the opposing side.

A recent Stanford University study found politics to be increasingly like “group competition” where supporters of a political party loved their ‘teammates’. They also hated opposing parties with a passion as automatic and irrational as racism.

Your System 1 enjoys this environment — one simple argument is far easier than a complicated debate.

Many of these emotional responses tie back to Reactance Theory, the ways people think or behave when they feel driven to protect a perceived threat to their freedom.

Reactance Theory is why friends (or strangers) feel uncomfortable accepting a favor, part of why we act against our best interest to make a bigger point, and it’s the reason you become so passionate when you feel someone, especially someone you know, is being treated unfairly.

In caveman days, freedom included an ability to hunt, forage and create shelter.

These freedoms have since extended to a universal idea of human rights. This movement began before the Magna Carta of 1215, and became a global interest with the second world war and subsequent Declaration of Human Rights in 1948.

System 1 combines partisanship, the concept of a universal right and wrong, and an environment of competitive political ‘teams’, to produce something called Partisan Animosity.

Partisan Animosity is where the opposing view — all of it — is treated as a universal wrong, a political evil.

Source

Despite the obvious benefits, when I suggested some government programs had some merit, “the road to hell is paved with gold” was my right-wing colleague’s ominous response.

Her System 1 used Partisan Animosity Bias to view all leftist policies as wrong. Even though government programs to make education affordable had proven merit, the idea was rejected because it was associated with the wrong team.

Why are you fed biased information?

“Why can’t they just explain things in simple language?” asked one of my close friends years ago, while we were watching the news.

Because politics is complicated and our brains are lazy.

System 1 loves heuristics. Heuristic is a method of problem solving that meets immediate goals, accepting that the answer won’t always be 100% correct.

It relies on a series of mental shortcuts, like the Halo Effect, to draw conclusions that make sense most of the time. System 1 excels at back-of-the-envelope analysis.

On a side note that friend is now studying law, after bringing up her daughter for 16 years as a single mum. Kelly, if you’re reading this, you are a wild, wonderful human. But I digress…

The brain likes easy judgements. It tends to avoid the slower, analytical ways of System 2, because System 2 requires more effort.

Remember that time you were half way through watching a documentary, then flipped over to Game of Thrones?

Or you considered reading a detailed analysis of health reform, but instead looked at pictures of cats.

Way cuter than healthcare

Your brain just flipped you over from the mental work of System 2, to the rancho-relaxo of System 1.

All around you, the world has also become increasingly complex, requiring System 1 to make snap judgements based on situations that often require System 2 style thinking.

Special interest groups know that’s how you think, and exploit heuristics like WYSIATI, the Halo Effect and Partisan Animosity because it’s in their interest to do so.

So what can you do about it?

How to be less bias, and see more friends than enemies

Mental exercise is like physical exercise — the more you do it the stronger you get. Try to engage your System 2 thinking more often.

You can focus on politics, though you’ll probably notice these biases in all kinds of conflict.

As a start, remember the world is full of very different views held by equally intelligent, well-intentioned people. It’s not as simple as good vs evil, or enlightened vs ignorant.

Kevin Rudd does an amazing job explaining this while comparing Chinese and American culture here.

Be mindful of the Halo Effect. Are you accepting an idea because it’s a good idea, or because you like the messenger?

To overcome WYSIATI, curiously adopt a first principles approach to life. Learn more about economic theory — how those moving parts fit together. Reading what some greedy banker did only gives you a smidge of the story.

True nutjobs exist, but they are few and far between. If something sounds crazy or easy to judge, switch on your System 2 and start investigating.

Be wary of Partisan Animosity, even if it makes you feel good. The left aren’t a bunch of fiscally irresponsible welfare spreading hippies any more than the right are a bunch of racist, misogynistic fascists who hate poor people.

Once you start flexing your System 2 mental muscles, you’ll recognize fake news more.

You’ll be more collaborative, more reflective, and when you jump on a bandwagon you’ll be better equipped to peacefully ride it all the way into the horizon.

You’ll know the water‘s warm

--

--