The Best Australian Science Writing 2012 (17 page)

BOOK: The Best Australian Science Writing 2012
3.13Mb size Format: txt, pdf, ePub

* * * * *

If the answers to those questions are yes, then we need to better understand the factors that make otherwise rational people subscribe to irrational beliefs – and, importantly, what might be done to prevent a growth in anti-science thinking.

Fortunately, there is enough research in this area to provide a fairly clear overview of why this happens.

At the heart of the problem, as outlined extremely well by Goldacre in
Bad Science
, is the way we are wired psychologically, which leads us to common errors in our thinking which in turn lead to distortions of perception, inaccurate judgments or illogical interpretations.

Social scientists call these ways of thinking ‘heuristics': mental shortcuts we take as a way of responding to rapid and complex information being fired at us. We need to quickly sort new information into categories – and an easy way to do this is to sort it according to our existing belief systems or values.

This holds true for beliefs about genetically modified (GM)
foods, the safety of nanotechnology, climate change, your favourite football club and so on – and the more complex the issue, the more likely it is that people will make decisions based on beliefs or values.

* * * * *

In an ideal world, we look at different information, analyse it carefully and make up our minds on a case-by-case basis. But that doesn't work when we don't have the motive or ability to do this.

We are increasingly time-poor in an increasingly data-rich world; that forces us to make mental shortcuts more often, drawing upon whatever existing knowledge we have (all too often from the media rather than from formal education), or falling back on our basic beliefs. It's not always one or the other, of course, as we tend to use a mix of emotion and logic, but we need to look at which is dominating our thinking.

Nobel prize winner Daniel Kahneman has coined the term ‘thinking fast and slow' to describe our different ways of thinking, but he also points out two important things: firstly, that slow thinking does not always lead to better conclusions, and secondly, that while we can recognise errors of thinking in others, we can rarely recognise them in ourselves. So everybody else uses faulty thinking, except us!

And this is all made more complicated by the fact that in the age of the internet, the information and communication flows are entirely different from what we may have been used to even only a decade ago.

We all know that the promise of the internet to provide us with a wealth of information to make us smarter was akin to the early hopes that television would make us more educated and could teach us many languages and so on.

Instead, it turns out that we are most likely to be watching people dance and sing and cook on TV, and watching talking babies and satires of the Hitler bunker scene in the film
Downfall
on the internet. And among the tsunami of irrelevant data on the web, we invariably end up hunting down data that supports our existing beliefs.

It's not the internet that is fully to blame – it's just a channel for information – but the sheer amount of data of dubious credibility that is available and that doesn't readily distinguish between comment and research, or blog and news has changed the relationship between information and attitude formation.

Where as kids we might have started with the germ of a wacky idea and sought to check its validity with experts such as teachers, or even by reading an encyclopaedia, we now have the ability to almost instantly find a community of people somewhere in the world with similar wacky ideas, never tested by an expert.

And through the internet, we can also reinforce each other's wackiness to the point where it becomes a solid value that ain't shifting for nobody, no how. Just Google ‘sexually abused by aliens' or ‘sin causes cancer' to see what I mean.

* * * * *

Access to the enormous breadth of opinions on the internet has revealed that when swamped with information, people use mental shortcuts, and follow that up by ‘motivated reasoning'. This means acknowledging only information that accords with our beliefs, and dismissing information that does not.

So if you believe UFOs are evidence of alien visitations, you would acknowledge every bit of data you found that supported that, and would dismiss everything that argued against it, and as a result you would tend to find only information that supported your beliefs, thus increasingly reinforcing them.

Likewise with climate change, as we are seeing played out over and over again in public debates. Those who reject climate change as being human induced – or even happening at all – are not swayed by any scientific evidence, and cling tenaciously to the sporadic data that might seem to support their views. Again, the reasons for this appear to be about the way we are wired.

It has been well documented in surveys that those who are politically conservative tend to reject human-induced climate change, while those who are more politically left-leaning tend to support it.

* * * * *

But it is not a person's politics that are the key drivers of attitudes; it is our underlying beliefs and values that affect our political alignment.

If your underlying belief system is that humans should dominate or tame nature (anthropocentricism), that economic growth is inherently good for society and should be maintained at all costs, and that an individual's rights are more important than the public good – then the idea that individual actions are actually causing damage will conflict so strongly with that belief system that you will instinctively reject it.

Likewise, if you believe that humanity must live in equilibrium with the planet (geocentricism), that we need to put the brakes on material progress to be more sustainable, and that public good is more important than individual rights – then the concept of human-induced climate change aligns well with your belief system and you will accept it very easily.

People then shop around for the data that most supports their existing values. And if you can't find it in scientific studies, rest assured you will find it somewhere else, such as on the internet.

An interesting statistic from a 2003 PhD study by Cathy
Fraser at the Australian National University into vaccinating and non-vaccinating parents, all of whom had access to the standard Health Department publications on vaccinations, shows that while only 1.6 per cent of vaccinating parents used the internet for more information, 36.2 per cent of non-vaccinating parents sought data from it.

So is getting more good facts out there the answer? Maybe not. Brendan Nyhan, at the University of Michigan, undertook a study which found that when people were shown information proving that their beliefs were wrong, they actually became more set in those beliefs. This is known in the business as ‘backfire'.

And what's more, highly intelligent people tend to suffer backfire more than less intelligent people do. The adage that attitudes that are not formed by logic and facts cannot be influenced by logic and facts holds true here.

So what about providing the public with more balanced and factual information?

Well, that can be a problem too. When you present the public with both sides of a story, giving them the arguments for and against, research shows that again, people's existing attitudes tend to become more entrenched. This research, conducted by Andrew Binder at North Carolina State University, found that most people, when faced with an issue related to science and technology, fairly quickly adopted an initial position of support or opposition, based on their own personal combination of mental shortcuts and previously held beliefs.

And the more people with opposing points of view talked together about divisive science and technology issues – such as GM food, nanotechnology, stem cells, take your pick – the less likely the different camps were to agree on any issue or even see it the same way.

Binder stated, ‘This is problematic because it suggests that individuals are very selective in choosing their discussion partners,
and hearing only what they want to hear during discussions of controversial issues.'

This means that the media's tendency to aim for balance in their stories, particularly on contentious topics, giving first one side of the argument and then the other, can actually exacerbate this problem of polarised extreme opinions.

* * * * *

The next thing we need to know is that the dismissing of facts and figures can be increased when somebody is highly emotive about a topic. So look around and see who is playing the ‘scare card' and whipping up a bit of emotional concern about topics. The more agitated, scared, upset or angry we are, the more receptive to emotive messages – and the less receptive to facts – we are.

Which brings us to the fear factor. Former US President Franklin D. Roosevelt once said, ‘We have nothing to fear but fear itself.' If only.

According to Frank Furedi, professor of sociology at the University of Kent and author of the
Precautionary Principle and the Crisis of Causality
, we are losing our capacity to deal with the unknown because we increasingly believe that we are powerless to deal with the perils confronting us.

He says that one of the many consequences of this is a growth in policies designed to deal with threats or risks being based on feeling and intuitions, rather than on evidence and facts.

Jenny McCarthy, celebrity leader of the anti-vaccination movement in the US, says she bases vaccine rejectionism on her
intuition
. Likewise, many alternative therapists advocate that people should put their trust in their own intuition to justify their choices.

* * * * *

So when people choose not to vaccinate, it's not because they are stupid – it's because their fear of the harm from vaccination has become stronger than their fear of the harm from not vaccinating, even though the evidence shows that they are wrong.

The diseases that we vaccinate against are these days unknown and unseen – we no longer see children dying from whooping cough or suffering from polio. However, what we
do
see are stories of children suffering autism and other conditions supposedly as a result of vaccinations.

No matter how small a risk this might be, it is one that is visible and known – and therefore given a higher priority.

A serious outbreak of whooping cough or measles might change all this, of course, and at the moment this is a dangerous possibility in some parts of Australia and the US: in California alone there were over 7800 cases of whooping cough in 2010, with ten deaths, due to lack of vaccination.

US health officials, when facing a large public rejection of smallpox vaccinations at the turn of the 20th century, talked about the need for a ‘fool killer' – an outbreak of smallpox devastating enough to convince people of the need for vaccinations and overturn their intuitive mistrust of them.

Furedi argues that this reliance on intuition, which has served us well for tens of thousands of years – by stopping us from doing things like stepping out of the safe cave into the dangerous dark of night – can lead to superstition and belief in paranormal phenomena and pseudoscience.

And according to Bruce Hood from the University of Bristol, humans have evolved to be susceptible to supernatural beliefs. He has postulated that the human mind is adapted to reason intuitively and to understand unobservable properties, such as what makes something alive, or people's motivations. On the plus side, it is this intuitive thinking that has led to many
scientific theories and revelations, such as gravity; but it also leaves us prone to irrational ideas.

Furedi has stated that misconceptions about the working of the world around us, such as astrology and other supernatural beliefs, are due to naïve intuitive theories. Psychologists Marjaana Lindeman and Kia Aarnio, from the University of Helsinki, have gone one step further and described this as ‘immature errors of reasoning', which are on par with children still learning about the natural world.

They say there are three major sorts of knowledge that determine children's understanding of the world: intuitive physics, which is an understanding of the physical world; intuitive psychology, which is an understanding of how people think and behave; and intuitive biology, which is an understanding of the principles and forces of life.

When we mix these up, they argue – such as by investing physical objects such as crystals with healing powers – we are suffering from ‘ontological confusion'. This confusion underpins many alternative health treatments that are based on the belief that thought can alter health outcomes, or that touch can convey healing power.

Similarly, they state that cognitive errors underlie homeopathy, reiki, healing by touch, distance healing and birth affirmations, which are often based on attributing some sort of ‘life forces' to physical events.

* * * * *

Which brings us to the next thing we need to better understand – the impact of uncertainty and control. At the heart of a lot of our non-science beliefs is a need for more control.

We live in an always uncertain and ever more out-ofcontrol world, but superstitious beliefs and pseudoscience can
give people a sense of control and certainty, providing simple answers that reduce their levels of stress – and stress reduction too is a necessary adaptive mechanism and something we tend to be wired to seek out.

But here's the crunch: science is predominantly based on
uncertainty
, while fringe beliefs are often based on providing more
certainty
. We are actually wired to favour non-scientific beliefs and values in many cases.

So what are we to do? Here's the issue boiled down simply: we are living in a technology-driven world for which our innate instinctive reasoning equips us poorly.

There is some good news, though, as evidence shows that adults with more science training will more often reject astrology or lucky numbers, and more often accept evolution.

Likewise a 2002 PhD study by Alyssa Taylor from the University of Virginia found that a course on critical thinking led to a significant decline in belief in the paranormal.

Other books

A Place Called Home by Dilly Court
Carola Dunn by The Magic of Love
Play It Again by Ashley Stoyanoff
The Collected Stories by John McGahern
Seduced and Betrayed by Candace Schuler
V.J. Chambers - Jason&Azazel Apocalypse 01 by The Stillness in the Air
Mimosa Grove by Dinah McCall