The New Puritanism

As a research professional with a scientific background and a professional expertise in information technology, I keep coming across arguments and beliefs that make me cringe. Whether the topic be immunisation, or climate change policy, or the National Broadband network, the health risks of genetically modified foods, or the effect of genetics on personal capability and life prospects, you can be sure that some commenters will challenge the “orthodoxy”. Now I don’t have a problem with challenging orthodoxies. The very basis of modern scientific practice revolves around the concept of putting up hypotheses and then trying to shoot them down.  But when a challenge to the prevailing paradigm is made without a suitable basis in fact and understanding, it contributes to an ongoing sense that we live in a country that is actively hostile to the idea of expertise.

Expertise implies a superior understanding, and “superior” is not acceptable in this country that has possibly the first recorded use of the term “tall poppy” in the modern era.

In May this year, Alecia Simmonds wrote a piece in Daily Life bemoaning the way that “Australia hates thinkers”. That article spurred any number of posts disputing or qualifying the premise. Nobody, however, appears to fundamentally disagree with the fact that there is a strong disconnect between the experts and academics and the general public.

A more relevant question is this: if intellectuals are disrespected in Australia, what effect does this have on public discourse of their pursuits? After all, the thinkers and intellectuals are those who are experts in their fields. They’re precisely the people we should be looking to for answers on the various contentious issues. The more important the issue, the more critical it is that we look to the expertise of those who are professionals in the field. Why should we pay attention to them? Because they’re being paid to know. And why do we consistently ignore or dispute their opinions? Because they’re being paid to know! The understanding that the experts are being paid for their expertise seems to reduce our respect for what  they might have to say.

Why do we instinctively doubt the experts or discount their statements? Why do people continue to dismiss the expert opinion of 95% of expert climate scientists in favour of a few detractors and their shock jock amplifiers? Why does there persist a large minority of parents and celebrities who think that childhood immunisation programs are useless, and give rise to autism, and possibly are an international conspiracy to biologically tag and spy on their populations? Why do expert claims that there will never, and can never, be another internet technology to go faster than the NBN and make it obsolete, get shouted down with a litany of “wireless! wireless!”? Why does the fiction of “wind turbine syndrome” persist in the face of multiple authoritative studies to the contrary?

Wilful denial of expert opinion is not restricted to hot-button controversial political issues. The specific issues above are some of those that I personally have a position on and some technical knowledge of, but I’m sure you can come up with examples of your own. We sometimes respond with the same derision whenever we hear about new research into something that we don’t understand and that’s not a cure for cancer. It’s the “ivory tower” principle: the idea that whichever expert is currently discussing their topic might be all smarty-pants hoity-toity expert at such-and-such, but could they survive a week in the desert? (To which, I would respond, could you?) When it comes down to it, we’re all experts at something, and almost certainly there’s something you, the reader, know that couldn’t be appropriately communicated to another person in a ten second sound bite.

A contributor to the problem may be that there’s no longer a concept of “pure” research. The days of scientists selflessly delving into the mysteries of nature for no purpose other than to understand what makes it all go are long gone. Science today is a commercial enterprise, eternally focussed on the marketability of findings. Research is always pitched in terms of the practical outcomes envisaged. The CSIRO has been building its research program around industrial sponsorship and commercial imperatives for decades. Even the universities, once a dependable source for “pure” research, now rely on business sponsorships to fund the majority of their research efforts. (Yes, this is probably a little unfair – there are probably still some knots of pure research out there. But they’re not the norm and they’re not in peoples’ minds when they think of science.) Because all science today is driven by commercial imperatives, it’s not a very long bow to draw to conclude that there are financial pressures on the experts to come up with the “right” answers.

We have been so intent on putting a value on the science that we have ended up devaluing the science.

Tall poppy syndrome, ivory tower principle and anti-expert contrariety probably, in many cases, come down to two related causes.

Confirmation bias

We tend to give more weight to arguments and statements that support our existing beliefs. In the cases listed above, the expert orthodoxy can be uncomfortable and challenging. Fortunately for some, there are dissenting views. Given a choice, people will often give credence to whatever evidence is available, however tenuous, that supports their existing views, in opposition to alternative viewpoints.

Cognitive Dissonance

I suspect that this is the big one. When given two conflicting ideas, people automatically seek to add new ideas to make the existing ideas compatible, or devalue one of the ideas to allow the other to take priority. Take a model that includes the following three ideas:

  • We need to reduce our consumption
  • I want to succeed in life
  • Success in life is measured through consumption

It’s not unreasonable to be firmly committed to all three of these ideas. The dissonance in these ideas can be resolved by devaluing any one of them.

In the case of climate change, it is often far too easy to devalue the expert opinion in order to help us resolve the problem – that the action required to address mitigate the risk of catastrophic climate change will be expensive and inflict damage on our ability to continue with our current standard of living. (Another method to resolve the dissonance is to change our own desires, i.e. find a way to attach less value to our current comfortable lifestyle. But it is far easier to change our beliefs than our desires.) Anything that gives us license to devalue the expert opinion helps us resolve the dissonance in our own favour.

So what gives us license to devalue the expert?

“Shock jocks” or other voices of authority can have an inordinate influence. Such voices usually need to appeal to some sort of basis, whether it’s spurious science or “vested interests”. But for many of their listeners, it’s not in their advantage to critically evaluate the statements being made.
“Groupthink” can be another source of authority. The shared beliefs of a group of people can itself be sufficient license for us to challenge the alternative view. Thus, for example, the self-reinforcing cadre of people who disagree with immunisation. There may still be a claim of a basis of science, but it’s often less important – the belief of the group can be itself a sufficient force to reinforce the preferred ideas.

Both the shock jocks and peer groups, however, boil down to a distrust of the science or expert knowledge that underpins the viewpoint that they wish to challenge. The opinion of the expert cannot be trusted. This distrust seems to fall into three main camps:

  • The science is wrong. The science isn’t settled / there are other valid opinions available. Those who espouse the majority view are guilty of Groupthink. (Put aside, for the moment, that if the science is not, in fact, settled, we should be proceeding on a risk management basis until it is settled. The more important the issue, the more critical this becomes.)
  • The science is correct, but inadequate. Those experts don’t know about real life. That technology may well be the best possible, but there are other technologies that are much cheaper / not much inferior.
  • The experts are mendacious. The scientists are motivated by profit / acclaim and their opinions are wilfully misleading or simply wrong.

The third of these has, in my opinion, far too much currency in our current society. It seems that the fact that somebody is earning money for their expertise taints their otherwise unimpeachable impartiality and opens them up to accusations of bias.

We would much rather believe the voices of authority that are not being paid for their expert opinion. They might not know as much as the academics, but at least we know that their beliefs are honestly come by. The sad fact of the matter is that this is often not true – from the scientists who argued against the adverse health outcomes of cigarette smoking to today’s experts being funded by the Homeland Institute, scratch the surface and you’ll find that money is just as much at the root of the “no” camp.

Is Australia anti-intellectual? I would say, intellectually no. We just really don’t trust our scientists to be impartial discoverers any longer.

Advertisements

One thought on “The New Puritanism

  1. Pingback: Random Pariah | A strategic approach to science

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s