Once again, vaccine hesitancy has hit the news. As of this writing, 141 cases of measles have been reported in the United States, most stemming from the outbreak centered on Disney Land. Canada is not immune (as it were) to outbreaks of vaccine preventable diseases; a 2011 survey of Canadian households showed that only 89% of parents said their children were up to date on all vaccines. Of those who didn’t immunize their children, 28% said that vaccines were not necessary and 17% cited concerns about vaccine safety. Just last year Calgary and Edmonton experienced an outbreak of measles, including an exposure at the Walmart down the street from my house.

My interest in knowledge translation stemmed from the vaccine controversy. As a scientist, and (full disclosure) one who worked in vaccine design, I didn’t understand how a 1998 study on the relationship between the Measles Mumps and Rubella (MMR) vaccine and autism that was later debunked, and now known to be one of the world’s worst cases of scientific fraud, led to a widespread and lingering loss of confidence in vaccination. Vaccine hesitancy is not due to lack of information; parents who refuse vaccinations tend to be very well educated, and despite a lot of effort to provide more information on the safety and efficacy of vaccines, the anti-vaccine movement remains strong. Why?

Scientists like to think that medical decisions should be based on logic, but in fact decision making is rarely so straightforward. Humans are guided by biases and intuition, and are more likely to assemble evidence to support a chosen position, than to choose a position based on the evidence presented. This is a well known phenomenon called confirmation bias: I am partial to chocolate, so am more likely to believe studies that say that chocolate eaters live longer and happier lives. That’s confirmation bias at work. Combined with a consumer approach to information, it is very easy for all of us to seek out information that already conforms to our preconceived notions.

A 2013 article by Eve Dube and colleagues took a good look at what factors contribute to parental decision making regarding vaccination. In addition to a larger social context, where a consumer approach to evidence means that everyone can cherry pick the evidence that suits their needs, as scientists we need to confront the fact that scientific authority has little currency anymore. We live in a world that questions all authority, questions science, and the honesty of medical providers (Dube et al 2013). Questioning authority and science is one thing, and one could argue, is a good thing. The problem is that the anti-vaccination movement, like climate change denial, capitalizes on this lack of trust with the skillful application of denialism.

Denialism can be defined as “the employment of rhetorical arguments to give the appearance of legitimate debate where there is none, an approach that has the ultimate goal of rejecting a proposition on which a scientific consensus exists.” Whether it is to deny evolution, climate change or the fact that vaccines do not cause autism, … denialists employ similar tactics such as relying on “conspiracy theories,” using fake experts, purposively selecting only supportive evidence and discrediting all other, creating impossible expectations of what research can deliver or using logical fallacies.” (Dube 2013 and references therein)

So what do you do in a world of science denialism, when so many important conversations and decisions rely on science? Bombarding people with more information does little to change minds, and research has shown that the only way to address vaccine hesitancy is when parents discuss their concerns with a trusted health care provider. Trust is key. In this regard, medical science has a lot to answer to; as Hope Jahren argues in her post “How I learned to love the needle.” A history of medical missteps such as thalidomide and outright medical abuses like Tuskegee have coloured our relationship with medicine. Though the research world has made great steps in improving safety and ethics, including developing systems for surveying for medical mistakes and learning from them, more work has to be done to regain public trust.

I would argue that what we need is a resurgence in the culture of critical thinking. Our schools, our culture, and our media do little to foster science literacy. The worlds of science and medicine do little to engage the public, and when we do, we emphasize the results and big breakthroughs, without any narrative showing how the conclusions were reached, or context on the uncertainties associated with any one particular study. As a result, many publicly reported studies end up being discredited, further eroding public trust. However, the skills of scientific thinking, including considering the question, the controls, and the experimental design are accessible to anyone. I offer workshops on the scientific method for non-scientists and consistently find that people from all walks of life are able to review published data and make judgments on its veracity with a little training. It is the logic of scientific thinking that is so powerful, and that is accessible to anyone. Poor logic is also the usual reasons for the failure of published studies. Embracing scientific thinking would make a powerful tool for decision making available to anyone who’s interested. And perhaps when people can participate on a more equal footing with scientists and doctors, then we can begin to rebuild the trust in a shaky system.