As a regular Twitter user, I choose the people and organisations I follow online carefully. And therein lies my problem. On social media, we are more likely to engage with and trust content that aligns with our views, and thus become saturated by opinions we already agree with. Some of these views are based on political or religious ideologies, others on the flimsiest of evidence or the most superficial and unreliable of information. Against this backdrop of conflicting ideas and polarised worldviews, we’re now being asked to trust in science – and scientists – like never before.
During the coronavirus crisis, everyone online seems to have a “scientific” opinion. We are all discussing modelling, exponential curves, infection rates and antibody tests; suddenly, we’re all experts on epidemiology, immunology and virology. When the public hears that new scientific evidence has informed a sudden change in government policy, the tendency is to conclude that the scientists don’t know what they’re doing, and therefore can’t be trusted. It doesn’t help that politicians are remarkably bad at communicating scientific information clearly and transparently, while journalists are often more adept at asking questions of politicians than they are of scientists.
It has never been more important to communicate the way science works. In politics, admitting a mistake is seen as a form of weakness. It’s quite the opposite in science, where making mistakes is a cornerstone of knowledge. Replacing old theories and hypotheses with newer, more accurate ones allows us to gain a deeper understanding of a subject. In the meantime, we develop mathematical models and make predictions based on data and available evidence. With something as new as this coronavirus, we started with a low baseline of knowledge. As we accumulate new data, our models and predictions will continue to evolve and improve.
A second important feature of the scientific method is valuing doubt over certainty. The notion of doubt is one worth exploring. We can trace its origins to a medieval intellectual movement, and to two individuals in particular, the Arab scholar Ibn al-Haytham (Alhazen) and the Persian scholar Razi (Rhazes). The movement was called al-shukuk in Arabic (meaning simply “the doubts”), and it refuted the wisdom inherited from Ancient Greek scholars more than 1,000 years earlier in subjects such as astronomy and medicine. Al-Haytham, an early advocate of the scientific method, cast doubts on the writing of the Hellenic astronomer Ptolemy, and suggested that one should question not only existing knowledge but also one’s own ideas – and be prepared to modify or overturn them in light of contradictory evidence. He overthrew the millennium-old idea that we can see things because our eyes shine light on objects, and gave the first correct explanation of the way vision works.
This approach still informs how we do science today. Indeed, this is how the scientific method differs from the stance of conspiracy theorists. Conspiracists will argue that, like scientists, they too are sceptics who question everything and value the importance of evidence. But in science, while we can be confident that our theories and descriptions of the world are correct, we can never be completely certain. After all, if an observation or new experimental result comes along and conflicts with an existing theory, we have to abandon our old presuppositions. In a very real sense, conspiracy theorists are the polar opposite of scientists; they assimilate evidence that contradicts their core beliefs, and interpret this evidence in a way that confirms, rather than repudiates, these beliefs.
Often, in the case of such ideological beliefs, we hear the term “cognitive dissonance”, whereby someone feels genuine mental discomfort when confronted with evidence that contradicts a view they hold. This can work to reinforce pre-existing beliefs. Ask a conspiracy theorist this: what would it take for them to change their minds? Their answer, because they are so utterly committed to their view, is likely to be that nothing would. In science, however, we learn to admit our mistakes and to change our minds to account for new evidence about the world.
This is crucial in the current pandemic. Clearly, the world cannot wait to learn everything about the virus before taking action; at the same time, stubborn adherence to a particular strategy despite new evidence to the contrary can be catastrophic. We must be prepared to shift our approach as more data is accumulated and our model predictions become more reliable. That is a strength, not a weakness of the scientific method.
I have spent my career stressing the importance of having a scientifically literate society. I don’t mean that everyone should be well-versed in cosmology or quantum physics, or understand the difference between RNA and DNA. But we should certainly all know the difference between bacteria and viruses. Even more importantly, if we are to get through this crisis, we must all have a basic understanding of how science works – and an acknowledgement that during a crisis like this, admitting doubt, rather than pretending certainty, can be a source of strength.
• Jim Al-Khalili is the author of The World According to Physics