What is Science?
Science is about more than test tubes and lab coats and Einstein's bad hair day. It's a way of thinking that can be applied to anything we observe in the natural world.
And it all began with philosophers—the original scientists—laying down some guiding principles, known as the four canons of science.
Canon #1. Empiricism
As well as the bald one on Lost, John Locke was a 17th century philosopher and the father of empiricism, which says that truth comes from repeatable and measurable experience.
We gain experience of the world through our senses, which technology extends to give us astonishing data-gathering capacity. Think microscopes, MRI machines, and malware detection software. From all this input, we can build a reliable framework of objective reality, and debunk claims which are evidently bogus.
Even empirical measurements must be scrutinised and replicated at scale to eliminate bias and anomalies. We can't draw definitive conclusions from a single experiment; any weakness in sampling or methodology or statistical analysis can botch the whole conclusion. So we repeat experiments with different approaches to gradually hone in on the truth.
Science is necessarily sceptical of claims we can't repeatedly experience and measure, such as the afterlife, mediumship, and telepathy. If they're not reliably detectable, they fall into the domain of personal belief.
Canon #2. Falsifiability
Our systematic data collection leads us to observations we can put into words, like "The planets and moons in the night sky appear spherical." We then make a prediction from this data, like "If other planets are spheres, then Earth is also sphere, because Earth is a planet."
A hypothesis is a tentative idea that can guide our exploration of truth. As such, it should be falsifiable: if it's wrong, it's within our means to disprove and discount it.
The ancient Greek philosopher, Eratosthenes, performed a famous experiment investigating the shape and size of the Earth using shadows and maths. Not only did he falsify the flat Earth hypothesis, but his results were accurate enough to predict the circumference of our spherical planet. But we needed that falsifiable idea to begin with.
In most cases, if an idea isn't falsifiable, it's a scientific cul-de-sac and should be regarded with a healthy dose of scepticism.
But there are exceptions to Karl Popper's falsification principle. Take physics, which sometimes has to deal with extreme scales (I'm thinking cosmic inflation or string theory). They don't roll with the canon of testability. We can't disprove them if they breach the limits of direct or indirect perception.
It gets messier when we consider that inherited knowledge can later turn out to be incomplete. There's a risk of valid but incomplete ideas being falsified because they're chinked by new data. Such false negatives can have us chasing our own tails, reminding us that scientific discovery isn't always straightforward.
For the most part, however, falsifiability is a useful principle across most sciences because it helps us distinguish between workable hypotheses and straight-up ideologies.
Canon #3. Parsimony
Human beings are creative storytellers. We're capable of leaping from one fantasy to another to explain, entertain, and elucidate. But most of science is about making small, incremental discoveries based on pre-existing logic, which brings us to the principle of parsimony.
Parsimony means being economical with our creative assumptions. It tells us to keep our thinking within the bounds of logic and probability.
Otherwise, we run away with our imaginations. One particularly wild conspiracy theory on social media is that viruses don't exist. It's a whopping divergence from reality, denying decades of hard genetic, clinical, and epidemiological evidence of viruses. Not to mention the growing number of vaccine and gene therapy technologies that leverage them.
Parsimony is the foundation of Occam's Razor, the philosophical principle wielded relentlessly by Sir William of Occam to counter the logic of his contemporaries. Don't make more assumptions than absolutely necessary. Closely aligned with critical thinking, we see it leveraged in philosophy, medical diagnosis, and the development of theoretical models.
Parsimony doesn't mean that the simplest explanation will do. Relativity isn't simple. Neither is evolution by natural selection. Parsimony means explaining complex observations with the most succinct, logical solution. The broader the scope and predictive power of your solution, the more likely you'll go down in history for it.
Canon #4. Determinism
The natural world as we observe it is a complex web of cause and effect. From the centrifugal forces that shape galaxies, to the molecular interactions between neurotransmitters in your brain.
The principle of determinism crushes ideas like fate, karma, and even free will. The predictable laws of cause and effect are breached by summoning superfluous, unknown causes for what we can already explain.
Yet, once again, as we test this principle at the quantum scale, it falls apart. While classical physics provides a pile of evidence for a clockwork universe, quantum physics offers conflicting evidence for a probabilistic universe. Where true randomness occurs at the level of fundamental particles.
How can the two co-exist? This is the problem that haunted Einstein. And it's yet to be resolved.
The conflicting evidence means we're missing some critical clue. A clue that will lead us to a parsimonious, unified theory. Until then, we just don't seem to have enough pieces of the puzzle. Has anyone tried looking under the couch?
Aside from this whopper quantum caveat, determinism is a solid principle to guide our everyday science. It has served us well for an awfully long time, allowing accurate predictions ranging from solar eclipses to SpaceX launches to social dynamics.
Together, the four canons of science drive the discoveries and technologies that are critical to society today. Energy, water supply, agriculture, transport, medicine, communication, computing, and so much more are all driven by this style of thought.
Just imagine your day without the inventions of science. You wouldn't get very far—unless of course, you currently thrive semi-naked in the wilderness. For narrative purposes I'm assuming you don't.
But if science is so cool and powerful and awesome, why is it under attack today? Heads up. Here comes a hypothesis.
Science is Widely Misunderstood
Many people believe that science is rigid and unyielding. Can you blame them? I just laid out four highly restrictive rules on how to think like a scientist, rejecting tightly-held notions of faith and free will in the process.
That's a tough pill to swallow if you want to get on board with science. In fact, plenty of scientists don't apply the four canons to their personal beliefs, setting them aside entirely from scientific scrutiny.
After all, human beings are tragically flawed creatures. Sometimes we need fantastical worldviews just to give us hope.
But there's a line between scientific knowledge and personal beliefs, and we can't seem to agree on where that is. The pandemic illustrates this in an explosive way. While science tells us that lockdowns and vaccinations are necessary to reduce the collective death toll, it cuts into our personal beliefs about individual freedom.
We all have different beliefs, and while this threatens to eclipse our collective science where it matters most, there's a fundamental aspect many people don't appear to take into account.
As a truth-seeking methodology, the four canons give rise to a scientific culture that's eternally responsive to change. We see this in action when new evidence is used to expand or even overhaul established scientific theories.
This willingness to embrace new information is how science progresses, honing in on objective truth with increasing specificity. When science is done carefully, and reported truthfully, it lights our path ahead.
Contrast this with the innate inflexibility of belief. Once we take an idea to heart, we tend to support it with confirmation bias: accepting all data that support our worldview, while rejecting that which conflicts.
Of all our personal beliefs, it's probably fair to say that religion is the most rigid. There's a hell of a lot riding on it: withdrawal from its inconvertible claims would lead to its downfall. Take the well-known example that the Earth is 6,000 years old. In holding firm to this belief, Christians must reject thousands of lines of archaeological, geological, and genetic evidence to the contrary.
Scientists don't get to choose what data they believe in. But they can certainly argue about how it's collected and what it means. The most logical and parsimonious conclusion rules out.
This is why peer review is so important. When scientists publish, others can critique the methodology and conclusions. It's a fierce error-checking process, leveraging collective intelligence and experience so our knowledge can evolve.
The media minefield is an ideal platform to practice scientific thinking. We can scrutinise claims in terms of the four canons, and be on the lookout for bad arguments:
- Strawman arguments deliberately misrepresent competing claims so they're easy to falsify.
- Appeals to ignorance argue that claims are true simply because they haven't been proven wrong.
- False dichotomies polarise debates inaccurately into black-and-white terms, when the truth is more nuanced.
- Slippery slopes irrationally and negatively assume that benign acts always lead to disaster.
- Circular arguments use the same claim as both the premise and conclusion.
...and on it goes. Fortunately, good arguments provide a logical framework for everyday life, so we can avoid being misled and build up our own cache of truth. Wrangle with everything, pull it apart, try to prove it wrong. See? You're already sciencing.
"Science is different to all the other systems of thought. You don't need faith in it, you can check that it works." - Brian Cox
Why Fox Pokes Cats
My son, Fox, loves animals. When he was younger, he went through a phase of poking and probing every cat he met, just to see what it would do.
Psychologists call this bottom-up processing. Fox didn't make any assumptions about cat behaviour; he only bootstrapped his way to new conclusions.
One day, a particularly sassy cat clawed at Fox hard enough to teach him that cats don't like being poked. In fact, sometimes they'll punish you for it. After that, he didn't have to go through the whole experimental rigmarole of poking them senseless to see what would happen.
He started top-down processing cat behaviour, beginning every interaction with an already-established conclusion.
Children adopt this scientific framework because it's simple and intuitive. They learn about the world through direct observation and build up a catalogue of conclusions.
But as we grow, we get cocky. We start to think we know the answers already. We instinctively cut corners and top-down everything. We fail to gather sufficient data in new situations, and forget the open-minded schema of childhood. Why? Because we're all striving for efficiency in the massive time crunch of life.
Fortunately, science is course-correcting. The foundations of scientific training force us to go back to basics and build truth from the ground up.
This is probably the most important thing I know about science. We have to empty our cups, full of preconceptions and misinformation, and fill them up with clean logic and conclusions.
So cast off your top-down assumptions, adopt a questioning approach, and play in ways you haven't done since you were five years old and poking cats.
"Science is a way of thinking much more than it is a body of knowledge." - Carl Sagan