Welcome to Science Me
Oh, hello. I'm Becky and I'll be the voice inside your head for the next few minutes. To make things more interesting, I've put some squiggly shapes and colours amongst the text.
For humans, science is about more than test tubes and lab coats and Einstein's bad hair day. It's a way of thinking that we can apply to anything in the natural world. And it all began with philosophers—the original scientists—laying down some guiding principles.
The Four Canons of Science
Canon #1. Empiricism
As well as the bald one on Lost, John Locke was a 17th century philosopher and the father of empiricism, which says that truth comes from repeatable experience.
We gain experience of the world through our senses, which technology extends to give us astonishing data-gathering capacity. Think microscopes, MRI machines, and malware detection software. From all this input, we can build a reliable framework of objective reality, and debunk claims which are evidently bogus.
Even our empirical measurements must be scrutinised and replicated at scale to eliminate errors and anomalies. What's more, repeated measurements give us a much more accurate picture of the world over time. A single temperature reading tells us nothing about the state of climate change, but many readings across different times and locations reveal the trend we must urgently address.
Science is necessarily sceptical of claims we can't repeatedly experience and measure, such as the afterlife, mediumship, and telepathy. If they're not reliably detectable, they fall into the domain of personal belief.
Canon #2. Falsifiability
Our systematic data collection leads us to observations we can put into words. (The planets and moons in the night sky appear spherical.) We then make a hypothesis built on an if-then-because statement. (If other planets are spheres, then Earth is also sphere, because Earth is a planet.)
A hypothesis is a tentative idea that can guide our exploration of truth. As such, it should be falsifiable: if it's wrong, we can disprove and discount it.
The ancient Greek philosopher, Eratosthenes, performed a famous experiment investigating the shape and size of the Earth using shadows and maths. Not only did he falsify the flat Earth hypothesis, but his results were accurate enough to predict the circumference of our spherical planet. But he needed that falsifiable idea to begin with.
In most cases, if an idea isn't falsifiable, it's a scientific cul-de-sac, and needs to be regarded with a healthy dose of scepticism. But there are exceptions to Karl Popper's falsification principle.
Ideas dealing with extreme scales (like cosmic inflation or string theory) don't roll with the canon of testability. We can't disprove them if they breach the limits of direct or indirect perception. Yet they're still built on empirical knowledge.
It gets messier when we consider that inherited knowledge can later turn out to be incomplete. And there's a risk of valid but incomplete ideas being falsified; such false negatives can have us chasing our own tails.
For the most part, however, falsifiability is a useful principle because it helps distinguish between workable hypotheses and ideologies. It's especially good for blitzing misinformation.
Canon #3. Parsimony
Human beings are creative storytellers. We're capable of leaping from one fantasy to another to explain, entertain, and elucidate. But most of science is about making small, incremental discoveries based on pre-existing logic. This is why we need parsimony.
Parsimony means being economical with our creative assumptions. It tells us to keep our thinking within the bounds of validity and probability.
Otherwise we can let our imaginations run away with us—and away from the truth. One popular conspiracy theory doing the rounds on social media is that viruses don't exist, and the pandemic is an elaborate political ruse. While I've no doubt political corruption exists, the requirement to deny the genetic, clinical, and epidemiological evidence of viruses is non-parsimonious to the extreme.
Parsimony is the foundation of Occam's Razor, the philosophical principle wielded relentlessly by Sir William of Occam to counter the logic of his contemporaries. Don't make more assumptions than absolutely necessary. Closely aligned with critical thinking, we see it leveraged in philosophy, medical diagnosis, and the development of theoretical models.
Parsimony doesn't mean that the simplest explanation will do. Relativity isn't simple. Neither is evolution by natural selection. Parsimony means explaining complex observations with the most succinct, logical solution. The broader the scope and predictive power of your solution, the more likely you'll go down in history for it.
Canon #4. Determinism
The natural world as we observe it is a complex web of cause and effect. From the centrifugal forces that shape galaxies, to the molecular interactions between neurotransmitters in your brain.
The principle of determinism crushes ideas like fate, karma, and even free will. The predictable laws of cause and effect are breached by summoning superfluous, unknown causes for what we can already explain.
Yet, once again, as we test this principle at the quantum scale, it falls apart. While classical physics provides a pile of evidence for a clockwork universe, quantum physics offers conflicting evidence for a probabilistic universe. Where true randomness occurs at the level of fundamental particles.
How can the two co-exist? This is the problem that haunted Einstein. And it's yet to be resolved.
The conflicting evidence means we're missing some critical clue. A clue that will lead us to a parsimonious, unified theory. Until then, we just don't seem to have enough pieces of the puzzle. Has anyone tried looking under the couch?
Aside from this whopper quantum caveat, determinism is a solid principle to guide our everyday science. It has served us well for an awfully long time, allowing accurate predictions ranging from solar eclipses to SpaceX launches to social dynamics.
Together, the four canons of science drive the discoveries and technologies that are critical to society today. Energy, water supply, agriculture, transport, medicine, communication, computing, and so much more are all founded in this style of thought.
Just imagine your day without the inventions of science. You wouldn't get very far—unless of course, you currently thrive semi-naked in the wilderness. For narrative purposes I'm assuming you don't.
But if science is so cool and powerful and awesome, why is it under attack today? Heads up. Here comes a hypothesis.
Science is Widely Misunderstood
Plenty of people believe that science is rigid and unyielding. Can you blame them? I literally just laid out four highly restrictive rules on how to think like a scientist, rejecting tightly-held notions of faith and free will in the process.
That's a tough pill to swallow if you want to get on board with science. In fact, plenty of scientists don't apply the four canons to their personal beliefs, setting them aside entirely from scientific scrutiny.
After all, human beings are tragically flawed creatures. Sometimes we need fantastical worldviews just to give us hope.
But there's a line between scientific knowledge and personal beliefs, and we can't seem to agree on where that is. The COVID pandemic illustrates this in an explosive way. While science told us that lockdowns and vaccinations were necessary to reduce the collective death toll, it cut into many personal beliefs about individual freedom.
It's a messy scenario. And while personal beliefs threaten to eclipse our collective science where it matters most, there's a fundamental aspect many people don't appear to take into account.
As a truth-seeking methodology, the four guiding canons give rise to a scientific culture that's eternally responsive to change. We see this in action when new evidence is used to expand or even overhaul established scientific theories.
This willingness to embrace new information is how science progresses, honing in on objective truth with increasing specificity.
Contrast this with the inflexibility of personal beliefs. Once we take an idea to heart, we tend to support it with confirmation bias: accepting tenuous information that supports our worldview and rejecting the hard facts that conflict.
Religion is rigid the extreme. Any withdrawal from its own inconvertible claims would lead to its downfall. For instance, many Christians hold firm that the Earth is 6,000 years old. In doing so they must reject thousands of lines of archaeological, geological, and genetic evidence.
An important aspect of science is peer review. Any time a scientist performs a novel experiment or gathers useful new data, they send their results to scientific journals. Other scientists then have the opportunity to critique the methodology or conclusions. It's a powerful process, applying collective intelligence to shared problems so our knowledge can evolve.
We can, if we choose, apply the scientific method to our personal beliefs. When we do this, we find plenty of beliefs built on bad arguments. This logical framework allows us to reject misleading claims and build our own personal cache of truth.
Wrangle with everything, pull it apart, try to prove it wrong. See? You're already sciencing.
"Science is different to all the other systems of thought. You don't need faith in it, you can check that it works." - Brian Cox
What's more, once we take on this mode of thinking, we can spot the mistakes within science. Oh yes, scientists make mistakes. Even with respectable training, there are plenty of examples where scientists have dropped the ball, adopted misguided agendas, or simply followed the money instead of doing good science.
But let's not throw the baby out with the bath water. Bad scientists have been exposed. More will be exposed in future. And who's exposing them? Scientists and laymen alike who cut through their claims with logical sense-making: the heart of the scientific method.
Why Fox Pokes Cats
One more nugget before we part.
I have a son called Fox, and he's fascinated with animals. In particular, extinct reptiles. He's gone way beyond the compulsory love of dinosaurs in which every three-year-old boy must indulge. He's now nine years old, and he thinks, breathes, and dreams in reptiles. I just checked to see what Fox is doing right now. He's on his iPad, watching a Carnotaurus fight an Allosaurus. You get my point.
Being such a fan of living creatures, he went through a phase of poking and probing every single cat he met, just to see what it could do.
Psychologists call this bottom-up processing. Fox didn't make any assumptions about cat behaviour; he only bootstrapped his way to new conclusions.
One day, a particularly sassy cat clawed at Fox hard enough to teach him that cats don't like being poked. In fact, sometimes they'll punish you for it. After that, he didn't have to go through the whole experimental rigmarole of poking them senseless to see what would happen.
He started top-down processing cat behaviour, beginning every interaction with an already-established conclusion.
This is a scientific framework that children adopt naturally because it's simple and intuitive. They learn about the world through direct observation and build up a catalogue of conclusions.
But as we grow, we get cocky. We start to think we know the answers already. We instinctively cut corners and top-down everything.
This is why we fail to gather sufficient data in new situations, despite the obvious benefits. We forget the open-minded schema of childhood that leads us to the most parsimonious conclusions. And we all top-down the world, because we're striving for efficiency in the massive time crunch of life.
Fortunately, science is course-correcting. As its foundation, scientific training forces us to go back to basics and build truth from the ground up.
This is probably the most important thing I know about science. We have to empty our cups that are stewed with preconceptions and rumours and misinformation, and fill them up with clean data and logic and conclusions.
So cast off your top-down assumptions, adopt a questioning approach, and play in ways you haven't done since you were five years old and poking cats.
"Science is a way of thinking much more than it is a body of knowledge." - Carl Sagan