Critical Thinking & the Scientific Method: How to Stop Letting the Algorithm Think for You

murray-turner-critical-thinking

We live in an age where you can outsource almost everything – your laundry, your meals, even your emotional stability (sometimes to a man named “Steve_BigLove82” on Reddit) – but somehow the one thing we really shouldn’t outsource is critical thinking. Yet we hand over our ability to think for ourselves like free samples at a supermarket. You open a news app screaming “THE WORLD IS ENDING!”, scroll onto Instagram where every third post adds another slow-motion explosion, and by lunch your worldview has been shaped by platforms designed with the psychological subtlety of a slot machine – except with worse lighting and influencers trying to sell you collagen that looks like it was scooped out of a unicorn’s pancreas.

The strange part is this: most news isn’t inaccurate. The problem is far more subtle – and far more dangerous. It’s that the information we get is often true but violently out of context, which is like being slapped in the face with a raw chicken. Yes, something technically happened, but you have absolutely no idea why – or who allowed it. Context is the difference between information and understanding. Lose the context, and even the truth becomes misleading. A fact without context is just a panic attack wearing a lab coat.

Numbers without timelines, risks without probabilities, events without baselines – this is how a world that is improving in dozens of measurable ways still feels like it’s burning down. “Shark attacks up 50%!” means nothing when it’s a jump from two to three. Yet your brain reacts as if the ocean is now a murder broth. In reality, you’re still statistically more likely to die in open combat with a vending machine – usually after tilting it like a deranged mountain goat to rescue a stuck KitKat – than from any shark on Earth. And in that context vacuum, emotional instinct takes over instead of thinking for yourself.

And if that weren’t enough, we’re swimming in both misinformation and disinformation – two terms often treated as twins even though one is a confused cousin and the other is a Bond villain. Misinformation is wrong by accident – your aunt forwarding a headline because the font looked stressed. Disinformation is wrong by design – engineered narratives aimed at your emotional underbelly. Misinformation wastes your time. Disinformation hijacks your mind. One’s a pothole. The other’s a sinkhole. Both thrive when we outsource critical thinking to the algorithm.


Why We Think Badly: Emotional Brains in a Fear Economy

Passive thinking feels incredible. It’s warm and soothing – like sinking into a marshmallow couch while someone feeds you grapes and whispers, “Don’t worry, your opinions are perfect.” But most of our beliefs were installed long before we ever had the cognitive ability to question them. And the less we practice critical thinking, the easier those early scripts become our default.

I grew up in Zimbabwe – a place that was beautiful, complex, and enclosed. The worldview I absorbed there wasn’t chosen; it was inherited. It came from the culture, the community, the stories we were told, the stories we weren’t told, and the stories we assumed were normal. Like anyone raised in a tight ecosystem, I didn’t realise how much of my thinking wasn’t mine until much later. That’s the emotional trap of passive narratives: the longer they’ve lived inside you, the more they feel like truth.

Hans Rosling’s Factfulness – subtitled “Ten Reasons We’re Wrong About the World, and Why Things Are Better Than You Think” – was the first real slap I ever got about how badly our perceptions can betray us. The book opens with a 13-question quiz designed to force you to confront your own assumptions before you even turn a proper page. Every group I’ve tested scores worse than random guessing. Not because they’re foolish, but because the mental maps they’re using were drawn decades ago. We are emotional creatures first and analytical creatures second. And fear is a much louder storyteller than progress. Progress whispers. Fear howls.

And that matters – because critical thinking depends on the ability to challenge those inherited stories.

This is precisely what Nobel laureate Daniel Kahneman explains in Thinking, Fast and Slow. We have two thinking systems: one fast, emotional, impulsive, catastrophising; the other slow, analytical, reflective. Most misinformation spreads because System 1 is driving while System 2 is locked in the boot asking if anyone remembered its seatbelt. System 1 doesn’t want truth – it wants certainty, drama, and the illusion of control. And unless we’re honest about what we feel, we can never truly activate critical thinking.


From Passive Narratives to Active Thinking

Most people live inside narratives they didn’t consciously choose. Narratives are like IKEA furniture: they arrive flat-packed, pre-shaped, and we accept them because rebuilding them from scratch feels exhausting. But passive narratives aren’t built on truth – they’re built on familiarity, approval, and emotional safety.

Active critical thinking is uncomfortable because it asks us to dismantle those inherited stories. It forces us to question why we believe what we believe. It demands that we ask who benefits from our assumptions. It requires the courage to seek out perspectives that challenge our identity, not just the ones that soothe it.

And, importantly, it requires people in our lives willing to call us out instead of nodding like dashboard bobbleheads.

This became painfully real for me when I realised I had been living in an echo chamber for years. I spent a long stretch of my career climbing the corporate ladder, following the “correct” recipe like an obedient sous-chef: add this qualification, join that committee, attend these meetings, never stop moving. It felt like success had a universal script – until I realised the script wasn’t designed for me. It was the generic, one-size-fits-all path handed to people who haven’t yet figured out their own. I wasn’t thinking critically. I was executing someone else’s blueprint.

Sometimes, to go fast, you have to go painfully, deliberately slow first – slow enough to check whether the ladder you’re climbing is even leaning against the right wall.

This is what unlearning at work looks like in real life.


The Scientific Method: Structured Curiosity for Real Life

Despite the dramatic branding, the scientific method is really just structured curiosity. It’s the antidote to panic, assumptions, and the “I saw it online so it must be true” approach to thinking. It starts with noticing something that doesn’t sit right and moves through a deliberate series of steps:

The Scientific Method (how thinking should work):
  1. Observe: Something catches your attention.
  2. Ask: “What’s actually going on here?”
  3. Hypothesise: Form a tentative explanation.
  4. Test: Look for actual evidence – especially the kind you don’t want to see.
  5. Analyse: Compare expectations with reality.
  6. Refine: Update your understanding.
  7. Repeat: Because truth evolves.

Now compare that to how most people use social media:

The Social Media Method (how thinking usually works):
  1. Observe: A meme your cousin reposted at 2am – featuring a pixelated raccoon and Comic Sans.
  2. Ask: Absolutely nothing, because the raccoon seems confident.
  3. Hypothesise: “Seems legit.”
  4. Test: By test, we mean squint, tilt your head, and scroll anyway.
  5. Analyse: “Lol. This will now replace three years of formal education.”
  6. Refine: “This is now a foundational pillar of my personality. I will defend it with the ferocity of a caffeinated meerkat.”
  7. Repeat: Until your worldview is built entirely out of screenshots, half-read threads, and one infographic someone made in Canva at 3am.

This contrast is why the scientific method is one of the most powerful tools for thinking for yourself.

It’s also how high-performing organisations operate. McKinsey’s State of AI 2025 shows that companies gaining real value from AI aren’t inhaling buzzwords – they’re testing, measuring, refining, stress-testing assumptions, and iterating relentlessly. In other words, they’re practising critical thinking in the workplace at scale.


Becoming Impossible to Fool

Critical thinking isn’t about cynicism. It’s about refusing to let your upbringing, your feed, or a bored journalist chasing a quota decide what’s true for you. It’s about noticing when you feel something before you think something. It’s about checking a source – and making sure at least one of them challenges you instead of patting you on the head. It’s about inviting people into your circle who’ll call you out instead of comforting you. It’s about practicing critical thinking in the workplace by questioning briefs, assumptions, and unclear requests before you lose entire quarters to work that never needed to exist.

Because once you can separate emotion from evidence – once you can interrogate your own internal scripts – you become, quite simply, impossible to fool.


What’s one belief you realised wasn’t actually yours? Drop it in the comments – and let’s see how many of us have been carrying around inherited thinking without noticing. And, if you would like to do the Factfulness quiz please let me know and I will build the quiz for you

2 Responses

  1. Great read and well written Murrels !
    Some years ago Merle and I attended a course called the Landmark Forum and one of the fundamentals was – and this is their term – “already always listening”. This is the way each of us assess and behaves in response to stimulus because of our “learnings / filters” in our formative years. As soon as one realises this, one can “sit up” and question before making assumptions / assessments / comments / judgements etc.
    This is a big “truth” that I learnt about what I believed was “true” in many instances but it is so difficult not to instinctively revert to one’s almost natural responses.

Leave a Reply to Murray Cancel reply

Your email address will not be published. Required fields are marked *