What evidence do you have that your brain isn't lying to you?
How is it working for you?
What are other possible outcomes?
Yesterday I blogged about cognitive biases, the ways our brain gets reality wrong.
The halo effect, for example.
The first impressions.
How the way someone looks influences how smart, competent, or trustworthy we think they are.
Wikipedia lists around 200 cognitive biases. They’re still being added.
Then there are cognitive fallacies, or errors in logic, and we have dozens or hundreds of them as well.
Like: “If you’re a bad person, you must always be wrong.”
Or: “If you’re a good person, you must be right.”
That’s not how reality works.
The bottom line is simple and uncomfortable:
our brain is a terrible judge of objective world.
So no, don’t listen to everything you hear in your head.
At the very least, take it with a grain of salt🧂
The same way you would with advice from someone else.
A good question is,
how do you do that consistently?
Here’s a simple framework I use with clients.
3 questions I shared above.
Best asked in writing, so your logic is recorded and can be evaluated and checked against reality later, by you, or by someone else.
1. What evidence do you have?
Not vibes.
Not stories.
Not intuition.
Actual evidence you’re reasonably sure of.
If you don’t have much, ask:
How can I test this assumption before going all in?
How can I dip my toes before diving?
Small experiments beat big leaps based on our biased thinking.
2. How is it working for you?
Whenever clients propose a plan, even when I know it’s probably unrealistic, I say:
Try.
And after they do, I ask:
How is it working for you?
Because if it isn’t, continuing in the hope that you’ll magically change or the world will adjust is usually just wasted time.
Time that could’ve gone into something that actually works.
3. What are other possible outcomes?
Not just the best-case scenario your brain shares with you enthusiastically.
All the other explanations.
All the other paths this could take.
Annie Duke, world's poker champion and decision scientist, introduced me to decision trees, mapping all plausible outcomes before deciding, not just the desired one.
Even better: invite an outside view.
Ask yourself: What would my friend say here?
Or actually ask them. Or ask ChatGPT. Or Claude.
Then consider probabilities.
Downside costs.
Upside potential.
Is the potential upside worth the potential risk? How likely are both to occur? On what evidence?
No, you don’t need to analyze every idea to death.
But before you invest serious time, other people’s time, money, energy, or say no to things that might work better - it’s good decision hygiene to scrutinize your own thoughts the way you’d scrutinize someone else’s advice.
Over to you, dear reader,
Where do you need to doubt your own thinking more?
And what small thinking habit could help you do that consistently?