You Need a System Against Biases, Knowing About Them Doesn't Help
Introduction
Perhaps the biggest pitfall about cognitive biases is that knowing about the biases does nothing to prevent them. In fact, it seems that the reverse is true: you are even more prone to be tricked by biases if you are aware of their existence [1]. Educating yourself more about cognitive biases is not a good strategy to avoid their negative consequences [2]. You need something else. That something else must be some sort of a system, maybe a set of prinicples that you adhere to, because relying on common sense or reasoning might make the biases even worse.
One of the systems is to have people in your life that you trust and that are very eager to give feedback. Now, the problem is that they're not right about everything either [3]. So how do you know which feedback to trust? A system I think could work is to consider such feedback as a signal to investigate more, maybe discuss more. If I just take their advice at face-value, I'm not going to get realiably closer to the truth. But if I take it as a signal that I might be off, I think it could serve me better.
This thinking was inspired by something I observed in work context years ago. Couple of trusted people were talking about a junior programmer and saying that the person really doesn't know programming yet. Now, my bias against blanket statements like this came to play and I thought they were exaggerating. I thought it was about some nuance that could be said to be a matter of preference or opinion.
Fast forward some time and I was with the said person in another project. They did something and I took only a cursory glance at that and thought nothing of it. But at some later point I had to use a script they'd done and I didn't work quite as I expected. Fine, I thought, maybe I'm just missing something. It turned out it had some weird choices like using some non-standard library for doing something very basic. OK, strange but easily fixable, I thought.
Then I took a real look at the code and there were some snippets that were just incomprehensible. In a test script, there was a loop of test cases and inside there was this piece of logic:
try: [normal code] expect TypeError, e: pass
There's no reason to do something like this. None.
Someone will quibble that sometimes you do need to catch TypeErrors in your code (and this is not just hypothetical, someone did quibble about that). Yes, I know. But this is not one of those cases. It is a script that is supposed to validate some functionality and if TypeErrors are thrown, it is a GOOD THING for the script to crash. If TypeErrors are thrown from the 'perfectly normal python code' then you need to fix those errors, you should not add 'try except TypeError'.
If there's some very involved a complicated reason that you actually need to do something like this (and in this case there is not because I know what it did) then you should at least add a comment along the lines of: "WARNING: this code seems utterly insance, but there's a reason that it needs to be like this. Let me explain. ..."
In the other context, similar misconceptions by the junior programmer were observed and discussed and the conclusion was that the junior programmer didn't agree with what was said by the senior programmers. For me, seeing these examples felt like a situation where the junior programmer's conclusion should not be: "Well, we just disagree about those issues. Neither of us is objectively right about this".
With a situation like this, I think you need a system that when an experienced programmer, or several of them, says that you are just wrong about something, bells should start ringing in your head saying that you need to understand what they are saying. Even if you believed they are wrong and you are right.
A further reasoning must involve some sort of extension to some fundamental principles. Because there are things that experienced developers disagree on and that is just fine. Some people believe that static strong typing prevents so many errors that is immoral to be programming in a language with dynamic typing. The passion with which they believe such opinions sometimes seems to exceed even something like agreeing that the above code is just wrong. How can you understand why one disagreement (some fundamental misunderstanding of some programming concept) is different from another (a disagreement over whether you can produce production grade software with a language that is typed dynamically).
However, I have been in a situation where some of the other people in a team or a larger organization do disagree with me about something and I still feel like I know better. First, I think I need to have confidence that I could persuade the other person, who disagrees, to at least consider my viewpoint. If the other person is someone I trust and they are saying that what I'm considering is just dead wrong full stop, then I think I need to go back to the drawing board and try to understand the whole thing much more deeply and broadly. We need to understand why we are disagreeing on something fundamental.
I'm not sure if I've ever been in the most extreme situation (they are saying that I'm dead wrong full stop and I disagree and it's something that is not just an opinion). The disagreements have been over things that are difficult or impossible to verify. Things like the whether we should use a programming language with static typing or the extent of which you should test your system with automated tests or how you should set up the deployment pipeline of your service.
All in all, I think finding people you trust with regards their understanding of the field you're working is one thing. The other is to consider their strong reactions against what you're doing as signals to understand things more deeply. They might be wrong --you have to remember that too-,- but it is also possible that you yourself have a huge blindspot in your thinking. That should be in the back of your mind, at least when you find yourself in a situation like this.
I admit that this isn't very tightly defined and encapsulated. But I think we need something like this in our intellectual pursuit. We might be wrong about something fundamental. But the one who gives feedback, who is someone you trust, might be wrong too. If they are very adamant, you should go back to the fundamentals and think very hard. Read books and try to understand.
Thanks to Timo for reading a draft of this and giving feedback.
[1] There is research about this also. Just google it.
[2] There are of course other reasons to learn about cognitive biases.
[3] One of my pet peeves nowadays is people who are very eager to give feedback to others but are themselves wrong about most things, especially about the feedback they give. Infuriating!