Cognitive bias: tool for critical thinking or ego trap?
What I would like to know is how and why people think it can help with critical thinking.
I'll explain why I think it's an ego trap with an example of the survivorship bias:
If we ask a lottery winner to talk to a group of people about how amazing his life has become, that group will be more likely to buy lottery tickets then a second group of people who would have listened to the story of a homeless man who lost all his money on lottery tickets. My guess is, if we tell people who bought tickets from the first group that they were biased, they might just say “oh but even if the chance is low, it’s still there, maybe it’s my lucky day”. So in the end, even given that info, I still think the first group would have more buyers than the second one. It could even be worse, they could fool themselves into thinking they’re critical: “I’m aware there is the survivorship bias, I’m aware the chance of winning is low but I’m rationally deciding to buy a ticket because I’m willing to risk losing small amounts of money to win big”. Is it really rational though? They’re mostly driven by the emotions that were triggered by the story of the winner…
How can we ever be sure that the decision we’re making isn’t biased? Biases are unconscious…
I see a lot of people using cognitive bias as some kind of superiority: “I know about cognitive bias and I try to avoid it, and you don’t, so I’m closer to the truth than you are”… And this is exactly the kind of behaviour that kills critical thinking… Or people who use it to take down someone’s defense: “you’re saying that because you’re biased, therefore it doesn’t have any value”…
I'll explain why I think it's an ego trap with an example of the survivorship bias:
If we ask a lottery winner to talk to a group of people about how amazing his life has become, that group will be more likely to buy lottery tickets then a second group of people who would have listened to the story of a homeless man who lost all his money on lottery tickets. My guess is, if we tell people who bought tickets from the first group that they were biased, they might just say “oh but even if the chance is low, it’s still there, maybe it’s my lucky day”. So in the end, even given that info, I still think the first group would have more buyers than the second one. It could even be worse, they could fool themselves into thinking they’re critical: “I’m aware there is the survivorship bias, I’m aware the chance of winning is low but I’m rationally deciding to buy a ticket because I’m willing to risk losing small amounts of money to win big”. Is it really rational though? They’re mostly driven by the emotions that were triggered by the story of the winner…
How can we ever be sure that the decision we’re making isn’t biased? Biases are unconscious…
I see a lot of people using cognitive bias as some kind of superiority: “I know about cognitive bias and I try to avoid it, and you don’t, so I’m closer to the truth than you are”… And this is exactly the kind of behaviour that kills critical thinking… Or people who use it to take down someone’s defense: “you’re saying that because you’re biased, therefore it doesn’t have any value”…
Comments (62)
I find the whole idea of cognitive bias unconvincing. Even if it is true, so what?
Cognitive bias is a philosophical theory to explain why supposedly rational people make errors in judgment. But, in practice, those with different opinions can accuse the other of bias, and without divine objectivity, no one can prove who's right and who's wrong. I suppose the frustration of a no-win "Mexican Standoff" of opposing opinions is what prompts some people to claim divine revelation, to break the logjam.
Yet, some people are more biased to accept the word of God, than others. In that case, only pseudo-objective "critical thinking", which examines your own motivations & tendencies, can occasionally discover a tipping-point of truth in a difficult dilemma. Unfortunately, if only one party in an ego contest is willing to back down, the win-lose result may not mean that Truth prevails. :smile:
No wonder!
Brains are survival machines, not truth machines!
Then why does the brain think about more than survival?
Malfunction or...
ok
Well, I'm not ok with it!
The double-edged truth.
More objective truth helps with survival though...
Quoting Gnomon
I didn't get that...
I agree with you here.
I am not an advocate of the cognitive bias framework. Each of us interpret the world via value schemes which differ from person to person. In order to manipulate, shape, influence another’s thinking it is necessary to connect with their interpretive framework, from their own perspective. So what is called ‘bias’ is actually the necessary sense-making framework we bring to bear on expereince. Eliminate this ‘bias’ and the world disappears. ‘Objective’ truth is just a certain kind of bias.
I think there is a mistake exactly in the text I quoted. In your hypothetical example, the talking person is considering the relativity they are immersed in ("I know about cognitive bias"), but then they reason in way that ignores the relativity they are immersed in ("’m closer to the truth than you are"), because they make an absolute statement. This is inconsistent. The correct conclusion of the relativistic premise is "so I hope I am closer to the truth than you are" or "so I might be closer to the truth than you are". This is the reasoning that I adopt in my relativistic choices: may be my choices are nothing, but may be they are something, beyond my and your understanding: who knows?
That works for experiments, not personal opinions... You could have a lot of well-grounded reasons to believe in something, even though there aren't direct experiments about it, or even though you've only observed it in some people for example. It's not necessarily a bias to have an opinion based on a small number of cases. But to go back to the survivorship bias, in the example I gave, realising there is a bias doesn't really help you make a "less emotional" decision, it leads to confusion. Regarding knowledge, it's going to be the same thing, every time there is room for a grey area, your feelings might make you see it completely white (or black), and I don't see how the cognitive bias theory could help you.
Wouldn't it be much more efficient to think in terms of feelings? At least you can consciously realise how you're feeling, and know that it might influence your opinion. It gets even better as you can think about the same thing once you've calmed down and see if your opinion is the same. At least it's falsifiable as you don't feel the same things all the time, unlike cognitive biases that have no way of being detected consciously.
Quoting Pantagruel
I would very much like to see your sources for that info.
It is a bias based on the fallacy of small numbers, by definition. It's a cognitive bias, so called, because it is a bias that is exhibited by a lot of people.
Many of these biases have been tested in experimental conditions, as you say. I have read lots of good cognitive science on them. It's common.
That doesn't mean that it's measurable quantitatively...
Experimentation requires quantifiable results. Statistical are quantitative.
You must not live in the Bible Belt. The particular prejudice I referred to is not innate, but cultural --- specifically religious indoctrination. :smile:
PS___I just read a tribute to soicio-biologist E. O. Wilson. In his 2016 book, Half Earth, he said : "What is man? Storyteller, mythmaker, and destroyer of the living world. Thinking with a gabble of reason, emotion, and religion . . ."
Cognitive bias is not one thing. See List of Common Cognitive Biases
The example in the OP is not an instance of cognitive bias.
Cognitive bias is a term from psychology, not philosophy, for a group of demonstrated systematic errors. See What Is Cognitive Bias? ( )
We can adjust for Cognitive bias by being aware of them, giving consideration to what justifies our beliefs and by subjecting our beliefs to public critique.
Quoting Jackson,
Then you are doomed to indulge in cognitive bias. You are denying accepted psychology. Hence Quoting Pantagruel
On the other hand, confirmation bias distorts news all the time and is a threat to democracy.
I know. But the basic idea of Cognitive Bias goes back to Socrates & Plato. It seems to be the fundamental problem in Philosophy : the root of erroneous reasoning. :smile:
A case from Plato's Meno offers an intriguing example that cuts across some of the modern categories of cognitive biases.
https://aeon.co/essays/what-plato-knew-about-behavioural-economics-a-lot
Not every school of psychology considers the objectivizing approach implied by cognitive bias as “accepted”. There are approaches which are troubled by the assumption that discerning such things as bias is a matter of passing judgments on easily discernible facts. This fails to acknowledge the deeply normative character of supposedly neutral and ‘objective’ descriptions of cognitive bias. The vantage from which empirical psychology determines a behavior to be biased is itself an unacknowledged normative bias.
Bang on!
Indeed, there are some non-scientific psychological theories out there.
Dunning-Kruger effect: Sacrifices knowledge for confidence!
False positive error: Sacrifices truth for safety!
Aesthetic argument*: Sacrifices truth for beauty.
:chin:
Just because we use numbers for interpretations doesn't mean the phenomenon is quantitatively measurable...
Quoting Banno
Um... I'm going to ignore that, I don't want to start a war. I only meant to share my thoughts and understand a concept that's trendy nowadays.
Quoting Banno
Yeah okay, use the plural if you prefer that, it's a concept that has a lot of subcategories.
Quoting Banno
How is it not the survivorship bias?
Quoting Banno
How can you be aware of something that's unconscious? Quoting Banno
Heresy, burn him! No one shall go against the opinion of the great masters of psychology.
Quoting jgill
. . .
This has gone terribly wrong, I wanted to argue about HOW and WHY people think it helps with critical thinking and no defenders of that theory actually explained it... Can anyone tell me how you can detect something that's unconscious? Doesn't this cognitive bias theory has the same problem as psychoanalysis, that's it's not falsifiable?
Your question is a broad one. How can we identify with certainty whether someone's beliefs are influenced by an unconscious perspective that comes from (let's say) personal trauma or a family of origin value system? Don't think we can.
I'm not sure I understand how you are connecting cognitive bias theory with critical thinking. In what sense are you proposing they are connected?
Actually that is exactly what it means. It seems you are coming from some kind of radically anti-scientific bias. All in good fun I guess, but not a good use of my time.
We don’t observe the world directly but through a personal framework of constructs that form a functional
unity. Each of us is thus ‘biased’ with respect to the perspectives of others. We each live in slightly different worlds. When we reach consensus on facts of situations or the working of the mind , this consensus doesn’t eliminate the perspectival nature of our outlooks. Consensus and normative agreement on scientific fact is an averaging of all of our personal biases , not their elimination. The ‘objective’ fact is a view that no one in particular actually holds, we all hold our own variation on that template.
When we accuse someone of cognitive bias, we are pointing out that their view deviates from
the consensus of the larger group. This doesn’t tell us the view of the majority is more ‘correct’ than that of the deviant. They cannot be said to be in closer touch with ‘true’ reality. The fundamental arbiter of validity of a viewpoint is to what extent it is consistent with one’s own understanding, not whether it measures up to some third person external criterion of truth.
Our negative emotions tell us when an aspect of the world no longer makes sense to us, when our personal anticipations of events fails to match up with what actually ensues( from our own personal perspective). We can block painful emotions , but this is generally a matter of not being able to articulate those feelings of chaos. We repress and avoid what we can’t make sense of, but this doesn’t eliminate the crisis, it only constricts our engagement with the world to what we can handle.
I can see things like this in myself in relation to 'things that happened' and how I viewed them then and notice that I didn't not look at things/hypocrisies/evidence that I would have found hard to face. I protected myself from guilt or shame.
I don't think this needs to be an accusation against someone. I think that's the wrong verb, though I am sure there are situations where someone is accused of cognitive bias, I see this as a fairly inevitable tendency, though one that can be struggled against in oneself and I suppose with others one is close to. In what I would call a healthy relationship, the people involved are aware this is a possibility, that they are filtering information to not admit/face something. So, there is some slack to have this pointed out. (and generally no one uses the phrase cognitive bias in these dialogues, but that is often what is being talked about)
You seem to be viewing cognitive bias as creating subset of those who deviate. I am sure that kind of thing happens (and often without the need of the concept of cognitive bias. Psychiatry has done this by pathologizing certain people or states or attitudes. Cognitive bias is considered something we all have. It's not like the category of, say, psychosis.Quoting JoshsI am not quite sure what in my post this is responding to. I think those scenarios could and often would lead to negative emotions. I do think that the so-called negative emotions (I don't think of them this way) also arise without confusion: say when someone violates a boundary or we think that they have: with violence, say. I think they can also arise when things we expect but do not like happen. But you may not have been trying to present a complete picture of when these emotions arise. As I said I am not quite sure how this section connects.
My point about the relation between negative emotions like guilt and shame , and the breakdown of predictive sense-making, is that guilt, shame and anger all have to do with situations that surprise, violate and thus invalidate schemes of understanding the world that we counted on to effectively predict events.Since these emotions are expressions or byproducts of a partial breakdown in the effectiveness of our schemes of understanding events and people, it is not the guilt, shame or anger that we need to protect ourselves from, it is ideas and behaviors of others that we cannot make sense of. We withdraw from people who alarm, disturb or confuse us with ideas that don’t make sense to us, and that as a consequence we may feel are harmful or immoral.
It is not that we simply ignore evidence that contradicts our beliefs, as if a part of ourselves recognizes and fully understands the opposing belief, we form a negative emotion and then decide to protect ourself from this emotion by ignoring the belief. We never get to this stage of recognition and comprehension. A belief is part of a larger system of mutually consistent ideas. It is impossible to incorporate, or even to fully recognize as meaningful, ideas of someone else that are incompatible with that system. Such ideas simply don’t make sense to us, seem incoherent or illogical , or may be mostly invisible. It’s not that we are pretending they don’t make sense, they really dont make sense. This isnt a matter of fooling ourselves or hiding something from ourselves that has already been absorbed. We have no peg, no proper structure to hang it on, and so it simply isn’t assimilated. This selectiveness of perception is a necessary feature of sense making.
In today’s polarized political climate, we spend a lot of time psychoanalyzing our opponents. We say they refuse to accept reality, create fake news, are brainwashed, succumb to shady motives, ignore what they don’t want to hear. What we have a great deal
of difficulty doing is recognizing that a fact only makes the sense it does within a particular account, and people from different backgrounds and histories use different accounts to interpret facts.
When I taught complex variables, a senior level mathematics course, I would resort to heuristics in order to encourage understanding of principles and theory. A theorem might require a complicated proof, but by drawing pictures on the board and describing the underlying concept students could see through the complications and comprehend a rational argument that implied the result. It was a shortcut, but one I have used for myself numerous times. If an idea is abstract and convoluted, find an example that illustrates the idea. Then study the formal approach.
I have no idea if this is the sort of thing you have in mind. Heuristics is in the broad category of CB. I've found that extracting an idea and making it personal in some way helps critical thinking. But I think this thread is more about political biases.
Which is another way of saying they have biases. Some people can have more than others. But we all have this.
Yes, there is a lot of bullshit 'analyses' of other people out there now. And rarely do the accusers (because that is part of accusing and labelling people and I now understand better where you are coming from) actually demonstrate their psychoanalysis nor do they realize that cognitive bias cuts both ways. I hold positions on current and past events that do not fit with mainstream media's version of reality. So I think I have great sympathy for what you dislike. I don't think the misuse of the idea of cognitive bias means that there is no cognitive bias. I see cognitive bias in all political groups and yes, political power generally determines what is objective and sweeping philosophically weak judgments of people do get thrown around. And, in fact, cognitive bias contributes to people's wholehearted certainty when they go along with BS getting shoved our way. And it has gotten worse. There is a centralization of media power and this has allowed
And they manage to use these in part because of cognitive bias and people's unwillingness to notice their team's cognitive bias, media bias, contradictions, counterevidence, control of media and more.
But cognitive bias exists and this can be demonstrated in research that is not being used for these kinds of purposes. Many true ideas can be misused. Many neutral things can do evil in the wrong hands. The people who hurl that around aimed at certain groups are not citing research, they are aiming a psychological concept at people the disagree with. This used to happen with psychological ideas like 'projection' 'delusion' 'paranoia' and so on. Those are real phenomena, but once laypeople or professionals acting as laypeople start hurling them around it has little to do with the research and further is just intuitively being applied. (and of course research can and is more and more biased, given the concentration of money controlling research in fewer and fewer companies, but that's another story)
Much of what you say, for example around negative emotions, seems like an explanation for why we have cognitive biases.
Again, Quoting Pantagruel
Well, having cognitive biases would lead to a more "subjective" vision of reality, so cognitive bias mitigation would naturally lead to a more "objective" one, and that would imply to be more critical about yourself or others in order to do that.
Quoting Pantagruel
With experiments, we can conclude a lot of people have cognitive bias (or whatever you want to call it actually), but that doesn't mean that we have tools to measure it quantitively in someone at a given moment. You have no way of measuring how much someone's opinion is biased. What did you have in mind? That we have some kind of cognitive bias detector that tells you how biased you are?
Quoting jgill
Oh yeah I think that method could actually help a lot, it would be harder to ignore one element due to strong emotions if it's in front of your eyes and logically connected to all the others.
But I was talking about trying to figure out if you are experiencing a cognitive bias, like simply asking yourself "do I take this decision because of the survivorship bias?". I think that approach is not efficient at all.
Quoting jgill
Political? No not necessarily, it could be all kinds of bias really.
I never proposed that we should construct a scale. Essentially, a bias is a distortion, so whatever the degree of the distortion, remediating it (by whatever amount) is better than not, don't you think?
But how do you plan to do that if you can't even know for sure if it's there or not? At a given moment for a given opinion, we have no tools to detect it...
We have the catalog of known cognitive biases. That's a pretty good tool IMO.
Totally agree. But are there not also some dishonest people involved, who do know different to what they profess?
We generally lie when we think our real motives and justifications will not be understood the way we mean them, in their full context , or when we believe the ideas we are operating from will not be properly understood. In these cases our dishonestly is not the root of the problem. It is only a symptom of , and our attempt to ameliorate the effects of, a prior breakdown in mutual understanding.
And after all that work they eventually find that what they end up with is a conscious bias.
What if you’re biased with another bias when you conclude you’re biased? What if you like the cognitive bias theory so much that it’s the confirmation bias to think you’re biased? How would you know which one is true? How can you be sure you consciously realised what was in your unconscious mind?
How does anything of what you sent me answer any of my questions?
If you're implying it's excessive to push the bias theory as far as having bias in the process of mitigating other biases, why do you think it is excessive?
You showed the theory, that's great, but I'm saying it's impossible to apply it and be aware of cognitive biases. I didn't make this thread to know more about the theory, I made it to get an actual debate about its application.
They tend to be much better at understanding patterns than resolving disputes via labeling of others You can use ideas like these to improve self-knowledge, understanding of dynamics, perhaps even to make positive changes in yourself or extricate yourself from toxic patterns with others. None of it that is diminished by the misuses people put these ideas to..
The thread is titled
Cognitive bias: tool for critical thinking or ego trap?
But it's not an either or situation as with many concepts from many fields. It is a useful concept, I think, AND people can use it terribly.
Like axes, guns or referendums.
Yeah sure but how do you prevent yourself from using it terribly? For guns you can have a license, also check if you're stable mentally, but what about cognitive bias? My point is you can never know for sure if you're biased, so I don't see the point trying to figure out if you are, it's nonsense, not falsifiable. Maybe in some cases it would help someone get some distance from themselves, because they would question their opinions, but I really don't think the detection of cognitive bias are the best questions to ask, I think they lead to a lot of confusion. And I also think it can quickly escalade to some kind of superiority.
So you shouldn't try to work with things that are more certain than others? You shouldn't try to maximise the certainty? When I said "how can you be sure", I obviously didn't see certainty as something binary, it has shades. I was referring to the falsifiable principle of Popper.
Quoting ArielAssante
Um... This is a very extreme opinion. Some people who overthink actually have very low self esteem and aren't narcissistic at all...
But don't get me wrong, I actually agree that knowing yourself is the best thing you could do to be more critical, I just don't think naming biases and trying to detect it in yourself or others is going to help.
Some people can, others can't. How does one learn how to weed out pathological boy/girlfriends? Some people can't so no one should bother? You can reflect over what attracted you to the person? You can listen to other people's nightmares, you could ask certain questions earlier in the relationship. Here I am using an analogy where intuition and wisdom are involved. Can one rule out one will not end up getting close to another crazy person? Well, probably not completely. But can one improve? Sure. There are all sorts of things we learn to do better that some people cannot or will not try to learn to do better.
And remember: you are talking about an individual on their own analyzing their own personal cognitive biases. The concept of cognitive bias exists and is well supported by research even if people can't use it as a tool on themselves alone.
And actually you can test to see if people's biases change and reduce and what can lead to this.
Then there's using other people to check on you biases: people you respect, perhaps best with a variety of political/social/philosophical opinions. They can point out when you read an article and manage not to notice the part that goes against your ideas. They can point out when you point out only those life experiences you've had which support or seem to your opinion of women, Republicans, the safety of vaccines, your sense of cosmology, your sense of what a responsible worker does and so on. They can point out when you contradict yourself. And this kind of dialogue (which hopefully is mutual) can and does reduce people's biases.
You want foolproof, then you need to not deal with a complicated lifeform like humans. It's easy to adjust the straightness of a bicycle tire.
Another example is learning how to learn. Over my lifetime I have learned how to improve my learning. Even if I am suddenly in the position of having to learn a new kind of subject. One little one is to reflect on what I have learned and how I went about it during a single lesson/study time/whatever.
Does this make me an immaculate learner? No. Does my reflection catch all my assumptions? No. Can my biases regarding my own work affect my self-reflection? Sure.
HOWEVER.....
My learning has gotten better and after I introduced reflective practices the difference was significant and noticed by others.
One can get into one's head and imagine that yes, it is possible that some process of change will have no effect because it is hard to track, is complicated, and I am involved so I may be confused about myself. And one can then say it is impossible to learn or learn with tool X. Welcome to the human condition. But actually these things have been measured in labs and while the individual is in their everyday fog, tools like working against one's own cognitive biases and using reflection in learning have be documented to make changes.
If you want perfect and 100% certainty then apply to be something simpler like a toaster.
But people are learning around you and using tools you are rejecting because there is some possibility it isn't working and you might just be fooling yourself if you think it is. And we might be brains in vats. And perhaps your job situation will never improve. And perhaps you will never meet someone you really get along with, just think you have for a while so no need to look who you are drawn to and how your dreamy romantic nature might be distorting the way you relate to romantic partners and so on. Since it is hard to know for sure, then there is no point in trying to improve.
Parents can't get better via reflecting on their parenting of their first child and talking to other parents because they may just think they got better.
Oh, and of course it is falsifiable. You can easily test to see if someone's poltical position affects what they notice in articles. If it doesn't then there is no cognitive bias in those situations. And hundreds of different hypotheses around what certain beliefs, roles, statuses and more will lead to in relation to cognitive bias have all been wel documented and could have been falsified.
You are not a psychological research center. And yeah, it's hard for us to reseach the way they can. That's our situation. I will bet you try to learn and change things in other areas of you life where the results may be misinterpreted by you, but you still go ahead and try to change through being mentored, through sharing with friends, through reading and reflecting, through engaging in dialogues with people you disagree with, through confessing stuff, through trying new things.
I never disagreed with that.
Quoting Bylaw
It's really funny cause I created another thread to try and name this kind of behaviour. Wanting more certainty doesn't mean wanting 100% certainty. It's foolish to even think 100% certainty is possible to reach... Of course I never meant to reach 100% certainty in that topic...
Quoting Bylaw
It's falsifiable as a general concept, I don't have anything against the experiments. But the thing is, detecting it personally in someone at a specific time is much more tricky. An experiment with one person being both the control and test subject is kind of crappy... Say you read two articles, one contradicting your point of view and the other supporting it. What if you remember several info about the supporting one because it reminded you of something that has shocked you in the past? Say you're scared of dolphins, and the supporting one has an analogy with dolphins... You wouldn't remember it because of the confirmation bias, although the result is the same. My point is, you can remember something in one specific article rather than another one for many reasons, even unconscious ones, maybe you just liked the layout of one article more than the other, or something stupid like that...
Quoting Bylaw
You don't need the cognitive bias theory for that. If a person ignores some data because it contradicts their opinion, you could ask them how they take these data into account in their ways of thinking, and if they can't answer/don't want to, you know they have some emotional blockage with that. I find it much more efficient to lay all the data on the table, link it logically, and ask the person how they reached another conclusion, discussing the logical link. And if some data are ignored (because of the survivorship bias, because of the confirmation bias, or because unicorns are white), it's going to be visible and you can point it out to the person. Like this technique mentioned by @jgill. But who cares about the cause in the end? Who cares if you act like this because of some trauma from your past or whatever? Point is, if you force yourself to lay everything on the table, you're going to be more objective anyway, you're going to mitigate these "biases", even if you have no idea they even exist.
I honestly don't think psychoanalysis is a good tool for improving yourself and trying to detect cognitive biases in someone seems to have a lot of common grounds with it.
Hello Skalidris,
The way I understand your OP is as follows.
These are your claims:
Cognitive biases effect both the uncritical as well as the critical thinker.
Knowledge of biases doesn’t hinder the formation of ego traps in the critical thinker nor the uncritical one.
And finally, biases are unconscious, and can influence you regardless of your conscious knowledge of them.
Did I get that right?
Title question:
The answer you give in your OP is you saying you know a lot of people that do not use cognitive biases as a tool for critical thinking, but turn them into an ego trap. Which I see as concluding that critical thinking doesn't immunize against biases.
Did I get that right?
Now, I would have to assume a lot more if I were to induce how I could prove the usefulness of critical thinking given your claims and even your conclusion. So, instead I would prefer if you could tell me what, specifically, would be valid examples, arguments or experiences to you. Moreover, it would also be great if you had a kind of baseline with regards to what it would take for you to change your mind.
I hope that is useful to you, as I would like to answer if I can understand better what it is you want.
Caerulea-Lawrence
I would just say that I see cognitive biases as the result of a normal emotional response in humans, which affects the "direction" of thoughts. I don't like to put people in two distinct groups "critical and uncritical", that's actually one of my critique for cognitive biases, it seems like they want to make people "critical" but most of the time, they just tell people how to meet the standards of one discipline (often science) without taking personal experience into account. So basically, it often ends up with "trust serious experts and don't try to understand yourself because you will be biased". And to them, the "critical" people are simply the ones who meet the standards of a discipline. So that term is meaningless to me.
In other words, to me, it's not about being "critical" or "uncritical", it's about how much people fight emotional responses: and the more they do; the more objective knowledge they will produce. And to me, trying to be aware of cognitive biases is not going to help in fighting the influence of emotions on the rational mind. And this is because you can never know if you're biased or not. And on the contrary, working with emotions is clearer. It seems easier for me to judge if I'm angry, defensive or happy, rather than asking myself "Was I biased by the success of the lottery winner?" How could you ever answer that? You can't trace back the unconscious reasoning. So even if you feel like buying a lottery ticket afterwards, how would you know if it's biased or not? However, you could access your emotions and realise you're more excited than before you met the lottery winner, so you could decide to wait until that emotion fades away to see if you still want to buy a lottery ticket. And only then you would have a better estimation of how much it affected your thoughts at the time.
I used the term "ego trap" because a lot of people feel like they're more critical because they know these theories, while to me, it's the opposite. For example, they would judge the opponent's opinion as biased (which is very easy to do because they are so many biases, you can always find one) and reject it because of it. But they wouldn't be as rigorous when it comes to inspecting their own biases: we tend to notice the splinter in the other's eye, but don't notice the beam of wood in our own. And I don't see how any of the cognitive biases theory could help prevent this.
Quoting Caerulea-Lawrence
So that's not what I meant. To rephrase it, I would say: fighting emotional responses would mitigate cognitive biases, because it's fighting the cause. But fighting cognitive biases directly seems pointless as they're unconscious and are very difficult (impossible?) to perceive with certainty. So fighting cognitive biases directly doesn't help mitigate them!
And my main point is that when you fight the emotional responses, it prevents cognitive biases so naming them and finding "tools" to detect it seems useless and extremely complex compared to just assessing emotions.
Hello Skalidris,
I am grateful you clarified this for me.
In my own experience, what you say is very true. Biases in different forms mask themselves behind a pretty rationale or with 'sound' logic. The clearest sign to me is not the logic, but the off-tune emotions it is expressed with.
There is also something else that works for me, and that is to be asked open questions - directed towards exploring gaps in my conclusions or my motivations. It is important to me to follow my inner logic, a logic which also includes its own sets of parameters. Good questions help me engage my inner troubleshooting tool, and lets me more easily spot inconsistencies.
This kind of troubleshooting is often part of the first half of a process to get closer to the jagged feelings underneath. It's like dusting or cleaning to get a better look at what's the issue, with the added bonus of softening the edges as well. Doing this is a complementary route for me, and it helps a lot in reducing direct reactivity with regard to issues I find difficult, even when I am not ready to process the emotionality directly.
And getting care and attention from my partner is also something essential for me in delving into these things - or I might have just opted to ignore some of them. There is only so much self-compassion and self-understanding I am capable of. Getting compassion, space and understanding makes it a lot easier to take a closer look at the dangerous splinters, lest I keep projecting them onto other people in the form of beams of wood for the rest of my life.
Kindly,
Caerulea-Lawrence