EQ In Action – Reality Testing
One of my favourite quotes is by Mark Twain. It goes:
“It ain’t what you don’t know that will get you into trouble. It is what you know for sure that just ain’t so.”
I love this quote because it encapsulates so perfectly the essence of Reality Testing – one of the subscales in the Decision-Making composite of the EQ-I 2.0 Model of Emotional Intelligence (shown here).
Reality Testing is “the capacity to remain objective by seeing things as they really are. This capacity involves recognizing when emotions or personal bias can cause one to be less objective.”
As human beings we are all prone to bias in the way we look at or view things. These ‘cognitive biases’ influence how we see things, how we process information, and how we make decisions.
The definitions of a cognitive bias, and the list of biases, vary based on source but it’s generally accepted that there are more than 175 identified biases covering a massive range of different areas.
We generally group cognitive biases together into the following categories:
- Too Much information – For example the Anchoring Effect which is when one piece of information that you learn early in a decision-making process becomes the basis of comparison for other, newer information. These ‘anchors’ prevent you from seeing new options or opportunities, some of which may be better than your original idea or thinking.
- Not Enough Meaning – For example the Confabulation Effect, which is a tendency to make up data to fill holes in memories of previous events, but without the intention to deceive. If you have ever been convinced about a previous experience you have had, and then later learned that some element of your memory was wrong, you have experienced the Confabulation Effect.
- Need to Act Fast – For example the False Consensus Effect through which someone will overestimate the extent to which other people agree with or support our ideas or thinking. This is a common cognitive bias in organizations where one department comes up with an approach to a challenge or opportunity and assumes that everyone else in the organization will see things the same way.
- What Should We Remember – For example the Negativity Bias through which people can have a stronger recall of negative events in their past than positive ones. This can lead people to negatively evaluate ideas or suggestions based on previous, negative experience that unfairly colors the way they see the current situation.
One of the interesting things about cognitive biases is that, while we all have them, certain biases are stronger in some people than in others. I have come to refer to these as the dominant biases. For example, some people have a strong Confirmation Bias.
Confirmation bias is the tendency to seek out information, and attribute disproportionate value to information that supports an existing perspective.
Put more practically, if you are a supporter of X political party, you are more likely to view information from that party as being positive. You might even rationalize things that don’t really resonate with your own views, as you credit everything that comes from that political source as correct.
You can understand what some of your dominant biases might by looking back at past decision making processes and asking yourself what you missed or got wrong. This can give you some insight into what some of your dominant biases might be.
Three Cognitive Biases That Hurt Leaders
Most of the work we do at The EQ Development Group is with leaders – helping them develop their emotional intelligence, and so we are particularly interested in how cognitive biases affect those in leadership roles. While all of the cognitive biases can play a significant part, there are those biases that come up more often than others in our work, and some of them can really hurt leaders. Here are just three examples:
- Dunning–Kruger effect – This bias refers causes a person who is less effective/skilled at something to overestimate their ability, while conversely, those who are very skilled/effective at something see themselves as less effective. If you have ever worked with leaders who are totally sure they have it all figured out, but those around them are having a different experience, that’s the Dunning-Kruger effect at work. Of course, the converse is true. Many very effective leaders underestimate their abilities (potentially imposter syndrome) and that can hurt them
- Reactive Devaluation – This bias makes a person see ideas, input, or suggestions as less valid if they come from someone who you don’t like or respect. While the judgment may be valid, suggestions are not met with openness or curiosity, and instead are dismissed out of hand.
- Curse of Knowledge – This bias prevents those with a higher level of understanding from being able to relate to others with less knowledge or experience. This can lead to someone thinking that someone else ‘should just get it’, or that if someone else can’t learn something as quickly as they can, that they are somehow slow or less skilled, when in reality neither are true – they just don’t have the context or experience.
If you can accept that all of us are susceptible to these biases, you should be interested (or even fascinated) about how you can identify, and potentially overcome these biases in your thinking. Doing so can help you to become more effective at making decisions.
Here are 3 things I recommend:
- Ask other people what they think and LISTEN to their response – Other people may hold a slightly different, or wildly different opinion on something than you. They may have the same information, but are viewing it from a different angle than you. Additionally, they are very likely to have a different set of dominant biases. Either way, it’s well worth trying to understand and how and why they see it differently as a way of surfacing your (and their) biases.
- Subtract a bias – This is a practical exercise where you look at the biases you know you are particularly susceptible to, for example the Reactive Devaluation bias that we discussed earlier, and you ask yourself ‘what judgments am I making about the source/value of this information that might be affecting the way I look at the situation’. Then, try ‘subtracting’ that bias from your own thinking (it’s not as easy as it sounds, but it’s surprisingly effective).
- Learn more about cognitive biases – The more you know about biases, and what they do to your thought processes, the more likely you are to be able to spot them when you are experiencing them, and also be able to even out your thinking. There is a great list of cognitive biases on Wikipedia HERE, and in addition on that same page you can also find a codex of known cognitive biases which (provided your eyesight is good) can make for an interesting poster for your wall as a reminder that you shouldn’t necessarily just trust your brain.
Can You Have Too Much Reality Testing?
One last thought – just as with any other aspect of emotional intelligence, you can have too much Reality Testing. You may have heard the term ‘paralysis by analysis’ – that’s a good way of understanding what can happen when you have too much Reality Testing.
Paralysis by Analysis happens when someone with a very high level of Reality Testing constantly seeks new or confirming information for a decision process, or they try and discount or negate a decision because of an unknown bias.
By constantly seeking this new information, people with a very high level of Reality Testing can delay making a decision for too long. Ultimately, we are all susceptible to cognitive biases, and every decision or action we take is likely to be affected in some way, small or large, by these biases.
For anyone, understanding the role of Reality Testing in your day to day activities is critical to emotional effectiveness. If, after reading this far, you are still not convinced that cognitive biases play a large role in human decision-making, you might want to check out Bias Blind Spot.
People who experience Bias Blind Spot believe that while other people are susceptible to bias, it does not affect them. Put another way, there is a known cognitive bias about denying cognitive bias. I think I just lost some brain cells trying to solve that conundrum!!