top of page

Bias


Previously, I described how the human mind tends to take shortcuts in order to make sense of new information we receive from our senses, or new ideas we are trying to understand. For example, rather than start from scratch when presented with new experiences and arguments, we tend to try to fit what’s new into existing storylines (such as the competing storylines presented during last month’s impeachment hearings).


While poor judgement is often attributed to emotions overwhelming reason, mental shortcuts we take all the time (which are referred to as heuristics) represent flaws of reasoning itself.


In the 1960s, the pioneering researchers Amos Tversky and Daniel Kahneman developed a set of intriguing experiments that allowed them to identify how our use of heuristics can cause reason to go awry. For example, humans can be easily manipulated about numerical information by giving them a value to consider before asking them to make their own decisions regarding quantitative information.


Cribbing from their work, if you ask a group of students if Mahatma Gandhi was 150 years old when he died, likely all of them would say no. Similarly, if you asked another group if Gandhi died at age 12, they would all know that number was wrong. But if, after proposing outlandishly large or small numbers, you asked each group to guess how old Gandhi actually was when he died, the group given 150 to start out with would – on average – give a higher number than the one that was first asked if Gandhi died at 12.


This is called the anchoring heuristic, or anchoring bias, because numbers you are exposed to before making your own numeric evaluations can anchor your perception, causing you to build your own quantitative evaluation around that anchor. This is why the asking price of a house becomes your perception of a house’s actual value. It is also why President Trump is asking voters to consider certain numbers (such as growth of the stock market or low unemployment) while Democratic candidates ask you consider different data (such as increases in wealth disparity) when evaluating the health of the economy.


Anchoring is just one of dozens of cognitive biases that can distort our reasoning which can be viewed through a model Kahneman proposed of the human mind divided into two “processes,” one fast and one slow. The fast process is the one we use most of the time to make sense of what we see and hear (such as the words making up an argument). One of the reasons we can process this information so quickly is that our fast process streamlines things, using heuristics to make novel data seem more familiar.


In contrast, our slow process is the one we perform more complex mental tasks, such as multiplying 24 x 59 in our heads (give it a try to see how different that experience is from your normal flow of thinking). This slow process is also the one we use when we try to change our minds about something, which often involves rejecting an existing storyline in our heads, or replacing it with a new one.


Of all the biases that can impact individuals (and through individuals, society), the most powerful and potentially destructive is confirmation bias: the tendency to accept facts and arguments that fit with what we already believe, and reject facts and arguments that do not.


This is what is at work when we do not simply disagree with what someone from a rival political party has to say but react to whatever is coming out of their mouth with disgust. If you have ever found yourself agreeing with someone, but then quickly reject what you just accepted once you discover that someone voted for the candidate you can’t stand last election, that’s an example of confirmation bias making your mind up for you.


Unfortunately, our current age allows us to not just contend with confirmation bias, but to wallow in it. Acceleration of this phenomenon began when cable news stations began to cater to particular prejudices, creating a nation divided between Fox News and MSNBC viewers with completely different perceptions of current events. The age of confirmation bias kicked into even higher gear with the advent of social media that allows us to construct custom news feeds and interact with communities in which our biases are constantly reinforce and our prejudices never challenged.


The existence of confirmation bias should not be taken as a claim that holding strong beliefs, including strong partisan beliefs, is never justified. We all need ways to navigate the world, and partisan affiliation involves common embrace of a set of values regarding social, economic and moral affairs that are often completely justified.


Problems arise, however, when those affiliations become our primary means for filtering good from bad information or determining if arguments for and against a policy are strong or weak.


The logic-checking processes you have been introduced to on this site provides powerful tools for controlling for confirmation bias and other cognitive flaws in our human makeup. But their usefulness can also be impacted by the very biases they have the power to correct.


For example, selecting and evaluating the premises in a logical argument requires giving those premises a fair hearing. Similarly, if we are to accurately evaluate the strength of the the connection between those premises and a conclusion, we must avoid being pre-disposed to a particular outcome before that analysis begins.


Science is often held up as one of the most successful forms of human reasoning ever invented. But if you consider science not as a single method, but rather as a set of processes that are part of a culture designed to slightly diminish the confirmation bias all of us (including scientists) are susceptible to, you can begin to see the huge payoff that derive from thinking a bit more, and believing things just because you want to a bit less.

Commentaires


bottom of page