Having logic-checked current events using words and diagrams, it’s now time to think about the role numbers play when reasoning about the news (or any other issue worth arguing over).
This year’s global pandemic has exposed many shortcomings and biases people bring to news and arguments based on quantitative information. And by bias, I’m not just talking about the tendency to notice or believe state-level COVID stats based on the party affiliation of that state’s governor. Rather, I’m talking about biases and flawed reasoning that results from the human tendency to put potentially unwarranted faith in arguments built on numeric data.
For this next set of posts on numerical reasoning (including numerical fallacies) I’m tapping Charles Seife’s 2011 book Proofiness: How You Are Being Fooled by the Numbers (as well as a book chapter on this subject from Critical Voter) for insights regarding psychological and cultural factors behind our near-religious faith in quantitative information.
In Western culture, this faith goes back to the Ancient Greeks. While most students only know the early mathematician Pythagoras through his famous geometry formula of a² + b² = c² (with A and B being the length of the legs of a right triangle and C being the length of the hypotenuse), Pythagoras was also a philosopher, some would say a cult leader, whose followers joined him in believing in the near mystical power of numbers, mathematics, and geometry.
Some stories regarding Pythagoras are obviously myths, such as the ones saying he had a thigh made of gold and could dematerialize and perform other feats of magic. But the historic Pythagoras, or at least his philosophy, certainly seems to have influenced Plato, whose philosophy can be seen as a quest to find equivalents in human life to the perfection of numbers.
For when you think about it, numbers are not only perfect but may represent the only perfect things we encounter in life. Words change their meaning, people their nature, mountains can be leveled with an earthquake, and great lakes can dry up, but two-plus-two will always equal four. That is true wherever you live and whatever culture you belong to, and it will continue to be true even if you choose not to believe it.
I don’t think it was an accident when, in the novel 1984, George Orwell used as an example of total submission to Big Brother getting someone to both swear and truly believe that 2 + 2 equaled 5 since this represents a denial of one of the few absolute truths we can conceive of as human beings.
A problem arises, however, once you take that number out of its abstract realm of perfection and drag it down to the real world by attaching a unit to it.
As Seife points out, 2 + 2 may equal 4, but what happens when you add the length of two pieces of wood, each of which is two feet? Well, if you’ve got a good enough ruler, it will turn out that neither of those pieces of wood is exactly two feet long. Both will be slightly longer or shorter, even if only by a fraction of a fraction of an inch. Which means if you add these alleged twos together, you won’t get four, you’ll get a number that is slightly less or more than four.
At first, this might seem like a ridiculous concern. After all, we have to accept a certain level of inexactitude as irrelevant if we want to measure a piece of wood to build a cabinet, as well as accomplish most of the other things we do in our lives that involve measurement and practical arithmetic. But what happens when the level of uncertainty inherent in numbers get applied to real things becomes not just significant but critical?
Next up – Infection Rates and Uncertainty