The Banal Fallacy
“It is more from carelessness about truth than from intentional lying that there is so much falsehood in the world."
I read a lot, and a lot of what I read is not particularly memorable, edifying or inspiring. However, it sometimes comes to pass that I find the perfect soft-cover to be read when I’m waiting for my damn game to load. Anyways, I was at the local McNally-Robertson during final exam season, looking for some light reading, and I stumbled upon two books which seemed useful: Daniel Goleman’s Social Intelligence, and Daniel Gilbert’s Stumbling on Happiness. As it’s been my humble, humble intention to write about the information world, I figured “there’s no better place to start than the human brain”; and these two books seemed like a good place to start. I don’t really remember what I read in “Social Intelligence”, although I do remember it was quite memorable; but, it just so happens, I picked up Gilbert’s book not long after I had writer’s block on this very subject.
Sage that he was, Samuel Johnson too had his occasional lapses from factual accuracy. In that respect, he is firmly in the camp of the multitudes: 99.99% of us make the occasional mistake when it comes to relating or remembering things we see, experience or hear. Although most of us see ourselves as paragons of truth-telling―I do myself―perhaps this will shed a useful light on some of the mechanisms our truth depends upon, and, perhaps, erode an iota of that certainty―if only for as much time as it takes for the brain to fabricate a rebuttal to my arguments.
I call “this carelessness with the truth”, as Johnson would have it, the ‘banal fallacy’. Banal fallacies have a great deal of impact on societies by reducing the efficiency and accuracy of information transmission: bad information can lead to the wrong politician being elected, or misdiagnosis. It can also lead to broken friendships, stupid business deals, and fraud.
So, the question is, “why does he have to come down so hard on people, and accuse them of falsehood, blatant misrepresentation of facts, and generally being bad citizens?” Well, the answer to that question is this: I’m not making a moral argument (per se), and I’m not going to come down hard on you at all. It’s something we all do, and it’s banal for that very reason. Typically there are three sources for fallacies, including properties of the brain, communications and error. Very little has changed, over thousands of years, in that respect, other than the volume of information whose source we are not even casually acquainted with. In this regard, the ever-widening sphere of human knowledge, when married to our finite capacities and imperfect tendencies, does seem to bring new scope to foolishness. Before I introduce you to the brain, and some things you may not have known (although you may have been aware of them), I should probably explain why the fallacy, banal or not, is not necessarily a good thing.
In human society, the question of factuality, the abstraction’s relationship to its object, is crucial. Without the ability to generate or warehouse accurate (factual) information, it becomes impossible to make meaningful comparisons between activities or objects. Bereft of this capability, it would be simply impossible for a group of people to develop tools, a society, or even survive. The degree of factuality is instrumental in determining the reliability and relevance of feedback, a system’s output forming part of the input fed back in; and, therefore, in determining the performance of the system as a whole (feedback is a mechanism facilitating control of a system). A good example, a common calculator takes inputs―punched in numbers and formulae―and passes them through its processing unit (containing the internal logic) in order to provide an output. If the calculator is not correctly programmed, it will process inputs such as 3 x 5 into a non-factual output, such as 16. For those of us who aren’t convinced that three times five is equal to sixteen, factuality would seem very important: arguably it makes a huge difference whether we have $100 or $1000 dollars, or whether the lion is dead or just sleeping.
Now that we know just how important accurate, factual information is, I can get to demonstrating how little our brains care for it. Let’s begin with the role of the three vehicles by which much of our ‘carelessness with the truth’ is accomplished. They are, in no particular order: filling in, leaving out, and selective sampling. The first, and least obvious vehicle, filling in, is done without the knowledge of the conscious mind.
Filling in is a pretty universal mental phenomenon―although some autists may be less likely to do this some of the time; but it is also unconscious, and so most of us are blissfully unaware of its occurrence. It is actually a biological necessity: our brains, no matter how marvelous they may be, have limited amounts of space available to store the information they are constantly being bombarded with. So, in order to conserve space, the brain compresses experiences into salient facts, and when the time comes to remember them, fills in with ‘made up’ details. As Daniel Gilbert, authour of Stumbling on Happiness, notes, “The elaborate tapestry of our existence is not stored in memory-at least not in its entiretey. Rather, it is compressed for storage by first being reduced to a few critical threads, such as a summary phrase, or small set of key features. Later, when we want to remember our experience, our brains quickly reweave the tapestry by fabricating-not actually retrieving-the bulk of the information that we experience as a memory.” So, our brains seem to have a tendency to remember things that didn’t happen, particularly when we summon the same memory more than once. Unfortunately, the brain is also complicit in another mistake: forgetting things that didn’t happen.
Donald Rumsfeld was harshly criticized for his “known unknowns” speech. But, what if he had a point?
From the perspective of the human mind, the ‘unknown unknown’ is a value that doesn’t reside in the memory for the simple fact that the fact has never happened, or for the simple fact that the brain has not registered that the fact has happened. Gilbert states, “…studies show that when ordinary people want to know whether two things are causally related, they routinely search for, attend to, consider, and remember information about what did happen, and fail to search for, attend to, consider, and remember information about what did not…in other words, we fail to consider how much imagination fills in, but we also fail to consider how much it leaves out.” Our predictions for the future tend to be flawed for this fact: we base our expectations on what we know has happened, and, for that reason, they tend to say more about us, our time and our place than they do about the next. When we aren’t dreaming about an idyllic future, we may be trying to make the best of our present; and that requires us to be a little selective in the information we give credence.
Selective sampling is a time-honoured human tradition designed, it would seem, to give us a more favourable impression of ourselves and our ideas than they truly merit. We should probably be thankful for it: if we didn’t have this capacity for self-deception, we might all be extremely depressed. After all, by definition, half of us are below average―and let’s not pretend that ‘average’ sets the bar high. In order for us to have positive views of ourselves, and so function effectively, we need to believe that we really stand out. This, of course, presents a quandary because much of the information we receive and process should tell us otherwise. The conscious mind isn’t quite so undiscerning that, when presented with overwhelming negative evidence, it will simply disregard it, and, if we don’t disregard it, we might be compelled to accept it. That would be bad, as would the opposite. “When we cook facts, we are similarily unaware of why we are doing it, and this turns out to be a good thing, because deliberate attempts to generate positive views contain the seeds of their own destruction,” says Gilbert, “…when volunteers in one study were told that they’d scored poorly on an intelligence test and were then given an opportunity to peruse newspaper articles about IQ tests, they spent more time reading articles questioning the validity of such tests than articles which sanctioned them…by controlling the sample of information to which they were exposed, these people indirectly controlled the conclusions they would draw.” It’s simple really: we can avoid tough realizations by subconsciously selecting for information which affirms ourselves. Once we come to realize that the human mind isn’t exclusively geared to truth-telling, even to its conscious self, it shouldn’t be hard to accept that it will pass on or accept inaccurate information.
Societies, organizations or complex organisms all exhibit degrees of specialization of function. Specialization of function, to be successful, requires the coordination of disparate functions, and coordination is achieved through transfer of information―communication. If the wrong information is communicated, dysfunction results. Wrong information can be transmitted due to the failings of the medium, such as interference, or due to misinformation in the original message. The transmission of a fallacious notion from one person to another is not a particularly mind-bending process, and doesn’t require much in the way of investigation. It can be done consciously to manipulate other people’s perceptions, or unconsciously…because the people involved simply don’t know any better. What is interesting is how often fallacies spread with greater ease than the truth, showing a great deal of resilience to whit. For example, mistaken ideas about areas where the intended audience do not have a great deal of knowledge, and ideas which make people feel better about themselves may often gain acceptance.
Mistakes happen: they’re a fact of life. Without mistakes, we wouldn’t be able to make smarter mistakes, and, eventually, find solutions to problems. We should probably thank our lucky stars that we’re a fairly advanced-model organism, and that many of the mistakes that made us have already been made. So, with only a little bit of information stored in us to begin with, and a great deal of capacity for storing and gathering―never mind processing―information, it’s not surprising that there are a lot of things we don’t respond to in the best way possible the first time: we don’t understand them because we have no knowledge of them. Our ability to imagine a plausible scenario is a function of our memory of something similar; in fact, the ability to store information in our memory the way we do would be a rather pointless one if we always made the right decision to begin with. So, people make mistakes in interpreting the information they do have, and this is a fertile source of fallacious ideas. It’s a universal problem.
If people have brains which often seem to wish to misinform us, if we have minds which are only as good as what they already know (and rarely that good), and if communicating bad information is easy, why is it that our societies function fairly well, our lives last fairly long, and we’re still in that relationship? Why is it that banal fallacies have not completely hamstrung humankind? Truth be told, banal fallacies, common as they are, are, individually, short-lived phenomena (in the great scheme of things). As much as I seem to have painted a bleak picture of the world of information, I’ve only told a fraction of the story. The human mind, and human societies have developed―some better than others―mechanisms to resist and to correct fallacious ideas. Our ability to capture experiences in written language, and now in audio and video, allows us a much clearer, and detailed view of prior events. Communications technologies are constantly increasing the effectiveness of the medium, both in preserving integrity of the message, and in increasing informational density. Most large organizations, and scientists certainly are, aware of our tendency to sample selectively, as demonstrated by practices in the laboratory or in the HR department of a large multi-national. Inevitably, the information we receive and process will never fully reflect the state of the world at a given moment; but I have confidence that, over time, the lifespan and scope of the banal fallacy will fall.