Doc Hall returns to our newsletter with a thought-provoking three-part essay on the risk of thinking we know the truth of things. Part I challenges us to make the effort to be life-long learners.

On-line technology multiplies opportunities to practice them. Every morning before breakfast I trash a half-dozen or more e-mail scams. Fifty years ago, no one began scamming you until you were awake.
Scams are deceptions. By Harry Frankfurt’s definition, deliberate deception is a lie. However, bullshit merely makes up “facts and stories” to persuade – land a sale, get a vote, impress somebody at a bar. Veracity is irrelevant to successful persuasion.
A lie is not telling the truth. But can we ever know “truth,” and if so, can we describe it? Turns out that we can’t ever know the truth – reality. We’re incapable of it.
Deep thinkers thrashing through their philosophical weeds have questioned whether reality exists ever since Plato noted that if it does, we can never see it. He likened us to cave dwellers that cannot see outside but must infer (guess) what is going on outside from shadows on the cave wall. But shadows lend themselves to the arts of deception.
Today, physiology explains the same thing. Humans perceive only a smidge of the phenomena in which we are immersed. For example, we can only see light in the visible spectrum, a sliver of the total electromagnetic spectrum. To guess at the rest, instruments must translate the invisible into something we can sense. And then using our direct senses, we focus on whatever draws our attention, ignoring the rest. Our perceptual limitations force us to focus.
It’s like we were each born in our own little mental prisons, shaped by unique life experience. Nobody sees exactly the same shadows on our walls. To learn something new – guess closer to reality – we must fight our own mental and physical limitations.
When sensing a broad spectrum of information, overload befuddles our brains. It’s like sorting 500 e-mails a day for a few morsels in the chaff. To sort, we pre-decide what we will pay attention to, and what not and tune a spam filter to help. However, filtering risks missing Important messages that don’t fit our preconceptions. Inescapably, we’re preconception biased; all of us; no exceptions. Only fools think they are. Nobody escapes being fooled.
When interacting with nature and other people, our perceptual feedback loops become complex (does she think that I think….). Our preconceptions clash. Fully grasping the workings of nature or the behavior of others is hopeless, but many of us barely try. We are perceptually lazy. We prefer people – and news – with similar biases. We can relax.
Although we can never see reality, technology can get us closer to it. A farmer that monitors soil temperature, moisture, pH, and composition has more data from which to exercise intuition. When meeting other people, we sense intent and trustworthiness. Will they do what they say? Can they? And persuasion by deception begins. For example, in budget negotiations in large organizations, a player must open with a big lie. Otherwise she’ll probably lose – and fail her department. Best liar wins; that’s why we lie. Technical problems are “tame problems.” Clashes of human intent set up “wicked problems.”
But in any kind of problem solving, we struggle against the instinct to conserve brain energy. Blame somebody else. Invent a hokum story. Honest searches to get closer to truth (reality) fight mental and emotional laziness. Science battles it all the time. The rest of society often seems disinclined to even consider “truth” being the only winner.