Robert D. Kirvel

Robert D. Kirvel

 

Enjoy our WTP Spotlights, notable selections featuring artists and writers from our Woven Tale Press magazine. To read the issue in full subscribe and you can also register on our site to enjoy our archive.

 

Robert D. Kirvel is a Pushcart Prize (twice) and Best of the Net nominee for fiction. Awards include the Chautauqua 2017 Editor’s Prize, the 2016 Fulton Prize for the Short Story, and a 2015 ArtPrize for creative nonfiction. His novel, Shooting the Wire, was published in August 2019 by Eyewear Publishing Ltd., London.

 

The Confidence Paradox:
Why Popular Views Are So Often Wrong

From WTP Vol. VIII #2

“Ignorance more frequently begets confidence than does knowledge.” —Introduction to The Decent of Man, Charles Darwin

“In the modern world the stupid are cocksure while the intelligent are full of doubt.” —The Triumph of Stupidity in Mortals and Others, Bertrand Russell

Angels and auras, glutens and GMOs, assault weapons and Amendment 2, healing crystals and pyramid power, transgenders and gene editing. We might think we know little or a lot about such topics, but what’s the objective reality behind all we believe we know about ourselves and the world?

One form of thinking about thinking, called metacognition, is the ability to step back and look with some objectivity at our own behavior and abilities, and to distinguish accurate judgment from error. Then again, is it really possible to be objective? A potential path is through critical thinking, the intellectual process of analyzing and evaluating, then applying information from observation, experience, or reasoning as a guide to thought and action. An antagonist to objective and critical thought is belief, which often infuses emotional reactions such as fear, rather than logic, to distort what we suppose is true or untrue.

Growing up in the Midwestern US, I remember hearing people claim with emotional conviction the waters of then heavily polluted Lake Erie caused polio whereas smoking carried health benefits, and America was deemed a unique land of peace and democracy. At the same time, more than a few politicians and newspaper headlines insisted communists had overrun Washington DC and Hollywood. Homosexuals were arrested as perverts who willfully chose their perversion, recruited and molested little boys, and flagrantly courted the wrath of The Almighty. On many emotional topics back then, little ambiguity tempered what folks believed and said. Oh, and by the way, let’s all have one more for the road, shall we?

People following the political news on TV and social media today might be forgiven for supposing self-doubt is in serious decline once again across the home of the brave, or at least at the United States Capitol, where certainty bores tunnels through credibility. A national refrain these days seems to honor binary thinking. “I know what you don’t know. End of discussion.” A problem with this way of thinking is that those who think they know what they do not know, don’t know they don’t know it.

One way to help understand hollow certainty fueled by emotion plus a decline in objectivity centers on a type of distorted reasoning called cognitive bias, which can take various forms. It’s not a question of whether we as individuals have any mental filters or biases—and if we do, how abnormal we are—but rather how many and how strong are the biases within every one of us as part of the normal condition of being human. Here’s an eye-opening proposition about one type of cognitive bias. The more we actually know about the real world, the less confident we tend to be in our beliefs and conclusions. Conversely, the less we know in a particular domain, the more confident we tend to be in our beliefs and conclusions. Two psychologists, David Dunning and Justin Kruger, published their research at Cornell University describing this paradox, especially the difficulties incompetent people have in recognizing their own incompetence.

The Dunning–Kruger effect, as it has become known, is a form of cognitive bias in which people of low ability mis-calibrate their own ability and performance. They are convinced of their illusory superiority compared to others who are demonstrably superior, and they mistakenly overestimate their cognitive ability relative to objective criteria. People of relatively high ability however tend to disparage their ability. How is it possible? One suggestion is that novices or low-ability people do not possess the skills to recognize their incompetence, or to put it in blunt terms, fools are blind to their own foolishness, so incompetent people do not realize how inept they really are. With only a simple idea of how things stand, the tendency is to be over-confident and unaware, and here’s the shocker: ignorance to the ignorant can feel just like expertise. Furthermore, those with narrow vision seek and adore certainty but often don’t receive or incorporate much feedback. Thus, the tendency to believe, “I’m right and you’re wrong, so just be quiet.”

Perhaps someone in the public sphere comes to mind? Or maybe someone in your private life fits the bill?

A related bias mechanism operates in the opposite direction in above-average folks. The more a competent individual knows, the less confident the person is likely to be, within limits. Of course there are exceptions, such as celebrated experts or geniuses who are highly verbal, but in the main, people who pursue a particular topic in depth appreciate how much they still do not know and how much remains to be discovered, so they tend to underestimate their knowledge and ability.

Take the example of someone who decides to pursue a science degree leading to research on vertebrate vision, that is, on how we are able to see the world. The individual might finish a college degree in chemistry, for example, then concentrate in graduate school on neuroanatomy and physiology. By focusing on the visual system of the human brain, the area of study narrows even more to specific brain pathways from optic nerves through mid-brain structures called the thalamus and superior colliculi and then to Area 17 of the human cortex. Specialization might proceed to narrower frames of reference, including synaptic electrical impulses and neurotransmitter interactions key to understanding visual nerve signals, perhaps down to the level of molecules. If you asked such a student back in tenth-grade biology to describe how we are able to see, you might get a reasonably confident answer about the eyeball and retina. However, ask the same person as a mid-career researcher, and you might encounter reluctance even to begin to address the question. I can vouch for the last statement because I was the student just described. Indeed, when pursuing any topic toward the limits of human knowledge, it’s tempting to conclude we as human beings know relatively little at present compared with what has yet to be explained. Genuine exploration in depth—learning more and more about less and less—often teaches modesty rather than certainty along with the idea that a certain amount of confusion can be productive rather than undesirable.

Intelligence in human beings has been defined in various ways, sometimes controversially, but whatever the definition, the capacity spans a wide range across humanity, often expressed as a score. With a value of 100 dead center (the average or mean value for IQ as it is often expressed), half the population by definition scores less than 100 and half scores higher. Regardless of the merits or faults associated with tagging intelligence with a number, humans clearly vary widely in that capacity. If intelligence is correlated with an ability to recognize one’s own cognitive capacity or task performance—in a word, metacognition—the Dunning–Kruger effect might be extended to suggest those with relatively lower intelligence are likely to be more convinced of their illusory superiority and the accuracy of their mistaken views than those with relatively greater intelligence, who are in general more reluctant to sound off with assurance. If that proposition is correct, then one can begin to understand how mistaken views often become more widely expressed and disseminated than expert knowledge.

Let’s take a few examples of things people think they know—or say they believe—to understand how the confidence paradox plays out in everyday life. The situation often starts with belief bias. Rather than considering the actual merits or complexities of a proposition, belief bias prompts individuals to rationalize almost any information to support a pre-existing belief, and that can be risky. It’s one thing to dabble in aromatherapy or to fantasize about imaginary auras for fun, but it’s downright foolish or dangerous to infer herbal “detoxification” or magic crystals hold a cure for cancer or dementia because of a belief that true cures must be “natural.”

We’ve all heard one of the most famous phrases in advertising history about milk being good for the body (Got milk?) and more recently, about glutens being bad for our health. To what extent do we believe the pitches? Sales-targeting by advertisers is so effective that it can lead not only to belief but also to emotional connections with, or reactions against, a given commodity, dietary or otherwise. Think about the implied connection, marketed to young men, between hot sex and racy sportscars for instance. It’s more than happenstance that from a young age, most kids exposed to television have heard and seen ubiquitous ads claiming dairy products do a growing body good. In fact, the American Dairy Association, owner of the National Dairy Council, has been touting the health benefits of dairy foods practically since its founding in 1915. Is it any wonder ordinary individuals who are nonexperts in the science of nutrition tend to believe drinking cow’s milk is beneficial, and that milk flies off the shelves of supermarkets across the United States? Ask a well-intentioned mom or pop to justify their beliefs about milk, and you’re likely to hear something about how milk is just plain good for you; everybody knows that.

But what about human allergies to dairy milk? What about high levels of cholesterol and artery-clogging saturated fat in cow’s milk and cheese leading to coronary heart disease, stroke, and, cardiovascular disease? Or consider the estimated 65 to 75 percent of the world’s population that is lactose intolerant (a condition far less widespread in Northern European genepools), let alone the fact that cow’s milk evolved as a nutrient for calves rather than for human babies, to say nothing about human adults. The reality is that dairy products—promoted by some health organizations as beneficial and disparaged by others as planet- and people-destroying bilge—is controversial and confusing. The bottom line is that health effects associated with consuming dairy products vary widely among individuals, and the jury is still out on a host of potential plusses and minuses.

What about glutens then? Is the consumption of glutens as unhealthy as some people swear? Ask the most ardent detractors what, exactly, a gluten is and many will be unable to explain even basic facts. The word itself conjures an image of a chunky-bottomed overeater, so the stuff must be bad for you, right? In fact, gluten is a mixture of two proteins in grains, such as wheat, barley, and rye, that triggers the autoimmune disorder of celiac disease, which damages the wall of the small intestine. However, according the Mayo Clinic, a source of health information more reliable than many others on the internet, scant research has been done on the health benefits (such as weight loss or improved health and athletic performance) of gluten freedom in the majority of people who do not suffer from a gluten-related medical condition. One of many possibilities is that those who identify as gluten-sensitive are reacting to partly absorbed carbs (called FODMAPs for fermentable oligo-, di-, and monosaccharides and polyols) and not gluten at all.

Similar faults in reasoning apply to opinions about genetically modified organisms, or GMOs for short. Individuals adamantly opposed to their production and consumption as food are unlikely to appreciate the widespread reliance of humankind on genetic selection and modification through history. Genetic modification in plants today involves inserting a specific stretch of DNA into the genome of one plant species, giving it some new and presumably desirable characteristic, such as increased yield or resistance to disease. Genes are introduced into plant cells either as particles coated onto gold or tungsten metal and physically shot into recipient cells or else are introduced by a bacterium. The modified cells are usually grown in culture so they can develop into mature plants and produce seeds inheriting the new DNA. Humans have been genetically selecting and selectively breeding crops and animals to modify them for various reasons for thousands of years. In the past, it’s true cross breeding was limited to similar species, whereas now the constraint no longer applies, and the mechanism differs. In addition, perhaps some antagonists are reminded of the horrors of human eugenics during WWII when contemplating the idea or “slippery slope” of genetic modification in plants today. In any event, heated debate on such topics often invokes political, economic, moral, ethical, and religious arguments, and the paradox is that the most pertinent facts, scientific ones, are often misunderstood or ignored. All too often, disputes arise among people full of intense feeling but little knowledge. That is certainly the case for modern-era anti-vaccination fictions reinforced by belief bias rather than scientific or medical facts.

In confirmation bias, we actively look for ways to justify or defend existing beliefs or preconceptions and ignore or deny conflicting information. Homosexuality is a choice, or it’s not. Either way, we’re quite sure about it. People who feel most certain that gay people choose to be gay often say they have gay friends, so they cannot be biased. Or they might have a colleague at work who’s gay, and they are sure gay people pick their sexuality because “normal” people are heterosexual, besides a friend knows somebody who made the choice to be gay, or a parent is convinced it happens that way, or their pastor says it is so, and the Bible commands a man who sleeps with a man shall be put to death. Such people often ignore what gay people—the real experts—say about the actual experience of growing up gay, namely, their sexual orientation was not a decision. Certainly that is the case for me and every LGBT person I’ve ever known. Indeed, an absence of choice regarding sexual preference makes more sense, given the persecution gays confront from childhood, along with other aversive realities.

Climate change is real, or it’s baloney. Which is it? Let’s say a friend or relative claims CO2 levels on Earth were much higher in the distant past (true, of one goes back 100+ million years), and plants and animals flourished back then, so the whole argument about global warming posing a problem must be a hoax. Genuine climatology however takes into consideration the mind-boggling complexity of climate interactions over time involving ocean temperatures, currents, and chemistry; the sun’s energy; earth’s reflectivity (albedo) and orbit; volcanic eruptions, atmospheric circulation and concentration of greenhouse gases; organic matter; and myriad other factors. Global climate is different from local weather, yet many confuse the two or do not appreciate the distinction.

It’s not unusual in social contexts for someone to raise the idea that cousin Patsy is artistic and Uncle George is logical, so she must be right-brained, and he left-brained. It’s true a fundamental principle of brain organization is that the left half of the brain controls the right side of the body, and vice versa. Indeed, every vertebrate brain is pretty much bilaterally symmetrical, with approximately equal left and right halves. It’s also the case that about 90 percent of people are right-handed, so what’s going on with the two sides of the brain?

In 1871, the French surgeon, Paul Broca, identified a frontal region of the left hemisphere (the third frontal convolution) vital to generating articulate speech. Soon after in 1874, the German neurologist, Carl Wernicke, described an area of the temporal convolution of the left hemisphere key to comprehending human speech and language. We know Broca’s area is connected to Wernicke’s area, and damage to the former results in telegraphic speech accompanied by simplistic grammar even though an affected individual is otherwise clear about an intended message. Ideas about brain laterality therefore have some legitimate anatomical truth behind them.

So, are people basically left-brained or right-brained, as we often hear? According to the belief in brain dominance or laterality, one side of the brain determines personality and behavior so that fact-oriented, analytical people are left-brained, whereas creative and intuitive free-thinkers are right-brained. Furthermore, conventional wisdom according to some people holds getting in touch with our “feeling” (right) brain can promote more positive and creative aspects of being human. Although location in the brain matters greatly for functions such as muscle movement, and people can really be numbers-oriented or art-inclined, scientific research does not bear out the cultural exaggeration of left versus right brain dichotomy when it comes to personality. In fact, MRI scans of more than 1000 people’s brains and 7000 different brain regions show people mostly use both hemispheres, without dominance and regardless of personality.

What can we say with confidence about personality then? More than 4,000 words in the English language describe human personality traits, impressive testimony to the interest folks have had in the topic. Astrology is an example of what psychologists call a trait or type theory of personality, many of which have been criticized as inadequate to say the least. The pseudoscience of astrology is an example of the Barnum Effect, another bias characterizing the way people tend to see personal specifics in vague statements by filling in the gaps, a propensity for which the human brain is most skillful. Historic examples of other trait theories include the introversion-ambivert-extrovert scale of personality suggested by Carl Jung in the 1920s. It’s the old and familiar notion with just a hint of truth proposing introverts are withdrawn while extroverts are outgoing and highly social, and ambiverts are somewhere in the middle. The problem is that such theories (and others from famous psychologists including Carl Rogers, Alfred Adler, Abraham Maslow, Raymond Cattell, and Floyd Henry Allport) are simplistic and inadequate to account for the spectrum of behaviors exhibited by real people in the real world. Often, trait or type theories fail to predict actual behavior in any consistent way from suggested or proposed dominant traits.

Stunning advances in addressing behavioral and health issues have been made on scientific, technological, and medical fronts in recent decades. Accompanying such achievements are absurd claims, fads, magical thinking, and unscrupulous profiteering involving the stars, holistic medicine, placebos, and the like. Open-mindedness to new possibilities and experiences is admirable, but gullibility and ignorance are never praiseworthy. I have a friend who annually scrapes together a pile of money from modest savings to pay for a costly week at a retreat that prohibits phones, radios, and all other forms of vocalization including talking; feeds its clientele a gluten-free diet centered on sprouts and watercress; and designs its daily activity schedules around colonic cleansing and healing-crystal treatments. There is little new here. History rings with accounts of unwise, unpleasant, or downright dangerous “health” treatments involving flesh-eating fish, bat blood, bird feces, leeches, maggots, bee venom, bloodletting, snake-weighting massage, burning towels, human fat (Axungia hominis), arsenic hair removal, “miracle” herbal remedies, bogus medical supplements, magic crystals, electrical shock, trephination, and lobotomy to name a few.

My friend gushes about the spa experiences, touting them as blissfully cleansing and testifying to their powerful health benefits, further reinforcing the well-documented potency of the placebo effect and eagerness with which many humans open wide their pocketbooks willingly to bizarre forms of self-inflicted deprivation and punishment in hopes of some benefit.

A list of phony treatments, holistic cures, and alternative medicines touted through history by unscrupulous merchants and hucksters would fill a small volume, yet people continue to fall victim to rebranded skullduggery in hopes of miracle results. It’s understandable if individuals are tempted to suspend disbelief when confronted with a dreadful disease or life-threatening condition, but the cost of misdirection and fakery—in emotional currency, needless suffering, and disappointment—is enormous.

If know-it-alls often know the least and many average folks can be duped, are we destined to remain fools or, at best, undiscerning and gullible? After all, brains are wired to take the path of least resistance, which often means doing what we think we know or have done before. The Cornell investigation, described above, explored people’s perceptions of their own competence in the test domains of humor comprehension, logical reasoning, and grammar, but they also studied the possibility of making the most incompetent people—the bottom quartile or lowest-scoring 25%—realize their ineptitude by making test subjects more competent through training designed to improve logical reasoning skills. Such training was successful in several ways. It not only improved test performance scores dramatically over the course of the study but also sharpened the self-assessment of individuals participating in the training. The results point to the importance of feedback—social, critical, quantitative—in revising one’s own conclusions by comparing them with reality. Critical thinking, once again, can make a difference.

If the internet and other media abound with spin and fabrication as well as flat-out lies, where can valid and reliable information be found for those willing to take time to look? Perhaps the most important criteria are the credibility and integrity of a source of information. Every area of human interest these days, from astrophysics to abstract art, features subject-matter experts who know the field well even if they do not know everything. The shelves of libraries hold more critically acclaimed and award-winning books on wide-ranging topics than most of us will ever take time to peruse. University output, including educational websites and peer-reviewed journals, are among the most trustworthy resources, but on topics ranging from pyramid power to vaccination, it’s inevitable to encounter sham sources with legitimate-sounding names offering pre-scripted conclusions masquerading as reliable data. Warning signs are familiar-sounding organizational or professional titles imitating legitimate ones, claims appearing too good to be true, glowing personal testimonials, unsound or unbalanced analysis limited to one side of a complex topic, or issues unduly simplified. The absence of proof does not prove or disprove anything, much like the presence of opinion. Anyone can evaluate a source of information by asking if someone, starting with the author and affiliation, stands to gain by making a particular claim, then remembering how those who claim they know the most often know the least according to the Dunning–Kruger effect. In the end, it might be necessary to swap the seductive allure of absolute certainty in our thinking for the better ambiguity and unease of not being quite so sure about what we think we know.

[/tm_pb_text][/tm_pb_column][/tm_pb_row][/tm_pb_section]

Leave a Reply