One of many tropes on the web is that Eskimos have twenty-nine (or thirty-six or ninety-nine) words for “snow.” Like so many online legends, this one is false. Inuit people have just two words for “snow,” though they do have a rather larger lexicon relating to icebergs, slurry, and other features of serious interest to people subsisting in a cold environment. The purpose of the “snow words” claim varies. It seems often to be used to bolster the phenomenological position that environment molds consciousness. Others use it to indicate the more nuanced point that a paucity of terminology may produce a reflective lack of subtlety in our thinking about the world. Certainly, it is to illuminate this latter thesis that C.S. Lewis wrote The Four Loves, his analysis of the four Greek terms that distinguish our modes of affection for family, friends, lovers, and God. We may need even more terms to do justice to this most powerful of human feelings; maybe that work would help us better understand its nature (see “The Moral Bullseye“). Even before coining such helpful neologisms, I propose that we examine some of the common terms we already have to assert our understanding of love or snow or anything else. Our use of them indicates that this is a subject we all claim to know, but even a slight analysis suggests otherwise.
First, a very brief context. All of our declarative sentences are about something, so our knowledge is only true if it reflects some actual state of affairs. Now that notion of reflection is important because it requires a dualism, a comparison, a kind of mirror between what we think ad what actually exists, and the capability to clarify that relationship can only be rational (see “The Tyranny of Rationality”). Whatever else we think necessary for truth must be built upon this rationality. So what do we mean by an “actual state of affairs”? It could be some fact about the world or some relation between our truth claim and the complex of other truths we embrace (see “What Counts as Justification?“). The key is that knowledge is always the mind’s reflection of something else. I am convinced that this relationship between claim and that other something is the whole key to clear understanding and communication. It is the claim’s justification or warrant, and my contention is that our lack of clarity about this subject produces no end of mischief in the world. Part of that murkiness derives from the very poor terminology we employ as we attempt to comprehend and communicate what we claim to know, and we do that thousands of times daily. I wish we had in actuality as many clear distinctions for justification as the internet claims the Eskimos have for snow.
Instead, we have only confusion. Tell me what an opinion is. The Supreme Court issues them citing estimable and subtle constitutional warrants. I issue them on the quality of my child’s Happy Meal. The use of this term references the most thoughtful expertise and the blindest prejudice. Everyone is entitled to one on any subject, thereby utterly debasing the term’s warrant. Sentences that begin with, “In my opinion,” can be safely ignored for content. We wait for the pause that allows us to throw our own into the ring to lie there listlessly until covered by piles of other equally empty preferences and distastes mixed with shards of fact and belief. The etymology of “opinion” reflects its emptiness, for it sources everything from judgment to flight of fancy. The word, like the declarative sentences in which it is used, all too often has no real meaning. As a signal for the kind of reflection of some actual state of affairs that gives heft to our truth claims, the word “opinion” is a featherweight. In the world of knowledge claims, it has all the gravitas and power of the word “thing” as used in general parlance.
Wouldn’t it be a good thing to replace opinion with words that establish a clearer reflection of our confidence in our knowledge, meaning words that imply the kinds of justifications we are prepared to offer? I don’t think this is a particularly difficult task, provided we are willing to look honestly at what we claim to know. But that may be the real issue. We prefer to be vague. It masks our uncertainty. When we hear something we disagree with, we can say, “That’s only your opinion,” tarring the remark with the brush of inconsequentiality, yet we like to think our own opinion gospel truth. For that matter, some of us think the gospel to be truth. Is that an opinion? Is the gospel? To have any real conversation about truth, goodness, or beauty, we must be honest about what we know, honest with ourselves and then with others. That honesty begins with clarity about the basic words we use to frame our knowledge. Wouldn’t it be a good thing for us to have a basket of terms with specific meanings to express gradations of knowledge rather than a Gordian knot of empty, ludicrous terms like opinion?
Let’s start at the beginning. As Locke said, all knowledge begins with perception, the senses, and reflection, reasoning on them. The flood of sense data pouring into our minds must be filtered through a reasoning faculty that seems unique to humans and universal among them. And the most elemental perceptual nugget is a fact. Let us call it “a datum of experience.” Facts are the most reliable knowledge claims we can have. The definition tells us why. They are simple, pure perceptual inputs, requiring a minimum of perceptual effort or reasoning and, once known, combining to form other facets of experience. Empirical science is built on a deep foundation of facts. Their simplicity is their elegance. We value them because of their unity and clarity, which is entirely a product of their direct perceptibility. If it is 96 degrees outside, it is a fact to say so. It is not a fact to say it is hot or is that it is hotter today than yesterday, even if true. The essence of facticity is simplicity, and its simplicity allows us to call any of our percepts a fact so long as we attempt to avoid reasoning on what we perceive (see “Facts are Fluxy Things“). Manipulating that datum of experience must be called something else, no matter how obvious the manipulation seems or how natural the addition to fact that we advance. Now people claim all sorts of things to be facts that aren’t, undoubtedly because the term retains some of its power even if used incorrectly, so I think we rather exploit the muddy understanding most people have about the term. If you agree with me, better still if everyone agrees with me, I feel safe to call any declaration a fact. Claim there is a purgatory in a Roman Catholic church and you have stated a fact. Say it in a Lutheran church and it becomes sheer malarkey. Put the earth at the center of the universe in the fifteenth century and you have stated a fact as obvious as the sunrise, one everyone “knows.” Somehow it sidles out into space in time to satisfy the current fact of its new location. Does it make sense to you that these simplest truths about reality are dependent on the assent of the listener? On the consensus of the public? In the case of purgatory, we have an additional problem, for no declaration of its existence could ever meet the definition of fact. Purgatory is imperceptible. I am not saying the claim is false, only that it could not be a fact. Neither could simple analytic statements like mathematical identities and dictionary definitions. And since facticity requires a near absence of reasoning on perceptions, we must eliminate from facticity not only manipulations of fact, identities, and the complex sensations of our inner life but also the beliefs and opinions we build upon them. Each of these non-factual declarations deserves its own distinctive name to separate it from fact.
So what do we call, say, the conclusion that today is hotter than yesterday? How does such a statement differ from the fact that it is 96 degrees, or the associated claim that it is hotter today than yesterday? We need more terms!
The moment we do something with a fact, we change it into something else. We may join facts together in some causal or correlative sense, thereby introducing more opportunities for errors in perceiving and reflecting on our declaration. To say it is hotter today than it was yesterday is that kind of claim. It requires at least two facts and some means to relate them. This effort to associate or differentiate facts about reality requires us to manipulate them to some intention that reason must direct, and introducing even a little bit of complexity to a fact also introduces a larger chance of error. I may err in claiming a datum of experience to be a fact. Maybe I misread the thermometer and it is really 95 degrees. But my chance for error is greater when I attempt to link today’s temperature to yesterday’s. That comparison, because it involves both perception and my reasoning about it, is more likely to be false. It is not a fact though, even if true. We need a word to describe the increased complexity that reasoning about facts introduces. I think the appropriate word for this effort is judgment. To judge something to be true is to weigh it deliberately—the link to reasoning is implicit in the word. To judge something is to arrive at a more complex truth claim than simple facticity. Our judgments may be wrong and given that their complexity may extend from a simple cause-effect sentence to a scientific paradigm, it is likely that they often are wrong. The corrective for an error of fact is a fact. It is really 95 degrees, not 96. The corrective for an error of judgment is a better judgment, based on more accurate factual information or better reasoning in manipulating it.
To call this act a judgment is to recognize the deliberation required to weigh our truth claims against an actual state of affairs by appeal to some warrant or justification. The word evokes the courtroom with its presentation and evaluation of evidence, its careful articulation of opposing argument, and above all the rational structure of its process. To judge is to subject a truth claim to a dispassionate, ratiocinative examination, and to judge prematurely is the literal meaning of prejudice. If I begin a truth claim with the words, “In my judgment…” I am throwing open my argument to your analysis, willingly subjecting it to your evaluation, accepting and even welcoming your opposing arguments, and above all acknowledging that our entire conversation on this subject is guided by reason rather than some other and lesser kind of warrant. To acknowledge that our judgment guides our discussion is to offer it the possibility of profit for both of us; therefore, the term is as superior to opinion as ordinary language to a baby’s babbling.
Not all judgments start with facts or link them. Some are purely conceptual, composed of categorizations among perceptual experiences that we think true. Concepts begin with perception, but they differ from facts because their nature is reliant on the way the mind files perceptions so as to categorize them. Love is a concept. So is justice. Concepts are forged from the judgments we make built upon the facts of our own and others’ experiences. Their very existence is dependent on accurate filing of similarity, difference, causality, unity, number, extension, causes, effects, and all the other ways we collate experience. To form a concept requires multiple exposures to related kinds of perceptions. The mind then files them under some categorical classification. Horses all have hooves. Bulls have them too, but also have horns. These conceptual categorizations are then manipulated in the same way we manipulate facts: to show similarity, difference, exception, and so on. It is possible, of course, to have a conception on some other foundation than perception, so agile is the human brain. We don’t perceive justice or love directly, but conceptualize the thing from multiple experiences that allow us to find its essential properties. That ability to sift many experiences for categorical use has powerful implications for the reliability of our knowledge claims (see “Expertise”) and powerful temptations to premature closure, the temptation to claim to know more than we do (see “Stereotypes and Categories”). We can even conceptualize and categorize imaginary kinds of perceptions. We know what unicorns and minotaurs are though no one has ever seen one. We can only imagine those things we have never experienced in terms of things we have. We judge them to be like horses, norwhals, and bulls. This is how the mind categorizes real experience to manufacture imaginary ones.
Some concepts bear no relation to any reliably known experience and so can only be imagined. Yet they reflect something we think more than imaginary. “Heaven” and “angels” are inexpressible in terms of experience, so the mind attempts categorizations that resemble familiar things as it does with unicorns, but if spiritual concepts are real, they are only metaphorically comparable to the perceptual things we know, and so our natural attempt to frame them in terms of experience proves dangerously misleading (see “The Problem of Metaphor in Religion“). Because the thing we imagine is only partially like the concept we know, the mind attaches to the part of the comparison it understands and provides a blank for the part it doesn’t. And that can lead to a misfiling, dropping a spiritual concept into a perceptual file and misleading us into thinking we know its nature. Like a prejudice based on premature closure, this mistake is an error of knowledge, a kind of careless filing problem.
It is a perfectly understandable one, though. The framing the unknown in terms of the known is how we grow in knowledge. Should I see an animal that resembles a bull without horns, I will have either to invent a new categorical file for “hornless bulls” or one for “that animal that looks just like a bull but has no horns.” We can call it “cow.” Such a pedestrian bit of growing in knowledge characterizes our entire education. But when that effort is indulged beyond what present knowledge can justify, we enter the territory of a very familiar term: belief. Its etymology opens it up to accurate understanding. To believe is to hold dear, to value. Think about the implications suggested by this etymology. What does it mean to profess a belief? What do we risk when we claim a belief to be true? How much of our own wishes do we stake on the truth of such a claim? We cling to our beliefs with the deepest attachment! Yet despite the stakes, how powerfully can we advance such a claim? The term has remarkably varied uses, which allow us again to advance a declaration with a minimum of fuss about how we plan to warrant it. We may think of a belief as a truth claim with a bit of suspicion clinging to its margins. If I should ask if you locked the car and you reply with “I believe so,” odds are that one of us is going back to check. We believe what we cannot know, what we cannot judge because of lack of evidence or rational relation. Or we think of it in the same sense as opinion in its most crippled sense: a belief is as unsupportable as the emptiest expression of taste. I may believe aliens live on Titan or in my garage. Our response is often understandably dismissive, for empty beliefs are like empty opinions: they lack all warrant.
But such is the confusion about the knowledge content of these terms that the use of belief may very well carry a heavier weight than I’ve indicated above. The word may just indicate a claimed degree of certainty equivalent to fact, and for some persons, its use indicates the very highest degree of confidence possible in a truth claim. In this sense, to believe is to know with absolute certainty. This seems to be the sense that religious believers mean in their credal professions of faith (see (see “Can Religious Belief Be Knowledge?“).So which is it? Does proclaiming a belief convey a lesser degree of knowledge than our rational judgment or a greater? To my mind, nothing better indicates the sorry sense of our present understanding of justification than this kind of question and nothing cries out more loudly for the clarification that can only come from a corrected usage. How confidently can I assert my beliefs? The answer to this question must be delivered from two sources, each producing a different answer. The first is as old as western civilization, the second as new as our present culture.
In his dialogue Theaetetus, Plato calls knowledge “justified, true belief.” But he was pretty obviously unsatisfied with that definition, as well he should be, for it is a truth claim’s justification that proves it true, and to call such a claim a belief is to tinge it with a blinding desire. He also warns us against being too enamored of our own opinions and seeks a strong wall between knowledge built from beliefs and mere opinion. But this seems ridiculous in light of the association of belief with attachment. Of course, we love our beliefs. We hold them dear to us. The word implies that attachment. In the original Greek, it had a modicum of that meaning. This confusion perhaps can be reconciled by remembering that Plato thought we should love and hold knowledge dear, preferring it to our own opinion. In the age when religious authority underwrote all claims to knowledge, a preference for one’s own opinion was an act of rebellion and appropriation from the omniscient source of all truth. We might arrive at “just an opinion” from our own preferences or biases, but our knowledge was a kind of revelation of indisputable truth from above, a kind of gift opening us to a true state of affairs in reality. This was delivered via religious dogma or God’s deputies and was given as absolute revelation. Granted, this might produce a problem, since such a gift was untraceable in terms of warrant and might very well be a delusion delivered by desire. That thought was suppressed, though, perhaps because the transition to authority facilitated public consensus. The casual acceptance of divine transmission was as consensually accepted as geocentrism, perhaps because this truth, like all truths, seemed divinely authored. That meant it had to be fixed by institutional authority, translated from private revelations to public truth. In a generation or two, the personal authority of the prophet was fixed by institutional authority, one result being a suppression of challenging beliefs, of heretical dissent (see “Authority, Trust, and Knowledge”). Since religious authority underwrote all claims to truth and goodness, revealed truth was maintained as indubitable (see “Premodern Authority”). Subscription to that authority was based on trust, but it was not until authority failed in the Reformation that the dissolution of trust allowed everyone’s private belief to dare to challenge religious authority and in time to consider itself a separate and equal kind of warrant (see Knowledge, Trust, and Belief”“) built upon the rational agency of each individual, often in opposition to the claims of institutional authority. Reformation reformers saw their own beliefs belonging in the same class as the beliefs asserted by the prophets and sages and saviors of ancient days that authority had came to endorse and perpetuate but which now caused only dissension and bloodshed. During the Reformation, everyone could be a prophet. When they received insight (an “inner vision”) or inspiration (a “breathing into”) from the divine, their beliefs could claim a certitude that fallible judgments could only aspire to. Our beliefs might be intimations of ideal truth, revelations of Gnostic knowledge or divine will, inspirations from Scripture or intuitions from a pantheistic Nature. Of course, believers clung to these with the deepest affection! They were the radiant beams of indisputable truth shining down in the murk of ordinary perception. To believers, the means of transmission made beliefs indubitable, and in that context they were viewed as the very best kind of knowledge. But as the Protestant Reformation cycled through its eight generations of doctrinal dispute, fanaticism, and the slaughter of millions in the battle of belief against belief and authority against authority, the certainty of revelation itself came under an attack that centered on the reliability of its means of transmission. This is the foundation of modernism, and the modernist reliance upon the less imperious mode of justification, judgment, remains a thread in the fabric of our culture. As modernism began its era of dominance, individual experience interpreted by universal reason overshadowed other means of justification. Science was modernism’s greatest invention, and judgment its source of energy. Belief began its retreat into the shell of private desire and empty opinion. But the power of belief would rise again, though in a different form.
It acquired an entirely different kind of justification in the twentieth century. Romantic intuition, the latest incarnation of direct revelation, had fractured upon the iron cliffs of doubt during the Victorian age, one that highlighted also the hypocrisies and inadequacies of modernist warrants, particularly those that attempted to find reasons for moral commitment, something the premodern form of belief did very, very well. So belief put on its new clothes in the shadow of World War I (see “The Victorian Rift“). Cut off its divine source, invent an entirely new mode of justification, the coherentist virtual circle, and call into question the modernist reliance on judgment, and you have a new sense of the power of belief (see “Postmodernism Is Its Discontents”). In this model , a dispassionate judgment is not available to the thinker whose picture of the world must always be colored by experience, whose reasoning must be formed by environment, and whose truths must always be personalized as a result (see “What Is the Virtual Circle?“). Given these uncertainties, the truth of any knowledge claim is only to be warranted by the linking of current beliefs to “truths” already accepted. Reflection turned inward, away from the uncertainties of fact and perception and reasoning and toward a felt internal consistency among very private responses to experience. The only truth test available thus becomes the principle of non-contradiction, a very weak test indeed made far weaker by the elasticity of the reasoning that applies the principle since the theory imagines reasoning to be as private as the experience that molds identity. In this twentieth century model, the truth of a belief becomes as reliable as fact, opinion, judgment, or imagining, so long as it coheres with the perspective of the believer. If you believe in it, it is true. Every truth claim is a private belief and only their combined consistency can be claimed as knowledge — but a thoroughly personal knowledge. Truth is reduced to a kind of personal reality. What was once revealed as universal now structures a private universe for the believer. In such a model, my beliefs are as certain to me as they are unknowable to you. Every differing perspective is merely different. Persons say with straight faces that they don’t believe in evolution just as they don’t believe in Santa Clause. Every truth is negotiable. Every good is pragmatic (see “The Problem of Moral Pragmatism”). Every beauty is personal. I trade off my ability to convince you for your inability to dissuade me (see “Postmodernism’s Unsettling Disagreements”).
In the current atmosphere, we can have no assurance that a person’s use of belief to justify a knowledge claim signifies divine revelation, affliction of doubt, or private construct: in other words, whether it relies on the premodern, modern, or postmodern meaning of the term (see “The Axioms of Moral Systems“). I doubt that most speakers could clarify their own warrant for the term as they use it, which might change mid-sentence.. It might be instructive for you to inquire about the speaker’s sense of why her belief is true. If she acknowledges it to be “true for me,” she is most certainly using the term in its coherentist sense, but if she claims her beliefs true for all, you might politely ask for some warrant. In practical terms, most people’s response is likely to be the same, for revelation and personal construction are equally irrelevant in our postmodern environment, and both are treated with a similar degree of disregard by those who equate beliefs with opinions and find them entirely unpersuasive unless they form some tangent to their own private experience. These kinds of discussion are merely social lubrication. No serious engagement with truth is generally required. Only personal relevance matters. What does not reflect one’s ego becomes a white noise.
But that is a loss, for beliefs in their modernist sense are fruitful topics for discussion because they mark the frontiers of our present knowledge. As we reach that murky boundary, we seek to push knowledge to its justificatory limits, seeking to continue the conceptual warrants that allow us to defend our claims as judgments. We step into the open space beyond that point and find reasonable surmises that can’t quite be justified but seem entailed by those things we do know. We are now in the territory of belief, and our entailed beliefs seem too sketchy and ad hoc to warrant convincingly, so we fill in the patchwork as we desire. Proceed a bit further into the unknown and now the desire far outweighs the knowledge, and such tentative stabs in the dark make lots of guesses permissible, depending on what we wish for the truth to be. This is the territory of religious belief, wherein religious metaphor and guesswork predominate and the intense desire for premature closure tempts us to claim what we cannot know. Our disagreements entirely depend on how we fill in the great blank spaces in our knowledge, and if we believe, the result will carry only private force. Of course, the same is true for those whose beliefs differ, and should we compare our patchwork of knowledge and belief, we will find plenty of places where desire patched the weave. We may appeal to authority, which is at least a public source of knowledge and then quote infallible authorities or sacred texts, but these carry no force for those who do not trust them, who perhaps trust other authorities citing other infallible authorities and conflicting sacred texts (see “The Fragility of Religious Authority“). Whether we use belief or authority, beyond the frontier of knowledge we will find other seekers who have put the clues together to form a different picture, each one as permissible to reason as our own but shaped by different desires (see “Religious Knowledge as Mobius Strip”).
We can exemplify that process by examining the actual game of Clue. One can hazard a guess in the first move of the game. It is permissible by the rules of the game, but such a guess seems more a hope than a truth. The only thing that would motivate it would be a desire to win the game. Though such a wild surmise might be permissible because it is not impossible, to venture it aloud seems reckless, even foolish. A few moves later and we begin to know the case, and a surmise is beginning to take shape, entailed by what has been revealed so far and by the rules of the game. At that point players are crossing the frontier from belief to knowledge, but they cannot actually know the truth of the crime for sure until all other suspects are rationally eliminated. The fun of Clue involves finding the moment when the frontier of belief is left behind and the first moment of knowledge begins, when one is motivated less by a desire to win than by the knowledge of who done it, when one has a preponderance of evidence and accurate reasoning to make sense of it. Knowledge and belief are different realms with different kinds of validity. Judgment and justification. Belief and desire. Permissibility and entailment. These terms work as well for the game of life as for the game of Clue, and we would be well-served by employing them.
But that level of delicacy is not likely to happen in the current climate because we have learned through experience that people hold their beliefs dear and their own agency even more dear, so we are likely to seek sufficient cause to change the subject as soon as convenient since they profess to have nothing to learn from a disagreement that would quickly assault their sense of self. After all, beliefs proclaim identity, and challenging them is a violation of etiquette, hardly polite in a culture in which the fixed meanings of essential terms have melted away like all those missing Eskimo words for “snow.”