Modernism’s Midwives

Major Contentions

  • We cannot imagine the desperation with which premodern thinkers faced the epistemological crisis of the Protestant Reformation, one in which all roots of trust were severed, and no truth or goodness claim could be reliably warranted.
  • Combatants in the Reformation faced three knowledge problems: the desire inherent in revelation rendered all conflicting claims equally suspect; only by sleight of hand could private belief be converted to institutional religious authority; and such authority still could appeal to no dispassionate arbiter when confronted by conflicting authority.
  • Only gradually did a few thinkers began to suspect that authority per se might be the core problem.
  • The first of these was Descartes, who found in mathematical proofs the evidence of universal human reasoning that might ground consensual public truths to end Europe’s Reformation nightmare.
  • The Cartesian method of rigorous rational “proofs” left something to be desired once individual experience was considered; the result was a “modernist cannibalism” in which every effort to ground universal truths upon individual experience produced some level of doubt; this process occupied the seventeenth century birth of the Enlightenment, definitionally a search to increase the reliability of modernist axioms of commitment.
  • The first modernist cannibal was Francis Bacon, who was forced to accept less certainty by examining the peculiarity of experience.
  • Only gradually did rigorously rational examination of experience produce empiricism as a partial solution to ground public truth claims; this effort was the beginning of modern natural science.
  • This modernist cannibalism did not only threaten religious authority; it also found natural explanations for what had formerly been considered divine prerogatives, leading to Deism.
  • Skeptical empiricist philosophers continued to apply their critique to experience, producing both rigor of thought and skepticism of conclusions through the work of Locke, Berkeley, Hume, and Kant.
  • The result of these philosophers’ work was to confine consciousness within the perceptual wall of the thinker’s own awareness, with no certain means to guarantee her judgments; Kant’s emphasis on “phenomena” provided only intersubjective reliability sufficient to ground universal rational thought.
  • The need for consensual public morality, including law, was satisfied by social contract theory, which was rapidly cannibalized by later thinkers.
  • Using the framework of the empiricist philosophers, experimental and observational science began its long period of increasing rigor, though many studies we consider to be pseudo-science were considered empirical until science was finally professionalized at the end of the nineteenth century.
  • By the eighteenth century, modernist axioms guided social reformers, though most persons still sought at least some guidance from institutional authority, including formal religious authority, despite the obvious clash of guiding axioms of commitment.
  • The implicit clash of these guiding axioms produced four effects: first, an ongoing nostalgia for a revival of religious authority as a more certain guide to moral ends; second, an attempt to pierce the perceptual wall through some means other than reason, which gave rise to Romanticism; third, an appeal to a range of pseudo-sciences to provide moral guidance, all of which became known as the human sciences; and fourth, a plethora of hypocrisies as institutional authority sought to base its appeal on some impossible composite of premodernist and modernist axioms of commitment.
  • It is these hypocrisies that postmodernism was all too happy to reveal over the course of the dismal twentieth century.

Imagine you are blindfolded and ear-plugged and told to move forward. You have no sense of direction or means of detecting dangers or judging your progress. And you have every reason to think this condition will not improve. The sense of helpless befuddlement you would suffer is nothing compared to the desperation of Western culture in the mid-sixteenth century. The ultimate guarantor of reliability for all truth and goodness judgments, authority, lay in smoldering ruins as civil and religious powers vied for influence, vied in brutality fueled by the darkest fears of God’s wrath in the next life and the unraveling of all civil society in this one. Millennialists prophesied the apocalypse while opportunists fed the generalized panic with imagined fears for their own advantage. And the worst of it was this: no one knew what to do.

Since the fall of Rome in the west in the fifth century, authority had woven itself into every thread of the fabric of life as a guarantor of at least a modicum of civilized life in the face of barbarian decline. Its survival maintained some small hope of resurrecting empires and restoring peace. So long as princeps and paterfamilias and pontifex survived, civilization might have some hope of renewal. We can hardly imagine the desperation with which the shredded remnants of Roman civilization turned to their local episcopates as the last vestige of a vast imperium, nor can anyone now living even comprehend how total was their trust in the authority that gave structure and meaning to the social order. To understand premodern authority requires that one disavow modernity and its presumption of personal autonomy, which no one now living can do. What modernity replaced in essence was a willing surrender of rational and moral control, a submission of individual agency in trust to another thought more capable of providing direction than oneself. The need for such a grant of trust was inversely proportional to social order and predictability. In seeking security, feudal society was forced to also seek continuity. Societal change was stalled until the Carolingian Renaissance of the ninth century, a half-millennium after the fall of Rome in the West. By our standards, progress was glacial but though the pulse of life quickened in some places in the fourteenth and fifteenth centuries, church authority absorbed the challenges of councils, monasteries, friars, and princes as it had absorbed the more fundamental threats of barbarians and feudal structures.

All of this came crashing down with Martin Luther’s revolt of 1517 (see “Premodern Authority). It is strange to me that the history of that era is as foreign to most of us as the Yuan Dynasty of China. Is it a secular delicacy about religious discord that has even historians treating the Reformation as a political event? It assuredly became political, particularly during the last decades of the Thirty Years’ War (1618-1648) and after the Restoration of Charles II in England (1660). But even the Glorious Revolution, the closing event of the Reformation to historians, had its roots buried in the soil of religious extremism. How much more confusion attends running this disaster movie backward to its dreadful beginnings: to that Hollow’s Eve’s posting of the Ninety-five Theses on the church door in 1517!

Something new was born in those miserable years of sectarian conflict. But modernism was not inevitable. On the contrary, its form could not even be imagined by the religious zealots who saw all truth and goodness as emanating from God’s word alone as interpreted by His clerical prophets and their anointed secular surrogates. The hundreds of sects and subdivisions produced by this warrant were bound by three knowledge problems. First, they derived from revelation and inspiration as interpreted by the beliefs of the founders, a recipe for schism, discord, and challenges to personal authority. As these new movements gained adherents and their founders died, often from persecution, personal authority was necessarily transmuted into institutional authority with the consequent hardening of orthodoxy that requires. This second-generation development magnified the second knowledge problem these competing religious dogmas faced. The seeds of all their truth and goodness claims had been proclaimed by their founders as revealed or inspired truth, not subject to external verification, of course. But as the essence of their truth claims necessarily were universal, they could brook no challenge to their truth or goodness and since the initial divine revelations resembled those now enshrined in Holy Writ, these new oracles’ disciples found it quite natural to reproduce the authoritarian patterns of their predecessors. But that proved problematic, for the nature of the initial claims, being personal and idiosyncratic, must necessarily challenge older claims previously considered authoritative and be challenged by younger zealots yet to be born, producing a profusion of discord. These challengers differed in their dogma, of course — these differences necessitated the reformations and purifications each proposed in the first place — but in terms of warrant, their claims were identical to those they challenged. Thus, the third knowledge problem inevitably arose: no means of reconciling the claims of these sects could ever be produced from within their means of justification. All of the Luthers and Calvins and Muntzers and Williams and Simons and Lees preached God’s own truth, and each preached something different. Since salvation followed choosing the true and damnation the false, and since each could offer only institutional authority to second-generation adherents, none could claim the higher ground, for all appealed to the same means of justification. Electors broke with princes who broke with kings in their turn as bishops accused popes and laity charged clergy with satanic collusion. Premodern authority foundered on the rocks of self-interest while attempting to retain its divine aegis. This wreckage took half a millennium to clear away, finally splintering in the ruins of World War I.

At least some thinkers slowly realized that authority itself was the problem. The justification issue involved in the uprisings, revolts, genocides, secessions, wars, and rebellions took a miserably long time to make itself clear, but what was clear all along was that nothing could replace authority’s tradition and sense of certainty.

In its purest form, the problem was first understood by Rene Descartes (1596-1650). Born into the French wars of religion and the legacy of the St. Bartholomew’s Day Massacre, he matured in the uneasy peace between Catholic and Huguenot of the early seventeenth century. This inventor of Cartesian geometry saw in mathematical proofs a certainty analogous to that lost by authority in the Reformation nightmare. This universal reasoning capacity could arrive at consensual warrants for truth and goodness claims, he argued, if we were only careful enough in framing our axioms, postulates, and chains of reasoning. His famous phrasing of “I think; therefore, I am” appeals to the potential of each human mind to formulate truth and goodness that every other rational agent would be compelled to agree to. Authority had promised certainty but had foundered on its inability to resolve conflict. This was the Reformation’s dismal legacy. But Descartes promised that here was the certainty of the geometric proof coupled with the power to detect and correct error. That after all is the charm of reason, one that places it far above authority as a means of correspondence justification (see “What Counts as Justification?). You and I might disagree on some point of argument vital to truth or some hinge of morality, but we might then retrace our steps, check our work, so to speak, and find where either of us had erred. And look: no need for nailing doors shut and setting houses of worship ablaze. No need to insist that heretics “have merited death in body and soul.” We can speak reasonably, work through our differences, and mutually choose the true and the good.

The rationalist movement continued to proclaim its merits, but other voices were already beginning to practice what might be called “modernist cannibalism,” using the Cartesian method to reveal the deficiencies of its solutions. Their objections to a strictly rationalist approach culminated in David Hume’s critique of purely analytic concepts as “meaningless without regard to experience.” Some more complex form of consensual justification was clearly required.

This was the project of Descartes’ contemporary, Francis Bacon. Descartes had considered the promise of closely attended experience, but correctly concluded that undifferentiated experience was too variable and unreliable upon which to build the certain knowledge that any successor to authority must claim. And, of course, he was correct, but as later thinkers revealed, his own method was doomed to sterility if it attempted to use mathematics as its model. Reasoning about reason could only yield definition and tautology. Bacon was willing to trade certainty for relevance, and he was convinced that the defects of the senses could in part be remedied by identifying the errors we make in reasoning about our experience. He lowered the bar for claiming truth, in part because he was so aware of the pitfalls that reasoning falls into as it processes experience. His efforts yielded two promising rewards. First, his distillation of the errors experience generates led to the gradual standardization of what was later called the scientific method. Secondly, he began a multigenerational conversation about the level of reliability necessary to warrant truth claims. In the shadow of authority with its reliance on revelation, that issue had not risen to awareness. But now, with authority in ruins, it had to be posed.

The problem was that the line kept moving. The seventeenth century saw a sustained conversation on the topic with both rationalists and empiricists, the analysts of experience, digging ever deeper into the issue. From Descartes to Berkeley in the early eighteenth century, these rigorous thinkers gradually removed God’s hand from efforts to warrant truth and goodness claims, finding the source in the human mind. The conversation continued through the eighteenth century, the full flowering of the Scientific Revolution and the Enlightenment, the former studying the science of nature and the latter the science of man. All that authority had warranted, particularly all questions of quality and morality, had to be renegotiated on modernist grounds even as the educated person’s understanding of modernism grew more sophisticated. Reason demanded that creation and its apparent order be respected, and a powerful nostalgic yearning for certainty guaranteed God’s continuing presence in the modernist view, but with laws of nature opening like springtime tulips, God seemed to have less and less to do, so Deism promised a benign lawmaking divinity whose universe tended toward perfection. Locke argued this perfection extended to human perceptions that mirrored reality without apparent distortion, guaranteeing both reason and experience a kind of objective quality that later thinkers would find hopelessly naïve. The great skeptical arguments concerning the perceptual wall that rendered perceptions and reflections unreliable were bruited by Berkeley and Hume, casting doubt on the modernist reliance on reason and experience as near certain substitutes for lost authority. The great compromise offered by Immanuel Kant increased perceptual reliability by means of categories of experience that make universal and consensual warrants for our truth and goodness claims possible, but like all compromises, it also gave up something vital. Kant acknowledged the empirical cannibals’ objections by conceding that we cannot know reality as it is. Our knowledge is only of our own perceptions. The conceptual wall that Berkeley had erected with his famous query about the tree falling in the forest could not be pierced. We cannot know the world but we can share common understandings of our impressions of it because our minds operate through human categories of experience. Thus, sense data perceptions come into consciousness prepackaged by our unconscious mental constructions into intelligible perceptions about the world. These categories of experience gave us a mutually intelligible picture of reality. Yet by Kant’s own admission, one forced by Hume’s skeptical observations, we can never be sure that we aren’t merely building order from what is really chaos. No wonder every event has a cause. Our minds construct reality to make it so. Phenomena are individual and unrepeatable, so our penchant for using cause/effect to predict future events will always prove unreliable, unless we construct reality so as to confirm our preconceptions. These arguments were as deeply disturbing as they were rational, and they raised lingering doubts about the reliability of the enterprise of experience. The lesson was taken: tighten the empirical process!

Meanwhile, new warrants had to be fashioned in this intellectually rich but challenging new world. What justifies political order if not God’s will and received authority, for instance? The modernists cobbled together their replacement, and so divine right monarchy evolved into a primitive sort of social contract theory thanks to Thomas Hobbes’s Leviathan (1651) (see “Why Invent the Social Contract?“). Like everything in the modernist contract, it too was cannibalized by later thinkers who found logical flaws in its structures and applied the self-correcting critiques that Descartes had envisioned, though social contract theory was by its nature hopelessly flawed (see “Two Senses of the Common Good.) By the time Rousseau wrote his version a hundred years later, the social contract would have been unrecognizable to Hobbes.

The same self-criticism applied to the evolution of science. The astonishing chain of work by Galileo, Copernicus, Kepler, and Newton vindicated Bacon’s judgments on the self-correcting quality of closely attended experience. Nothing could be further from the endless hostilities of competing prophets than this slow accretion of knowledge, each building on the painstaking efforts of an earlier generation through the development of expertise in a limited field of study. But as spectacular as seventeenth and eighteenth century scientific progress seems in retrospect, we must be cautious in overpraising these early scientific efforts, for practitioners were only beginning the long and reductive process that produced what we know as natural science. All kinds of enterprises were regarded as “scientific”: phrenology, numerology, alchemy, and the kind of thinking that the philosophers practiced. Bacon’s methodology required trial and error to perfect. That was not completed until near the end of the nineteenth century for what we know as the hard sciences and in regard to the human sciences has yet to be completed. But even its earliest achievements demonstrated that empiricism was to be less certain but far more incrementally productive than authority had been.

By the dawn of the eighteenth century, the banners of modernism were fully unfurled and its territories marked. The collapse of authority as warrant was accomplished and the new competitors of universal reason and individual experience began their two centuries of dominance. They would undergo their own trials. Born in chaos and ever subject to self-doubt, the justifications that defined modernism would never exercise their power as authority had. And indeed, authority retained an echo of its former allure. Its lost sense of certainty remains culturally appealing, particularly in times of upheaval such as occurred after the crisis of World War I brought modernism’s deficiencies to public attention. Its adherents today rightly question its emphasis on the rational and autonomous individual pursuing truth and goodness through the independent application of her own reasoning. They look nostalgically back to the more communitarian moral structures of former times when authority enmeshed individuals in a social matrix. (see “Tao and the Myth of Religious Return”). What marked those premodern movements is almost entirely lacking in today’s postmodern world: a public means of moral warrant. Also, in times of great cultural complexity such as ours, institutional authority offers the temptation of simplicity that more appropriate justifications cannot match, for it only requires a power hierarchy to operate, and our culture is stuffed full of those. Yet we view authority with suspicion, don’t we? It will never be a match for the warrants modernism offers because it cannot escape the catastrophic failures of its past or the inherent limits of its mode of justification.

These challenges reached a crisis point in the great epistemological crisis at the dawn of the twentieth century (see “Modernism and Its Discontents”). But then, modernism had been born of crisis, had applied its own methodology to its conclusions and found them doubtful. Modernism’s severest critics were its own admirers, but their rational objections to its premises suffered from what might be called “the Clue problem.” Remember the board game that had you search for a murderer, a weapon, and a murder scene? Players had to base their hunches on incomplete knowledge; the structure of the game allowed multiple rationally warranted hypotheses to apply to the crime. Of course, only one could be correct. So it was with the empirical cannibals, who found rationalism offered multiple and convincing answers to questions about the true, the good, and the beautiful. But these theoretical creations were grounded in too weak a foundation, leaving persons starving for moral clarity to fill in the blanks with their own confirmation bias, a generalized problem until the twentieth century separated scientific thought from common sense, leaving it to the human sciences to play their own game of Clue with their own solutions. It is little wonder that the Romantics returned to gnostic revelation to offer greater certainty and that empirical science with its more rigorous procedure would dominate investigations of material reality, but the improvements in its methodology forced a vow of silence on judgments of quality or morality. The black hole of a missing moral framework, and particularly of a missing public moral framework, exerted such gravitational force that it radically distorted the entire history of the human sciences (see The Calamity of the Human Sciences). Modernism’s moral failures produced a second great knowledge crisis that consumed the entire twentieth century (see “The Axioms of Moral Systems”). One failing solution with an even less satisfactory resolution produced the great warrant of postmodernism that continues to warp our public discourse (see “Postmodernism Is Its Discontents“).

One thought on “Modernism’s Midwives

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s