The Victorian Rift

 

Ah, love, let us be true
To one another! for the world, which seems
To lie before us like a land of dreams,
So various, so beautiful, so new,
Hath really neither joy, nor love, nor light,
Nor certitude, nor peace, nor help for pain;
And we are here as on a darkling plain
Swept with confused alarms of struggle and flight,
Where ignorant armies clash by night.

Matthew Arnold wrote “Dover Beach” in 1851 during the fourteenth year of Queen Victoria’s long reign. Contemporaries might have wondered what he was complaining about. Europe was pawing the world into submission and Britain sat atop Europe as a rider on her steed. No far-flung, God-forsaken isle was too frigid, torrid, or barren to escape the Union Jack. No race, religion, or creed could resist its influence. As its mercantilism remade the world in its image, the landscape of the home country belched forth industrial centers as tirelessly as factories billowed smoke. Old forms of life vanished in the blink of an eye as new classes and values sprouted upon England’s green and pleasant land. An exciting era, to be sure, and despite its breathless rush to the future, it was largely a peaceful one. Britain was the only major European nation to largely escape the scars of war or revolution during the reign of Victoria. A continent inured to perpetual conflict saw its longest era of peace since the fall of Rome. The “Pax Britannica” was both impressive and real, a stable balance of power in the midst of a mad imperial scramble, despite the growling of that king of revolutions: the industrial one. When Queen Victoria came to the throne in 1837, she had to travel from Rome to Britain to receive her crown. The journey took fourteen days, the same length required of Julius Caesar as he marched from Britain to Rome in the final spasms of the Roman Republic. That was in 49 B.C.E. It is no exaggeration to claim that life changed more over the course of Queen Victoria’s reign (1837-1901) than it had in the preceding two thousand years. Scientific advances were more than matched by  manufacturing, transportation, and quality of life improvements. Britain became a true democracy. All that progress, though muted as it rumbled eastward across the Continent, still reverberated throughout the world. The nineteenth century is called the Victorian Age for good reason: The sun never set on British values. In light of Britons’ pride in their own progress, it seems fair to ask what Arnold could possibly be referencing in these lines. The historian might say he was mourning the loss of an old way of life and fearing a new one. The layman living in our own blur of change might pose a natural question in response: what historical epoch has not produced its advancements and setbacks? Granted that the pace of life quickened during the Victorian era, hasn’t it quickened even more since? Has the last century revealed Arnold to be prophet or crank?

Consider another prediction written by W.B. Yeats in 1919.

Turning and turning in the widening gyre

The falcon cannot hear the falconer;

Things fall apart; the centre cannot hold;

Mere anarchy is loosed upon the world,

The blood-dimmed tide is loosed, and everywhere

The ceremony of innocence is drowned;

The best lack all conviction, while the worst

Are full of passionate intensity.

If Arnold’s words are the voice of one crying in the wilderness, Yeats’s are the air raid sirens. The Irish poet issued his own prediction, not of private despair but of public collapse. Could anything have happened during the Victorian era not only to validate Arnold’s fears but to magnify them into that rough beast of cataclysm Yeats saw on the near horizon?

I wish to answer the nut of that question in this analysis. My effort necessarily must be limited by two considerations. First, a multiplicity of deep responses is already available in the usual categories: political, economic, sociological, military, and intellectual. Academic incentives being what they are, these broad divisions have been subdivided into their constituents down to the hair’s breadth and tilted to satisfy the current investigative climate. My interest here as everywhere is to assume my reader can find more than enough information on the trees, branches, and leaves but is nevertheless lacking the one essential map to the entire forest. The key to a deeper understanding lies in the warrants the actors in the drama brought to the stage. And though any historical era may with profit be examined by these means, the Victorian era, particularly the last half of the nineteenth century, is unique in the employment of its warrants for moral goodness (see “What Do We Mean by ‘Morality’?”). It faced a clash of positions it could neither reconcile nor ignore, though its opinion makers tried to do both. Worse, it is doubtful they fully understood this discord, living as they did in the eye of the hurricane whose intensity they had not predicted and whose future path they could not know. My second consideration is that this clash, poorly examined in itself, has profoundly influenced our own age when “ignorant armies clash by night.” In truth, the situation has worsened because the Victorian Age has been mistaken by its postmodernist critics (see “Modernism and Its Discontents”) as a general epistemological crisis when it was in truth a moral one. Their critique has removed from consideration the very tools necessary to repair the breach (see Postmodernism’s Unsettling Disagreements”). It is not that that “the centre cannot hold.” There is no center, at least in the sense the Victorians thought of it: as a unifying moral imperative. We read in the despair of Arnold and Yeats a message in a bottle floating up from the distant past to our own stormy present. The challenge is to seek the source of their despair and, having traced its effect on their world, examine its dangers to ours.

The Victorian era marked the mature phase of modernism, an intellectual movement begun in the seventeenth century as a desperate effort to find the means to warrant persons’ claims to truth, goodness, and beauty. By the time Arnold wrote “Dover Beach,” modernism had acted upon ten generations of Western culture. It began as a jury-rigged response to the Protestant Reformation of the sixteenth century. That response was necessitated by a total collapse of warrant over the 150 years of Reformation bloodshed. Declarations that had been defended by authority since time out of mind were disputed, challenged, and rejected in the name of other religious authorities believers found more convincing. But as Jesuits warred upon Lutheran clergy and Lutheran princes upon Catholic kings and Calvinists upon dissenting sects and prophets upon prophets–history offers few bloodier, more obdurate, or more miserable examples–authority itself came under suspicion for what seems in retrospect an obvious flaw. Of the five correspondence warrants for truth and goodness (see What Counts as Justification?), only authority fails to provide a means to reconcile dispute within its own mode of warrant (see “Authority, Trust, and Knowledge”). And that proved problematic since religious authority had provided the base support for all goodness claims and most truth claims that molded medieval culture. Replacement warrants were eventually found. They had to be. These became the defining ingredients of modernism, so after 1700 all truth and goodness claims had to find their justifications in reason and closely examined experience, both appealing to the individual for ultimate validation. Modernists assumed persons shared a universal reasoning faculty. Much of their early thinking reflected an attempt to define it, explain its workings in irrefutable terms, and examine how it adapted to experience (see Modernism’s Midwives”). The goal was to provide a substitute for the certainty of authority’s justifications that now lay shattered along with the doctrines of the Hutterites, the Schwarzenau Brethren, the Knipperdolings, and a thousand other sects. That proved difficult, for the more rigorously these everyday processes were examined, the less confidence thinkers could find in their reliability. Of course, this doubt was achieved by precisely the process it disputed as each generation of thinkers challenged the rational assumptions buried in the explanatory models of its elders. This assault exacted its price, at least on serious epistemological thought. By Arnold’s day, that lack of confidence was still confined to a learned few who could master abstruse thinkers like Hume and Kant. Their skepticism entered the popular culture indirectly through academic circles but would be widely disseminated in the second half of the nineteenth century through a new popular media that spread doubts about universal reason.

Those doubts faced a clear counterargument, not in theory, but in practice. No one affected by Western culture could dispute the astounding success of natural science, and those who admired it or practiced it felt completely justified in pointing to the ever-tightening subject disciplines of the sciences to prove the skeptics wrong (see “The Limits of Empirical Science”). Here was the bloom of the modernist theory in full flower: universal reason applied to experience unlocking the secrets of the universe, discovery building upon discovery, invention upon invention, touching daily life from Montevideo to Manchuria and probing the very mind of God. Empirical science confirmed modernist warrants, the proof lying in a cataract of discovery and an ocean of invention as each generation of scientists stood on the shoulders of the one before. The philosophers of the mind had been called empirical. Their investigations of experience had only produced doubt and despair, but the true empiricism of natural science tramped through the nineteenth century, a conquering army of ever tightening disciplines. Throughout the world, who could resist this proudest envoy of modernist warrants?

Now from the superior vantage point of our own interests, some of us might challenge such a rosy picture of natural science, seeing buried in its discoveries the seeds of exploitation of native peoples and the environment, in its inventions the means to destroy social harmony. That condemnation has merit, yet framing it relies on the very modernist justifications that it condemns, calling as it does for analysis and evidence to support or dispute. That argument highlights the growing gap between what might be called “common sense” and the increasing standardization of scientific endeavor. Victorians were slow to recognize that reason and science are not synonymous, for the latter requires a very narrow refinement of the former, and that narrowing precludes an empirical evaluation of empiricism itself. Neither Victorians’ admiration nor our own doubts about the goodness of science can be warranted by the same means science uses to understand phenomena. Science cannot find goodness beyond utility, cannot issue judgments of quality or morality. Yes, it can find a good material from which to construct an elevator cable or to prevent sunburn, but it cannot use those methods to decree that the cable should be made or the sunburn prevented. It can extend life, but it cannot demonstrate by its method that life should be extended. And that proved an insuperable problem. As its truth-finding methods were narrowed by the ever tighter rules of empirical inquiry culminating by the time of Victoria’s death in the mature academic scientific disciplines we all now know, that same constriction of methodology shucked off any hope of goodness claims discoverable by natural science. At the very moment when it mastered the most reliable means to know truth, or at least some kinds of truth, its means of mastery precluded any examination of any goodness beyond simple utility (see What Do We Mean by Good?”).

To the Victorians awash in random transformation and change, that limitation was insupportable. For them as for us, truth and goodness are intimately connected. What is the point of a correct knowledge of reality if it cannot direct our choices (see Truth and Beauty Do a Dance)? What good is science if it cannot make life better, cannot even address the meaning of “better”? In the welter of choices Victorians faced daily, what could differentiate good, better, best if not the triumphant methods of the sciences? If empiricism cannot quantify or assist our choosing of what is good, what can?

Now this crisis arrived at that odd intersection of philosophy’s assault on reason and empirical science’s appropriation of it. The way narrowed further as natural science refined its methodology, sending its rejects into the shadowy disgrace of pseudo-science. This constriction of access to truth only deepened an incipient hypocrisy built upon Victorians’ atavistic defense of authority and their baseless assertions of moral superiority. The Victorian mind endured a gnawing doubt beneath its arrogance. Even as explorers carried their gospel of wealth to guano miners in Chile, as missionaries carried their quite different gospels up the Congo River to bewildered peoples in the world’s hearts of darkness, as obedient daughters imbibed arcane rules of etiquette in Connecticut drawing rooms, the culture groped for the iron laws of nature that might order and warrant its values but found only a receding tide of justification as empiricism tightened its method. This simultaneous discrediting of common sense reasoning and glorification of empirical processes accentuated Victorians’ hollow moral life and left a space that three justificatory schemas tried to fill. These were authority, Romanticism, and the human sciences. Each professed to offer the path to reliable knowledge of goodness. Our own age testifies to their failures.

The oldest of these had been beaten yet had not surrendered. Modernism had developed during the seventeenth and eighteenth centuries as a response to the failure of authority, specifically religious authority, in the nightmare of the Protestant Reformation. We can enter that nightmare only retrospectively, so different is our own worldview from that dominant before 1517 when Martin Luther began his revolt against the Catholic Church (see Premodern Authority”). Authority’s glory as a warrant was that it offered a sphere of absolute certainty in which truth and goodness claims are mutually supportive. Because its truth claims must be accepted on the basis of trust or belief, the goodness preferences that they framed must seem to the believer an inseparable part of the package (see The Fragility of Religious Authority). Absolute authority suffered from none of the embarrassing bifurcation of truth and goodness claims that modernism has faced. Despite its repeated humiliations during the 150 years of the Reformation when authorities countermanded each other’s declarations with nauseating predictability and deadly consequence, a seemingly endless quarrel that only resolved itself with modernism’s new warrants of reason and experience, the lure of certainty remained a very strong one. So many persons found tradition a great comfort in the dislocations of the Victorian era, despite its embarrassments and longstanding failures. Rulers still found it expedient to add divine will to their own, and we cannot doubt that missionary motives account for a good part of the zeal with which Europeans sallied out to dominate the world, at least at first. Nearly everyone failed to see that premodern warrants were now vying with modern ones, at least until someone like Darwin forced them to. We should not see this longing for tradition as simple nostalgia. It reflected modernism’s poverty in justifying moral claims, one its ethical theories found challenging to resolve with the individual moral agency that lay at its core. Arnold saw the issue as a draining away of “the sea of faith,” an irreparable loss. Many modernist writers reflected that sense of loss well into the twentieth century (see “Tao and the Myth of Religious Return”). Our current cultural crisis echoes at least some of that longing to return to absolute religious authority despite its inability to resolve challenge in a multicultural environment (see Belief in the Public Square”).

That longing remained unrequited for many in the Victorian era as it does today because authority had long since lost its aura of certainty. Persons might still attach themselves to it, but the zeitgeist required them to plumb their own sense of moral responsibility before forming that attachment, always recognizing those alternatives that now presented themselves as living options to be hypothetically calculated and rationally rejected so that an affiliation with authority might be continued (see “Toward a Public Morality). But that prudential calculation was in itself an expression of doubt that only individuals’ moral agency could overrule, putting authority perpetually under assault by the axioms of modernism that presented a buffet of alternatives to those sheltering from the storm of change in the bulwarks of tradition. This tension expressed itself in the repetitive efforts of Victorian thinkers to present the components of tradition as separate appendages of a single body of authority as though parents, church, culture, and God had spoken the same words with the same voice. But as Victorianism matured, this proved an effort increasingly desperate and ineffectual. One cannot read the literature of that era without being struck by the strained attempt to unify the claims of its authorities into a single sweeping justification rooted in tradition, nor to observe the untidy outcomes of such an effort as the winds of change battered the old fortifications. By the middle of Victoria’s reign, the stress of maintaining the forms of traditional authority while also acknowledging the source of moral power in individual reasoning began moving toward crisis.

One vibrant moral alternative presented itself in an endless loop in popular media, that new and stimulating alternative to traditional sources of knowledge. What media pitched after 1800 was a relentless Romanticism that found reliable declarations about truth and goodness within the receptive heart. In one sense, Romanticism was a conscious antithesis to modern preoccupations with careful reasoning about experience. Ironically, it was born from that long search. When Kant argued in 1781 that our minds cannot guarantee knowledge of reality but must instead operate upon phenomena floating into consciousness from some preconscious mental constructions, he completed an intensely rational critique of reason and experience begun amidst the ruins of the Reformation by Descartes a century and a half earlier. The possibility that these mysterious intuitions were discovered rather than constructed became a new obsession for German thinkers. More importantly by far, the desperate hope that they might be divinely inspired revived what had been a lost quest for moral clarity that still respected modernism’s elevation of individual moral autonomy. By 1798, when Wordsworth and Coleridge rushed their poetry into print, the thought revolution shifted from the classroom to the garden and from the mind to the emotions, and from the educated few to the literate many. The charm of this new popular Romantic literature was obvious. It explicitly rejected the abstruse and dry theorizing of the philosophers with its plodding and contentious analysis and replaced it with something glorious. The empirical philosophers, most notably David Hume, had bound modernism’s moral options to cultural consensus or sentimentality with no sure rational guarantee. This noose tightened with the attack upon common sense waged by an increasingly confident and professional empirical science. But Romantic glorification of intuition promised to bypass these rational limits and grant a level of moral certainty lost since the Reformation. And it seemed so easy, so accessible, and so universal! Nature, silence, exotic experience: all were portals to indubitability. Romanticism took modernism’s relocation of moral autonomy and elevation of experience quite seriously. It tapped a long-standing vein of Gnostic revelation in religion to imagine a pantheist god-in-nature whispering to that part of itself that is the private human soul, communicating to the moral agent not only certain truth but also what modernism could never supply and what the age so desperately needed: certain goodness. “What I think is right, is right,” Thoreau proclaimed. “The heart’s emphasis is always right.”

As Romanticism wrapped the new popular culture in its warm embrace, it found itself confronting and opposing the clear-cutting march of empirical progress. The two approaches proved irreconcilable. Science could be performed. But it could not be lived. Its reliance on analysis, the rational disassembly of carefully monitored experiences, gave empiricism its power to discern the truths those experiences conveyed, so long as they proved capable of isolation, measurement, and repetition. But what could one do with such truths if they did not also reveal what one should do? Romanticism’s attraction lay in its universal truths privately transmitted, but its power lay in its claim of an easy and intuitive discovery of moral and qualitative goodness. And this was a claim science could neither match nor refute. “Sweet is the lore that nature brings./Our meddling intellect/misshapes the beauteous form of things./ We murder to dissect.” Explicitly opposed to modernism’s reliance on reason and even more opposed to reason’s scrupulous applications in science, Romanticism nevertheless appropriated modernism’s utter rejection of authority. But as the Victorian age advanced, it was empiricism that won the battlefield. As Thoreau drifted above Walden Pond in 1854, he wondered at the mysterious harmonies Mother Nature had wrought among fishes and frogs and frothy wave, at the organic wholesomeness of the thousand caresses of nature’s god. Five years later, Darwin explained it all quite differently: as a desperate and vastly wasteful war of each against all for survival, nature red in tooth and claw: determinism without purpose (see The Determinism Problem”). And that presented every person exposed to Victorian culture with a quandary. How could these deep antinomies be reconciled on the factory floor or ladies’ drawing room, before the mast on rollicking tea clippers, or by earnest discussions in Chautauqua lecture halls?

None of this was as cut-and-dried as presented here, which only added to the confusion. The good Victorian was expected to somehow integrate these warring values: to be both heartfelt and deliberate, intuitive and rational, sensitive and sensible, respectful and daring, to treasure both heart and head, though how to arbitrate their conflicting instruction could never be resolved. Much of the disrepute we associate with Victorianism is traceable to this impossible task, and at least some of the confusion of our own age was born in the Victorian response to modernism’s poverty of ethical theory.

The zeitgeist required that any possible solution must demonstrate a deep respect for the moral autonomy of the individual, but of what use is any moral theory if it cannot prescribe the means to reconcile conflict between friends and among strangers and do it by means all parties can embrace? This requires that the mode of warrant be congenial to disputants and also that they not reject whatever axioms persons bring to their warrants (see “The Axioms of Moral Systems“). But the contraries of Victorian life blocked such agreement in public disputes at the historical moment when absolute authority revealed itself hopelessly compromised. Kings and emperors might still wrap themselves in the purple and squat on their gilded thrones, but practicality required them to accede to the popular will. If only they could find it! And private morality faced a similar impasse as persons sought some thread of consensus in their preferences or at least some axiom congenial to their divergent private interests. In hindsight it seems unsurprising that the only moral stance that could respect such diversity while providing a consensual framework for morality was one that prescribed the means rather than the ends of morality. This moral theory was utilitarianism. It deeply respected individual desire, abstaining from taking a position on the moral melee that so confused the age. Its only rule was that desire be subjected to the dictates of universal reason. So as to avoid charges of total pragmatism (those would have to wait until the next century) utilitarianism clothed its methodology in the one area of knowledge where modernist axioms had clearly triumphed: science. Its aim was to be an empirical science of morality.

I have already written that effort off. We know that true science must fail in finding goodness for the exact reason it succeeds so well in finding some kinds of truth: its mode of inquiry severely limits the kinds of experiences it can examine. And these are only a very narrow slice of life, though natural science has performed wonders in that range. Science describes perceptual phenomena with precision and breaks them into their components with surgical care, forming hypotheses and testing them in settings wherein variables can be isolated and experiences precisely duplicated. Of the correspondence proofs of judgment, empirical science is the royal road to truth, at least for those few truths thus discoverable. We know that now. Over three centuries of traveling that road, we have gradually discovered just which kinds of truths science can seek and find and which it can’t. That process is ongoing, but it achieved maturity only at the end of the Victorian era with the widespread adoption of the German model of academic science. Before that, there had been both hope and confidence that science would one day crack the nut of morality as it had of biology and physics, that we would one day possess prescriptive knowledge as reliable as the descriptive knowledge science continued to unwrap about the world. To prescribe moral goods would require more than a new empirical discipline. It would require a new science.

From the very beginning of modernist thinking, the science of nature had been a model for an unfolding science of man. Early epistemologists expressed full confidence that human behavior would one day be as accessible to scientific inquiry as the larval stage of the fruit fly. Victorian scientists like Auguste Comte and Max Weber practiced a positivism that promised to decode free will and prescribe irresistible social goods. In order to claim empirical warrant, though, that promise had to eliminate the troubling intrusion of human freedom (see “Our Freedom Fetish”), so as to make human behavior predictable and testable. The positivists had every confidence that science would one day solve this problem, but even if it could resolve this basic analysis of human nature, it would still face the thornier issue of prescription. It would not only tell humanity necessary truths; it would use them to clarify necessary preferences, to replace religious certainty with scientific precision. That failed. As the hard sciences continued to refine their methodology, the soft ones found themselves unable to resist their own hubris, a smug superiority instilled by what seemed a scientific approach yielding irresistible moral conclusions. We see it the positivists’ airy dismissal of challenging paradigms and differing methodology. Better known practitioners include Adam Smith, Karl Marx, and Sigmund Freud, whose analyses of human behavior blurred description and prescription to the point of intentional deception. Who could question Smith’s admiration for the wisdom of the invisible hand, Marx’s regret over the loss of feudal communitarianism, or Freud’s contempt for religious authority? Was this science? In a Victorian world reeling from a simultaneous banquet of scientific progress and famine of reliable moral theory, a conjunction seemed inevitable, particularly since the boundaries of what passed as science in the popular mind made it seem akin to magic. All manner of studies were treated to a kind of pseudo-scientific seriousness as physiognomists built their catalogues of facial features, ethnologists of racial types, occultists of paranormal manifestations, and so on. Why shouldn’t empiricism in its deep study of human motivation reveal a science of moral health?

Utilitarianism tapped into that rich vein of promise with its seemingly precise quantifying of hedons and dolors and its rankings of higher and lower pleasures. Its premise was a reasonable one. If persons have both universal reasoning and moral autonomy, no moral goods should be externally prescribed nor any duties imposed. Private desire seems too varied for such directives, and the presumptions of modernism assured Jeremy Bentham and John Stuart Mill that individuals could apply reasoning to experience to resolve dispute (see “Three Moral Systems“). But that was not happening in the Victorian melieu. Romanticism and traditional authority denied the legitimacy of universal reasoning and an increasingly rigorous hard science found no moral implications in its methodology, so the theorists of utility embraced a more wholehearted positivism that they hoped would provide clarity and direction in defining either how desires should be quantified or, in contradiction to the foundational principles they claimed to embrace, which goods they should desire. Unfortunately, their faith in even the most liberal human science could not remedy the fundamental hedonism of their prescriptions. Which desires should persons value and which pains should they avoid? How could the intensity of desires or pains be quantified or publicly compared? Why should one person’s valuation of a desire or a pain trump another’s? It seemed as Victorians attempted to employ utilitarianism that Hume and the Romantics were right in thinking morality a deeply irrational, emotional, and personal endeavor. But on what grounds might persons meet to resolve their disputes? How might they justify those pronouncements of public morality that we call laws (see The Foundations of the Law: an Appetizer”)? Such activities seems in retrospect the very antithesis of science. And so utilitarianism found itself forced to evolve in practice.  It blurred science and philosophy for its truths, and sought consensus in cloaking itself in empirical authority. Only Victorianism could have produced such a toxic emulsion.

When American thinkers at the turn of the twentieth century proclaimed Progressivism the new science of education just as compulsory schooling spread across the land, an eager public lapped up its seemingly scientific promise of “natural development” and “social adjustment.” John Dewey happily blended philosophy and psychology in developing his educational theories, which percolated through teachers’ colleges into every American classroom of the twentieth century. This new scientific education would prepare students for the world of tomorrow and put them on a firm, modern footing. Strangely, the education establishment ignored the source of Dewey’s thought even as its premises sapped content from curricula. But the psychology of this new education really looked to the past, not the future, for its verification. Despite his dreams of turning classrooms into laboratories and experiences into experiments, Dewey was actually inspired by the eighteenth century French Romantic philosopher, Jean Jacques Rousseau, whose novel Emile was equal parts invention and idealist speculation, the very antithesis of the empirical process Dewey supposedly espoused in 1894 as the chair of the Department of Philosophy, Psychology, and Pedagogy at the University of Chicago (Consider that title!).  Even today, vestiges of his hopes exist in the deference paid psychology and economics as “human sciences” despite their manifold paradigms and miserable track record (see The Calamity of the Human Sciences). Can we fault the Victorians in a far less defined scientific climate for grasping at human science for moral clarity? We may excuse them their desperate quest for qualitative and moral goodness, but our kindness brings us no closer to resolving the putrid bequest they passed on to us.

The failure of authority, Romanticism, and the human sciences to warrant moral goodness was a continuing crisis in Victorian life. The attempt to combine them made matters far worse.

They might try to blend the three moral axioms that percolated through their culture, bending authority to embrace Romanticism as Methodism and pietism did, or finding the rationality in Utilitarianism when no empiricism could apply. But the axioms supporting external authority could never be reconciled with universal reason, which in its turn could never be made conformable to private desire, and so religion and utilitarianism failed as a means of arbitrating public goods. Still, we see everywhere in Victorianism an effort to somehow integrate moral authority into individual moral agency, most powerfully in university curricula, where Victorian students faced a constant tug of war between their own moral sense and a stridently authoritarian faculty. Given that hypocrisy, it is little wonder that universities became the locus of intellectual ferment beginning with the Revolutions of 1848, continuing with all the flirtations with novelty for the first half of the twentieth century, and culminating with the campus protests in Paris and the U.S. in 1968.

Failing in an outright merger, Victorians might slice their moral lives in other ways: seeking moral truth in some deeply emotive aesthetic movement as Ruskin recommended while simultaneously embracing a superficial utilitarianism in daily life or seeing women as repositories of emotional truth and men as its rational actors. This effort required a stereotyping of gender roles from which we have yet to recover. None of it worked. None of it could. The axiom of moral autonomy could not be reconciled with externalized authority. The Gnostic inner certainty of Romantic intuition abraded against the plodding workings of reason. Universal reasoning could not be made to align with private experience so long as desire was regarded as inviolate. And the confident voice of empirical science must be rendered dumb by the prerequisites of moral goodness despite the ongoing efforts of the crusaders of the human sciences. The Victorian era dissolved into its own contradictions as the nineteenth century ended, falling like the faded petals of a multifoliate rose.

The pulse of the crisis quickened even further as the twentieth century dawned. Good Victorians were aghast at what the new century produced: a final rejection of tradition in favor of a desperate novelty in every sphere of cultural life. “Make it new,” said Ezra Pound. “I cannot make it cohere.” The new century, even more than the old one, could find no iron for its spine, no true north for its desperate cultural navigators, no consensus to quell the millions of plaintive voices each avowing its own moral universe. Efforts to amalgamate the assumptions of modernism with other moral frameworks had not only failed but had highlighted that failure. That same pop culture that had sentimentalized Victorian values now began to mock them, a process that consumed the entire new century, itself a backhanded tribute to the power of the Victorian model. This oscillation between sentimentality and derision proved to be a final realization of the old high versus low cultural battle that had raged through the nineteenth century.

Romantic pantheist certainty was the first to fall away. The rules of social order stiffened into immobility in the last years of the nineteenth century as authority found itself relentlessly challenged by new modes of life and responded with rigor mortis. Romanticism had offered a bracing opportunity for innovation in the first half of the century, but the appeal to emotional intensity grew stale and arthritic, formulaic and pompous. The wild emotionalism of Shelley collapsed into the conventionality of Tennyson. The private whispers of intuition were channeled into the broad streams of structured emotion. Inspiration was reduced to Duty. Late Victorians were instructed on the proper technique of summoning deep emotion. A new kind of furniture became popular: the fainting couch. Formalized emotion respected neither individual moral autonomy nor reason applied to individual experience. The pantheist voice of God was reduced to absurdly articulated etiquette-as-morality: as such, just one more failing authority.

Desperation for moral clarity combined with a deep awe for the achievements of natural science and a continuing popular confusion of empirical limits. The wild beasts of the “Fauve” might compress dimensionality. Naturalist writers might depict a Darwinian vision of a contingently determinist universe. Einstein might contend that reality violates every assumption human reason brings to any interpretation and Freud that consciousness compromises itself with every reflection. The new century promised a larger and deeper comprehension of reality. Didn’t this novelty itself foretell some new synthesis built from the shards of apparent moral chaos? So went the narrative at the turn of the twentieth century, born in a paroxysm of technological promise, a wild enthusiasm for anything novel, and a developing disdain for Victorian values. Then came the war.

Had it not been so horrifying, it would have been a mercy killing. World War I was Victorianism’s death sentence and execution. It accomplished more than that. The new century had promised some new synthesis. The war buried that promise. Just as modernists found it impossible to put themselves in the place of those who lived before the Reformation, so too do we find it impossible to resurrect the Victorian world view except as historical artifact. Only its shards remain. Everything Victorianism hoped for, every remnant of confidence, was shattered by the war that began with heroic cavalry charges and ended in the yellow fog of mustard gas and the bark of the machine gun. Since then, we have had more destructive conflicts in terms of human deaths, but these were not only prophesied but made inevitable by the nature of that first and most futile of world wars.

The entire twentieth century was drawn by the lines of torpedo wakes and trenches in the agonies of the worst war in human history, worst because so many died so horribly without reason and in glaring contradiction to the pronouncements of the culture, a war that reduced mass killing to a science. This war to end all wars wiped the moral slate, allowing the unspeakable replacements of communism, fascism, and consumerism to be etched in persons’ minds not because these proved reasonable but because nothing else did, and these at least boasted a scientistic facade. The moral landscape at the end of World War I resembled the blasted no-man’s land between the trenches. All prior restraints were wiped away. No limits circumscribed the inventiveness of minds conditioned to seek truth in human science and moral commitment in Romantic emotionalism. The communism of Lenin owed as little debt to Marx as Marx did to true science, and the anthropological roots of fascism were an insult to actual science. Freud’s contorted visions of the mind were inventive to be sure, but they had very little relation to neurology. Jung’s had even less, later psychological paradigms less still. The frantic novelty of the new century demanded the simulacrum of science rather than the reality. It demanded science with a conscience. Pseudo-science rushed in to fill all vacuums, not only of moral goodness but also of judgments of quality. The science of man had promised to remake our understanding of human nature. The human sciences now offered to remake that nature itself. This exaggeration and falsification of empirical power, the deep error we know as scientism, was easily overlooked in the orgy of discovery and invention that natural science produced, produced in the shadow of unspeakable moral failure. If science could now be crowned the undisputed monarch of truth-finding, scientism became its sycophantic courtier, promising an epistemology of goodness as reliable as empiricism’s excavation of truth, indeed as reliable as religious authority had once seemed. Western civilization was shell-shocked by its glaring moral failures. World War I ensured that. Science seemed to offer a resolution of near religious certainty, a promise now fulfilled in technology and still utterly lacking in moral theory.

All three modes of warranting morality practiced by Victorians devolved throughout the 20th century. Scientism in human sciences pitched its false hopes through most of that dismal epoch and bears responsibility for much of its dysfunction. Though appropriating its method and terminology, the postmodern solution that only resolved into clarity in the 1970’s has challenged it along with the universal reasoning it is based upon (see “One Postmodern Sentence”), but the weakness of the Victorian vision of moral structure, particularly of public morality, sustained the moral vacuum that still cries for resolution today (see “Postmodernism Is Its Discontents”). That same alienation deprived Romantic pantheism of its inspiration, reducing intuitions to private constructs rather than divine discoveries. It inspired generations of postmodern critics of popular culture to embrace a private existentialism both romantic and rebellious (see Freedom, Antihero, and Zeitgeist) and led others to abandon all hope of clarity or consistency and settle for a self-serving emotivism that reduces moral agents to consumers of culture and the hard-won moral autonomy of our nature to a surrender to the formative power of environment. Nostalgic believers recognize the flaccidity of the culture, believing that a louder or sterner reminder of categorical morality will return secularists to their duty. Their listeners to this day remain puzzled about what that duty might be.

Advertisements