- Queen Victoria’s accession in 1837 marked an era of industrial and imperial dominance that transformed Britain into a modern nation-state; this transformation was won at the expense of workers and women in the home country and non-European peoples in the rest of the world who were forced to modernize in the Victorian model.
- Daily life changed far more in the sixty-one years of Victoria’s reign than it had in the preceding two thousand; this dynamism was largely fueled by the successes of empirical science, but the era is shot through with deep fractures: hypocrisies and contradictions that climaxed in the horrors of World War I and the century following.
- Historians have teased out political, intellectual, economic, religious, and social strands that tangled together to motivate the major actors of this era, but this view either privileges the impersonal forces that determined outcomes or denies them in favor of a psycho-social analysis of motivations, both of which imply determinist forces dominating intentionality .
- My interest here is solely in examining the warrants that moved individuals’ and cultures’ moral preferences during the era and to examine the irreconcilable conflicts of intention that produced so much progress tainted by so much hypocrisy; this focus portrays World War I less as an avoidable catastrophe than a natural outcome, the climax of five centuries of axiomatic contradiction that cumulatively produced the second great moral crisis in human history.
- Its outcome largely shaped the era to follow, largely because analysts mistook a moral crisis for an epistemological one, the error beginning in Victoria’s reign.
- The Victorian era marked the mature phase of modernism, a movement forced into being by the crisis of religious authority in the sixteenth century; its guiding axioms privileged closely examined individual experience and universal reasoning as the means to determine truth and goodness.
- As seventeenth century modernist thinkers investigated a means of verification to replace religious authority, they engaged in modernist cannibalism: a critical analysis of the nature of both experience and reasoning that revealed serious difficulties in the truth and goodness claims that these axioms of commitment produced.
- The eventual beneficiary of this cannibalism was empirical science, which gradually used this self-critique to perfect its methodology of limiting experiences and maximizing reasoning on them; mature science only emerged toward the end of the Victorian period, but nevertheless its methods and technological products dominated the era.
- As capable as emerging science proved to be, its restrictions also began to close off any utility in using it as a consensual moral guide capable of replacing institutional authority; by Victoria’s coronation, this limitation had not only begun to emerge into public view, it had spawned a popular cultural response, Romanticism.
- Romantics were definitionally anti-science; they saw pantheistic intuition as a certain guide to universal truth and moral goodness; such truth was most clearly revealed in nature and in powerful emotional responses that were renewed by exposure to varied exotic experiences.
- Romantics were as opposed to institutional authority as they were to scientistic thinking, so although they admired intense belief, they were unwilling to champion a revival of institutional religion.
- A cultural battle ensued, with religious authority always asserting its (now ironic) demands for trust in opposition to Romanticism’s admiration for private agency guided by intuition; these contradictions were also in conflict with empiricism’s restrictive methodology and stern devotion to immediate utility; the Victorian mind was somehow expected to arbitrate the conflicting axiomatic demands of competing methodologies to find truth, goodness, and beauty, an effort that could not succeed, even as it planned imperial conquest and revolutionized societies.
- By the last third of the nineteenth century, a reconciliation of sorts had been found: a “moral” philosophy ostensibly scientific, ostensibly individualistic, and ostensibly moral, a “system” of preference that privileged utility and majority will above all: utilitarianism.
- But utilitarianism could not resolve the inherent contradictions that had produced it, not to mention the ones it added; though it claimed scientific validity, it was pseudo-scientific, and though it pretended to value individual preference, it demanded persons sacrifice their own desires to majority interests; and though it pretended to reduce societal choice to mathematical precision, it could prescribe no means of preference other than majoritarian desire.
- As religious authority’s axiomatic contradictions tainted institutional structures for the larger society, Victorians began gravitating toward scientism at that historical moment when empiricism found the means to professionalize; this produced a proliferation of pseudo-sciences, particularly in the “human sciences” that sought to subject preferential freedom to empirical determinism and therefore perfect society.
- This toxic emulsion of moral vacuum and axiomatic contradiction culminated at the end of the century with admixtures of authority, Romanticism, and the human sciences that highlighted both the inherent contradictions of each and the impossibility of blending them into consensual moral direction.
- Victorians utterly failed to blend individual agency with trust in institutional authority, empirical determinism with human freedom, intuition with inductive reason, and private belief with public will.
- Whereas cultural voices had reinforced institutional directives early in Victoria’s reign, by the end of the century, popular culture began a sustained critique of existing mores, an exercise in modernist cannibalism that lasted for the entire twentieth century.
- The first to fail was Romantic intuition, felled by the mechanistic theories of empirical discovery and technological invention; scientific progress then stretched the distance between ordinary reasoning and empirical research and found the former painfully inadequate, leaving a broad space for the interjection of the human sciences as the moral arbiters of the new century.
- The twentieth century was shaped by this sense of failure so powerfully reinforced by the Great War; Western civilization had conquered the world, but the moral vacuum it created was inadequately filled by the human sciences after 1918.
- Throughout the twentieth century, scientism pitched its false hopes, postmodernism continued to exploit what modernist cannibalism had revealed, Romantic pantheism devolved into the virtual circle, and nostalgists bemoaned the loss of religious authority as moral arbiter.
Ah, love, let us be true
To one another! for the world, which seems
To lie before us like a land of dreams,
So various, so beautiful, so new,
Hath really neither joy, nor love, nor light,
Nor certitude, nor peace, nor help for pain;
And we are here as on a darkling plain
Swept with confused alarms of struggle and flight,
Where ignorant armies clash by night.
Matthew Arnold wrote “Dover Beach” in 1851 during the fourteenth year of Queen Victoria’s long reign. Contemporaries might have wondered what he was complaining about. Europe was pawing the world into submission and Britain sat atop Europe as a rider on her steed. No far-flung, God-forsaken isle was too frigid, torrid, or barren to escape the Union Jack. No race, religion, or creed could resist its influence. As its mercantilism remade the world in its image, the landscape of the home country belched forth industrial centers as tirelessly as factories billowed smoke. Old forms of life vanished in the blink of an eye as new classes and values sprouted upon England’s green and pleasant land. An exciting era, to be sure, and despite its breathless rush to the future, it was in the main a peaceful one. Britain was the only major European nation largely to escape the scars of war or revolution during the reign of Victoria. A continent inured to perpetual conflict saw its longest era of peace since the fall of Rome. This impressive achievement was won at the expense of the rest of the world, though. The “Pax Britannica” was both impressive and real, a stable balance of power in Europe in the midst of an obscene imperial scramble, despite the growling of that king of revolutions: the industrial one. When Queen Victoria came to the throne in 1837, she had to travel from Rome to Britain to receive her crown. The journey took fourteen days, the same length required of Julius Caesar as he marched from Britain to Rome in the final spasms of the Roman Republic. That was in 49 B.C.E. It is no exaggeration to claim that life changed more over the course of Queen Victoria’s reign (1837-1901) than it had in the preceding two thousand years. Scientific advances were more than matched by manufacturing, transportation, and quality of life improvements. Britain became a true democracy, though hardly a universal one. All that progress, though muted as it rumbled eastward across the Continent, still reverberated throughout the world. The nineteenth century is called the Victorian Age for good reason: the sun never set on British values. In light of Britons’ pride in their own progress, it seems fair to ask what Arnold could possibly be referencing in these lines. The historian might say he was mourning the loss of an old way of life and fearing a new one. The layman living in our own blur of change might pose a natural question in response: what historical epoch has not produced its own disruptions? Granted that the pace of life quickened during the Victorian era, hasn’t it quickened even more since? Has the last century revealed Arnold to be prophet or crank?
Consider another view written by W.B. Yeats in 1919.
Turning and turning in the widening gyre
The falcon cannot hear the falconer;
Things fall apart; the centre cannot hold;
Mere anarchy is loosed upon the world,
The blood-dimmed tide is loosed, and everywhere
The ceremony of innocence is drowned;
The best lack all conviction, while the worst
Are full of passionate intensity.
If Arnold’s words are the voice of one crying in the wilderness, Yeats’s are the air raid sirens. The Irish poet issued his own prediction, not of private despair but of public collapse. Could anything have happened during the Victorian era not only to validate Arnold’s fears but to magnify them into that rough beast of cataclysm Yeats saw on the near horizon?
I wish to answer the nut of that question in this analysis. My effort necessarily must be limited by two considerations. First, a multiplicity of deep responses is already available in the usual categories: political, economic, sociological, military, and intellectual. Academic incentives being what they are, these broad divisions have been subdivided into their constituents down to the hair’s breadth and tilted to satisfy the current investigative climate. That means a pseudo-scientific view of deterministic forces in line with thinking history to be a social science whose explanatory axioms require that its actors be molded by events rather than direct them; alternatively, it engages in an analysis less postmodern and more sympathetic to the psycho-social motivations of the era’s prime movers, which narrows the scope of influence without lessening its determinative power. These efforts are necessarily reductionist, either in scope or theory, for every person’s story flows from many headwaters and the birth of every social change has a thousand fathers. My interest here as everywhere is to assume my reader can find more than enough information on the trees, branches, and leaves but is nevertheless lacking the one essential map to the entire forest. The key to a deeper understanding lies in the warrants the actors in the drama brought to the stage. These involve the preferences that moved the intentionality of the persons who molded and responded to the events of the age. My presumption of moral agency runs counter to the fuzzy determinism of contemporary historians that seeks to find the pressure points on intention, to reduce it to motivations beyond the control of the individuals who made the history. Some truth attaches to that methodology, of course. We are all moved by forces beyond our control, but our own experience informs us of how we respond to them with conscious purpose, moved by axioms of commitment and justifications that warrant our preferences to ourselves and, in the case of historical figures, to posterity just as the historians who document their lives think themselves free to investigate them (see “What Counts as Justification?“). And though any historical era may with profit be examined by these means, the Victorian era, particularly the last half of the nineteenth century, is unique in the employment of its warrants for moral preference (see “What Do We Mean by ‘Morality’?”). It faced a clash of positions it could neither reconcile nor ignore, though its opinion makers tried to do both. Worse, it is doubtful they fully appreciated this discord, living as they did in the eye of the hurricane whose intensity they had not predicted and whose future path they could not know. They responded intentionally but without real consistency. My second consideration is that this clash, poorly examined in itself, has profoundly influenced our own age when “ignorant armies clash by night.” In truth, the situation has worsened because the Victorian Age has been mistaken by its postmodernist critics as a general epistemological crisis when it was in truth a moral one (see “Modernism and Its Discontents”). Their critique has removed from consideration the very tools necessary to repair the breach (see “Postmodernism’s Unsettling Disagreements”). It is not that that “the centre cannot hold.” There is no center, at least in the sense the Victorians thought of it: as a unifying moral imperative appealing to trust or conscious sanction. We read in the despair of Arnold and Yeats a message in a bottle floating up from the distant past to our own stormy present. The challenge is to seek the source of their despair and, having traced its effect on their world, examine its dangers to ours.
The Victorian era marked the mature phase of modernism, an intellectual movement begun in the seventeenth century as a desperate effort to find the means to warrant persons’ claims to truth, goodness, and beauty. By the time Arnold wrote “Dover Beach,” modernism had acted upon ten generations of Western culture. It began as a jury-rigged response to the Protestant Reformation of the sixteenth century. That response was necessitated by a total collapse of warrant over the 150 years of Reformation bloodshed. Declarations that had been defended by authority since time out of mind were disputed, challenged, and rejected in the name of other religious authorities believers found more convincing. But as Jesuits warred upon Lutheran clergy and Lutheran princes upon Catholic kings and Calvinists upon dissenting sects and prophets upon prophets — history offers few bloodier, more obdurate, or more miserable examples — authority itself came under suspicion for what seems in retrospect an obvious flaw. Of the five correspondence warrants for truth and goodness, only authority fails to provide a means to reconcile dispute within its own mode of warrant (see “Authority, Trust, and Knowledge”). Trust requires a surrender of rational and moral agency and dispute requires an exercise of it, a re-aquisition of what has been willingly surrendered. And that proved problematic since religious authority had provided the base support for all goodness claims and most truth claims that molded medieval culture. Replacement warrants were eventually found. They had to be. These became the defining ingredients of modernism, so after 1700 truth and goodness claims, even efforts to return to trust, had to find their justifications in universal reason and closely examined experience, both appealing to the individual for ultimate validation. Modernists assumed persons shared a universal reasoning faculty. Much of their early thinking reflected an attempt to define it, explain its workings in irrefutable terms, and examine how it adapted to experience (see “Modernism’s Midwives”). The goal was to provide a substitute for the certainty of authority’s justifications that now lay shattered along with the doctrines of the Hutterites, the Schwarzenau Brethren, the Knipperdollings, and a thousand other sects. That proved difficult, for the more rigorously these everyday processes were examined, the less confidence thinkers could find in their reliability. This doubt was achieved by precisely the process it disputed as each generation of thinkers challenged the rational assumptions buried in the explanatory models of its elders. This assault of modernist cannibalism exacted its price, at least on serious epistemological thought. By Arnold’s day, the lack of confidence that resulted from continual rational challenge was still confined to a learned few who could master abstruse thinkers like Hume and Kant. In the half century before the ruinous disaster that was World War I, skepticism about the power of universal reason had entered the popular culture indirectly through academic circles but was also widely disseminated through a new popular media that spread doubts about universal reason. That was called Romanticism.
Those doubts faced a clear counterargument, not in theory, but in practice. No one affected by Western culture could dispute the astounding success of natural science, and those who admired it or practiced it felt completely justified in pointing to the ever-tightening subject disciplines of the sciences to prove the skeptics wrong (see “The Limits of Empirical Science”). Here was the bloom of the modernist theory in full flower: universal reason applied to experience unlocking the secrets of the universe, discovery building upon discovery, invention upon invention, touching daily life from Montevideo to Manchuria and probing the very mind of God. Empirical science confirmed modernist warrants, the proof lying in a cataract of discovery and an ocean of invention as each generation of scientists built upon the work of the one before. The philosophers of the mind had been called empirical. Their investigations of experience had only produced doubt and despair, but the true empiricism of natural science tramped through the nineteenth century, a conquering legion of dedicated naturalists constructing ever tightening disciplines. Throughout the world, who could resist this proudest envoy of modernist warrants?
Now from the superior vantage point of our own interests, some of us might challenge such a rosy picture of natural science, seeing buried in its discoveries the seeds of exploitation of native peoples and the environment, in its inventions the means to destroy social harmony. That condemnation has merit, yet framing it relies on the very modernist justifications that it condemns, calling as it does for analysis and evidence to support or dispute. But as a judgment on science, it can never be either justified or refuted by the science it judges. That vital point highlights two distinctions that could only arise from the refinements of thinking produced by the standardization of scientific endeavor. First, Victorians were slow to recognize that reason and science are not synonymous, for the latter requires a very careful employment of the former. At the beginning of the nineteenth century, anyone could be a naturalist, a careful observer of nature, using the same methodology and categorizations of experience Aristotle had used to examine everything from termites to tyrants. By the end of the century, the kind of observing and thinking scientists were employing seemed unavailable to ordinary reasoning about experience. Secondly, a realization only gradually dawned — and perhaps has not yet fully arisen — that this same refinement of effort precludes an empirical evaluation of empiricism itself. Neither Victorians’ admiration nor our own doubts about the goodness of science can be warranted by the same means science uses to understand phenomena. Science cannot find goodness beyond utility, cannot issue judgments of quality or morality. Yes, it can find a good material with which to construct an elevator cable or prevent sunburn, but it cannot use those methods to decree that the cable should be made or the sunburn prevented. It can extend life, but it cannot demonstrate by its method that life should be extended. And that proved an insuperable problem. As its truth-finding methods were narrowed by the ever tighter rules of empirical inquiry culminating by the time of Victoria’s death in the mature academic scientific disciplines we all now know, that same constriction of methodology shucked off any hope of goodness claims discoverable by natural science. At the very moment when it mastered the most reliable means to know truth, or at least some kinds of truth, its means of mastery precluded any examination of any goodness beyond simple utility (see “What Do We Mean by Good?”).
To Victorians awash in random transformation and change, that limitation was insupportable. For them as for us, truth and goodness are intimately connected. What is the point of a correct knowledge of reality if it cannot direct our intentions (see “Truth and Beauty Do a Dance”)? What good is science if it cannot make life better, cannot even address the meaning of “better”? In the welter of choices Victorians faced daily, what could differentiate good, better, best if not the triumphant methods of the sciences? If empiricism cannot quantify or assist our choosing of what is good, what can?
Now this crisis arrived at that odd intersection of philosophy’s assault on reason and empirical science’s appropriation of it. The way narrowed further as natural science refined its methodology in the second half of Victoria’s reign, producing three unpleasant consequences. First, the refinement of reasoning that characterizes true empiricism seemed to discredit what once was called “common sense,” which is reason applied to undistilled experience. By 1899, when Freud published his The Interpretation of Dreams, certainly by 1905, when Einstein published his theory of special relativity, anyone interested in reasoning about experience must have found her confidence shaken. It seemed common sense was neither widely seeded nor ordinary, and as Victorian authorities found doubts that could not be resolved without ceding to individuals the power to decide such matters themselves, an agency that must erode trust, it seemed that same ordinary reasoning must prove inferior to the emerging truths of science.
Secondly, this elevation of empirical thinking demeaned more than common sense. The professionalization of science also tightened its disciplinary paradigms, sending its rejects into the shadowy disgrace of pseudo-science. Many casual nineteenth century studies sought empirical status, but what gradually emerged was a kind of gauzy continuum as formerly respectable studies found themselves failing the increasingly rigorous tests of academic science as the century neared its end. These often-ancient studies sought the experimental verification, mathematical language, and predictability of real science, but fell to amateur or cult status by the beginning of the twentieth century. Their longevity only increased the arcane and evolving disciplinary standards of academic science, but they also obscured the loose nature of the human sciences that fell between hard and pseudo-science. This continuum is still sorting itself out in the Internet age.
What was a kind of terminological struggle for the truth between science and authority masked a deeper battle over the axioms of commitment. The modern reliance on individual experience and universal reasoning found itself challenged by its own products, for the greatest product of modernism had challenged reason’s capability to know and communicate the facts of reality, at least until those facts were highly distilled by the methods of science. Whether common sense or empirical process, this retention of rational agency had endured repeated assaults by institutional authority. There could be no compromise; individuals had to use their own reason or surrender it in trust. But it seemed neither possibility could succeed, and reliable truth might be found only in the laboratory, not in the street, and individuals seemed entirely unwilling to trust traditional institutions in the midst of wholesale social change. This constriction of access to truth only deepened an incipient hypocrisy built upon Victorians’ atavistic defense of moral authority. It had been premodernism’s ace in the hole: alloyed truth and goodness claims that were advanced as certain and absolute and also mutually confirmatory. It was only this capacity to move public morality that had allowed the survival of premodernism’s axiom of commitment: trust in authority. But with its truth claims battered by the new sciences and its goodness claims subject to individuals’ doubt, traditional authority was sorely besieged. One can actually observe both the demands for and the rejections of trust grow more strident as the nineteenth century progressed. Failures of trust abraded against the baseless assertions of moral superiority that had sent Europeans sallying out to conquer the world in the name of “civilization.” Their conquest were morally exploitative, for it was justified by a trust in the authority of flag and God that its champions could not force using a science that made conquest possible yet must reject trust in favor of active rational agency. The victims of European imperialism had their own flags and their own gods. What they lacked was the technology that Western science and modernist axioms had made possible. But exporting technology carried its own explosive moral charge. For all its superiority as a means of finding truth, this new science’s reliance on the act of severance, cutting off determinations of truth from subsequent valuations of goodness, meant Victorian authority must fail to respond to assaults on trust with stronger warrants so long as science and its products proved valuable (see “The Act of Severance”). As trust in authority shrank and sanction of science’s methodology grew, a moral gap opened that neither could fill. As Queen Victoria grew old and Western conquests continued, its defenders at home and abroad endured a gnawing doubt beneath their arrogance and ostentatious displays of material wealth. Even as explorers carried this gospel of wealth to guano miners in Chile, as missionaries carried their quite different gospels up the Congo River to bewildered peoples in the world’s hearts of darkness, as obedient daughters imbibed arcane rules of etiquette in Connecticut drawing rooms, the culture groped for the iron laws of nature that might order and warrant its values but found only a receding tide of justification as empiricism tightened its method. This simultaneous discrediting of common sense reasoning and glorification of empirical processes elevated the axioms of modernist commitment while eroding trust in moral authority. They accentuated Victorians’ hollow values and left a space that three justificatory schemas tried to fill. These were authority, Romanticism, and the human sciences. Each professed to offer the path to reliable knowledge of moral truth. Our own age testifies to their failures.
The oldest of these had been beaten yet had not surrendered. Modernism had developed during the seventeenth and eighteenth centuries as a response to the failures of authority, specifically religious authority, in the nightmare of the Protestant Reformation. We can enter that nightmare only retrospectively, so different is our own worldview from that dominant before 1517 when Martin Luther began his revolt against the Catholic Church (see “Premodern Authority”). Authority’s glory as a warrant was that it offered a sphere of absolute certainty in which truth and goodness claims are mutually supportive. Because its truth claims must be accepted on the basis of trust or belief, the goodness preferences that they frame must seem to the believer an inseparable part of the package (see “The Fragility of Religious Authority”). Absolute authority suffered from none of the embarrassing bifurcation of truth and goodness claims that modernism has faced. Despite its repeated humiliations during the 150 years of the Reformation when authorities countermanded each other’s declarations with nauseating predictability and deadly consequence, a seemingly endless quarrel that only resolved itself with modernism’s new axioms of universal reason and experience, the lure of certainty remained a very strong one. So many persons found tradition a great comfort in the dislocations of the Victorian era, despite its embarrassments and longstanding failures. Rulers still found it expedient to add divine will to their own, and we cannot doubt that missionary motives account for a good part of the zeal with which Europeans sallied out to dominate the world, at least at first. Nearly everyone failed to see that premodern warrants were now vying with modern ones, at least until someone like Darwin forced them to. We should not see this longing for tradition as simple nostalgia. It reflected modernism’s poverty in justifying moral claims, one its ethical theories found challenging to resolve with the individual moral agency that lay at its core. Arnold saw the issue as a draining away of “the sea of faith,” an irreparable loss. Many modernist writers reflected that sense of loss well into the twentieth century (see “Tao and the Myth of Religious Return”). Our current cultural crisis echoes at least some of that longing to return to absolute religious authority despite its inability to resolve challenge in a multicultural environment (see “Belief in the Public Square”).
That longing remained unrequited for many in the Victorian era as it does today because authority had long since lost its aura of certainty. Persons might still attach themselves to it, but the zeitgeist required them to plumb their own sense of moral responsibility before forming that attachment, always recognizing those alternatives that now presented themselves as living options to be hypothetically calculated and rationally rejected so that an affiliation with authority might be continued (see “Toward a Public Morality”). But that prudential calculation was in itself an expression of doubt that only individuals’ moral agency could overrule, putting authority perpetually under assault by the axioms of modernism that presented a buffet of alternatives to those sheltering from the storm of change in the bulwarks of tradition. This tension expressed itself in the repetitive efforts of Victorian thinkers to present the components of tradition as separate appendages of a single body of authority as though parents, church, culture, and God had spoken the same words with the same voice. But as Victorianism matured, this proved an effort increasingly desperate and ineffectual. One cannot read the literature of that era without being struck by the strained attempt to unify the claims of its authorities into a single sweeping justification rooted in tradition, nor to observe the untidy outcomes of such an effort as the winds of change battered the old fortifications. By the middle of Victoria’s reign, the stress of maintaining the forms of traditional authority while also acknowledging the source of moral power in individual reasoning began moving toward crisis.
One vibrant moral alternative presented itself in an endless loop in popular media, that new and stimulating alternative to traditional sources of knowledge. What media pitched after 1800 was a relentless Romanticism that found reliable declarations about truth and goodness within the receptive heart. In one sense, Romanticism was a conscious antithesis to modern preoccupations with careful reasoning about experience. Ironically, it was born from that long search. When Kant argued in 1781 that our minds cannot guarantee knowledge of reality but must instead operate upon phenomena floating into consciousness from some preconscious mental constructions, he completed an intensely rational critique of reason and experience begun amidst the ruins of the Reformation by Descartes a century and a half earlier. The possibility that these mysterious intuitions were discovered rather than constructed became a new obsession for German thinkers. More importantly by far, the desperate hope that they might be divinely inspired revived what had been a lost quest for moral clarity that still respected modernism’s elevation of individual moral autonomy and vehemently rejected external authority. By 1798, when Wordsworth and Coleridge published Lyrical Ballads, the thought revolution shifted from the classroom to the garden and from the mind to the emotions, and from the educated few to the literate many. The charm of this new popular Romantic literature was obvious. It explicitly rejected the abstruse and dry theorizing of the philosophers with its plodding and contentious analysis and replaced it with something glorious. The empirical philosophers, most notably David Hume, had bound modernism’s moral options to cultural consensus or sentimentality with no sure rational guarantee. This noose tightened with the attack upon common sense waged by an increasingly confident and professional empirical science. But Romantic glorification of intuition promised to bypass these rational limits and grant a level of moral certainty lost since the Reformation. And it seemed so easy, so accessible, and so universal! Nature, silence, exotic experience: all were portals to indubitability. Romanticism took modernism’s relocation of moral autonomy and elevation of experience quite seriously. It tapped a long-standing vein of gnostic revelation in religion to imagine a pantheist god-in-nature whispering to that part of itself that is the private human soul, communicating to the moral agent not only certain truth but also what modernism could never supply and what the age so desperately needed: certain goodness. “What I think is right, is right,” Thoreau proclaimed. “The heart’s emphasis is always right.”
As Romanticism wrapped the new popular culture in its warm embrace, it found itself confronting and opposing the clear-cutting march of empirical progress. The two approaches proved irreconcilable. Science could be performed. But it could not be lived because it could not prescribe goods of preference. Its reliance on analysis, the rational disassembly of carefully monitored experiences, gave empiricism its power to discern the truths those experiences conveyed, so long as they proved capable of isolation, measurement, and repetition. But what could one do with such truths if they did not also reveal what one should do? Romanticism’s attraction lay in its universal truths privately transmitted, but its power lay in its claim of an easy and intuitive discovery of moral and qualitative goodness. And this was a claim science could neither match nor refute. “Sweet is the lore that nature brings./Our meddling intellect/misshapes the beauteous form of things./ We murder to dissect.” Explicitly opposed to modernism’s reliance on reason and even more opposed to reason’s scrupulous applications in science, Romanticism nevertheless appropriated modernism’s utter rejection of authority. But as the Victorian age advanced, it was empiricism that won the field. As Thoreau drifted above Walden Pond in 1854, he wondered at the mysterious harmonies Mother Nature had wrought among fishes and frogs and frothy wave, at the organic wholesomeness of the thousand caresses of nature’s god. Five years later, Darwin explained it all quite differently: as a desperate and vastly wasteful war of each against all for survival, nature red in tooth and claw: determinism without purpose (see “ The Determinism Problem”). And that presented every person exposed to Victorian culture with a quandary. How could these deep antinomies be reconciled on the factory floor or ladies’ drawing room, before the mast on rollicking tea clippers, or by earnest discussions in Chautauqua lecture halls?
None of this was as cut-and-dried as presented here, which only added to the confusion. The good Victorian was expected to somehow integrate these warring values: to be both heartfelt and deliberate, intuitive and rational, sensitive and sensible, respectful and daring, to treasure both heart and head, though how to arbitrate their conflicting instruction could never be resolved. Much of the disrepute we associate with Victorianism is traceable to this impossible task, and at least some of the confusion of our own age was born in the Victorian response to modernism’s poverty of ethical theory.
The zeitgeist required that any possible solution must demonstrate a deep respect for the moral autonomy of the individual, but of what use is any moral theory if it cannot prescribe the means to reconcile conflict between friends and among strangers and do it by means all parties can embrace? This requires that the mode of warrant be congenial to disputants and also that they not reject whatever axioms persons bring to their warrants (see “The Axioms of Moral Systems“). But the contraries of Victorian life blocked such agreement in public disputes at the historical moment when absolute authority revealed itself hopelessly compromised. Kings and emperors might still wrap themselves in the purple and squat on their gilded thrones, but practicality required them to accede to the popular will. If only they could find it! And private morality faced a similar impasse as persons sought some thread of consensus in their preferences or at least some axiom congenial to their divergent private interests. In hindsight it seems unsurprising that the only moral stance that could respect such diversity while providing a consensual framework for morality was one that prescribed the means rather than the ends of morality. This moral theory was utilitarianism. It deeply respected individual desire, abstaining from taking a position on the moral melee that so confused the age. Its only rule was that desire be subjected to the dictates of universal reason. So as to avoid charges of total pragmatism (those would have to wait until the next century) utilitarianism clothed its methodology in the one area of knowledge where modernist axioms had clearly triumphed. Its aim was to be an empirical science of morality.
I have already written that effort off. We know that true science must fail in finding goodness for the exact reason it succeeds so well in finding some kinds of truth: its mode of inquiry severely limits the kinds of experiences it can examine. And these are only a very narrow slice of life, though natural science has performed wonders in that range. Science describes perceptual phenomena with precision and breaks them into their components with surgical care, forming hypotheses and testing them in settings wherein variables can be isolated and experiences precisely duplicated. Of the correspondence proofs of judgment, empirical science is the royal road to truth, at least for those few truths thus discoverable. We know that now. Over three centuries of traveling that road, we have gradually discovered just which kinds of truths science can seek and find and which it can’t. That process is ongoing, but it achieved maturity only at the end of the Victorian era with the widespread adoption of the German model of academic science. Before that, there had been both hope and confidence that science would one day crack the nut of morality as it had of biology and physics, that we would one day possess prescriptive knowledge as reliable as the descriptive knowledge science continued to unwrap about the world. To prescribe moral goods would require more than a new empirical discipline. It would require a new science.
From the very beginning of modernist thinking, the science of nature had been a model for an unfolding science of man. Early epistemologists expressed full confidence that human behavior would one day be as accessible to scientific inquiry as the larval stage of the fruit fly. Victorian scientists like Auguste Comte and Max Weber practiced a positivism that promised to decode free will and prescribe irresistible social goods. In order to claim empirical warrant, though, that promise had to eliminate the troubling intrusion of felt human freedom so as to make human behavior predictable and testable (see “Our Freedom Fetish”). The positivists had every confidence that science would one day solve this problem, but even if it could resolve this basic analysis of human nature, it would still face the thornier issue of prescription. It would not only tell humanity necessary truths; it would use them to clarify necessary preferences, to replace religious certainty with scientific precision. That failed. As the hard sciences continued to refine their methodology, the soft ones found themselves unable to resist their own hubris, a smug superiority instilled by what seemed a scientific approach yielding irresistible moral conclusions. We see it the positivists’ airy dismissal of challenging paradigms and differing methodology. Better known inheritors of the dreams of philosophes like Adam Smith and Voltaire include Karl Marx and Sigmund Freud, whose analyses of human behavior blurred description and prescription to the point of intentional deception. Who could question Smith’s admiration for the wisdom of the invisible hand, his sweeping admiration anticipating Marx’s equally prescriptive regret over the loss of feudal communitarianism, or Freud’s contempt for religious authority echoing the moralism of Voltaire and Montaigne? Was this science? In a Victorian world reeling from a simultaneous banquet of scientific progress and famine of reliable moral theory, a conjunction seemed inevitable, particularly since the boundaries of what passed as science in the popular mind made it seem akin to magic. All manner of studies were treated to a kind of pseudo-scientific seriousness as physiognomists built their catalogues of facial features, ethnologists of racial types, occultists of paranormal manifestations, and so on. Why shouldn’t empiricism in its deep study of human motivation reveal a science of moral health?
Utilitarianism tapped into that rich vein of promise with its seemingly precise quantifying of hedons and dolors and its rankings of higher and lower pleasures. Its premise was a reasonable one. If persons have both universal reasoning and moral autonomy, no moral goods should be externally prescribed nor any duties imposed. Private desire seems too varied for such directives, and the presumptions of modernism assured Jeremy Bentham and John Stuart Mill that individuals could apply reasoning to experience to resolve dispute (see “Three Moral Systems“). But that was not happening in the Victorian melieu. Romanticism and traditional authority denied the legitimacy of universal reasoning and an increasingly rigorous hard science found no moral implications in its methodology, so the theorists of utility embraced a more wholehearted positivism that they hoped would provide clarity and direction in defining either how desires should be quantified or, in contradiction to the foundational principles they claimed to embrace, which goods they should desire. Unfortunately, their faith in even the most liberal human science could not remedy the fundamental hedonism of their prescriptions. Which desires should persons value and which pains should they avoid? How could the intensity of desires or pains be quantified or publicly compared? Why should one person’s valuation of a desire or a pain trump another’s? It seemed as Victorians attempted to employ utilitarianism that Hume and the Romantics were right in thinking morality a deeply irrational, emotional, and personal endeavor. But on what grounds might persons meet to resolve their disputes? How might they justify those pronouncements of public morality that we call laws (see “The Foundations of the Law: an Appetizer”)? Such activities seem in retrospect the very antithesis of science. And so utilitarianism found itself forced to evolve in practice. It blurred science and philosophy for its truths, and sought consensus in cloaking itself in empirical appeals to reason alloyed with authority’s appeal to trust. Only Victorianism could have produced such a toxic emulsion.
When American thinkers at the turn of the twentieth century proclaimed Progressivism the new science of education just as compulsory schooling spread across the land, an eager public lapped up its seemingly scientific promise of “natural development” and “social adjustment.” John Dewey happily blended philosophy and psychology in developing his educational theories, which percolated through teachers’ colleges into every American classroom of the twentieth century. This new scientific education would prepare students for the world of tomorrow and put them on a firm, modern footing. Strangely, the education establishment ignored the source of Dewey’s thought even as its premises sapped content from curricula. But the psychology of this new education really looked to the past, not the future, for its verification. Despite his dreams of turning classrooms into laboratories and experiences into experiments, Dewey was actually inspired by the eighteenth century French Romantic philosopher, Jean Jacques Rousseau, whose novel Emile was equal parts invention and idealist speculation, the very antithesis of the empirical process Dewey supposedly espoused in 1894 as the chair of the Department of Philosophy, Psychology, and Pedagogy at the University of Chicago (Consider that title!). Even today, vestiges of his hopes exist in the deference paid psychology, sociology, and economics as “human sciences” despite their manifold paradigms and miserable track record (see “The Calamity of the Human Sciences”). Can we fault the Victorians in a far less defined scientific climate for grasping at human science for moral clarity? We may excuse them their desperate quest for qualitative and moral goodness, but our kindness brings us no closer to resolving the putrid bequest they passed on to us.
The failure of authority, Romanticism, and the human sciences to warrant moral goodness was a continuing crisis in Victorian life. The attempt to combine them made matters far worse.
They might try to blend the presumptions that percolated through their culture, bending authority to embrace Romanticism as Methodism and Pietism did, or finding the rationality in Utilitarianism when no empiricism could apply. But the axioms supporting external authority could never be reconciled with universal reason, which in its turn could never be made conformable to private desire, and so religion and utilitarianism continued their unsuccessful pursuit of arbitrating public goods. Still, we see everywhere in Victorianism an effort to somehow integrate moral authority into individual moral agency, most powerfully in university curricula, where Victorian students faced a constant tug of war between their own moral sense and a stridently authoritarian faculty. Given the hypocrisy implicit both in that demand for trust and the conflict with the idea of a university as an appeal to reason, it is little wonder that the campus became the locus of intellectual ferment beginning with the Revolutions of 1848, continuing with all the flirtations with novelty for the first half of the twentieth century, and culminating with the campus protests in Paris and the U.S. in 1968.
Failing in an outright merger, Victorians might slice their moral lives in other ways: seeking moral truth in some deeply emotive aesthetic movement as Ruskin recommended while simultaneously embracing a superficial utilitarianism in daily life or seeing women as repositories of emotional truth and men as its rational actors. This effort required a stereotyping of gender roles from which we have yet to recover. None of it worked. None of it could. The axiom of moral autonomy could not be reconciled with a trust in externalized authority. The gnostic inner certainty of Romantic intuition abraded against the plodding workings of empirical science. Universal reasoning could not be made to align with private experience so long as desire was regarded as inviolate. And the confident voice of empirical science must be rendered dumb by the prerequisites of moral goodness despite the ongoing efforts of the crusaders of the human sciences. The Victorian era dissolved into its own contradictions as the nineteenth century ended, dying like the faded petals of a multifoliate rose.
The pulse of the crisis quickened even further as the twentieth century dawned. Good Victorians were aghast at what the new century produced: a final rejection of tradition in favor of a desperate novelty in every sphere of cultural life. “Make it new,” said Ezra Pound. “I cannot make it cohere.” The new century even more than the old one could find no true north for its desperate cultural navigators, no consensus to quell the millions of plaintive voices each avowing its own moral universe. Efforts to amalgamate the assumptions of modernism with other moral frameworks had not only failed but had highlighted that failure. That same pop culture that had sentimentalized Victorian values now began to mock them, a process that consumed the entire new century, itself a backhanded tribute to the power of the Victorian model. This oscillation between sentimentality and derision proved to be a final realization of the old high versus low cultural battle that had raged through the nineteenth century.
Romantic pantheist certainty was the first to fall away. The rules of social order stiffened into immobility in the last years of the nineteenth century as authority found itself relentlessly challenged by new modes of life and responded with rigor mortis. Romanticism had offered a bracing opportunity for innovation in the first half of the century, but the appeal to emotional intensity grew stale and arthritic, formulaic and pompous. The wild emotionalism of Shelley collapsed into the conventionality of Tennyson. The private whispers of intuition were channeled into the broad streams of structured emotion. Inspiration was reduced to Duty. Late Victorians were instructed on the proper technique of summoning deep emotion. A new kind of furniture became popular: the fainting couch. Formalized emotion respected neither individual moral autonomy nor reason applied to individual experience. The pantheist voice of God was reduced to absurdly articulated etiquette-as-morality: as such, just one more failing authority.
Desperation for moral clarity combined with a deep awe for the achievements of natural science and a continuing popular confusion of empirical limits. The wild beasts of the “Fauve” might compress dimensionality. Naturalist writers might depict a Darwinian vision of a contingently determinist universe. Einstein might contend that reality violates every assumption human reason brings to any interpretation and Freud that consciousness compromises itself with every reflection. The new century promised a larger and deeper comprehension of reality. Didn’t this novelty and tumult foretell some new synthesis built from the shards of transparent moral chaos? So went the currents at the turn of the twentieth century, born in a paroxysm of technological promise, a wild enthusiasm for anything novel, and a developing disdain for Victorian values. Then came the war.
Had it not been so horrifying, it would have been a mercy killing. World War I was Victorianism’s death sentence and execution. Authority had survived by offering moral certainty when nothing else could. The War showed its hollow promise, its hypocrisies and its failures. Trust could not survive wholesale human butchery fought for no discernible purpose. Authority was one casualty, but hope was crippled as well. The new century had promised some new synthesis. The war shattered that promise. Just as modernists found it impossible to put themselves in the place of those who lived before the Reformation, so too do we find it impossible to resurrect the Victorian world view except as historical artifact. Only its shards remain. Everything Victorianism hoped for, every remnant of confidence, was shredded by the war that began with heroic cavalry charges and ended in the yellow fog of mustard gas and the bark of the machine gun. Since then, we have had more destructive conflicts in terms of human deaths, but these were not only prophesied but made inevitable by the nature of that first and most futile of world wars.
The entire twentieth century was drawn by the lines of torpedo wakes and trenches in the agonies of the worst war in human history, worst because so many died so horribly without apparent reason and in glaring contradiction to the pronouncements of the culture, a war that reduced mass killing to a science. This war to end all wars wiped the moral slate, allowing the unspeakable replacements of communism, fascism, and consumerism to be etched in persons’ minds not because these proved reasonable but because nothing else did, and these at least boasted a scientistic facade. The moral landscape at the end of World War I resembled the blasted no-man’s land between the trenches. All prior restraints were wiped away. No limits circumscribed the inventiveness of minds conditioned to seek truth in human science and moral commitment in Romantic emotionalism. The communism of Lenin owed as little debt to Marx as Marx did to true science, and the anthropological roots of fascism were an insult to actual science. Freud’s contorted visions of the mind were inventive to be sure, but they had very little relation to neurology. Jung’s had even less, later psychological paradigms less still. The frantic novelty of the new century demanded the simulacrum of science rather than the reality. It demanded science with a conscience. Pseudo-science rushed in to fill all vacuums, not only of moral goodness but also of judgments of quality. The science of man had promised to remake our understanding of human nature. The human sciences now offered to remake that nature itself. This exaggeration and falsification of empirical power, the deep error we know as scientism, was easily overlooked in the orgy of discovery and invention that natural science produced in the shadow of unspeakable moral failure. If science could now be crowned the undisputed monarch of truth-finding, scientism became its sycophantic courtier, promising an epistemology of goodness as reliable as empiricism’s excavation of truth, indeed as reliable as religious authority had once seemed. Western civilization was shell-shocked by its glaring moral failures. World War I ensured that. Science seemed to offer a resolution of near religious certainty, a promise now fulfilled in technology and still utterly lacking in moral theory.
Scientism, Romantic revivalism, materialist pragmatism, and authoritarian nostalgia all competed for moral gravity after World War I. Rooted in Victorian confusions, they further devolved throughout the 20th century. Scientism pitched its false hopes through most of that dismal epoch and bears responsibility for much of its dysfunction especially as it was pitched by the human sciences that glorified it. Though appropriating science’s method and terminology, the postmodern solution that only resolved into clarity in the 1970’s manages both to glorify and demean the empirical process it is based upon (see “One Postmodern Sentence”). So the weakness of the Victorian vision of moral structure, particularly of public morality, expanded into the moral vacuum that still cries for resolution today (see “Postmodernism Is Its Discontents”). That same alienation deprived Romantic pantheism of its inspiration, reducing intuitions to private constructs rather than divine discoveries. It inspired generations of postmodern critics of popular culture to embrace a private existentialism both Romantic and rebellious (see “Freedom, Antihero, and Zeitgeist”) and led others to abandon all hope of clarity or consistency and settle for a self-serving emotivism that reduces moral agents to consumers of culture. The hard-won moral autonomy of modernism now verges on a surrender to the formative power of environment. None of these developments boded well for the modernist assumptions of contractarian democracy, itself a conspicuous effort at moral neutrality perverted by Victorian failures and their aftermath into a mindless materialism (see “Why Invent a Social Contract?“). Nostalgic believers recognize the flaccidity of the culture, believing that a louder or sterner reminder of categorical morality will return secularists to their duty. Their listeners to this day remain puzzled about what that duty might be.