The Calamity of the Human Sciences

 Psychology Today once ran a story on the major influences on therapists’ professional approaches. Guess what most impacted their theoretical outlook. Their dissertation research? Learned papers in their field of specialization? Esteemed colleagues or lengthy seminars? Not even close. These behavioral scientists listed their personal experiences as the factor most determinative in their professional lives regardless of the level of their education or their years of practice. As they offered their analyses and therapeutic advice to their patients from the lofty eminence of their professional standing, their wisdom sprang from no fount more substantial than the accidents of some moment from their own past, subject to all the particularities of context and personality, all tinted by the imperfections of memory. Such was the nature of their “expertise.”

 Perhaps psychology is too easy a target. But as we survey the history of that peculiar branch of empirical imitation known as the human sciences, we find a rich mine of similar absurdity. Some brilliant observer termed the study of sociology  “the excruciating elaboration of the obvious” after reading a study confirming that paraplegic hitchhikers were picked up less frequently than able-bodied ones. And who can forget the red faces of economists as they surveyed the wreckage of the 2008 financial collapse, a moment immortalized by Jon Stewart’s pitiless skewering of Mad Money’s Jim Cramer? Whatever plagues psychology also casts its pall on other soft sciences. And therein hangs a twentieth century tale.

 First, as always, it is useful to clarify what we are talking about. The human or soft sciences are a set of relatively recent disciplines whose scope and methodology was established in the 1900’s. Their subject matter differs from the natural or hard sciences that are their predecessors and their inspiration in this way: the human sciences study elements of human free will and the behavior related to it. What happens to a person’s body as she jumps from the airplane is physics. What happens to her biological functioning is any of a related group of medical sciences: neurology, physiology, cardiology, biochemistry, hematology, etc. Forensics and chemistry examine what happens after she hits the ground. All of these studies exclude the issue of why she jumped. That is the domain of the soft sciences. When we use empirical methodology to examine “the black box” of choice-making, we enter the human sciences. Besides those mentioned—psychology, sociology, economics—we might add criminology, anthropology, education and many other subsets. In some ways these soft sciences resemble the hard sciences. They make use of a specialized vocabulary that excludes outsiders. They branch into sub-disciplines as the hard sciences do. They often set up experiments and publish papers in academic journals. They form departments on college campuses and award undergraduate and graduate degrees. To a casual observer, the human sciences may seem to be just another of the many subdivisions of the empirical quest to subject nature to the scientific method. But don’t be fooled. The emulation of the natural sciences by the human sciences was intentional and perhaps originally well-intentioned, but it proved quixotic and delusional in the long run and has caused incalculable historical harm.

 These threads are only untangled through some historical review. Empirical science as we define it was introduced by the Renaissance. Its emphasis was on close examination of experience. To understand its success, we must see its maturation over the next three centuries as a result of a series of subtractions of focus and procedure and multiplication of safeguards against what the great prophet of modern science, Francis Bacon, called the four idols, the blinders that hinder reliable knowledge. These idols in today’s terms are the limitations imposed by our perceptions, by hypothesizing, by communicating, and by theorizing from present range of knowledge. We forget how wide-eyed the Renaissance masters viewed the landscape of experience after a millennium confined to the still life of religious authority. Literally everything was open to critical investigation, and it was not until the nineteenth century that science began to put aside studies that failed to meet its increasingly stringent criteria. Newton, after all, spent his last years studying astrology and Biblical prophecy with the same critical method he brought to the inverse square law. It was only gradually that subjects considered unfit for empirical study were weeded out of serious consideration. The process was dictated neither by importance nor by interest. The limits were imposed by the means of justification. The distinction only gradually emerged: the word scientific meant quantitative and predictable. If phenomena did not lend themselves to counting and measurement or if they yielded results that could not be predicted, they failed to meet the bar of true empirical science. Instead, they were relegated to the netherworld of the pseudo-sciences, covering a huge range of disciplines from hypnosis to acupuncture to spirit detection to phrenology: all still subjects of serious study by amateurs, but all banished from the field of professional inquiry. This sorting process continues to this day, but the classifications have hardened sufficiently for scientists to draw fairly rigid exclusionary walls around their fields. Remember that this process has taken three centuries to mature and evolved entirely from the application of the scientific method in a sustained attempt to make experience more reliable.

 But we face a chicken-and-egg problem on this score. The methods of science are valued not for their own sake but because they work. They have produced astonishing results. You can see these in three different orders of magnitude, all equally impressive. First, the interlocking subject disciplines of the hard sciences have proceeded from observation to theory about reality, and the theories all confirm each other (the exception of quantum and gravitational forces in physics is being whittled at today). The guiding theoretical basis of each subject discipline is its paradigm. It channels work in the field into productive investigations and excludes outlier theories that challenge it. It unifies experimental and observational work. Now this might be seen as a construction rather than a discovery of the fundamental unity of nature; that argument fails because the paradigms of one discipline must accord with those of related fields. So, for instance, biology is built on the fundamental chemistry of the periodic table, and each subdivision of the biological sciences investigates its own slice of reality, all building mutually supportive theories upon the dominant paradigms of the field. This interlocking relationship strongly supports the accuracy of each field of study. A second impressive indicator of science’s success is the increasing complexity of its studies, revealing layer after layer of often surprising truths about the material structures of reality. These deep observational and experimental findings often overturn entire paradigms and reorient the life work of experts in the field. At this frontier of investigation, it is not the connections among disciplines that impress us but the abstraction within each field, with experimental verification often resulting from rather than producing theoretical mathematical models. This networking and depth of study are impressive enough to confirm the natural sciences as the royal road to truth, yet what matters most to ordinary folks is the final verification of scientific success: its technology. The greatest material advances in human history have coincided with the maturation of the sciences. Who can doubt its truth-finding power?

 The human sciences were at least initially a part of this mix. Natural philosophy, the eighteenth century term for empirical science, included in its purview the science of man, and efforts to shine the spotlight of close inquiry on human behavior at first seemed to offer dazzling promise. Those same empirical philosophers who taught budding scientists how to reason about experience felt quite at home exploring the mind of man and theorizing on his nature and the means to perfect it. But from the beginning, this new stab at Enlightenment galloped off in the wrong direction. Unfortunately, studies in the natural sciences that would dazzle the world were in their infancy in the eighteenth century, and so what we would now call the human sciences, pseudo-sciences, and the natural sciences were indistinguishable at first. Locke could write on the predominant influence of environment on childhood development, on the origin of government in a mythic state of nature, and a hundred other subjects, convinced he was engaged in the same kind of investigation that Van Leuwenhoek was undertaking with his microscope. It would take another two centuries for the natural sciences to sort out valid investigations as those in which conditions could be observed and measured, variables limited, experiments  conducted, and results repeated. Over the course of the next two hundred years, empiricism would shed its ancestry in philosophy and theology and embrace mathematics as the infinitely precise language of its practice. But the human sciences over the same span of time would do none of these things. Even so, they would stand proudly in the reflected light of empirical science and dazzle the lay world with the reasonableness and scope of their theories, encourage social experimentation, and agitate for change while wearing the scientist’s scowl of disinterest.

 It is perhaps forgivable that Jean Jacques Rousseau’s vision of the social contract should have inspired the French Revolution’s Reign of Terror of 1793. The truth-telling power of natural philosophy was in its infancy, and Rousseau was a thoughtful and witty critic of the social order of the ancien regime; his take on the new Romanticism that was sweeping popular Western culture was original and powerful. Nor is it surprising that Rousseau pushed Immanuel Kant into articulating the revolution in philosophy that was The Critique of Pure Reason. That one philosopher should challenge another is part of the great conversation of ideas stretching back to the Hellenic Age. What is unfathomable to me is that Rousseau should be regarded as the father of American educational theory in the twentieth century. Now here is a puzzle. Rousseau’s fictional narrative advocating “natural education,” Emile, provoked lasting debate about the best means of schooling. This debate ignored the historical truths that Rousseau fathered five children, all of whom were sent to orphanages as infants, and had no expertise whatsoever in tutoring. The question of whether children should be allowed to follow their interests or be inculcated with habits and general knowledge is certainly worth serious discussion and debate among all interested parties. But that is not what happened.

 In 1894 the Pragmatist philosopher John Dewey was invited to chair the new department of Philosophy and Psychology at the University of Chicago. Note the title of the department: the human sciences were on the cusp of eminence and essentially facing the monumental decision of whether to accept alignment with the sciences or with philosophy. Dewey was a key player in this drama, for his position in what is often called the only American philosophical movement was that the truth or goodness of any declaration depended on its usefulness to the thinker, that usefulness to be decided by a life of “practical experimentation.” Note both his preference and his error here, for they molded American education for the next hundred years. First, his inclination was clearly and permanently toward science, and he saw every reason that our social existence should be ordered on scientific principles. But note too the absurdity of considering undifferentiated experience as a kind of experiment. The essence of empiricism and the source of the power of the natural sciences is the isolation of variables to assist close observation, the replication of results through repeated trials to filter out the faults of perception and preconception, measurement to clarify cause/effect, and replication of results. It is this painstaking and excruciating process that has produced the glories of natural science mentioned above. And Dewey was suggesting what? That we can emulate this success in ordinary and undifferentiated experience, the kind of activity in which perceptions whiz by like oncoming traffic and reasoning faces a kaleidoscope of judgments every second? What neurological or experimental evidence could legitimize such a sweeping recommendation? And he was offered a university chair at a great American institution to do it? How can this kind of lunacy be explained?

 I have argued previously (on July 22nd and 30th of last year) that the turn of the twentieth century was marked by fevered attempts to remedy the defects of modernism made manifest by the contradictions of Victorianism culminating in the coming of World War I.  John Dewey’s success was one of those attempts; it exploited the technological and theoretical revolutions that natural science was producing, harnessing its power to dreams of social perfection. Dewey’s notions were directly rooted in Rousseau’s musings on natural education, and so yet another grotesque Victorian hybrid of Romanticism and rationalism entered the mainstream, this time attempting to nourish children’s curiosity as the touchstone of a new social science.

 It was called Progressivism, and it was launched at the nation’s first laboratory school, founded at Dewey’s insistence in 1896 at the University of Chicago.  As the new century dawned, colleges of education sprang up around the country, embracing the primacy of “scientific” pedagogy modeled on the Progressive model. Dewey himself denigrated obstructionists and traditionalists, and so the notion of natural education promoting socialization seeped into the new K-12 public school systems across the nation backed by the full force of the new social sciences. The movement splintered in its application, but by the 1930’s was called Constructivism, structured by the Swiss psychologist Jean Piaget (who was the director of the Rousseau Institute in Geneva while publishing his theories, incidentally). Its modified goals could be summarized in two precepts: education should serve psychological and sociological goods and all knowledge is the personal construction of the learner. Constructivism is still the default academic model in U.S. schools. But note the postmodern implications of its goals and the empirical tone of its means of achieving them. To say that knowledge is constructed rather than transmitted is to strike at the traditional heart of the goal of education. The continuity of Western culture and the critical abilities required to improve it are no longer central to our understanding of education. This abandonment tracked a larger cultural rejection of modernism and embracing of novelty in the first decades of the twentieth century in all cultural pursuits from architecture to poetics (The causes of this invisible revolution are the focus of my book, What Makes Anything True, Good, Beautiful: Challenges to Justification.} The teacher is no longer the expert but is now the facilitator of ….what? Whatever the child values? And how is progress or promotion to be measured if success is to be determined by the satisfaction or social adaptation of the learner?  The jargon implied by this orthodoxy in educational circles was to make the teacher “the guide from the side rather than the sage on the stage.” As students construct their own virtual circles of truth, goodness, and beauty (please see my posting of August 6, 2013, for a fuller explanation of that process), they become the controllers of curriculum and the arbiters of relevance in the classroom. The current obsession with standards-based education is a direct repudiation of Constructivism. It is surprising to me that this current educational battle has not been more generally recognized and its roots uncovered. No one should be surprised that Piaget’s four cognitive stages of development were discarded in light of more sophisticated cognitive research. Yet strangely, the erosion of the “scientific” basis for Constructivism seemed to provoke little doubt in its social science adherents, possibly because those who major in education have little subject-discipline knowledge. I predict that their understanding of the roots of the theory, its pseudo-scientific formulation, and its application in American classrooms for nearly a century through the influences of colleges of education and professional licensure will one day be exposed as the single greatest cause of the current crisis of American education.  And so at the beginning of the twenty-first century, we face this strange question: what are the odds of an eighteenth century French Romantic philosopher steering the course of American public education in the twentieth century?

 Stranger things have happened. And far worse. The same bizarre combination of Romantic yearning and human sciences produced the two greatest scourges of the last century: communism and fascism. It was the Romantic philosopher Hegel who inspired Marx during his days in Berlin to posit as a fundamental law of historical science the dialectical process of feudalism, capitalism, and communism. And so a philosophical theory was again transmuted into a human science “law” that binds human will. And again, the original “scientific” theory was modified by successors, this time by Leninism, Trotskyism, and Maoism into successor and competing theories. And again, the theory proved fatally flawed. To offer another example, it was the philosopher Herder who molded the Romantic notion of German exceptionalism that then was “confirmed” by ethnology and the new sociology of thinkers like Max Weber into a theory of the uniqueness of the Aryan race and their supposed persecution by the descendents of Roman Europe.  And so the “new men” of the new sciences marched confidently toward a future of social perfection first prophesied by the philosophers of the Enlightenment, inventing in their pursuit a word unknown until 1944: genocide.

 A final example is to be seen in the work of Sigmund Freud, who published his landmark The Interpretation of Dreams in 1900. The medical doctor who had begun his career flirting with hypnosis and cocaine as cures for female hysteria trolled the shallows of pseudo-science from the beginnings, and his reliance on myths as models of human behaviors introduced powerful Romantic influences to his theories of the unconscious. His later works, in particular, effaced the line between science and philosophy, though it is hard to trace any period in which Freud drew it clearly. Again, his work was subjected to divisive reinterpretation by later generations of psychologists, partially resulting from his own sense of persecution, producing what seems a perennial problem for the human sciences: a profusion of competing and mutually contradictory paradigms,  as indicated by the Psychology Today absurdity mentioned at the beginning of this essay.

 There is a repetitive storyline here. Philosophy inspires Romantic yearnings that then are systematized into human science, all occurring as natural science rises in public esteem. These “new sciences” then are transmuted into competitive successor theories that revivify versions of the original theory and produce the very opposite of the social perfection they envisioned.  So what is the lesson? We may draw several.

 Most obviously, we should beware of the poisonous conflation of Romantic philosophy and the human sciences. It seems combining the grand visions and yearnings of Romanticism with the trappings of secular priesthood granted to science produces no end of mischief in the world. Every issue examined by human science is worthy of thoughtful and careful conversation by educated persons. But Romantic emotionalism and human science dogmatism are anathema to the kind of conversation required to find the proper place for student-centered learning, economic justice, national identity, and healthful psychology. Perhaps our error was in our approach more than in our ambitions, for our history is a full of examples of social advancement made possible by the yearnings of social reformers. We have been blinded by the power of science into accepting truth and goodness claims that seem reasonable and comprehensive, yet are far beyond the power of science to confirm. Hats off to the philosopher of science Karl Popper, whose theory of falsifiability should be required reading for every educated person. Popper’s lessons, incidentally, retain their power in applications beyond science, for the temptations of systematizing practiced by postmodernists and each of us in the building of our unique virtual circle lead to desires for premature closure and the self-love that characterize belief in our own explanatory models. Popper warns us that the test of these constructivist visions is not their reasonableness. The “Clue Problem” I referenced last week is to blame. Given a limited number of facts about experiences, a variety of rational explanations can be found to link them into a causal sequence, each seeming to be reasonable and even revelatory. Popper wisely advises that we ignore the logical attraction of these theories in favor of something less self-evident: their falsifiability. We should ask ourselves under what circumstance our rational construction might be proved false. What set of facts or judgments could conclusively prove the error of our theory? Note that the theory must not be proven false, for that would end it, but rather should be subject to tests that might disprove it.

Consider two examples. On the one hand, we have Einstein predicting the apparent shift in the position of a distant star caused by the passage of its light near the gravitational well of our sun. Should it be observed in a non-shifted position or should its position be shifted beyond 1.75 arc-seconds, a miniscule but precise prediction, his theory of general relativity should have to be revised or abandoned. When Arthur Eddington confirmed Einstein’s prediction in the solar eclipse of May, 1919, Einstein became the twentieth century’s most important figure, for his theory had passed the test of falsifiability. The second most important, according to Time Magazine was Sigmund Freud for his work on psychology. Popper discusses one of Freud’s followers, Alfred Adler, diagnosing a patient without personal contact, and arguing for the accuracy of his diagnosis “on the basis of my great experience.” But as Popper notes, the diagnosis was so flaccid that it could be modified to suit any of the patient’s personal circumstances—Adler was uninformed on such things—and retroactively claimed to be correct. This self-deluding quality is characteristic of practitioners of the human sciences. Their theories are too vague to be falsifiable. We can charitably attribute their self-deceptions to the catharsis that accompanies their first exposure to the social science paradigm they embrace. It is nothing less than a conversion experience in which their virtual circles are reoriented by the breathtaking scale of the paradigm they adopt. (For more on this justification issue, please see my posting of September 18,  2013). Of course, thinking of their attachment to their virtual circles in religious terms establishes a knowledge equivalency between human sciences and religious belief that neither side would embrace.

 A more obscure question but one of great importance in our current climate concerns the fate of the human sciences themselves. If the defining quality of their nature is their concern with the “black box” of human motivation, then they face insurmountable odds in the current zeitgeist and whatever reputation they can salvage must look backward rather than forward. The future without question belongs to the hard sciences. Neurology and genetics cannot supplant psychology too quickly in my estimation. But notice the response from practitioners and academics. Rather than try to open that black box, they resort to behavioralism, the study of behavior rather than motivation. In essence, they favor what can be observed. Now traditional human scientists might regard that as a total surrender of the field. Freud certainly would have. But to my mind the mystery of human consciousness is far too little understood to be the subject of scientific study, but we can observe and make fruitful conclusions about what people actually do. We can call that psychology and still ignore the core questions of motivation and consciousness. And, glory of glories, we can do real experiments in behavioralism that yield quantitative data and allow us to form real and useful theories. Look at a listing of human sciences and think about how each field could do something similar. It might be best if you ignore the spectacular failure of economists to build predictive models or the efforts of sociologists to advocate for “healthy” social structures, as Margaret Mead so shamelessly did in her studies of the Tahitians. From the very beginnings, human sciences have had a problem isolating the true from the good. From Freud’s characterization of religion as “the history of a delusion,” to Timothy Leary’s advocacy of LSD, to competing economists advocating rather than analyzing policy proposals, it seems human scientists are just too human to be as objective as science demands. Against Popper’s advice, they fall in love with their beliefs and claim them as knowledge.

 We have lessons to learn about justification as well. The most powerful warrant for our correspondence truth claims is the empirical one. The weakest is that of undifferentiated experience. That is why I view with befuddlement Dewey’s attempt to conflate one with the other and see the dawn of the twentieth century as an age of epic ignorance and wild experimentation by those who should have known better. Perhaps if the prophets of the human sciences had indulged in less jargonizing and abstracting, they might have encouraged the rational discussion of issues that produce real human progress. Or maybe they were too infected with Romantic passion for that, an infection they thought they were curing with their pseudo-science.

 The best news about this particular and dreadful chapter in Western history is that it seems to be ending. I don’t often credit postmodernists, but despite their sordid reliance on the terminology and methodology of the human sciences, their world view forbids any admiration of empiricism’s supposed objectivity, and in the case of the soft sciences, their suspicions are well-warranted. Only this morning, I heard an economist author advocate that we distrust experts. As they say, from the horse’s mouth. Every day, the true empirical sciences continue their relentless advance into the unknown, appropriating the traditional objects of the human sciences with the same inexorable efficiency they once used on philosophy and religion.

 Until next week,

 S. Del

Advertisements

5 thoughts on “The Calamity of the Human Sciences

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s