A few weeks ago I took the time to reread the book that launched my current sequence of posts about enchantment, Jason Josephson-Storm’s intriguing study The Myth of Disenchantment. One test of a book’s value is whether it can handle being read more than once. Josephson-Storm’s book stands up well to that test. Each time I read it, it does the thing you’re not supposed to do while walking in the mountains in winter, and sets lumps of mental snow spilling downslope, setting off avalanches of ideas that sweep away whole villages of preconceptions in their path.
One of the reasons the book’s so fascinating is that it draws on a field of scholarship that has by and large been far more notable for its failures than its successes, and still does something useful with it. The field in question is critical theory. If you’ve been watching the latest pitched battles in the culture wars here in the United States, you’ve doubtless heard of one offshoot of this body of work, the much-ballyhooed and much-denounced field of critical race theory. One thing you won’t get from the media furore about that subject is any notion of what critical theory (with or without the word “race” stuck in the middle) actually is. We can start there.
Critical theory was born in Germany between the two world wars. Its founders were a clique of Marxist academics in Frankfurt who discovered to their horror that the grand march toward the communist utopia predicted by Marx wasn’t happening on schedule. On the one hand, communism in the Soviet Union, instead of becoming the workers’ paradise of intellectuals’ wet dreams, had devolved into a totalitarian nightmare with a reliable habit of mass murder on the grand scale. On the other hand, the working people of one of the most educated and cultured nations of Europe, who according to Marxist theory should have been flocking to the banners of proletarian revolution, were turning their backs on the entire spectrum of respectable political beliefs, to rally around a weird little man with a toothbrush mustache who called on them to abandon rational politics for an archaic, bloodthirsty mysticism of race and soil.
Obviously something had gone wrong, not just with Marxism but with the entire enterprise of Western rationality summed up in the phrase “the Enlightenment.” Pause and think about that phrase for a moment. One of the basic credos of the cultural mainstream in Western countries is the rather odd notion that, at a certain point not that many centuries ago, for the very first time in human history, intellectuals in western Europe saw the universe as it actually is. Before then, despite fumbling attempts in the right direction by ancient Greek philosophers, humanity was hopelessly mired in superstitious ignorance; afterward, Western intellectuals led a rapid ascent toward true knowledge of humanity and the universe. People still speak of that period using such far from neutral terms as “the Age of Reason” and “the Enlightenment;” in Germany the term is die Aufklärung, literally “the Clearing-Off.”
It’s to the credit of the founders of critical theory—Theodor Adorno, Walter Benjamin, Erich Fromm, Max Horkheimer, and Herbert Marcuse—that they didn’t just sweep all these problems under the rug and go on believing in the secular mythology of progress. They grasped that the Enlightenment had failed to accomplish what everyone expected it to accomplish. They set out to understand what had gone wrong. Since they were Marxists, of course, they still framed things in terms of the march toward a utopian society of the future, and critical theory from its early days thus set out not just to understand society but to change it. It sought, in Horkheimer’s words, “to liberate human beings from the circumstances that enslave them”—but it tried to do that by understanding the entire panoply of reasons why those circumstances happen to exist at a given place and time.
That’s what makes critical theory useful. Treat a belief as though it’s timeless and context-free and all you can do is accept or reject it. Recognize that every belief has a history and a cultural context and you can understand it instead, and this opens up a galaxy of new possibilities. Critical theory tries to do that with the core beliefs of Western society. The first major book to come out of the critical theory movement, Horkheimer and Adorno’s Dialectic of Enlightenment, tried to make sense of the way that Enlightenment rationalism led to the twin tyrannies of Stalin and Hitler. It’s still worth reading today, even though much of what passes for critical theory in the present time is very little more than empty propaganda.
In the opening lines of his Guide to Kulchur, Ezra Pound made a comment that’s relevant here. “In attacking a doctrine, a doxy, or a form of stupidity, it might be remembered that one isn’t of necessity attacking the man, or say ‘founder,’ to whom the doctrine is attributed or on whom is it blamed. One may quite well be fighting the same idiocy that he fought and whereinto his followers have reslumped from laziness, from idiocy, or simply because they (and/or he) may have been focussing their main attention on some other goal, some disease, for example, of the time needing immediate remedy.” It’s quite common, in circles unsympathetic to what critical theory has become, to assail Adorno, Benjamin, et al., because of the current antics of their followers. This is unfair. The founders of critical theory did in fact make a massive mistake, but it’s one that pretty much everyone made in those days and too many people still make today.
The mistake? The failure to recognize that the academic circles to which Adorno, Benjamin, et al. belonged, and to which their current followers by and large belong today, themselves form a privileged class with a straightforward interest in furthering its own influence and grabbing more than its share of wealth and privilege. Critical theory by and large avoids talking about that. A genuine critical race theory would interrogate the discourses concerning race used by left-wing activists in today’s society, and show how those discourses are used as instruments of hegemony by those activists and the people who pay them. A genuine critical theory would also interrogate the discourse of “liberating human beings from the circumstances that enslave them,” and talk about how that rhetoric of liberation is used, as of course it is, to replace one set of enslaving circumstances with another, and to disadvantage one group of people instead of another. You can read a whale of a lot of critical theory and never catch the least whisper of this sort of thinking.
That’s the thing that makes Jason Josephson-Storm’s work so fascinating to me. He tiptoed very close to the edge of that forbidden territory, by suggesting that one of the most fundamental assumptions of modern thought—the notion that we modern people are disenchanted, freed from the superstitious burdens of the past and venturing heroically forward into a new world free of myth and magic—is simply another myth, playing the same role in our culture that the things we like to call “myths” play in other cultures.. He’s applied the tools of critical theory to one of the basic assumptions that underlies critical theory, and showed that belief in disenchantment is just another set of discourses that emerged out of a particular set of historical circumstances and is employed to advantage certain people over others. It’s an impressive project.
One of the writers who has influenced him, a scholar he cites at important points in the course of his book, has done the same thing on a bigger scale to an even more vulnerable set of narratives. This is Bruno Latour, one of the first scholars to study the social construction of scientific facts. That last half sentence is quite a mouthful, so let’s take it a step at a time.
It’s part of the mythology of science that scientists in their research are simply following where nature leads. In practice, it’s very nearly the other way around. This is best understood through a concrete example. Let’s imagine, then that, you intend to do some research into the atmospheric physics of near space—that fascinating realm where the atmosphere thins out to blackness, radio waves bounce off fluctuating layers of charged particles, and solar radiation, lunar tides, and a galaxy of other influences kick off intricate processes with poorly understood effects on the planet below. So you prepare for your research by reading the relevant literature, you craft a hypothesis you want to test, you consider the available equipment you can use to find out what’s happening miles above your head, you design your experiment, and you find funding for it. All these are standard elements in science as it’s actually practiced. It’s quite common for these steps to be dismissed as mere details, but they’re far more significant than that.
Their significance can be grasped by taking a closer look at the steps just outlined. The literature you’ve read is the product of the complex social process of peer review and the evolution of scientific opinion, which has at least as much to do with academic politics as with anything nature is doing. The hypothesis you devise is a product of your education, a social process, and also of current fashions in the field of near space studies—anyone who thinks that scientists are immune to the blandishments of intellectual fashion has never met a scientist. What equipment is available for you to test your hypothesis depends on who’s put how much money into developing which kinds of experimental gear, and also on what gear is popular and readily available in your field just then. Your experimental design is just as subject to fashion, and it also has to appeal to funding sources and to whoever controls access to the necessary equipment and other resources in your department. The decision to grant or withhold funding for your experiment, finally, depends entirely on the behavior of human beings involved in the funding process.
Now let’s take the story the rest of the way. You’ve navigated through the immensely complex social process necessary to run your experiment, the high-altitude balloon that carried your experiment eighteen miles into the sky comes down to earth again in a Pennsylvania pasture, and you have the results in hand. You then have to interpret the results, write a paper, get a prestigious coauthor or two to sign on, submit it to a journal, wait nervously while it goes through the peer review process, revise the paper at least once in response to comments by the anonymous peer reviewers, see the paper through publication, and wait to see how other researchers in near space studies respond to it and adapt their own research projects in the light of what you’ve found. All these, again, are social processes.
The end result of your research project—a half sentence and footnote, say, in some future textbook of near space studies—is thus almost entirely a product of social interactions among human beings. At the center of theose interactions, the flake of grit at the heart of a big and strangely shaped pearl, is the fact that you asked nature a specific question and got an equally specific answer. That process of question and answer is the thing that makes science as effective a way of making sense of the world as it is, but it does not erase the effect of social processes on the result—it just means that the result has to have some contact somewhere with nature.
Now take that and multiply it by four centuries or so of scientific effort, and the result is a vast social process built atop a relatively narrow foundation of natural facts. Those facts are carefully selected, curated, and assembled by the social process into a model of the world. Ask different questions, use different equipment, give the results a different theoretical spin, and combine them in different ways, and you can quite easily end up with a completely different model of the world. That’s not something most people want to talk about in the scientific community, because it would weaken the claims of that community to its current share of influence, wealth, and privilege. That’s why so many scientists were shouting “Believe the science!” at the tops of their lungs not so long ago: maintaining the cultural prestige of science, and thus their own social status and its perks, took precedence over nearly everything else.
If you want another glimpse at just how far the social enterprise of science veers from its imagined ideal, look up the phrase “replication crisis” sometime. One of the essential principles of science is that any scientifically valid finding has to be replicable; it can’t be some kind of temporary fluke; anyone who repeats the same experiment should get the same results. These days, for an astounding number of studies in a very wide range of sciences, that’s no longer true. Very few people in the sciences want to talk about how much of this is caused by experimental and statistical fraud, both of which are pervasive in those branches of science where corporate profits are involved and far from rare even in less lucrative fields of research. Then there’s the question of how much of it is simply the result of only asking those questions that will support existing theory, or of spin doctoring of data to make it fit current theoretical commitments—and these, too, are extremely common all through the sciences.
If science was really a matter of following nature wherever it leads, the emergence of the replication crisis would have caused a sudden frantic search for the causes. We’re talking, after all, about something that challenges the act of faith at the center of the scientific enterprise, the trust that the results of scientific inquiry really do provide reliable glimpses into the behavior of nature. By and large, that search hasn’t happened. Instead, most scientists have shrugged and kept plugging away at their existing commitments, with at most an uneasy sidelong glance now and then at Retraction Watch or one of the other websites that talks about who got caught. Other scientists have gone on the attack and denounced anyone who talked about the problems with modern science. That’s the typical behavior of any elite group faced with a challenge to their legitimacy: that is, a social process.
This is the kind of thing Bruno Latour writes about. His field of study is the sociology of science as it is actually practiced, and how the social practices of science shape, and often define, the model of nature produced by science. He’s not alone in that field, of course. I’ve written more than once in previous essays and books about John Ellis’ 1975 study The Social History of the Machine Gun, which shredded the claim that technology is value-free by showing how specific values guided and required the creation of one particular technology. I’ve also written here and elsewhere about Misia Landau’s 1991 book Narratives of Human Evolution, which showed that 19th and 20th century scientific accounts of how human beings evolved were simply rehashes of the myth of the hero’s journey, with Man as the collective protagonist and every single incident discussed by Joseph Campbell in his writings on hero myths present and accounted for.
Latour’s strength is that he’s taken up this approach and generalized it. One of his books, the one sitting on my endtable right now, takes it in a direction deeply relevant to the work of this blog. The title? We Have Never Been Modern.
The argument behind that teasing title is complex and draws on a great deal of social theory that’s not especially relevant to our discussion. The heart of the point he makes, though, is easy enough to express. Most people who live in the industrial world at present are convinced, or at least act as though they’re convinced, that the modern world is something new and unique in human history, because unlike all others, our sciences really do tell us the straightforward objective truth about nature, undiluted by social processes. They are also convinced, or act as though they’re convinced, that the same thing is true of everything else in our culture. When authorities in the modern world claim that something does not exist, in other words, that claim implies that it never existed and never will exist, and anyone who says otherwise is a liar or a fool; when those same authorities claim that something is morally right, that implies that it has always been right and will always be right, and anyone who disagrees is evil.
Other societies, other ages of history, had subjective opinions about what’s true or false, right or wrong. We and we alone supposedly know the truth about everything that matters, and if we don’t happen to know it yet, nobody else could have known it either. That’s the heart of modernity. That’s the conviction that keeps people nowadays from making use of any of the hard-won lessons of past civilizations, or even learning from our own civilization’s catastrophic mistakes. It’s a fond, false, foolish belief, it’s hardwired into the foundations of contemporary thought—and Latour and Josephson-Storm have shown, from two different angles, that it can’t be justified except by the most absurd sorts of special pleading and circular logic.
What does it mean if we give up the myth of modernity, the conviction that we—alone of all the human beings who have ever lived—see the world truly? Surprisingly enough, we don’t have to give up science. The plain realities that scientific facts are socially constructed, that scientific worldviews are not given by nature but assembled out of data points drawn from nature in much the same way that a mosaic is assembled from bits of colored stone, and that the pattern in which those data points are assembled is as much a human product as the pattern of the mosaic, does not make science meaningless or false. It simply makes the work of the scientist a product of human society and culture rather than a revelation handed down from on high.
In a very real sense, science stripped of the myth of modernity takes on the same shape as the study of history. It’s absurd to think that history is simply an account of what happened; “what happened” in a month in any small town would fill entire libraries. The historian’s task is to craft a narrative which illuminates some part of the past, using actual incidents as building blocks. A scientist without modernist pretensions, similarly, crafts a narrative that illuminates some part of nature, using replicable experimental results as building blocks. Theories along these lines are useful rather than true; they start by accepting the reality that the human mind is not complex enough to understand the infinite sweep of the cosmos, and then goes on to say, “but as far as we are capable of making sense of things, this story seems to reflect what happens.”
This sort of thinking is doubtless a bitter pill to swallow for those who have founded their own identities on the notion that humanity is or should be the conqueror of nature, the acme of evolution, the measure of all things, and all the rest of that self-important drivel. Here again, though, the failure of those notions as a means to create world fit for human habitation is increasingly clear to many of us. I’d like to suggest that the sooner we accept that today’s industrial societies are just another set of human cultures, that the stories they tell to explain the world are just another set of mythologies, and that the technologies they’ve created to manipulate the world are just another set of clever gimmicks—why, the sooner we do these things, the sooner we can get to work discarding those aspects of modernity that have failed abjectly, picking up those older habits and stories and technologies that are better suited to the world we find ourselves facing, and do something less inept and foredoomed with our time on earth.