There are times when the twilight of the American century takes on a quality of surreal absurdity I can only compare to French existentialist theater or the better productions of Monty Python’s Flying Circus, and this is one of them. Over the weekend, in response to a chemical-weapons incident in Syria that may or may not have happened—governments on all sides are making strident claims, but nobody’s offering evidence either way—US, British, and French military units launched more than a hundred state-of-the-art cruise missiles at three Syrian targets that may or may not have had anything to do with chemical weapons, damaging a few buildings and inflicting injuries on three people.
James Howard Kunstler, in a recent and appropriately blistering essay, termed this “kabuki warfare.” It’s an apt term, though I confess the situation makes me think rather more of John Cleese and the Ministry of Silly (Bombing) Runs, or perhaps a play by Camus in which Bashar al-Assad and Vladimir Putin sit around talking while they wait for the endlessly delayed arrival of an American cruise missile named Godot. What, exactly, was accomplished by Donald Trump’s red-faced bluster, the heavily rehearsed outrage and cringing subservience of our European lapdogs-cum-allies, and all those colorful photo ops of missiles blasting off?
To be sure, there’s nothing even remotely new about the latest skit from this transatlantic flying circus. For most of a decade now the US military has been carrying out a similar sort of warfare against jihadi militias in Syria and Iraq, pretending to fight Islamic State in much the same way a mime pretends to be trapped in a phone booth—a habit pointed up by the way that the Russian military, which has a less ineffectual notion of warfare, pushed Islamic State into prompt collapse by having their cruise missiles and bombs actually hit something.
There’s nothing uniquely Trumpian in this sort of silliness, to be sure; the mock war against jihadi terrorists was launched by the younger Bush and pursued with unflagging enthusiasm and utter fecklessness by Obama, and indeed such futile gestures have been standard bipartisan practice for American presidents for a good many decades now. For that matter, the maximal theater and minimal effect of our military gestures in the Middle East are arguably par for the course from a nation whose health care industry doesn’t care for anybody’s health, whose education system long ago stopped even trying to educate, whose Democratic Party has nothing but contempt for democracy, and whose Republican Party displays an equal contempt for the res publica, the public good from which the entire concept of a republic derives.
At the end of an age of abstraction, such absurdities are par for the course. At some point in the not too distant future, as I pointed out a few years ago in my novel Twilight’s Last Gleaming, it’s pretty much a given that the US is going to run face first into an opponent that takes war a good deal more seriously than we do, and by the time the dust settles it’s anyone’s guess whether the United States will have the same system of government—or for that matter, whether it will still exist as a single nation. (When the Russian Empire sent its huge, expensive, and ineffective military lumbering into a major war in 1914, the Tsarist imperial system was terminated with extreme prejudice and replaced by the Soviet Union; when the Austro-Hungarian Empire did the same thing with its own equally overpriced and underperforming military in the same year, it ended up partitioned into half a dozen new nations.)
What John Kenneth Galbraith called the twilight of illusion—the inflection point at which the end has arrived, but is not yet in sight—plays a massive role in today’s world. While we wait for the inevitable moment when realities finally get a look in, though, it’s useful to keep pursuing the project set in motion on this blog some weeks ago: the exploration of ways in which individuals can haul themselves up out of the swamp of abstractions in which our society is so deeply mired just now, and catch their balance again on the solid footing of things that actually matter.
Let’s summarize the historical dimension of our present predicament; it will make what follows a good deal clearer. In the course of their life cycle, societies pass through a series of predictable stages, and these stages shape—among many other things—the way people in those societies relate to knowledge. In very broad terms, we can speak of an age of faith, an age of reason, and an age of memory, or (to put the same insight in different terms) an age of narration, an age of abstraction, and an age of reflection.
As we look back from the present along the historical trajectory that leads to the modern Western world, that cycle can be traced three times—two complete, one not yet complete. The classical world went through all three stages along the historical arc that leads from the mythological consciousness of archaic Mediterranean societies, through the flowering of Greek philosophy and culture, to the long slow reflective twilight of the Roman world. The medieval world—centered in the Middle East, but influencing Europe as well—went through the same cycle thereafter, from the mythological consciousness of formative Christianity and Islam, through the flowering of scholastic philosophy and culture, to the long slow reflective twilight that wrapped the Byzantine-Muslim world after the Mongol invasions and kindled the Renaissance in Europe.
The modern world then launched itself on the same trajectory. Though it’s considered impolite to point this out in many circles, the high intellectual culture of medieval Europe was never much more than a pale reflection of cultural stimuli from points further east—the dependence of medieval European scholars and thinkers on translations from Arabic is very well documented, and so is the massive role played by the fall of Byzantium in 1453 and the westward flood of Byzantine refugees that followed it in launching the high Renaissance in Italy.
In the western and central European nations where our civilization had its origins, interest in the high culture of the Muslim world and its European echoes was an upper-class affectation, and never put down deep roots. The age of faith that followed the end of the western Roman empire thus continued effectively unbroken straight through the Middle Ages, and attempts to spread a different cultural mentality in the Renaissance generated the violent pushback of the Reformation era. So our age of faith ran its course in the usual way, and gave way to an age of reason, which (as ages of reason always do) started out with perfect faith in the supposedly limitless power of the human mind to make sense of everything that matters, and (as ages of reason always do) proceeded to bog down in a swamp of abstract generalizations increasingly disconnected from the actual circumstances in which people live their lives.
It’s that swamp of abstraction that’s on display as the US and its European client states lob petulant flurries of cruise missiles at arbitrarily chosen Syrian warehouses, and brandish an assortment of equally arbitrary verbal noises to justify their actions. That, in turn, is why—as we’ve discussed all through this series of posts—it’s crucial for those of us interested in a less abstract and arbitrary approach to the rising spiral of crisis of our time to get to work on the foundations of the next stage in the cycle. That, in turn, brings us back to where we left off at the end of the last post in this sequence.
Two weeks ago we talked about the ancient and Renaissance study of topics—that is, of the truths we have in common, the figurative places from which discussion can start when there’s no overarching generalization to provide a basis for reasoning together. One detail we didn’t get to is the way that topics were divided up back in the day. Every specialized field of study had its own topics, and we’ve still got some of those in fossilized form today: behind the modern classification of chemical elements, for one, and the Linnaean system of taxonomy for living things, for another, lies the ancient quest for topics, for a set of mutually accepted truths from which discussions within a given field of study can proceed.
Those are the special topics or, as they were called once upon a time, the special or particular places. Those stand over against another set of places which serve a more general function as starting points for conversations outside any one specialty. These are the common places—or, as we tend to write the word nowadays, the commonplaces.
That’s one of the Rodney Dangerfields of English vocabulary these days, a word that gets no respect. To call something “commonplace” is to dismiss it as obvious, boring, beneath mention. Yet the term didn’t start out as a dismissal. Before our civilization embarked on its age of abstraction, the commonplaces were literally the common places, the places that didn’t belong to any specialized field of knowledge, but were at least potentially shared by all.
Please note: potentially shared by all. At the end of an age of abstraction, communication grinds to a halt because no two sides in any controversy have the same abstract generalizations in common, and abstract generalizations have become so central to thinking and conversation that next to nobody knows what to do when they fail. It’s the failure of abstract generalizations to provide a shared basis for communication that drives the search for some other starting point, and counterproductive as most of the attempts are, sooner or later they find their way to the same enduring resource of the commonplace, the experiences most of us have in common.
In an age that’s confident of its command of vast truths, the little truths that belong to the realm of the commonplace are dismissed as unimportant. It’s when those vast truths start to fall to pieces that the little truths come into their own. That was what sent thinkers of the Renaissance back to the old Greek study of topics, the things that are true most of the time, the matters of common experience—and the same resource has just as much to offer now.
So how do we find these commonplaces, these matters of shared experience? That’s where things become interesting.
One of the technological achievements that set the Renaissance apart from previous ages, at least in the Western world, was the availability of paper: a cheap, convenient, readily produced writing surface that allowed huge amounts of information to be stored in a compact, portable form. The technological revolution of papermaking set off a giddy assortment of consequences across Europe—newspapers, lending libraries, a torrent of cheap broadsides and pamphlets that had the same wildly uninhibited, inaccurate, unfair, and frequently obscene nature as the internet does today—but one that had a much bigger impact than people generally realize today was the cheap paper notebook: a bound volume containing some hundreds of blank pages, on which individuals could write their own content.
Does that seem like a little thing, dear reader? Until then, across the western half of Eurasia, writing materials were either expensive or sharply limited in content. In ancient Greece, for example, your literate person who wanted to write something down had his or her choice between expensive sheets of parchment or papyrus, on the one hand, and less costly materials that wouldn’t hold much, such as broken scraps of pottery, on the other. (There were also wooden tablets with a layer of wax on them, for temporary notes.) The notebook made it possible, for the first time in the Western world’s history, for individuals—and not just rich individuals, either—to make an enduring record of their own thoughts and reflections, and the things they encountered that set these in motion.
Thus one of the standard uses for cheap paper notebooks, from the late Renaissance straight through into the nineteenth century, was that once-familiar phenomenon, the commonplace book. Children were taught to start such a book as soon as they could write legibly, and a fair number of them kept at it straight through their adult lives. A commonplace book was a place to write down things you encountered that interested you, or set you thinking, or struck you as unusually true, or valuable, or beautiful. Short passages from books would go into a commonplace book; so would poems, your own or others’; so would recipes, household hints, useful facts; so would your own reflections on these things and others.
Notice what was going on here. These things were commonplaces, but they weren’t anyone else’s commonplaces. The standard practice was to encourage, even to require, each student to do the work himself or herself, so that each person ended up with a unique set of commonplaces. No doubt every single entry in a schoolchild’s commonplace book was shared with at least one other child in the same school, and a pretty fair number could be found in most—at times when books were still quite expensive, and students read a small number of them very carefully rather than reading a vast number superficially, the sources of raw material were rather more limited than they are today. Even so, each commonplace book turned into the anchor for a unique inner life, shaped by influences different from those that shaped any other person anywhere.
Many of the other common habits of the period from the Renaissance to the mid-nineteenth century had the same effect. One I find particularly intriguing is meditation. Since the rise of Theosophy in the late nineteenth century, most people in the western world have thought of meditation as something exotic, spooky, and foreign, practiced by mostly swamis in funny turbans and Shaolin monks seen through lenses smeared with vaseline. They’ve also thought of it as something that works by turning off the thinking mind, because that’s the kind of Asian meditation that the Theosophists publicized. (There are many other kinds that work the thinking mind rather than trying to amputate it, but those methods aren’t what H.P. Blavatsky and her immediate followers wanted to talk about.)
Yet the word “meditation” is good English, and existed hundreds of years before Blavatsky launched modern alternative spirituality (and invented modern fantasy fiction, but that’s another story). It didn’t originally have anything to do with turning off your mind, either. When we say that a crime was premeditated, after all, we don’t mean that the perp chanted a mantra before committing it; we mean that he thought about it.
That’s what meditation used to mean. Christians practiced it all the time, back in the days before most varieties of Christianity capitulated to modern pop culture and let their faith be redefined by its enemies. Joseph Hall, an Anglican bishop of the late seventeenth and early eighteenth centuries, wrote several hugely popular textbooks on the subject. His method—which was standard at the time—was to calm body and mind by any of several methods that would be highly familiar to your average Zen monk, and then take a brief passage from the Bible or some other source, hold it in your mind, and think about it, keeping your awareness focused on the theme (the subject of the meditation) and the thoughts that unfolded from it.
The technical name for this kind of meditation is discursive meditation—so called because it very often takes the form of an inner discourse. (Something very similar, interestingly enough, was practiced by the Stoics back when the classical world’s age of reflection dawned—you can find the details in Pierre Hadot’s excellent works The Inner Citadel and Philosophy as a Way of Life.) It has the same benefits as other kinds of meditation, but it also feeds the same process as keeping a commonplace book: the nurturing of a uniquely personal inner life. Because you’ve taken the time to think through things that matter to you, turning them over in the silence of your mind, you aren’t dependent on the people around you and the chatter of the media for your opinions and your values; your inner life is your own, not just a wholly owned subsidiary of your neighbors’, or of the big corporations who get to decide what you watch on television.
Does it seem to you, dear reader, that we’ve gone as far as possible from the theme of mutual communication in an age when all abstractions have crumbled? Quite the contrary, we’ve finally reached the point where that becomes an option. Let’s say you and your neighbor want to talk about some issue that matters to you both, and let’s say both of you have your own unique inner lives, stocked with a set of personal commonplaces that might not have anything in common with each others’, and that you also both have a habit of thinking thoughts you didn’t get off Twitter or the evening news of your choice. What’s more, you both know this.
Given that background, do you come on strong with some abstract generalization that you insist is the only valid frame in which the whole issue must be discussed? Or do you lead with a factoid that serves as a stalking horse for some such generalization? No, Socrates, you do not. You don’t do this because you know better—because you’ve measured the distance between your inner life and those of other people, and know that the presupposition of consensus that underlies those bad habits is a baldfaced lie.
Instead, you offer a commonplace that you think, on the basis of whatever knowledge you have of the other person, the two of you might be able to use as a mutually acceptable starting point. If that doesn’t work, you don’t throw a tantrum (or a cruise missile), you let the other person try—and you assess the commonplace that’s offered as though you were considering whether to put it into your commonplace book or use it as a theme for a meditation. Very often, because this works far more often than not, you start with your own experience, granting from the get-go that the other person’s experience may be different from yours. Bit by bit, the two of you build a foundation of shared experience from which you can discuss even a controversial subject in a thougtful, reflective, and constructive manner.
There’s a name for this kind of reflective exploratory conversation, too; it’s called dialectic. (No, this has nothing to do with Marxist babble about “dialectical materialism.” One of the many reasons that Karl Marx deserves to be posthumously slapped is his role in making it harder to talk about classical dialectic.) It’s a fragile thing, and can easily be disrupted by people who want to do that, which is why it usually thrives best in formal or informal institutions specifically set up to foster good conversation and exclude those who won’t follow the rules. We’ll be talking about such institutions as we proceed; we’ll also be talking quite a bit about dialectic—but before we get to that, it’s going to be necessary to trace things back another way, and talk about how the robust and unique inner life we’re discussing gets its raw material.