When I mentioned in a post two weeks ago that America was heading into a new phase of its history, and that I would be offering some suggestions about what that next phase might look like, I was far from sure how to begin that conversation. As happens fairly often these days, however, current events came to my assistance. Late last month, rumors began to spread that the Trump administration was drafting an executive order setting out rules for new government buildings that will require them to be built in the Neoclassical style that used to be standard for government buildings all over the country.
Of course this caused a fine shrieking meltdown, not only in the architectural profession but in corporate media outlets such as the New York Times. The executive order was denounced in the usual shrill tones as a war on diversity, a heavy-handed attempt by Trump to stifle the creativity of architects, and of course racist—the corporate media has a bad case of political Tourette’s syndrome these days, and “racist” is one of the words it blurts out whenever it feels stressed. It so happens that there’s a very good reason for the temper tantrum; it’s just not any of the reasons the throwers thereof want to talk about.
At the root of the tantrum is the unspoken but ironclad rule of modern architecture: ordinary people must not be allowed any say in the built environment in which they live and work. Only architects and certain other allegedly qualified experts are allowed to have a voice in those decisions. Everyone else is supposed to shut up, grit their teeth, and put up with whatever steaming, smelly mass of cutting-edge postmodernity the architects happen to want to excrete this time. This rule applies above all to major public buildings—you know, the ones that you and I get to pay for with our tax dollars. The contract goes out, the architects do their thing and collect their money, and everyone is supposed to be grateful, no matter how sickeningly ugly, meaningless, and dysfunctional the building that results from this process happens to be.
What makes this a loaded issue, of course, is that “sickeningly ugly, meaningless, and dysfunctional” is a good capsule summary of the architecture of our time. Buildings don’t have to be like that. Good architecture is beautiful—that is, it evokes feelings of delight in the people who spend time around it, provided that their natural taste hasn’t been spoiled by snobbery or an architecture degree. Good architecture is meaningful—that is, it communicates to the people who spend time around it, using the visual language of an architectural style rooted in their history and experience. Good architecture is functional—that is, it supports and facilitates the human activities done in and around it, instead of interfering with them. It’s because so few modern buildings fulfill even one of these requirements that the vast majority of Americans loathe modern architecture. It’s because of this, in turn, that Trump’s executive order is clever politics: once again, he’s goaded his opponents into supporting something most voters oppose, and once again he’s tricked them into being smug and arrogant about it.
We’ll talk another time about the reasons behind the rise of what I’ve called Uglicism—the attitude, pervasive throughout the artistic mainstream these days, that deliberately rejects beauty and meaning, and instead glorifies ugliness and a flat refusal to communicate with anyone outside a narrow circle of cognoscenti defended by various modes of snobbery rooted in class privilege. Just now I want to talk about two other points. The first is that the tantrum over Trump’s executive order isn’t just a tempest in an unusually ugly teapot. It’s not only in architectural matters, after all, that ordinary people are supposed to sit down, shut up, and accept whatever their soi-disant betters decide they’re going to get. That’s been the basic attitude of the managerial class all along, and it explains the media panic: if the people get the right to decide what kind of built environment they live in, will they demand a voice in other decisions too?
Of course they will, and right there you can see one of the outlines of the next America taking shape around us. As I discussed two weeks ago, the seven decades since the managerial class took power have proven beyond any question that the most thoroughly educated experts in the world can embrace embarrassingly stupid policies with appalling human costs, especially if they go out of their way to shield their fellow experts from the consequences of their mistakes. That’s why past generations in this country embraced, however imperfectly and unsteadily, the idea of representative democracy: the principle that the people get to choose those who make decisions in their name, and can throw the decisionmakers out on their ears if they make bad decisions and don’t learn from their failures. It’s also why past generations in this country embraced, however imperfectly and unsteadily, the idea of liberty: the principle that individuals, families, and communities should have the right to make their own decisions in any matter that doesn’t conflict significantly with the rights of others—and no, “I’m outraged by the decision you made!” doesn’t count as a significant conflict.
At intervals in American public life, elite classes decide that they’ve had enough of democracy and liberty, and try to rig things so that all the decisions that matter should be made by them or their flunkies. At intervals in American public life, the people get sick and tired of this, and demonstrate to the elite classes that they have another think coming. (One of the most common ways they do this involves finding someone the elite classes can’t stand and electing him to the presidency.) Over the decades immediately ahead, we can expect to see a fair amount of turmoil as politicians and voters tussle over who gets to make which decisions, which is business as usual in an open society; eventually it’ll settle down into a new elitism, but if things follow the usual rhythm, children born this year will be getting ready for retirement before that happens.
So that’s one of the processes brought to the surface by the squabble over architecture. The other requires a little more background.
In the United States, right back to the earliest days after independence, major government buildings were usually built in the Neoclassical style—that is to say, using a set of architectural forms meant to evoke the public buildings of Athens and the Roman Republic. That was a deliberate choice and it was made for good reasons. In 1792 nearly every other country on Earth was ruled by a monarch, and used public architecture to proclaim royal supremacy. The United States, in renouncing monarchical rule, used the heritage of Athens and Rome to remind the world that some of the greatest nations of the ancient world had done without kings and used architecture to celebrate the supremacy of the people instead.
Most people, provided that their natural taste has not been spoiled by snobbery or an architecture degree, find ancient Greek public architecture attractive and engaging. If you read The Ten Books On Architecture by Marcus Vitruvius Pollio, the only architect’s handbook to survive from the ancient world, you’ll find out why. The proportions of all those columns are derived from the proportions of the human body—the Doric style of Greek architecture is based on the male body, the Ionic and Corinthian styles on the female body. When you look at ancient Greek public architecture, in other words, with all those columns gathered together, what you’re seeing is a group of people standing around having a conversation. That’s why people in the United States recognized Neoclassicism for so long as the right style for their governmental buildings; it’s one of the very few examples of a monumental architecture that celebrates the people rather than glorifying the power of a monarch.
The rejection of Neoclassical architecture in favor of Uglicism after the Second World War also has a message to communicate. Look at a government building in the Uglicist style and you don’t see a group of people having a conversation. You see looming walls of concrete, glass, and steel shutting out the community, proclaiming a worldview that rejects everything human in favor of brute mechanical force. (It’s telling that Uglicist architecture is so often much more attractive on the inside than on the outside; what’s being communicated here, of course, is “we deserve comfort and beauty, but you don’t.”) There’s another thing you’re seeing, though, and that’s a systematic attempt to erase history.
An artistic style is a conversation across time. Over the long history of the Neoclassical style—a history that starts in the Renaissance, and draws on the early medieval Romanesque style as well as the Classical styles of Greece and Rome—architects carried on that conversation with each other, exploring what could be said with the rich architectural vocabulary of the Neoclassicist style. What’s more, this was a conversation that people outside the architectural profession could follow and participate in. On the one hand, it was a normal part of public education to learn at least a little about the history of architecture; on the other, architects had not yet retreated from the public behind a barrier of class-conscious snobbery as harsh and forbidding as the exterior of a modern office building. At chatauquas—the public speaking venues, half instruction and half entertainment, that played so important a role in 19th century American public culture—talks on architecture were common fare, and so when a new state or federal building went up, most adults knew enough to have an intelligent opinion about its design.
That was one of the things that went out the window after the Second World War as Uglicism came to the fore. For a while there was an attempt at a coherent style—the International Style, barren and bleak as it was, still tried to express a shared symbolic language—but before long the last traces of the conversation across time disintegrated into the schizophrenic word salad of Postmodernism or the stark catatonia of Brutalism. Instead of building on the history of architecture, with its rich stock of inherited meanings and its centuries of experience with human functions, modern architects by and large go out of their way to design buildings that reject and erase the past. That’s not a bug, it’s a feature—and it comes out of a pervasive hostility to history that shapes much of the behavior of the managerial elite today.
The thing that interests me is that this hostility to history occurs in mirror-image form on both sides of the line that divides mainstream corporate liberals from mainstream corporate conservatives. To the first group, America was a monolithically horrible place until postwar liberals came along and started fixing it, and therefore we should only learn a highly edited and tendentious version of what happened before then. To the second, America was a monolithically wonderful place until postwar liberals came along and started wrecking it, and therefore we should only learn a highly edited and tendentious version of what happened before then. Watch current squabbles over such hot-button issues as the New York Times’ 1619 Project, a hamfisted attempt to rewrite American history to conform to the latest fashions in social justice ideology, and most of what you’ll see consists of all-out war between the partisans of two absurdly simple-minded caricatures of American history.
This is why, to my mind, one of the most revolutionary things any American can do right now is to put some serious time into learning the parts of this nation’s history that don’t conform to either of these caricatures. I mean that quite seriously. As George Orwell pointed out trenchantly in his novel 1984, whoever controls the past controls the future; convince people that their entire history points in a certain direction and you’ve got a pretty good chance of keeping them going in that direction, even if that leads them right off a cliff. That’s why the wars over American history have been so bitter in recent years—everyone involved knows that the stakes in those struggles are very high indeed.
Both sides in the history wars of our time are committed to the sort of history I critiqued at length in my book After Progress: that is to say, history as morality play, a stereotyped conflict between progress and stasis in which one side acts out the role of heroic visionaries heralding the oncoming wave of the future, while the other side acts out the role of defenders of the outworn status quo whose sentimental attachment to the existing order makes them resist the inevitable and fail. It’s because conservatives bought into this narrative just as thoroughly as liberals that the history of conservatism from 1932 to 2016 was one long litany of defeats. It’s because liberals bought into this narrative just as thoroughly as conservatives that so many people on the American left have gotten so unhinged since the 2016 election, when history stopped conforming to the stereotype and unexpectedly cast them in the role of defenders of a failed and outworn status quo.
That rigidly plotted morality-play view of history, however, can only be defended by an act of erasure that blots out most of American history and also—and crucially—does the same thing to even more of the history of the rest of the world. Take the endless posturing by liberal historians about the unequaled moral evil of the European conquest and settlement of the Americas. You would think, to read these diatribes, that no other group of people in all of history ever invaded and conquered lands where somebody else lived. Au contraire, unless all your ancestors happen to belong to one of a very small number of ethnic groups in a very few isolated corners of the world, it doesn’t matter what your skin color and ethnicity is, people in your family tree invaded and conquered somebody else’s territory. What’s more, to judge by every documented example known to history, they were just as brutal about it as the European settlers of the Americas.
It so happens that in the centuries before 1500, the warlike peoples of the bleak, mountainous peninsula that sticks off the western end of Asia—yes, that would be Europe—borrowed military and maritime technologies from the great band of civilized urban societies that stretched from China to West Africa, and as barbarians so often do, reworked them in new and lethally creative ways. It so happens that in the centuries between 1500 and 1900, those same warlike peoples swept out of their European homelands to conquer most of the world. It’s an old story, and it was already an old story when the Guti swept down on the city-states of Sumer four millennia ago.
Only three things make it distinctive just now. The first is that the borrowings and innovations that equipped the marauders from Europe made it possible for them to cover more territory than any previous set of barbarian invaders could manage. The second is that there was a massive bacteriological differential between the Old World and the New, so that Old World diseases wiped out around 95% of the native population of the Americas and Australasia. (If that hadn’t been the case, the history of those three continents after European colonization would parallel that of India and Africa.) The third, of course, is that the world shaped by those invasions is the one we live in today.
A thousand years from now, in other words, historians will talk about the Europeans the way they now talk about the Mongols and the Huns. The events unfolding around us right now are just as instantly recognizable if you know how the latter days of these and other barbarian invaders unfolded in their turn: the softening and slow decay of the conquering nations, shifts in the balance of power as older societies regain their technological edge, the dissolution of vast but temporary empires and their replacement by successor states—again, it was already an old story when the Guti lost control of the lower Tigris-Euphrates valley to the resurgent city-states of Sumer.
Most of us are still too close to the events of the age of invasions to see things with that sort of clarity, though. This makes it difficult for people on all sides of the resulting conflicts to recognize that what happened was simply another round of history as usual. Human beings invade land claimed by other human beings and dispossess, enslave, or kill the previous inhabitants; that’s one of the things our species does—and despite the claims sometimes made by fans of hunter-gatherer societies, it’s something we’ve been doing since long before we finished becoming human. Chimpanzees in the wild have been documented doing exactly the same thing to other bands of chimpanzees, you know.
I’ve gone into this issue here at some length, because one of the standard mechanisms used to erase history these days is to point to something people did in the past that we now consider morally offensive, and insist that because people back then did that horrible thing, people now should be prevented from learning anything about people back then—other, that is, than hearing an endless braying litany about how horrible they were. It’s one variant of a broader strategy of erasure, which works by pulling certain historical events out of context, parading them around as though they made up the entire experience of a complex era, and using them to distract attention from other events that fail to support a preconceived narrative.
It’s important to be precise here. History is always a matter of picking and choosing, deciding which events are important and which are not. This is why it’s crucial, in learning about history, to come at it from a variety of directions, exploring competing narratives and criteria for what is and isn’t relevant—and also, of course, why the sort of dogmatic historical monoculture that’s so often seen in American education, for example, is so destructive and so dull. A history that doesn’t challenge your preconceptions from more than one direction has failed in its task.
Every movement toward a new future thus begins with a new understanding of the past. The next America that will begin taking shape over the decade ahead, as the technocratic America of the managerial elite finishes its one-way trip into history’s dumpster, is no exception to this rule. As a contribution to that process, I’m going to devote a certain number of posts this year to a major current in the history of this country that has been excluded from the narratives marketed by all sides in the recent culture wars. You already know about some of the people I’ll be discussing, though only a very few of you will have heard of some of the others, and the current that weaves them together is not something most American historians will talk about at all. Though the story doesn’t begin there, we’ll start our exploration with the arrival of a ship at the busy wharves of colonial Philadelphia in the year 1694.