If there’s one thing Donald Trump’s Twitter account should have taught us, it’s that allowing anyone to say anything, just because they have opposable thumbs and a pulse, may not be the most productive way to conduct the public discourse so essential to a functioning democracy. When the “marketplace of ideas” is full of hucksters and bullshitters and con artists, after all, it can be difficult for good, honest, thoughtful people to get a word in edgewise.
This didn’t used to be a problem.
Prior to the internet, legend has it, there were good, honest, thoughtful people everywhere, and conversations around the American dinner table were lively, respectful affairs conducted by engaged citizens who carefully considered every nuance of public policy at all levels of government—local, state, national, international (and, for Star Wars fans, interstellar). After years of debating the pros and cons of each and every policy proposal served up by legislators, as well as the merits of every judge, sheriff, and water commissioner in the region, these exceptionally well-informed citizens would then shuffle over to their nearest voting precinct and cast their ballot, secure in the knowledge that their vote counted and that their fellow citizens were voting in equally good faith, based on their own rigorous study of the issues and candidates, all of whom were extraordinarily well-qualified to lead our country to a better, brighter, more prosperous future, regardless of their party affiliation.
I jest, of course. In truth, the only place such passionate policy discussions ever took place was in the bedroom, after a pair of married, freedom-loving citizens attempted to procreate to the soaring strains of our national anthem, always timing their patriotism to coincide with those glaring red rockets, which of course symbolize the turgid excitement all Americans feel when democracy is under attack.
Again, I’m kidding. In bed, most American men can’t make it past the word “ramparts,” no matter how gallant their stream.
But you get the point. Many people in this country think they can recall a time when the so-called “public discourse” was so civil that people worried about it being “coarsened.”
In the rosy embers of the Boomer generation’s collective imagination, there are shrines dedicated to television truth-tellers like Walter Kronkite, Ted Koppel, Dan Rather, Tom Brokaw, and the incomparable Harry Reasoner—a man who had the word “reason” in his very name, for criminy’s sake! Journalism itself was the reliable “fourth estate,” the one institution in our democracy that could be counted upon to ask uncomfortable questions of politicians who keep pretending everything is going to be okay. In this informational utopia of yore, all an American citizen had to do in order to “stay informed” was spend twenty minutes reading the morning paper and half an hour watching the national news. In less than an hour a day, then, Americans had all the information they needed to make the world’s best choices on election day.
This is all a romantic fiction, of course.
No one in 1976 was better informed about politics than the average ten-year-old is today. The big difference is that learned adults in the 1960s and 1970s felt more informed. And they felt more informed because they trusted television and newspaper editors to sort through the detritus of reality and, every morning and evening, present them with a cohesive, neatly packaged narrative that fit nicely into the rhythm of their own day. Things made sense, in other words, because the people writing the so-called “first draft of history” worked very hard to craft a version of world events that seemed rational. The morning paper was not a chaotic jumble of suggestions for how to treat toenail fungus or identify which celebrities are vegan—it was a slowly evolving story of American exceptionalism that anyone could understand and follow. In journalism schools around the country, budding reporters were taught to write at a “sixth-grade level,” because it was understood back then that the average adult was not, scholastically speaking, a very good fit for democracy. Thus, the complexity and nuance of world events was communicated with roughly the same intellectual heft as a Hardy Boys novel, and the “news” was delivered each day in easily digestible chunks. And if it only took twenty minutes to read the morning paper, that left 23 hours and forty minutes to think about the news, to digest and process it, before another installment of disturbing information landed on the doorstep.
Then came CNN and the internet. 24-hour cable news made it possible for people to gorge as much as they wanted at the trough of news, and the internet gave people a giant slop bucket where they could spew the undigested chunks of speculation and innuendo that inevitably accumulated. The formula for reasonable discourse in society was suddenly reversed: Instead of spending less than an hour a day absorbing news and the rest of the day thinking about it, people started absorbing news all day, which didn’t leave time to think about much of anything. Then people discovered that if they could dispense with thinking about the news, there was nothing left to do but react to it. And, because reacting and spewing are so much more cathartic and satisfying than stepping back for several hours to give a matter some considered thought, the internet naturally became a cesspool of vomitous bile.
This wasn’t supposed to happen, of course. Back in the 1990s and early aughts, internet evangelicals were absolutely certain that “democratizing” information would ultimately be good for society, and that allowing people from all over the world to communicate with each other—anytime, anywhere—would naturally foster a universal sense of understanding and compassion. A more informed world would be a better world, and the great hive mind of humanity would somehow usher in a new era of international cooperation. Like a beauty contestant searching for the right answer, some of these silicon prophets would even venture that global peace and prosperity for all lay just around the technological bend.
Back in 1995, before most people had even seen a web page, I sat in a room full of other reporters and listened to a panel of young technological visionaries describe the brave new world of communication they were gifting to the world, much of which has come true—and much of which has not. It is true that people all over the world can now communicate with each other cheaply and effortlessly. And it is true that, technically speaking, more information is now available to the average human being through their phone than has been available to any human in history. It is not true, however, that all of this additional communication and information has naturally led to a kinder, more compassionate world. Quite the opposite. More understanding and tolerance may have been the goal, but what none of these technological utopianists seemed capable of anticipating was the collective upchuck of hatred and bigotry that their invention would unleash.
Any working journalist could have told them what they were up against. Back in the days of pre-internet journalism, I and my colleagues regularly got hate mail from angry readers who took exception to one thing or another we had written. The beat didn’t matter; nor did the subject.
If you expressed an opinion in print about something—anything—there was always a posse of readers who felt it was their solemn duty to write scathing, bilious denouncements of whatever you’d written. Then came the character assassination—you were an idiot, imbecile, moron, stooge, prick, asshole, monster . . . or whatever. And if you touched an especially sensitive nerve, there were always a few reliable malcontents who wanted you fired—or worse killed, always in some grisly, medieval fashion that would communicate in no uncertain terms just how angry you had made them.
On my first day on the job as a working journalist, a colleague of mine showed me a letter he had just received from one of his readers. The letter was scrawled in red crayon and said, “Fuck you for what you said about ‘Pump Up the Volume’. Eat shit and die!!!” He was the movie critic, and he had evidently expressed a cinematic opinion that one of his readers did not agree with—by, as it happened, liking the movie in question a bit too much. So, of course, this reader thought the logical punishment for the crime of not writing what he wanted to hear was . . . death. I myself have received numerous death threats over the years, from readers who felt my perspective on various civic and cultural matters was so noxious that I too should be exterminated.
Back then, we did not yet live in a world where journalists were occasionally abducted, hacked into pieces with a bone saw and dissolved in acid. Nor did we live in a world where people offended by a cartoon would barge into a newspaper’s office and mow down the editorial staff with an AK-47. Journalists did occasionally lose their lives while reporting from the front lines of some war in a dusty distant land, but for the most part, the only people who actively sought vengeance against journalists were dictators for whom the light of truth is like sunshine to a vampire.
Still, there were always plenty of readers who, when confronted with an idea they did not like, preferred the gallows to anything resembling reasonable public discourse. The general public was unaware of these people, however, because the only vehicle available for readers to express a contrary opinion at the time was the “Letters to the Editor” section, and newspaper and magazine editors typically refused to publish letters that were threatening, deranged, or illiterate. Indeed, as a class, editors thought of themselves as the gatekeepers of public discourse. And by not publishing specious invective, baseless allegations, or casual death threats, they typically felt as if they were providing a public service.
To maintain some semblance of civility and decorum in the public discourse, then, it was necessary to suppress some voices—voices that, editors felt, did not contribute anything constructive to the public “dialogue.” And as conscientious gatekeepers of the public discourse, editors felt it was part of their job to insulate the public from the simmering sewer of rage that burbled in some sectors of the citizenry, if only because it is difficult to have a dialogue with someone who is shouting at you.
Fast forward to 2022: one of the biggest differences between media consumption then and now is that today everyone now has a “voice,” even if they only use it to disparage people on Facebook. People who don’t appreciate what mainstream newspapers like the New York Times and Washington Post have to say about the world also now have news outlets and websites that tell them what they want to hear. Not what is true, mind you, or what can be verified through responsible reporting—but whatever confirms their own biases and suspicions.
Thus, people with a different worldview are free to create their own informational ecosystem and tell themselves whatever they want to believe: the election was rigged, the coronavirus is a hoax, the climate isn’t changing, Vladimir Putin is a genius, and forest fires are created by Jewish lasers shot from space.
Again, this was not supposed to happen. When news outlets started creating websites and posting stories online, many an idealistic editor thought the “comments” section for each story would become a kind of electronic “public square,” where people could debate the pros and cons of any given story and thereby create the kind of open “discourse” that almost everyone agreed should be the cornerstone of a healthy democracy. Instead, given the chance, readers heaped so much profanity and invective on news articles that many newspapers started hiring small armies of interns to monitor the comments sections and weed out the most offensive posts. Dismayed by the crassness and incivility of their readers, many news websites stopped posting comments altogether.
But the angry mobs did not go away—they simply gravitated toward media outlets that validated their anger and offered them an informational prism through which they could look and see that they were right and everyone else was wrong, just as they suspected. For some reason, these same people concluded that the mainstream media was lying to them, and that professional journalists whose livelihoods depend upon the accuracy of their reporting, and who have the guts to attach their name to each story they write, cannot be trusted. No, to these people, a story can only be trusted if it is being posted on a Reddit threat by an anonymous teenager whose job, if they have one, is definitely not on the line should it turn out that Hillary Clinton is not selling children for sex at the pizza parlor around the corner.
Trust in anything, consequently, is at an all-time low. Almost all “media” is now polarized along political battle lines, and each of the warring factions accuses the other side of being deceitful criminals and fascists who will destroy the country if they are ever allowed to pursue their evil agenda. In this paradigm, Congressional gridlock is the only thing saving American democracy from its supposed representatives, all of whom—if you believe the yowling from their respective media tribes—are hell-bent on ruining the country, one way or another.
So we are left with a conundrum: democracy in America seemed to work better—or at least more smoothly—when the public had less information at their fingertips, and the general arc of the media narrative was controlled by just a few large newspapers and television networks. Most people got the same general news from the same basic sources, so there wasn’t a tremendous amount of disagreement about the basic facts of any given discussion.
Then again, average voter turnout in presidential elections between 1960 and 1995 hovered around 53%, so it cannot be said that the citizenry was tremendously engaged in the electoral process. In 2020, however, total voter turnout nationwide was 66.8%—the highest voter turnout in 120 years. So, despite the fractured media landscape and seemingly batshit attitude toward governance displayed by the Trump administration (or perhaps because of it), participation in the democratic process during the 2020 election was at an all-time high. Americans may be polarized and angry, but it seems as if there is nothing quite like the threat of encroaching fascism from both ends of the political spectrum to energize the electorate and get out the vote.
So which is the better form of democracy? A comfortably centrist polity that lulls the population into a state of relative indifference, or one that swings wildly from left to right and back again, creating so much drama and alarm that voters turn out in droves?
The answer doesn’t really matter, because it is impossible to go back to the media landscape of the mid-twentieth century, when news editors waded through the messy nonsense of real life and presented the public with a highly selective slate of stories that seemed rationally connected to the world in which readers actually lived. Not so anymore. In the parlance of our beloved tech titans, the internet has set information “free,” and people are now re-arranging their lives accordingly, based on whatever version of reality they prefer.
But fear not: if Americans end up not liking the 21st century iteration of American democracy, they can always scrap it and build a society where everyone agrees on everything—you know, like China, North Korea, or Russia. Because if there’s one thing authoritarian dictators are good at, it’s restricting the flow of information so that their citizens don’t get too confused.
Unfortunately, if Americans want to add a little more authoritarianism to their daily lives, they are going to have to answer one crucial question: Which form of fascism do they prefer?