After the Fact
In the history of truth, a new chapter begins.
By Jill Lepore
Ted Cruz’s campaign autobiography is called “A Time for Truth.” “This guy’s a liar,” Donald Trump said at a recent G.O.P. debate, pointing at Cruz. Trump thinks a lot of people are liars, especially politicians (Jeb Bush: “Lying on campaign trail!”) and reporters (“Too bad dopey @megynkelly lies!”). Not for nothing has he been called the Human Lie Detector. And not for nothing has he been called a big, fat Pinocchio with his pants on fire by the fact-checking teams at the Times, the Washington Post, and Politifact, whose careful reports apparently have little influence on the electorate, because, as a writer for Politico admitted, “Nobody but political fanatics pays much mind to them.” “You lied,” Marco Rubio said to Trump during the truth-for-tat February debate. Cruz tried to break in, noting that Rubio had called him a liar, too. Honestly, there was so much loudmouthed soothsaying that it was hard to tell who was saying what. A line from the transcript released by CNN reads:
UNIDENTIFIED MALE: I tell the truth, I tell the truth.
Eat your heart out, Samuel Beckett.
On the one hand, not much of this is new. “Gen. Jackson is incapable of deception,” Andrew Jackson’s supporters insisted, in 1824. “Among all classes in Illinois the sobriquet of ‘honest Abe’ is habitually used by the masses,” a Republican newspaper reported of Lincoln, in 1860. The tweets at #DumpTrump—“This man is a hoax!”—don’t quite rise to the prose standard of the arrows flung at supporters of John Adams, who Jeffersonians said engaged in “every species of villainous deception, of which the human heart, in its last stage of depravity is capable.”
“When a President doesn’t tell the truth, how can we trust him to lead?” a Mitt Romney ad asked last time around, during an election season in which the Obama campaign assembled a so-called Truth Team to point out Romney’s misstatements of fact. Remember the Swift Boat Veterans for Truth, from 2004? This kind of thing comes and goes, and, then again, it comes. Cast back to Nixon: Among all classes, the sobriquet of “Tricky Dick” was habitually used by the masses. “Liar” isn’t what opponents generally called Ford or Carter or the first George Bush, but a Bob Dole ad, in 1996, charged that “Bill Clinton is an unusually good liar,” and much the same was said of Hillary Clinton, dubbed “a congenital liar” by William Safire. A Bernie Sanders campaign ad refers to him, pointedly, as “an honest leader”; his supporters have been less restrained. At a rally in Iowa, they chanted, “She’s a liar!”
On the other hand, some of this actually is new. When a sitting member of Congress called out “You lie!” during the President’s remarks before a joint session in 2009, that, for instance, was new. (“That’s not true,” Obama replied.) John Oliver’s #MakeDonaldDrumpfAgain campaign is both peerless and unprecedented. On HBO, Oliver checked Trump’s facts, called Trump a “litigious serial liar,” and dared him to sue. Also newish is the rhetoric of unreality, the insistence, chiefly by Democrats, that some politicians are incapable of perceiving the truth because they have an epistemological deficit: they no longer believe in evidence, or even in objective reality.
To describe this phenomenon, Democrats go very often to the Orwellian well: “The past was erased, the erasure was forgotten, the lie became truth.” Hillary Clinton has a campaign ad called “Stand for Reality.” “I’m just a grandmother with two eyes and a brain,” she says, which is an awfully strange thing for a former First Lady, U.S. senator, and Secretary of State to say. But what she means, I guess, is that even some random old lady can see what Republican aspirants for the Oval Office can’t: “It’s hard to believe there are people running for President who still refuse to accept the settled science of climate change.”
The past has not been erased, its erasure has not been forgotten, the lie has not become truth. But the past of proof is strange and, on its uncertain future, much in public life turns. In the end, it comes down to this: the history of truth is cockamamie, and lately it’s been getting cockamamier.
Most of what is written about truth is the work of philosophers, who explain their ideas by telling little stories about experiments they conduct in their heads, like the time Descartes tried to convince himself that he didn’t exist, and found that he couldn’t, thereby proving that he did. Michael P. Lynch is a philosopher of truth. His fascinating new book, “The Internet of Us: Knowing More and Understanding Less in the Age of Big Data,” begins with a thought experiment: “Imagine a society where smartphones are miniaturized and hooked directly into a person’s brain.” As thought experiments go, this one isn’t much of a stretch. (“Eventually, you’ll have an implant,” Google’s Larry Page has promised, “where if you think about a fact it will just tell you the answer.”) Now imagine that, after living with these implants for generations, people grow to rely on them, to know what they know and forget how people used to learn—by observation, inquiry, and reason. Then picture this: overnight, an environmental disaster destroys so much of the planet’s electronic-communications grid that everyone’s implant crashes. It would be, Lynch says, as if the whole world had suddenly gone blind. There would be no immediate basis on which to establish the truth of a fact. No one would really know anything anymore, because no one would know how to know. I Google, therefore I am not.
Lynch thinks we are frighteningly close to this point: blind to proof, no longer able to know. After all, we’re already no longer able to agree about how to know. (See: climate change, above.) Lynch isn’t terribly interested in how we got here. He begins at the arrival gate. But altering the flight plan would seem to require going back to the gate of departure.
Historians don’t rely on thought experiments to explain their ideas, but they do like little stories. When I was eight or nine years old, a rotten kid down the street stole my baseball bat, a Louisville Slugger that I’d bought with money I’d earned delivering newspapers, and on whose barrel I’d painted my last name with my mother’s nail polish, peach-plum pink. “Give it back,” I told that kid when I stomped over to his house, where I found him practicing his swing in the back yard. “Nope,” he said. “It’s mine.” Ha, I scoffed. “Oh, yeah? Then why does it have my name on it?” Here he got wily. He said that my last name was also the name of his baseball team in the town in Italy that he was from, and that everyone there had bats like this. It was a dumb story. “You’re a liar,” I pointed out. “It’s mine.” “Prove it,” he said, poking me in the chest with the bat.
The law of evidence that reigns in the domain of childhood is essentially medieval. “Fight you for it,” the kid said. “Race you for it,” I countered. A long historical precedent stands behind these judicial methods for the establishment of truth, for knowing how to know what’s true and what’s not. In the West, for centuries, trial by combat and trial by ordeal—trial by fire, say, or trial by water—served both as means of criminal investigation and as forms of judicial proof. Kid jurisprudence works the same way: it’s an atavism. As a rule, I preferred trial by bicycle. If that kid and I had raced our bikes and I’d won, the bat would have been mine, because my victory would have been God-given proof that it had been mine all along: in such cases, the outcome is itself evidence. Trial by combat and trial by ordeal place judgment in the hands of God. Trial by jury places judgment in the hands of men. It requires a different sort of evidence: facts.
A “fact” is, etymologically, an act or a deed. It came to mean something established as true only after the Church effectively abolished trial by ordeal in 1215, the year that King John pledged, in Magna Carta, “No free man is to be arrested, or imprisoned . . . save by the lawful judgment of his peers or by the law of the land.” In England, the abolition of trial by ordeal led to the adoption of trial by jury for criminal cases. This required a new doctrine of evidence and a new method of inquiry, and led to what the historian Barbara Shapiro has called “the culture of fact”: the idea that an observed or witnessed act or thing—the substance, the matter, of fact—is the basis of truth and the only kind of evidence that’s admissible not only in court but also in other realms where truth is arbitrated. Between the thirteenth century and the nineteenth, the fact spread from law outward to science, history, and journalism.
What were the facts in the case of the nail-polished bat? I didn’t want to fight, and that kid didn’t want to race. I decided to wage a battle of facts. I went to the library. Do they even have baseball in Italy? Sort of. Is my name the name of a baseball team? Undeterminable, although in Latin it means “hare,” a fact that, while not dispositive, was so fascinating to me that I began to forget why I’d looked it up.
I never did get my bat back. Forget the bat. The point of the story is that I went to the library because I was trying to pretend that I was a grownup, and I had been schooled in the ways of the Enlightenment. Empiricists believed they had deduced a method by which they could discover a universe of truth: impartial, verifiable knowledge. But the movement of judgment from God to man wreaked epistemological havoc. It made a lot of people nervous, and it turned out that not everyone thought of it as an improvement. For the length of the eighteenth century and much of the nineteenth, truth seemed more knowable, but after that it got murkier. Somewhere in the middle of the twentieth century, fundamentalism and postmodernism, the religious right and the academic left, met up: either the only truth is the truth of the divine or there is no truth; for both, empiricism is an error. That epistemological havoc has never ended: much of contemporary discourse and pretty much all of American politics is a dispute over evidence. An American Presidential debate has a lot more in common with trial by combat than with trial by jury, which is what people are talking about when they say these debates seem “childish”: the outcome is the evidence. The ordeal endures.
Then came the Internet. The era of the fact is coming to an end: the place once held by “facts” is being taken over by “data.” This is making for more epistemological mayhem, not least because the collection and weighing of facts require investigation, discernment, and judgment, while the collection and analysis of data are outsourced to machines. “Most knowing now is Google-knowing—knowledge acquired online,” Lynch writes in “The Internet of Us” (his title is a riff on the ballyhooed and bewildering “Internet of Things”). We now only rarely discover facts, Lynch observes; instead, we download them. Of course, we also upload them: with each click and keystroke, we hack off tiny bits of ourselves and glom them on to a data Leviathan.
“The Internet didn’t create this problem, but it is exaggerating it,” Lynch writes, and it’s an important and understated point. Blaming the Internet is shooting fish in a barrel—a barrel that is floating in the sea of history. It’s not that you don’t hit a fish; it’s that the issue is the ocean. No matter the bigness of the data, the vastness of the Web, the freeness of speech, nothing could be less well settled in the twenty-first century than whether people know what they know from faith or from facts, or whether anything, in the end, can really be said to be fully proved.
Lynch has been writing about this topic for a long time, and passionately. The root of the problem, as he sees it, is a well-known paradox: reason can’t defend itself without resort to reason. In his 2012 book, “In Praise of Reason,” Lynch identified three sources of skepticism about reason: the suspicion that all reasoning is rationalization, the idea that science is just another faith, and the notion that objectivity is an illusion. These ideas have a specific intellectual history, and none of them are on the wane. Their consequences, he believes, are dire: “Without a common background of standards against which we measure what counts as a reliable source of information, or a reliable method of inquiry, and what doesn’t, we won’t be able to agree on the facts, let alone values. Indeed, this is precisely the situation we seem to be headed toward in the United States.” Hence, truthiness. “I’m no fan of dictionaries or reference books: they’re élitist,” Stephen Colbert said in 2005, when he coined “truthiness” while lampooning George W. Bush. “I don’t trust books. They’re all fact, no heart. And that’s exactly what’s pulling our country apart today.”
The origins of no other nation are as wholly dependent on the empiricism of the Enlightenment, as answerable to evidence. “Let facts be submitted to a candid world,” Thomas Jefferson wrote in the Declaration of Independence. Or, as James Madison asked, “Is it not the glory of the people of America, that whilst they have paid a decent regard to the opinions of former times and other nations, they have not suffered a blind veneration for antiquity, for custom, or for names, to overrule the suggestions of their own good sense, the knowledge of their own situation, and the lessons of their own experience?”
When we Google-know, Lynch argues, we no longer take responsibility for our own beliefs, and we lack the capacity to see how bits of facts fit into a larger whole. Essentially, we forfeit our reason and, in a republic, our citizenship. You can see how this works every time you try to get to the bottom of a story by reading the news on your smartphone. Or you can see it in the recent G.O.P. debate when Rubio said that Trump had hired Polish workers, undocumented immigrants, and Trump called him a liar:
TRUMP: That’s wrong. That’s wrong. Totally wrong.
RUBIO: That’s a fact. People can look it up. I’m sure people are Googling it right now. Look it up. “Trump Polish workers,” you’ll see a million dollars for hiring illegal workers on one of his projects.
In the hour after the debate, Google Trends reported a seven-hundred-per-cent spike in searches for “Polish workers.” “We rate Rubio’s claim Half True,” Politifact reported. But what you see when you Google “Polish workers” is a function of, among other things, your language, your location, and your personal Web history. Reason can’t defend itself. Neither can Google.
Trump doesn’t reason. He’s a lot like that kid who stole my bat. He wants combat. Cruz’s appeal is to the judgment of God. “Father God, please . . . awaken the body of Christ, that we might pull back from the abyss,” he preached on the campaign trail. Rubio’s appeal is to Google.
Is there another appeal? People who care about civil society have two choices: find some epistemic principles other than empiricism on which everyone can agree or else find some method other than reason with which to defend empiricism. Lynch suspects that doing the first of these things is not possible, but that the second might be. He thinks the best defense of reason is a common practical and ethical commitment. I believe he means popular sovereignty. That, anyway, is what Alexander Hamilton meant in the Federalist Papers, when he explained that the United States is an act of empirical inquiry: “It seems to have been reserved to the people of this country, by their conduct and example, to decide the important question, whether societies of men are really capable or not of establishing good government from reflection and choice, or whether they are forever destined to depend for their political constitutions on accident and force.” The evidence is not yet in.
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.