Jacques Derrida, the most famous French philosopher of his generation, died yesterday in a Paris hospital aged 74 after losing a battle against cancer of the pancreas.
The death of the founder of the philosophical school of "deconstruction" will be seen as the French Academy's most significant loss since Jean-Paul Sartre died in 1980.
As recently as last week, Prof Derrida was being tipped as a possible winner of this year's Nobel Prize for Literature, an award that was eventually made to Elfriede Jelinek.
Jacques Chirac, the French president, said yesterday that in Prof Derrida, "France has given the world one of its greatest contemporary philosophers, one of the major intellectual figures of our time".
Prof Derrida's work pioneered a complex and controversial form of philosophy which interpreted different kinds of human thought and knowledge as ambiguous "texts" with multiple and apparently endless layers of meaning. The method, though often impenetrable, had an enormous impact on literature, linguistics, philosophy, law and architecture.
Over a 40-year career, the flamboyant doyen of Parisian intellectuals became one of the best-known and controversial philosophers in the world - loathed, adored and seldom fully comprehended.
Born into a Jewish family in Algeria in 1930, Prof Derrida began studying philosophy at the elite Ecole Normale Superieure in 1952 and taught at the Sorbonne University in Paris from 1960 to 1964. From the early 1970s, he spent much of his time teaching at American universities, including Johns Hopkins and Yale.
While his followers acclaimed him a playful genius of language, critics said he merely created an obscure form of relativism, in which anything could mean anything. His famously difficult and literary style made him particularly unpopular among many English and American philosophers, most of them reared in the tradition of plain-speaking Anglo-Saxon thought.
Matters reached a head in 1992 when 20 philosophers, including the renowned formal logician, W V Quine, signed a letter to Cambridge University protesting at the award of an honorary doctorate to Prof Derrida.
On the continent, however, Prof Derrida was a celebrated figure - akin to a pop star among students.
In recent years, he began to intervene regularly in political debates. In a debate on global terrorism, he refused to describe September 11 attacks as an act of "international terrorism", arguing that "an act of 'international terrorism' is anything but a rigorous concept that would help us grasp the singularity of what we are trying to discuss".
It is tempting to say that Jacques Derrida's death has been greatly exaggerated. The French philosopher was so closely associated with nihilism and metaphysical absence that it's perhaps worth wondering whether he ever lived at all. But reality contains some incontrovertible truths, and one of them is that Derrida passed away on Saturday in Paris, at the age of 74. He had suffered from pancreatic cancer.
For more than three decades, the dapper Frenchman with bushy eyebrows and a tan that would have made George Hamilton envious ruled the lecture halls of America's universities. His legions of New World acolytes treated him more like a rock star than a humanities professor. Even politicians admired him, or at least certain ones did. "In him, France gave the world one of the greatest contemporary philosophers, one of the major figures in the intellectual life of our time," said French president Jacques Chirac, whose office announced Derrida's death.
When Derrida burst onto the American scene in the 1960s, the reigning idols of academe, Freud and Marx, were losing their luster. The professoriate wanted a new intellectual hero. In Derrida, they found their man. He offered his fans everything they could have hoped for, from his handsome good looks to his knack for academic celebrity. His Garbo-like refusal to have his photograph taken for publication until 1979 only added to the allure.
Born in French Algeria, Derrida quickly became identified with the hip postwar café culture of Paris's Left Bank. His prose was famously impenetrable; Derrida didn't shrink from writing sentences that rambled on for two or three pages and his books were abstruse and convoluted in the extreme. None of this put off his tweedy admirers, who regarded Derrida's density as further proof of his profundity.
But Derrida built no new intellectual edifice. His project was one of destruction - or "deconstruction." Derrida claimed to have discovered that all texts contain inherent contradictions that fatally compromise their ability to communicate meaning. The upshot was that the entire Western philosophical and literary tradition rested on an enormous fallacy. Fundamental concepts like logic and truth were illusions. Derrida himself wrote more than 50 books attempting to prove that nothing could be said.
Although dismissed by Derrida's fellow philosophers, deconstruction appealed to literary scholars and others in the humanities who wished to project their own beliefs (political and otherwise) onto the works they studied. It is perhaps revealing that Derrida chose to defend rather than censure the legacy of his most famous student, Paul de Man, after a Belgian scholar revealed that the Yale professor had written anti-Semitic tracts in a French-language, collaborationist newspaper during the Second World War.
Undaunted by the obvious fact that their own works could be deconstructed and thus nullified by the same theory, professors dove headlong into the Western canon armed with what The Economist dubbed the "circumloquacious" writings of the great Frenchman. Derrida seemed at times to recognize the ludicrous implications of his theory: "What deconstruction is not? Everything, of course. What is deconstruction? Nothing, of course." Yet the true believers failed to understand that the joke was on them.
After deconstruction dethroned art and literature, what remained? Television, apparently. From the perspective of the deconstructionist, almost everything was a "text" - and Professor Derrida simply adored the boob tube. When he wasn't reviewing his travel itinerary or his lecture schedule, Derrida spent much of his free time riveted to the set. "I watch TV all the time," he once said. What kind of shows did he watch? "Anything." But television was not mere passive entertainment - not for a Brilliant French intellectual. "I am critical of what I'm watching," Derrida insisted. "I am trying to be vigilant. I deconstruct all the time."
Academic fashions come and go, and deconstruction is now considered passé in many faculty lounges - but that's mainly because its central insights and prejudices have been so fully absorbed into the intellectual outlook of college humanities departments. Deconstruction is now a part of the modern academic's critical toolkit.
The Master is now absent. Unfortunately, his leveling children remain a powerful presence on campus.
” John J. Miller is a writer for National Review and Mark Molesky is an assistant professor of history at Seton Hall University. Their new book, Our Oldest Enemy: A History of America's Disastrous Relationship with France, has just been published by Doubleday.
Anyone who tries to account for Jacques Derrida's success in North America is faced with a paradox. During the early 1980s, when his fortunes began to ebb precipitously in France--articles on his philosophy had slowed to a trickle of two or three per annum--in the United States deconstruction became something of an academic cottage industry. Translations of his books, conferences devoted to his thought, as well as endless commentaries trying to explicate the obscurities of "so-called deconstruction" proliferated.
The irony is that American academics--most of whom were clustered in comparative literature departments--attempting to ride the crest of the Parisian theoretical avant-garde were "always already" (to employ a pet Derrideanism) behind the times. For, by the mid-'70s, Derrida's exotic brand of "post-structuralism"--which had proclaimed that the ends of metaphysical "closure" pursued by first-generation, hard core structuralists like Claude Lévi-Strauss, Jacques Lacan, and Michel Foucault, could never be achieved--had become a dead letter.
Nineteen sixty-seven was Derrida's breakthrough year. He published three successful books and, for a brief, shining moment, became the toast of the Left Bank. Structuralism had become intellectually hegemonic. The claims of Derridean "différance"--viz., that all claims to determinate meaning were self-undermining--appeared revolutionary and refreshing. Yet, already by the following year, his hermetic, "negative semiotics"--a semiotics of "absence" rather than "presence"--had become an object of satirical derision. In Structuralist Mornings, the novelist Clément Rosset subjected deconstructionist pretense (specifically, the Derridean habitude of writing sous rature or crossing out words) to biting parody: "I write a first sentence, but in fact I should not have written it, excuse me, I will erase everything and I'll start over again; I write a second sentence, but after thinking about it, I should not have written that one either."
In France, the Derridean gambit foundered quite soon. Like the structuralists, Derrida prided himself on his discursive "illisibilité," or "unreadability." But after the breakthrough of the May '68 revolt, when structuralist platitudes concerning the "end of history" and the "end of man" were refuted on the streets of the Latin Quarter, "unintelligibility" had become a distinct liability. In the eyes of the May generation, Derrida was associated with the structuralist old guard. Deconstruction was perceived, not unjustly, as part and parcel of an elitist, self-enclosed, mandarin academic idiom. In the eyes of his critics, Derrida was never able to live down his famous bon mot, "There is nothing outside the text." The exclusive emphasis on "textuality" in his work, combined with the studied indifference to the political exterior or "outside," constituted a final nail in deconstruction's coffin.
Toward the late '80s, deconstruction also underwent a major crisis in North America. In the eyes of his acolytes, the Master's frequent proclamations concerning the "death of the subject" seemed to malign and belittle the idea of human agency itself--and, thus, the prospect of progressive political change. If all meaning were, as Derrida claimed, indeterminate, if moral and epistemological questions were ultimately "undecidable," what was the point of political commitment? When all was said and done, wasn't deconstruction merely an elaborate and convoluted prescription for political quietism?
In 1987 the Paul de Man and Martin Heidegger scandals broke--coincidently, within months of each another. In a stroke, deconstruction's key North American benefactor (de Man) and its leading philosophical inspiration (Heidegger) were exposed for their compromising associations with Nazism. Derrida did nothing to enhance deconstruction's credibility when he claimed: 1) that Heidegger had become a Nazi due to a surfeit of "metaphysical humanism" and 2) de Man's 1941 newspaper articles endorsing the deportation of Europe's Jews were actually the work of a closet résistant. Could it be that the claims and suspicions of deconstruction's vigorous detractors were true after all?
But the ultimate paradox besetting deconstruction lies elsewhere. It hinges on the fact that a methodology that promoted itself as "critical"--as the exemplar of political and textual criticism--quickly degenerated into a variant of run-of-the-mill academic corporatism. Each time deconstruction was exposed to criticism, the Derridean faithful predictably circled the wagons. Deconstruction had become a new Scripture or Holy Writ. And in the eyes of true believers, its progenitor could do no wrong. Anyone who dared to criticize the credo was branded as a heathen or non-believer. Deconstruction had its moment in the intellectual limelight. But, appropriately, that moment was fleeting.
Richard Wolin is the author of The Seduction of Unreason: The Intellectual Romance with Fascism from Nietzche to Postmodernism (Princeton University Press).
The popularity of Jacques Derrida's philosophy among academics is hard to understand except as a symptom of decadence. Western intellectuals have never been more safe, more comfortable or more free - so they have turned to a wild, often absurd philosopher who trashes the intellectual foundation of the humanities (and any coherent political project) in a search for intellectual stimulation. As he is buried this week, it is time to ask whether his ideas - and the long, agonising postmodern intellectual spasm - should be buried with him.
I have friends who still awake weeping at 3am with nightmares about trying to understand Derrida in time for their final exams. It's true his writing is wilfully obscure, and at times he lapses into gibberish. But in fact, once you learn how to boil down his prose, his ideas are fairly simple - and pernicious.
Derrida believed Western thought has been riddled since the time of Plato by a cancer he called "logocentrism". This is, at its core, the assumption that language describes the world in a fairly transparent way. You might think that the words you use are impartial tools for understanding the world - but this is, Derrida argued, a delusion. If I describe, say, Charles Manson as "mad", many people would assume I was describing an objective state called "madness" that exists in the world. Derrida would say the idea of "madness" is just a floating concept, a "signifier", that makes little sense except in relation to other words. The thing out there - the actual madness, the "signified" - is almost impossible to grasp; we are lost in a sea of opposing words that prevent us from actually experiencing reality directly.
Derrida wants to break down the naive belief that there is an objective external reality connected to our words that can be explored through language, science and rationality. Any narrative we construct to understand the world will inevitably be built on supressed violence and exclusion. So, for example, the narrative of 'madness' has been shown by Derrida's colleague and friend Michel Foucault to be a highly elastic concept that is used to stigmatize 'dissidents'; it is a categry that serves the powerful. None of our words is immune to these power-games. There is tension, opposition and power in even the most simple of concepts.
So there are, Derrida concluded, no universal truths, no progress and ultimately no 'sense', only "decentred", small stories that are often silenced by a search for rationality and consistency. The search for intellectual coherence is 'violent' and must be shunned.
Derrida claimed he was offering a critique within the Enlightenment tradition - yet within his own explanatry framework he made Enlghtenment values untenable. (He spoke openly of using tactics of "duplicity" and "the playing of a double game" to "challenge" the Enlightenment. He explained he was operating wthin the language of reason since there was no other, but he would try to lay traps for reason by posing it problems it could not answer. This was designed to expose the inherent contradictions in reason and ultimately destroy it.)
Most of his followers therefore work on the assumption that the Enlightenment - the 18th century tradition that gave us our notions of rationality and progress - is just another empty narrative, a sweet set of delusions.
Behind every reasoned argument, Derrida believed, there is a raw decision with no rational or reasoned basis. Everything else is a polite excuse. So the foundation our Enlightenment culture is built on - the absolutely fundamental assumptions we act on every day - are rotten. All we can hope for is to destroy this "metaphysics of presence", which is the assumption that we can expect immediate access to meaning. Then we might be able to experience a few 'concepts' - somehow. Derrida's method for destroying language is deconstruction - a technique that makes us see that "signifiers" are so ambiguous and shifting that they can mean anything or nothing.
Derrida was, in short, the mad axeman of Western philosophy. He tried to hack apart the very basis of our thought - language, reason and the attempt to tell big stories about how we became as we are. All we are left with - if we accept Derrida's conclusions - is puzzled silence and irony.
If reason is just another language game, if our words cannot match anything out there in the world without doing 'violence' to others - what can we do except sink into nihilism, or turn to the supernatural?
The deconstructionist virus has swept through the humanities departments of universities across Europe and America. But the best way to demonstrate the intellectual collapse this has caused is by looking at the impact of postmodernism on fiction. The fiction the preceded postmodernism - for all its flaws - usually engaged with the world. At its best, it even tried to change it: John Steinbeck hitched a wagon across Depression-scarred California and found a family that became the subject for The Grapes of Wrath.
Compare that to postmodernist fiction, a form of torture so heinous that it surely contravenes the Geneva Convention. Look at the execrable novels of Thomas Pynchon or David Foster Wallace, trapped in self-referential Derridan word-games and irrelevance while a world warms and wails outside their pages. The critic Dale Peck has described the postmodern implosion of the novel perfectly: "This is a tradition that has systematically divested itself of any ability to comment on anything other than its own inability to comment on anything."
Now magnify that effect across the humanities: imagine this deflation happening in anthropology, sociology, philosophy ... you get the idea. There is nothing more depressing than meeting smart graduate students who should be researching really important subjects, only to find they are writing a postmodern deconstruction of the idea of happiness or wealth or human rights, or a thesis with a name like "Is Anthropology Really Possible in Post-Modern Space?". Of course we should always question the ctegories of our thought - but the wave of deconstruction seems to have reduced academics to doing nothing else. The passivity and irrelevance of European intellectuals and American universities over the past three decades is largely due to the wrong turn they have taken into masturbatory post-modernism; Derridan readings invariably in my experience encourage confusion and passivity in the face of injustice, rather than action.
To be fair to him, late in his life Derrida seems to have begun to understand the terrible forces of ultra-scepticism he unleashed. Very few people can actually bear to be nihilists; very few people can preach a message of paralysis and despair for long. So Derrida declared in the early 1990s that there are some "infinitely irreducible" ideas that should not be deconstructed - particularly justice and friendship.
But it was too late. Derrida had vandalised all the tools he could have used to make a case for justice. If reason is just an "exclusionary strategy", if words are mere symbols in a dense fog, if everything must be broken into warring fragments, how can he suddenly call a halt to the process of deconstruction when it comes to one particular value he happens to like? Is his use of the word "justice" somehow immune to all the rules he spent his career articulating? How could he formulate the concept without violently excluding, say, the unjust? How can the battle between thsoe two words be saved from endless mutual obfuscation?
Derrida was left making the preposterous case that justice is a "Messianic" concept that would somehow be revealed to us once we stripped away language and reason.
I suppose it's touching that Derrida made a tragic final attempt to chain his own decontructionist beast. But the time for him to dissociate himself from nihilism was decades earlier, when he first launched the idea of deconstruction. He should have admitted that, yes, division and tension can be found in all things - but sometimes we need to accept that and build larger categories anyway, witout being accused of 'supression'.
Buried in Derrida's philosophy there are small nuggets of insight: that the structure of language determines our thought much more than we understood before Wittgenstein, and that grand narratives are inherently dangerous unless their exponents admit that they are partial and always doomed to be (at best) necessary fictions. Derrida could have drawn the sane conclusions from this at the start of his career: that we should show a greater degree of scepticism both toward language and narratives than before. But Derrida always promoted a far more shrill and silly agenda to unpick and 'expose' the Enlightenment tradition.
And build what in its place? Derrida neglected to discuss alternatives except in language so opaque it is impossible to decipher. In the real world, the alternatives to reason (Divine revelation? Superstition? Pure will? Despair?) are even more flawed and even less likely to lead to the "liberation" Derrida claims to seek.
We can see this in Derrida's personal route out of nihilism - through susperstition. Not for nothing was Derrida described as "a Jewish mystic"; he even wrote about his belief in ghosts, which seems to be literal (if one can assume anything in Derrida is literal or rational).
When there are urgent crises in the world that need serious intellectual application, it is faintly disgusting for left-wing intellectuals to spend their time arguing about whether the world is really there at all or whether it can ever be described in language or whetehr there are ghosts about the place. To claim to do so in the name of "true justice" is simply insulting to the victims of injustice. No hungry person craves deconstruction. No tyrannised person feels they are trapped in a language game. It is not only shallow but decadent to claim to be on the left and to dedicate your energy to these demoralising intellectual games.
Terry Eagleton is a Marxist academic with whom I disagree on many things (like the Soviet Union) - but we have a shared belief in rational Enlightenment politics based on notions like evidence, truth and open dispute. He chides Derrida for believing in "the emptiness of desire, the impossibility of truth, the fragility of the subject, , the lie of progress and the pervasiveness of power... [Derrida] greets the suggestion there has been any progress in human history with scorn while [he] regularly avails himself of anaesthetics and water closets."
Eagleton continues, "Derrida says there are moral judgements, but they lack any sort of moral or rational basis. There is no longer any relation, as there was for Aristotle or Marx, between the way the world is and how we ought to act within it, or between the way we are and what we ought to do... These judgements are left accordingly hanging in the air. For Derrida ethics is a matter of absolute decisions - decisions which are vital and necessary but also utterly 'impossible', and which fall outside all given norms, forms of knowledge and modes of conceptualisation. One can only hope he is not on the jury when one's case comes up in court."
Just so. There is no doubt a space for a continuing debate about post-modern thought in the more obscure philosophy departments, in the same way that some people still discuss Berkeley's idealism and other philosophical ideas that nobody would ever actually act on.
But to allow it to dominate so much of the humanities, as it has for decades, to allow it to paralyse thought at a moment when the world faces unprecedented crises, is almost pathologically deranged. Academics, novelists and serious thinkers have been parked in the Derridan dead-end for too long.
It's encouraging that so many news organizations offered stories correcting the untrue or half-true factual claims made by both candidates in the presidential debates. My favorite was George W. Bush's claim that John Kerry voted to waive budget caps "two-hundred and seventy-seven times." That works out to more than once a month for Kerry's entire tenure in the Senate, which sounds implausible--unless Bush's figure is for procedural votes, not substantive votes. The president ought to know that not even the ponderous United States Senate calls the roll 277 times on any matter of substance. But Bush thought the line sounded good, so he used it regardless.
Is all that matters in contemporary culture whether a line sounds good? That's the thesis of an important, provocative new book, The Post-Truth Era, by Ralph Keyes. It's Keyes's thesis that in the current ethos, whether something is believed has become more important than whether it's true. Keyes cites psychological research showing that people lie far more often than we'd like to think--constantly telling petty lies they think will never be detected and often telling whoppers, even to friends and loved ones. One study showed that 28 percent of conversations among friends contained conscious lies, and 77 percent of conversations between strangers did so. The lies were on matters of substance, not just "your column is good today" and the many similar prevarications intended to avoid hurt feelings.
So perhaps Americans are no longer outraged when politicians lie because we lie so often in our daily lives. Much everyday lying, Keyes says, concerns constructing attractive pasts for ourselves. "I was the quarterback on my high school football team" or "I have a master's degree" or "I had lots of proposals of marriage" or many other claims along these lines are told both to impress others and to make ourselves feel our own pasts were richer or more accomplished. As Paul Auster has written, "Memory is the place where a thing happens for the second time." But not necessarily accurately. Americans like and even admire personal mythmaking and thus don't seem to object much when political figures lie to puff up their pasts. Lyndon Johnson, for example, constantly told audiences his grandfather died at the Alamo; his grandfather died at home in bed, but an Alamo myth made Texas voters more comfortable with LBJ. Jesse Ventura elaborately claimed to have been a Navy SEAL and to have fought in Vietnam. Keyes contends that neither claim was true--but the mythical Ventura had proven attractive to voters. LBJ and Ventura, it must be noted, came out ahead by presenting personal histories they may have wished were true.
There are many other examples, and The Post-Truth Era collects dozens, making it an invaluable compendium of the decline of respect for verity in modern culture. Today many would rather watch a docudrama, in which viewers have absolutely no idea what is historical and what is imaginary, than read carefully researched history. The made-up version is more interesting! Many would rather listen to Michael Moore or the Swift Boat guys--Moore on the left and the Swifties on the right being current exemplars of post-truth politics--since the sort of arguments in which it doesn't matter what is true are more fun than tedious accuracy. The really disturbing trend, Keyes argues, is that so many figures in contemporary politics, literature, journalism, and other fields get away with so much lying about themselves. The public appears to prefer the post-truth version.
Keyes blames the decline of respect for truth partly on intellectual modernism and postmodernism. Intellectuals, he says, crusaded to convince people that there are no absolute truths, that everything is contingent or based on frames of reference. Calamity descended as people actually decided to believe this. Postmodernism's worst idea has infected popular culture, and now millions of Americans and Europeans believe that nothing is really truth. Even though most people who watch docudramas or read self-serving "fictionalized" memoirs have never heard of Jacques Derrida or Paul Feyerabend, antitruth ideas they and others championed are loose in popular culture, driving discourse downward.
Since Derrida died nine days ago, it's fair to ask whether he should be assigned some blame for the post-truth state of public debate--intellectuals, after all, must accept responsibility if their ideas do harm rather than good. Derrida was a strangely polarizing figure: His followers considered him an oracle while his detractors viewed him with absurdly exaggerated alarm. Some of what Derrida maintained was inarguably true: for example, that writers can never really escape the confines of language structure nor free themselves of the conventional assumptions of society, which impose psychological limits on creativity. That's a powerful critique. Of course, if the critique is inarguably true, then how does it jibe with Derrida's additional contention that nothing can be inarguably true? Off you go into the postmodernism hall of mirrors, and pretty soon you are all the way back to fretting about whether the chair is actually there.
I think Derrida and others in his general camp do share some of the blame for declining public respect for the notion that some things are true and other things are not true. Intellectuals like to curse the benighted public for not grasping academic theories, but the worst aspect of postmodernism (which is now an old enough term that we ought to be saying aprés-modernism, perhaps) is that the public actually did grasp it. While the ideas of, say, metaphysicians currently have no bearing on public culture, the ideas of the deconstructionists and postmodernists are prevalent in movies, pop fiction, and politics. It's a worst-case outcome.
Though assigning Derrida some of the blame for the post-truth era, I would like to vindicate Werner Heisenberg, whose work is widely misunderstood, including by many well-educated people. The Heisenberg Uncertainty Principle holds that measuring certain systems alters the system, such that you can know either the position or the momentum of a subatomic particle but not both. Since Heisenberg began publishing his work in the 1920s, modern and then postmodern authors and thinkers have been insisting that the uncertainty principle is hard scientific evidence that no belief, statement, or even observation can be verified; nothing is definite, all is subject to uncertainty. This is total nonsense--because the uncertainty principle applies only to the quantum level, not the world of human senses.
Heisenberg's research concerned paradoxes of quantum mechanics, and quantum mechanics is the science of the incredibly small: of structures much, much tinier than atoms. For instance a "quantum leap" is an infinitesimally small subatomic transition, not a big jump as the term is commonly misused. At the quantum level, researchers observe many strange effects and can barely guess what they are seeing; for instance what the quark, the smallest observed unit of matter, is made of is anybody's guess. (My favorite theory is that quarks are made of very rapidly spinning nothing.) But quantum effects are never observed above the quantum level--that is, above the level of subatomic particles. Heisenberg's thesis has no relevance to the everyday world.
Here is an interesting paper (Cho, PHYSICS: Researchers Race to Put the Quantum Into Mechanics, Science 2003 299: 36-37) on a University of California at Santa Barbara physicist who's been trying to determine why quantum effects are never observed in the macro world. Good luck, Professor Cleland, in puzzling this out. What matters is that nothing in Heisenberg's uncertainty principle applies to any object larger than a molecule. We may not be able to determine precisely where an electron is--but we know exactly where a rock, desk, or chair is. Uncertainty at the quantum level washes out when averaged across the quadrillions of quantum-sized particles in a baseball, whose position may then be precisely known. There is no uncertainty about most physics of the macro world, and no uncertainty about how we experience that world.
Theatergoers were done a great disservice by the recent hit play Copenhagen, which depicted a 1941 meeting in Denmark between Heisenberg and his teacher Niels Bohr and went on and on and on and on and on about how the uncertainty principle tells us that nothing is ever known or certain. I winced throughout this play regarding its distortions of science and wondered whether author Michael Frayn was a naïf with no grasp of physics or was deliberately misrepresenting physics in order to make his work trendy aprés-modernism. Critics who praised Copenhagen seemed to lack a grasp of physics too; I saw no review that noted the uncertainty principle has no application whatsoever to human experience. Audiences wanted, perhaps, to believe that science has proven there cannot be truth. I state, as an absolute truth, that this has not been proven and will never be proven. And I commend to readers The Post-Truth Era as an antidote.
Gregg Easterbrook is a senior editor at TNR and a visiting fellow at the Brookings Institution.