Sentimental Nihilism And Popular Culture
Commercial art forms may yet save the Western tradition from its propensity to self-destruction
If last year’s debates about Britishness demonstrated anything, it’s that a culture cannot be reduced to a checklist of its most popular dishes and landmarks. Society is built, instead, upon the countless habits and rituals of its members, both living and dead. Since collective identity emerges imperceptibly from these everyday experiences, our understanding of ourselves is always rather nebulous and imprecise — like one of those optical illusions that, when one focuses too hard, dissolves back into the page. As each generation passes, we forget something essential — if intangible—about ou rselves. With the final breath of every dying person, some small spirit of the age escapes irretrievably into the air.
Throughout history, civilisations have compensated for this loss by stowing their shared memories in communal institutions. But today, for perhaps the first time in history, large chunks of our culture appear indifferent, even hostile, to their own past.
Look, for instance, at the art world. For many centuries, the West’s artistic traditions were held among its most precious assets, for they conveyed — by melody and brushstroke — so many things otherwise inexpressible about who we are. But at the beginning of the 20th century, culture suddenly took a different turn: artists, no longer content simply to loosen the ties and top buttons of convention, stripped themselves completely, doused their clothes in petrol, and set them alight.
Swept by the modernism surging through Europe’s veins, they sought to overturn and recreate everything anew. Declaring their own traditions irrelevant, they butchered them. Schoenberg irrevocably scrambled tonality. Duchamp scribbled a moustache on the Mona Lisa.
The great oaks of Western art were burned to the ground. Today, radical artists are left scouring through the embers, still looking for last traces of life. Their primary target is now the taboo — the unspoken memory of a once-communal system of values. Tracey Emin shows us her unmade bed, strewn with used condoms and bloodied underwear. Damien Hirst suggests that the 9/11 hijackers “need congratulating”. Every last inherited standard — every last comfort — must be torn from us once and for all.
But by trying so hard to wipe its own memory, art comes perilously close to losing its sense of self altogether. Once the shocks no longer shock, what does it stand for? A few generations after the narcotic highs of modernism, the art world has left itself largely brain-dead.
This tragedy acts as a miniature simulation of just how easily — and quickly — cultures can wither away. And it ought to alarm us to see the same pattern emerging right across Western society.
Consider the main philosophical movements of the 20th century. The majority followed the fearsome footsteps of Friedrich Nietzsche — the man who killed God and buried good and evil at His side. And though they grappled with his legacy in a variety of ways, they shared, more or less, the same key assumption: that the traditional pursuits of thought — truth, beauty, meaning — were fundamentally misguided. Philosophy, unable to comment on the world, turned instead to — and on — itself. “Having broken its pledge to be at one with reality,” Theodor Adorno wrote, “philosophy is obliged to ruthlessly criticise itself.”
At the same time, positivism — the belief that only empirical or logically deduced data have any real meaning — took hold among many of the West’s intellectual circles. A.J. Ayer and Bertrand Russell declared that, if we were ever to understand ourselves, it would be by scientific means alone. Cultural memory, which could not be reduced to testable propositions, was made entirely superfluous.
Wherever one looked, the West seemed to be in the midst of a curious experiment: can a civilisation survive on nothing but the impulse to debunk its own presuppositions?
Adorno and his co-author Max Horkheimer tried to tackle this question in Dialectic of Enlightenment. A bleak assessment of Western culture, it argued that modernism, nihilism and reductionism were symptoms of the same fundamental malady — the suicide of Enlightenment thinking. Our insatiable appetite for self-criticism, the monstrous alter ego of philosophical scepticism, was finally gnawing at the very foundations on which we stood.
Adorno and Horkheimer thought it unlikely we would survive, and predicted three historical steps that would see us collapse altogether. High culture — including art — would exhaust itself, taking with it any sense of a shared inheritance. Second, we would lapse into infantile solipsism, duped by the immediate gratifications of capitalism — in particular, cinema and popular music. Finally, society — stupefied by such pleasures — would topple at the first serious test of its walls. Adorno and Horkheimer saw a host of surrogate mythologies — most notably, Nazism — poised to flood into the vacuum left behind.
This final point seemed borne out by the events of the 1930s and 1940s. But then, as the war receded into the past, much of the West suddenly found itself reclining into an unprecedented period of peace and prosperity. To the baby boomers, Adorno and Horkheimer’s stuffy pessimism seemed laughably outmoded. And today, we assume — having never known any different — that this good fortune is simply here to stay. At a time of such global instability — with Putin and Islamism openly challenging our values — we urgently need to reconsider our confidence. Were the last 70 years really the final disproof of Adorno and Horkheimer’s pessimism, or did history merely postpone its judgment?
Let us begin with the charge of Western infantilism. Here, at least, Adorno and Horkheimer seem to have been rather prescient. The West is — for all its wealth today — far more childish than even they anticipated. This can be traced — I believe — to the reductionist narratives we adopted as our mantras during the last century.
Think about the social implications of Ayer’s philosophy, emotivism. According to Ayer, moral and aesthetic statements express nothing but the crudest of personal feelings — when I say “Theft is wrong,” all I really mean is “I don’t like theft.” That’s it. Arguments about the thorniest of ethical dilemmas or the most sublime of artworks are reduced to the level of a toddler’s tantrum. The evolutionary psychologists go even further: we’re not just children, they say; we’re animals. According to Richard Dawkins, “Our animal origins are constantly lurking behind, even if they are filtered through complicated social evolution.” Culture is just a long-winded mating game that, somewhere along the line, seems to have got a bit out of hand.
These are not niche ideas any more. Advertisements humorously depict us as bumbling primates, perhaps stumbling upon coffee or a microwave for the first time. Kurt Vonnegut wrote: “I was taught that the human brain was the crowning glory of evolution so far, but I think it’s a very poor scheme for survival.” Such writers give us absolutely no reason to cultivate virtue, no reason to refine our judgments, and every reason to ignore the past and dispense with our responsibilities.
Which, given the easy ride we’re getting, suits us just fine. We seek to make society blinkered, mindless and immature. Look at the way today’s businesses choose to market themselves. They invent names that imitate the nonsense words of babies: Zoopla, Giffgaff, Google, Trivago. They deliberately botch grammar in their slogans to sound naïve and cutesy: “Find your happy”, “Be differenter”, “The joy of done”. They make their advertisements and logos twee and ironic — a twirly moustache here, a talking dog there — just to show how carefree and fun they are.
Those in our society who actually still have children have them later and in smaller numbers than ever. Many simply choose to forego the responsibilities of parenthood altogether. Marriage is an optional extra — one from which we can opt out at any point, regardless of the consequences for the children.
Students expect to be treated like five-year-olds: one conference recently prohibited applause for fear it would, somehow, trigger a spate of breakdowns. Many of my fellow twentysomethings reach adulthood believing they can recreate in their everyday lives the woolly comforts of social media. They discover, with some surprise, that they cannot simply click away real confrontation, and — having never developed the psychological mechanisms to cope with it — instead seek simply to ban it.
The effects of social media don’t end there. A Pew Research Centre study last year found that regular social media users are far more likely than non-users to censor themselves, even offline. We learn to ignore, rather than engage with, genuine disagreement, and so ultimately dismantle the most important distinction between civil society and the playground — the ability to live respectfully alongside those with whom we disagree.
Social media assures us that the large civilisational questions have already been settled, that undemocratic nations will — just as soon as they’re able to tweet a little more — burst into glorious liberty, and that politics is, thus, merely a series of gestures to make us feel a bit better. Hence the bewildering range of global issues we seem to think can be somehow resolved with a sober mugshot and a meaningful hashtag.
In reality, our good fortune is an anomaly. We’ll face again genuine, terrifying confrontations of a kind we can scarcely imagine today. And we’ll need something a little more robust than an e-petition and a cat video.
Sadly, our philosophical approach seems to have been to paper over Nietzsche’s terrifying abyss with “Keep calm . . .” posters. If one were to characterise the West’s broad philosophical outlook today, it would be this: sentimental nihilism. We accept, as “risen apes”, that it’s all meaningless. But hey, we’re having a good time, right?
This is gleefully expressed by our society’s favourite spokespeople — comedians, glorifying the saccharine naivety of a culture stuck in the present. When the New York Times columnist Ross Douthat asked the comedian Bill Maher to locate the source of human rights, he simply shrugged his shoulders and said, “It’s in the laws of common sense.”
Unable to make sense — as Alasdair MacIntyre says — of the mutilated philosophical traditions that once gave our now everyday language its meaning, we curl up into our little corner of history and — fingers crossed behind our backs — resort to wishful assertions. As a classic sentimental nihilist, Stephen Fry, says: “I know that lies will always fail and indecency and intolerance will always perish.” Really? On what evidence?
Far more likely to perish, unfortunately, is the “open society”. As the philosopher Leszek Kolakowski wrote: “the extension and consistent application of liberal principles transforms them into their antithesis . . . [A]mong the dangers threatening the pluralist society from within . . . what seems to bode most ill is the weakening of the psychological preparedness to defend it.” Perhaps he had in mind Bertrand Russell’s boast “I would never die for my beliefs because I might be wrong”, which is today echoed by Ricky Gervais: “We have nothing to die for. We have everything to live for.” Will history be kind enough to let us get by on that alone?
If we are to equip ourselves for the challenges ahead, we urgently need to tackle this nihilism. For as long as we see ourselves as the spiritless inhabitants of a meaningless world, we will teeter precariously above a precipice of our own making.
We must also reverse our deep-set suspicion of history. Our universities have, for some time now, been expunging reams of “dead white males” from their reading lists. To a generation with fingers in their ears, such thinkers have nothing to say. Unable to sense the subtle threads that bind us all to a shared past, we latch on instead to whichever tags are dangled in front of us — feminism, transgenderism, post-colonialism. These labels are much easier to grasp, for they require no real knowledge of the past, only of present suffering.
We need some way to engage with each other as members of a common group once again. And though so much of our culture splintered over the last century, there is one strand that might provide us with a starting point: popular culture.
Anchored by the conservatism of public taste, most popular forms — film and music in particular — stayed the course of the 20th century much more successfully than their “higher” cousins. Many can trace an unbroken line back to the very traditions the modernists tried to sever us from. If a contemporary classical composer writes in a tonal style, it sounds peculiar to us: too self-conscious, too kitsch. But in popular music, the continued use of a harmonic system developed centuries ago sounds perfectly natural — precisely because it never tried fully to break away.
Indeed, far more of the West’s teleological code might have been smuggled in popular forms than their highbrow critics ever realised. Just as the eye seems to appear on whichever evolutionary branch one looks at, so the same trends that preoccupied Western musicians a hundred years ago are unfurling in pop music today. Melody strains against its rhythmic and harmonic leashes once again, threatening to snap free altogether. But while Schoenberg — motivated by political ideology — thrust this melodic “autonomy” onto his works, today it grows out of humanity’s simple desire to explore. The prognosis for today’s music is therefore, I believe, much better.
Popular culture crystallised archetypically Western tropes that, if nurtured, may still blossom again. It is probably the closest thing we have today to a myth about ourselves — we do not question, perhaps cannot question, the pre-rational sway it has over us. So ingrained in the public’s mind are the perfect cadence and the love story that not even the Enlightenment’s cynical ticks can burrow deep enough to suck them out. Today, like the lounge suit, their ubiquity conceals a quintessentially Western inheritance.
Which suggests that capitalism — for all Adorno and Horkheimer’s misgivings — might protect, rather than corrupt, culture. Kolakowski notes how totalitarian regimes reach a point of economic stagnation and collapse, taking their culture with them. Capitalism, by reflecting more accurately the intricate web of human relations, does a better — though not, of course, perfect — job of preserving our tastes and traditions.
But it cannot look after us alone. It is but one part of an urgently needed review of who we are and where we’re going. And to face the future with any confidence, we must begin with the memory of where we once came from.