Mark Fisher
K-Punk: The Collected and Unpublished Writings of Mark Fisher
Editor’s Introduction by Darren Ambrose
PART ONE: METHODS OF DREAMING: BOOKS
space, time, light, all the essentials — reflections on j.g. ballard season (bbc four)
why i want to fuck ronald reagan
what are the politics of boredom? (ballard 2003 remix)
fantasy kits: steven meisel’s “state of emergency”
the assassination of j.g. ballard
toy stories: puppets, dolls and horror stories
PART TWO: SCREENS, DREAMS AND SPECTRES: FILM AND TELEVISION
portmeirion: an ideal for living
fear and misery in the third reich ‘n’ roll
gothic oedipus: subjectivity and capitalism in christopher nolan’s batman begins
when we dream, do we dream we’re joey?
notes on cronenberg’s eXistenZ
i filmed it so i didn’t have to remember it myself
spectres of marker and the reality of the third way
“you have always been the caretaker”: the spectral spaces of the overlook hotel
coffee bars and internment camps
“they killed their mother”: avatar as ideological symptom
return of the gift: richard kelly’s the box
“just relax and enjoy it”: geworfenheit on the bbc
star wars was a sellout from the start
beyond good and evil: breaking bad
classless broadcasting benefits street
rooting for the enemy the americans
how to let go: the leftovers, broadchurch and the missing
the strange death of british satire
the house that fame built: celebrity big brother
sympathy for the androids: the twisted morality of westworld
PART THREE: CHOOSE YOUR WEAPONS: WRITING ON MUSIC
the by now traditional glasto rant
k-punk, or the glampunk art pop discontinuum
noise as anti-capital: as the veneer of democracy starts to fade
lions after slumber, or what is sublimation today?
for your unpleasure: the hauter-couture of goth
it doesn’t matter if we all die: the cure’s unholy trinity
memorex for the kraken: the fall’s pulp modernism
postmodernism as pathology, part 2
you remind me of gold: dialogue with mark fisher and simon reynolds
militant tendencies feed music
the secret sadness of the twenty-first century: james blake’s overgrown
review: david bowie’s the next day
the man who has everything: drake’s nothing was the same
break it down: dj rashad’s double cup
start your nonsense! on eMMplekz and dolly dolly
review: sleaford mods’ divide and exit and chubbed up: the singles collection
test dept: where leftist idealism and popular modernism collide
PART FOUR: FOR NOW, OUR DESIRE IS NAMELESS: POLITICAL WRITINGS
don’t vote, don’t encourage them
october 6, 1979: capitalism and bipolar disorder
what if they had a protest and everyone came
the face of terrorism without a face
conspicuous force and verminisation
my card: my life: comments on the amex red campaign
the great bullingdon club swindle
winter of discontent 2.0: notes on a month of militancy
football/capitalist realism/utopia
the future is still ours: autonomy and postcapitalism
the only certainties are death and capital
why mental health is a political issue
time-wars: towards an alternative for the neo-capitalist era
not failing better, but fighting to win
the happiness of margaret thatcher
how to kill a zombie: strategising the end of neoliberalism
no one is bored, everything is boring
abandon hope (summer is coming)
for now, our desire is nameless
PART FIVE: WE HAVE TO INVENT THE FUTURE: INTERVIEWS
they can be different in the future too: interviewed by rowan wilson for ready steady book (2010)
capitalist realism: interviewed by richard capes (2011)
preoccupying: interviewed by the occupied times (2012)
we need a post-capitalist vision: interviewed by anticapitalist initiative (2012)
“we have to invent the future”: an unseen interview with mark fisher (2012)
PART SIX: WE ARE NOT HERE TO ENTERTAIN YOU: REFLECTIONS
how to keep oedipus alive in cyberspace
no future 2012 (for nick kilroy)
ridicule is nothing to be scared of (slight return)
real abstractions: the application of theory to the modern world
fear and misery in neoliberal britain1
K-Punk: The Collected and Unpublished Writings of Mark Fisher (2004-2016)
Front Matter
About the Authors
Mark Fisher (1968 – 2017) was a co-founder of Zer0 Books and, later, Repeater Books. His blog, k-punk, defined critical writing for a generation. He wrote three books, Capitalist Realism, Ghosts of My Life and The Weird and the Eerie, and was a Visiting Fellow in the Visual Cultures department at Goldsmiths, University of London.
Darren Ambrose is a freelance writer and editor from the North-East of England.
Simon Reynolds is the author of Retromania and Rip It Up and Start Again.
Title Page
k-punk
The Collected and Unpublished Writings of Mark Fisher (2004-2016)
Edited by Darren Ambrose
Foreword by Simon Reynolds
Foreword by Simon Reynolds
The strange thing is that I encountered Mark’s mind long before I actually met him. In a way, I knew him before I even knew of him.
Let me explain. In 1994, I wrote a piece for Melody Maker about D-Generation, a concept-laden outfit from Manchester whose line-up included Mark. But I only ever spoke on the phone with another member, Simon Biddell. Because I was so interested in D-Generation’s ideas, it never even occurred to me to do the basic journalistic procedure of asking who else was in the group. So it was a full decade later that I learned I’d effectively written about Mark, when he shyly revealed this fact in an email to me. And sure enough, digging out the yellowing clip — D-Generation as “Pick of the Week” in the Advance section of Melody Maker — there was Mark right in the centre of the photo: his hair in a vaguely Madchester-style bob, his eyes staring out at the reader with a searching, baleful intensity.
D-Generation were one of those groups that are grist for the mill of the music press, catnip to a certain kind of critic: the conceptual framing was piquant and provocative, the sound itself lagged slightly behind the spiel. Rereading the piece and listening for the first time in many years to D-Generation’s EP Entropy in the UK, it’s fascinating just how many of Mark’s signature fixations were already in evidence. There’s the centrality of punk in his worldview: D-Generation described their music as “techno haunted by the ghost of the punk” (literally, in “The Condition of Muzak”, which sampled Johnny Rotten’s Winterland 1978 kiss-off “ever get the feeling you’ve been cheated?” and turned his bitter jeering laugh into a riff). There’s the love-hate for Englishness: hating the hale ‘n’ hearty, artless and anti-intellectual side of the national character (“Rotting Hill” sampled “Merrie England? England was never merry!” from the film version of Lucky Jim), loving a darkly arty deviant tradition that included the Fall, Wyndham Lewis, and Michael Moorcock (all referenced in D-Generation’s press release). There’s also early evidence of Mark’s virulent contempt for retro: “73/93” targeted what D-Generation dubbed the “Nostalgia Conspiracy”. And there’s even flickering ectoplasmic portents of hauntology, that twenty-first century current of music and thought and sensibility that Mark championed so compellingly.
Beyond those specifics, though, it’s the structure of the encounter itself that is revealing and prefiguring. Here’s a music journalist (in this case, me) hungrily on the look-out for a group with ideas, and, having found one (in this case, D-Generation), forming a symbiotic alliance with musicians who themselves think like critics. That’s how Mark would operate when he got to the other side of divide. In his fruitful relationships with Burial, the Caretaker, Junior Boys, and other artists, a mutually intensifying feedback loop between the music-theorist and the music-practitioner was set in motion. The borderline between the two fronts of activity dissolved. Both critic and artist contributed equally to the scene, pushing it ever forward in a dialectic of advance, counter-reaction, swerve, clash.
Raised on the British music press of the Eighties (primarily NME) and fuelled further by what survived of its approach and spirit into the Nineties (primarily Melody Maker and the Wire), Mark Fisher was possibly the last of a disappearing breed: the music critic as prophet. The primary mission was identifying the leading edge and proselytising on its behalf, while simultaneously directing laser beams of negativity to discredit the wrong paths being taken and to clear space for the true music of our time. But alongside weaponised praise in support of the new and radical, the messianic critic also set challenges for music — and for listeners and readers too.
Mark Fisher became the best music writer of his generation. But that is just one of his areas of achievement. Mark wrote brilliantly about the arts adjacent to popular music: television, science fiction, mainstream movies (particularly the pulp end of the spectrum — it always amazed me that he would routinely check out things like the CGI-bloated 2005 remake of King Kong, just on the off chance there was something salvageable there, something he could recruit to his “pulp modernism” concept). Mark wrote rivetingly about high culture too — visual arts, photography, literature, highbrow cinema. And he wrote penetratingly about politics, philosophy, mental health, the Internet and social media (the phenomenology of digital life — its peculiar affects of connected loneliness and distracted boredom). Often, and most crucially, Mark wrote about many — sometimes all — of these things at the same time. Making connections across far-flung fields, zooming in for vivid attention to aesthetic particulars and zooming out again to the widest possible scope, Mark would locate the metaphysics in a TV show like Sapphire and Steel, the psychoanalytic truths lurking in a Joy Division song, the political resonances stitched into the fabric of a Burial album or Kubrick movie. His subject was all of human life (even though he would characterise himself as neither a humanist nor a vitalist). The ambition was vast; the vision was total.
The exciting thing about Mark’s writing — in his blog k-punk, in magazines like the Wire, FACT, Frieze, and New Humanist, and in his books for Zer0 and Repeater — was the feeling that he was on a journey: the ideas were going somewhere, a gigantic edifice of thought was in the process of construction. You sensed, with mounting awe, that Mark was building a system. There was a feeling too that while the work was rigorous and deeply informed, it was not academic, either in terms of its intended audience or as an exercise done purely for its own sake. The urgency in Mark’s prose came from his faith that words really could change things. His writing made everything feel more meaningful, supercharged with significance. Reading Mark was a rush. An addiction.
After that odd not-quite-encounter with D-Generation, the first time I came across the name “Mark Fisher” was a byline in the strikingly designed periodicals that emanated from an enigmatic entity known as Ccru. I can’t recall whether they sent me their tracts or whether it was our mutual friend Kodwo Eshun who turned me on. Right from the start, Mark’s work stood out. Much of the output of the Cybernetic culture research unit, a para-academic organisation loosely tethered to Warwick University’s Philosophy Department, was wilfully hermetic, closer to experimental fiction than academic work. Practically writhing on the page, Mark’s prose wasn’t scholarly either, but it was always lucid. Oh, he was partial — as we all were in those days — to gnomic neologisms and portmanteau terms; there was an exuberant play with language in amidst the apocalyptic sobriety and urgency of the tone. But —and this would characterise his entire career as a writer — Mark hardly ever made his writing more dense or difficult than it needed to be. He had the zeal of the true communicator, someone who believes that the ideas and the issues being addressed are simply too important to be obfuscated. Why put obstacles in the way of understanding? I’m sure this is why Mark’s work found a readership that extended beyond the narrow field of scholars and the university-educated that some of his more arcane and abstruse interests might have indicated. He didn’t talk down to anyone, ever, but he always invited the reader in, pulled them along with him.
I met Mark for the first time in 1998, having persuaded the academic magazine Lingua Franca to assign me a lengthy piece about Ccru and its allies in renegade academia like O[rphan] D[rift>]. Compared with the deranging strangeness of their texts, Ccru in person were surprisingly mild and, well, British. But again, Mark stood out a little from his comrades, for his sheer intensity. I remember the way his hands shook with passion as he held forth acerbically on everything from the cyberpunk aesthetics of jungle to the decrepitude of socialism. Although soft-spoken, you could see already a distinct flair for public speaking, the aura of an orator in the making.
After that, Mark and I rubbed shoulders online as contributors to the post-rave music theory site Hyperdub, founded by Ccru member Steve Goodman, aka Kode9. But full-blown friendship really came when Mark hurled himself into the blogging fray in 2003, starting k-punk a few months after I launched my own Blissblog. With incredible speed, a rough equivalent to the old UK weekly music press reconstituted itself online. Or so I liked to think, anyway: that this was the music press in exile, a reactivation of all of its bygone best aspects that you could no longer find in the surviving print remnants (what passed for NME by then, monthlies like Q). Well, all except for the getting paid aspect. But on the early 2000s blog scene, the broader perspective that the golden-era rock press had — where music held a privileged status, but film, TV, fiction, politics bubbled in the mix too — was miraculously and unexpectedly resurrected.
“It wasn’t only about music and music wasn’t only about music”, is how Mark put it, talking about what NME had meant to him growing up, a working-class boy with limited access to high culture. “It was a medium that made demands on you.” The same applied to the blog circuit, whose population of autodidacts, independent researchers, disenchanted academics (like Mark) and assorted oddballs formulated theories and ransacked the works of famous thinkers for concepts and analytical tools to misuse. In compensation for not being able to make a livelihood off your ravings and rantings, the blogging platform afforded extra powers: incredible speed of response, flexibility of format (you could blog extensive think-pieces or miniature thought-bombs), and the ability to illustrate the writing using images, audio, video. Best of all, there was an interactive and collective aspect to blogging that had only ever been glimpsed in the music press (with the readers’ letter page, writers arguing amongst each other week by week, and the regular uptake of malcontents from the fiery fanzines). The blog circuit was a true network.
Fizzing with fervour, k-punk soon became the hub of the community. Mark was a dynamo, hurling out provocations, ideas that demanded engagement. He became a cult figure. A catalyst. He was also the perfect host, convening a salon-style energy in the k-punk comments section, igniting debate and defusing dispute when things got a little testy (as they inevitably did). That amiably fractious spirit would then inform the message board Dissensus that Mark co-founded with Matthew Ingram (whose Woebot was our community’s other hub). In some ways, it was in these comments sections and message board threads that Mark was in his truest element: arguing, sometimes agreeing, but always building on his interlocutor’s point, pushing the conversation further along. Some of his best insights and lines emerged out of this back-and-forth: jewels that are hard to disentangle from the discursive thicket of their moment, countless brief exchanges and interactions in which his mind flexed itself most playfully and merrily.
During this whole period of the early-to-mid 2000s, I found myself in a pleasantly disorienting situation: someone whom I’d influenced became someone who influenced me. There was a certain edgy excitement to turning on the computer every morning and immediately checking to see what Mark had thrown down in terms of an ideas-gauntlet — a definite feeling of having to keep up. We would often operate as a long-distance double-act (with a five-hour delay, Mark being in London and me in New York). One of us would pick up on something the other had written. It was a complementary (and complimentary) relationship, like a baton being passed back and forth, but it also fairly frequently involved respectful disagreement. Others joined in as well, of course; it was an omnidirectional free for all.
As much as it was a cooperative endeavour, this blog circuit and in particular the dialogue between Mark and myself, there was also an undercurrent of competition. (No doubt this is the case with writers in many different fields). An unusual sensation, for me, of being outflanked and out-done, and having to repeatedly raise my game. In some of our exchanges, Mark had the advantage of seeing things in starker, more black-and-white terms, whereas I tended to see shades of grey, or recognise that there were things to be said in favour of the opposing view. That might be a virtue in real life, but in writing it definitely softens your attack.
Mark could access greater resources of “nihilation” — his term for a ruthless drive, on the part of a critic or an artist, to reject other approaches and condemn them to history’s trashcan. (This dismissiveness was a feature of his print persona, I should add; in person, he was generous and open.) The improviser John Butcher described this mindset from an artist’s point of view in a 2008 interview with the Wire:
This music is here in opposition to other music. It doesn’t all co-exist together nicely. The fact that I have chosen to do this implies that I don’t value what you’re doing over there. My activity calls into questions the value of your activity. This is what informs our musical thinking and decision making.
For Fisher and Butcher alike, severity towards “the opposition” is the mark of seriousness, a sign that something is at stake and that differences are worth fighting over. Above all, it is this negative capacity — the strength of will to discredit and discard — that keeps music and culture rattling along furiously in a forward direction, not wishy-washy tolerance and anythinggoes positivity. If music-making can be a form of “active criticism”, then criticism could equally be a sort of soundless contribution to music.
To gather my thoughts and clear my head before writing this, I went for a walk. There was something of an English feel to the bright and beautiful February morning in Southern California — a blustery breeze sent huge clouds skidding across the sky, their tufty whiteness shot through with piercing sunlight, creating that somehow crazed quality I associate with partly-cloudy summer days in the UK. I would love to have shown Mark around Los Angeles, shown him some of its other sides (he had some fixed ideas about the city, largely derived from Michael Mann’s Heat and Baudrillard’s America!). And I would love to have been shown around Suffolk, the coastal landscapes Mark loved.
But the amount of time we spent in each other’s physical company was painfully small. It’s possible that the number of meetings is in single figures. Mark and I lived on different continents most of the time we knew each other. That lent a kind of purity to the friendship, based as it was almost entirely on the written word: there was a lot of mental communing via email, inter-blog debate, message boards… but not much hanging out.
That means that what I’ve written here can only be a partial portrait of Mark, as both a public figure and as a man. We knew each other mostly as virtual colleagues and unofficial collaborators (we never co-wrote anything, but we formed a united front in various joint campaigns, like the hardcore continuum, hauntology, and a general anti-retro polemic). Above all, I knew him as a reader. (Again, the disconcerting thing was to become a fan of someone who had been a fan of me). But I know that there are many other Mark Fishers. Mark the teacher, Mark the editor, Mark the son and spouse and father. I encountered him almost entirely in the rarefied realms of discourse — usually fairly fevered discourse — and didn’t get to see that much of him in more casual, everyday modes. I would love to have known better these other Marks — Mark at play, Mark having a laugh, Mark relaxing, Mark with his family.
The last time I saw Mark in the flesh was in September 2012, at the music festival Incubate in Tilburg, Holland. The theme of the festival was do-it-yourself. I gave the keynote, interrogating certain aspects of DIY ideology and wondering whether it had outlived its usefulness as a cultural ideal. Mark was set to follow and spontaneously decided to drop what he was going to say and improvise a new talk, building on my argument. It was like the old blog days, except this time happening in real-time and real-space. Where I had read from a pre-written text that I laced with the occasional ad lib, Mark spoke completely off the cuff, pulling riffs from the formidable arsenal in his brain, generating new thoughts and making electric connections. The performance was typical both of his collegiality and his mental agility. Mark likened it later to a stand-up routine — adding that it was becoming a problem that institutions and individuals were video-recording his talks and putting them up on YouTube, because people would become overly familiar with his material. But I can’t imagine that was really ever going to be a problem: Mark was an inexhaustible font of insight and overview, bubbling over with fresh perceptions and original articulations, memorable maxims and acute aphorisms. He was never going to run out of things to say.
But then Mark ran out of time.
I feel his absence as a friend, as a comrade, but most of all as a reader. There are many days when I wonder what Mark would have had to say about this or that. I hadn’t realised how dependent I had become on the surprises and challenges that Mark would throw out there at regular intervals: the spur and spark of his writing, the clarity he could bring to almost anything he shone his light upon. I miss Mark’s mind. It’s a lonely feeling.
Simon Reynolds, 2018
Editor’s Introduction by Darren Ambrose
“We have to invent the future.”
— Mark Fisher[1]
Mark Fisher (k-punk) was not, and never wanted to be, a conventional academic writer, theorist or critic. His writing was too abrasive, polemical, lucid, unsentimental, personal, insightful and compelling for that. Despite displaying the most acute understanding of the physical, psychological, economic and cultural consequences of contemporary capitalism, his work was also optimistic and strategically operative. A great deal of his writing was undertaken in vehement opposition to the all-too evident PoMo dissociation and collective alienation around us, and as a response to what he termed the “boring dystopia” of our shared present, to the increasingly depthless present of this new millennium. Mark provided original, savage and stylish dissections of our moribund culture, and continually observed how popular modernist films, books, television and music that had had a lifelong effect on him continue to “haunt” our collective present.
In the 1990s Mark studied for a PhD in philosophy at the University of Warwick, successfully completing his thesis in 1999 titled “Flatline Constructs: Gothic Materialism and Cybernetic Theory-Fiction”. Whilst at Warwick he was one of the founding members of the Cybernetic culture research unit (Ccru), along with others such as Nick Land, Sadie Plant, Kodwo Eshun, Steve Goodman (Kode9), Robin Mackay and Luciana Parisi. Several years later, whilst working as an FE teacher in Kent, he began his blog k-punk. k-punk emerged in the early days of blogging, where it quickly became an important part of a community of emergent bloggers, including music journalists Simon Reynolds, Ian Penman and David Stubbs, philosophers Nina Power, Alex Williams, Lars Iyer, Adam Kotsko, Jodi Dean and Steven Shaviro, writer and activist Richard Seymour, writers Siobhan McKeown and Carl Neville, and architecture critic Owen Hatherley. One of the most vital aspects of blogging for k-punk during those early days was the simple element of re-connection, of becoming involved in a new online collective at a time in his life, after Warwick, Ccru and the PhD, when he was perhaps at his most isolated. As he recalls in a 2010 interview:
I started blogging as a way of getting back into writing after the traumatic experience of doing a PhD. PhD work bullies one into the idea that you can’t say anything about any subject until you’ve read every possible authority on it. But blogging seemed a more informal space, without that kind of pressure. Blogging was a way of tricking myself back into doing serious writing. I was able to con myself, thinking, “it doesn’t matter, it’s only a blog post, it’s not an academic paper”. But now I take the blog rather more seriously than writing academic papers.[2]
And in a post written exactly a year after launching the blog, he writes:
It’s been my only connection to the world, my only outside line… It’s reinvigorated my enthusiasm for so many things, and pricked my enthusiasm for things I’d never previously considered…[3]
The early years of the k-punk blog were ones marked by intensity and informality, with regular swathes of writing and numerous dialogues. k-punk traced the positive effects of regular writing, enabling Mark to access the depths of his own obsessions and interests and refine his own powerful style. As he developed he began to connect into a kind of thematic rhythm, and over the years, one begins to see his thoughts coalesce around the themes of hauntology, popular modernism and capitalist realism. As he acknowledges, k-punk was written as a way for him to escape from the imposed bonds and pointless strictures of academic writing. Dense, allusive, theoretically rich and abrasive posts were written in response to a consistent set of personal obsessions and external stimuli (a film, book or album to be reviewed; an event to be contextualised or theorised) and were often written with a real sense of urgency, by the need to participate in an ongoing dialogue, or by a self-imposed deadline. These k-punk posts encapsulated an intellectual moment of reflection on the world: they are responsive, immediate, and provide an affectively charged perspective. Some of his references and allusions are undoubtedly challenging and potentially intimidating — Spinoza, Kant, Nietzsche, Marcuse, Adorno, Althusser, Deleuze and Guattari, Baudrillard, Jameson, Žižek, Zupančič, Berardi, Badiou, Lacan — but his writing is never marked by the zealous pedantry exhibited by so much academic writing in the theoretical humanities. Mark has faith in the intelligence and rationality of his readers; he trusts their capacity to be challenged by unfamiliarity, complexity and the new. He consistently displays the courage to take up a strong theoretical and practical position. His work rows against the tide of anti-intellectualism in the present which has tried so hard to flatten things out to a level of cretinous instrumentality and utilitarian stupidity.
For us, the readers of the k-punk blog, his writing always gives us reasons for continuing, against the odds, to hope for an alternative to the dystopian present. This is not just a consequence of the specific content of his writing, but is as much to do with the fact that he was there at all, for the persistence of his provocative and challenging voice. That voice — strident, angry, fiercely intelligent, sophisticated and enthusiastic, serious and animated — is so close to the way he actually spoke in person. There is always a real intimacy to his writing, and he possessed a unique and rare talent to be able to articulate his thoughts and ideas without the written words diminishing, softening or reducing them. Mark’s voice is preserved in his work, and it is preserved online.
The first two k-punk posts in this collection, “Why K?” and “Book Meme”, both written in 2005, offer us some precise insight into Mark’s reasons for blogging as k-punk, together with an understanding of his operative objectives and ambitions.
One of these is the ongoing belief in simply grasping the new technological democracy of blogging and using it as a “kind of conduit for continuing trade between popular culture and theory”. k-punk’s belief in the importance of outsider forms of discourse never wavered. There is a consistent belief in the operative effectiveness of fugitive discourses which have been legitimated by neither the official channels of the establishment (via academia or mainstream media outlets) or traditional forms of publishing. In the early days of k-punk this was something he particularly came to associate with blogging:
All that is lacking is the will, the belief that what can happen in something that does not have authorisation/legitimation can be as important — more important — than what comes through official channels.[4]
Another is his declared fidelity to the ideas of Kafka and Spinoza, along with his allegiance to a whole host of things which, by articulating and confirming his own perceptions, modernist sensibilities and thoughts of existential detachment, alternative possibilities and perspectives, he felt had first brought him to a degree self-awareness: Joyce, Burroughs, Ballard, Beckett, Selby. The early books, the music, the television, the films, the ideas, the events. The Miners’ strike, the Falklands war, Thatcherism, Blairism, post-punk, Joy Division, the Fall, Scritti Politti, Magazine, acid house, jungle, Goldie, Deleuze and Guattari, Marx, Jameson, Žižek, Foucault, Nietzsche, the Ccru, Cronenberg, Atwood, Priest, M.R. James, Nigel Kneale, Marcuse, Penman, Reynolds, Batman, and in later years The Hunger Games, Burial, Sleaford Mods, the Caretaker and Russell Brand. The performativity of the k-punk blog was, at least in part, undertaken as part of a personal survival strategy, to regain and persistently re-state fidelity to those things in which he had discovered the original ideas that had animated and inspired him. He says as much in “Book Meme”:
The periods of my adult life that have been most miserable have been those in which I lost fidelity to what I discovered then, in the pages of Joyce, Dostoyevsky, Burroughs, Beckett, Selby…[5]
This fidelity is key, because it provides the animating fibre that underpins the vast collage which he somehow synthesises into an effective and operative worldview opposed to the tedious banalities of a present ruled by the merciless logic of what he came to term “capitalist realism”, where alternative possibilities have become increasingly proscribed and reduced to almost nothing. His forensic attention was so brilliantly attuned to those often unnoticed traces of modernism in texts, music, films and television programmes, and he repeatedly worked out acute readings of popular culture as if piecing together, one by one, the fragmented pages of a lost manifesto of cultural alchemy necessary for challenging the disastrous tyranny of the present. For more than a decade, k-punk served as a critical epicentre for hauntology and counterculture, highlighting the latent radical potential of a lost and receding twentieth-century modernism. The k-punk blog warped the reality field by repeatedly and incisively piercing through the drab fabric of the early years of this new millennium. k-punk brought important, uncomfortable and original challenges to the present, interrupting it with shards of difference, elements of the past that remained out of joint with the present, that disrupted the inertial logic of our times. At its very best it served to effectively resist the tendency for things to settle into a hazy homogenous present where time equals capital, and where everything is flattened out into commodified and easily digestible consumables. It did this by keeping open a space for alternative possibilities, and by refusing to allow important things to become reduced to the mediocre, the banal, the downright stupid, and the boring.
Finally, there is his exemplary antipathy and negativity towards PoMo hyper-ironic posturing, dreary hopey-feely liberal leftism, delibidinised culture, upper-class superiority, vampiric trolls, vitalist positivity, and Deleuzo-creationism, which was always evident on the k-punk blog. The sentiments and positioning of his challenging and controversial piece “Exiting the Vampire Castle”[6] in 2013 is no less evident in earlier k-punk posts like “New Comments Policy”[7] in 2004 and “We Dogmatists”[8] in 2005. His savage, cold-rational polemics are often in full force throughout the life of the k-punk project. Take this for example from “Noise as Anti-Capital”:
THERE IS NO DIGNITY. Don’t confuse the working class with the proletariat. Thatcher inhibits the emergence of the proletariat by buying off the working class with payment capital and the promise of owning your own Oed-I-Pod. The comforts of slavery. She gives the replicants screen memories and family photos. So that they forget that they were only ever artificial, factory farmed to function as the Kapital-Thing’s self-replicating HR pool, and begin to believe that they are authentic human subjects. The proletariat is not the confederation of such subjectivities but their dissolution in globalised k-space. The virtual population of nu-earth… The heroism of the proletariat consists not in its dignified resistance to the inorganic-inhumanity of the industrialisation process… but in its mutative Duchamp-transformation of its body into an inhuman inorganic constructivist machine.[9]
In 2007, Mark left his teaching job in Kent and moved to Woodbridge in Suffolk, where he began working on what would become his first book, Capitalist Realism: Is There No Alternative?, published in 2009 with Zer0 Books, the alternative publisher he had helped to co-found with his friend Tariq Goddard.[10] This book, which synthesised some of his strongest early posts on k-punk, firmly established Mark’s reputation as one of the important contemporary theorists. The publication of Capitalist Realism crystallised new forms of collective connection for Mark at the very point the original blogging community had begun to fragment and become a much more fractious and difficult space in which to operate. Although Mark continued to blog as k-punk, it became less and less frequent as he pursued his work, thought and activism in the form of invited lectures, talks and Q&As, as well as the pieces he wrote as a freelance writer having to earn a living. Mark’s frustration with the direction that blogs and forums had taken is clearly evident in a number of charged and polemical k-punk posts regarding the etiquette of comments and discussion, as well as his numerous interventions found on online forums.
After a precarious period of freelance writing Mark began to teach again, teaching courses in philosophy for City Lit and the University of East London, and later had a permanent position at Goldsmiths. He went on to produce two further volumes of published work, Ghosts of My Life for Zer0 in 2014 and The Weird and the Eerie for his newly formed Repeater Books at the end of 2016. Both volumes cull work from k-punk and from his commissioned reviews and interviews for the Wire, a magazine he had acted as Deputy Editor for in 2008. During this period he also published significant amounts of writing for online and print publications in the form of book chapters, music reviews, film reviews, opinion pieces, pieces of activism and theoretical essays, as well as continued posts for k-punk. It is the aim of this volume to bring together for the first time a significant portion of this writing.
The editorial task of putting together this collection was, at times, akin to a peculiar form of digital archaeology and memento mori. It involved excavating a decade of digital layers, often confronting digital lacunae where the reconstruction of lost dialogues from online fragments, interlocutors having gone missing, became necessary. Sometimes the task was one of recovering lost memories from pages haunted by dead links, a task that felt oddly appropriate given Mark’s emphasis on hauntology. It was accompanied by a strange melancholy, contemporary since it had been birthed by the first layers of the online age, but also weirdly resonant with previous forms of melancholy and memory associated with found photographs, diary fragments, carvings, etchings, paintings. I will confess that I had some trepidation that much of the writing by k-punk from the last decade, once excavated, would have lost something with the inexorable passing of time. But I quickly discovered that this is in fact not the case. There is so much here that remains vital, fascinating, inspiring and insightful, and a great deal of this writing is gathered together for this collection.
There were inevitably difficulties regarding the decision about what to include and exclude. It was immediately evident that there is a vast reservoir of writing, almost overwhelming in terms of its scale and scope. Yet, what also became sadly evident is its sheer finite quality. As one reads through Mark’s collected writings one is simply led to the incredibly sad realisation that this is all there is, and all there ever will be. As David Stubbs wrote for the Quietus, Mark’s death “leaves a gaping crater in modern intellectual life”.[11] The loss of Mark’s voice from the present situation is incalculable, let alone his companionship, camaraderie, and friendship. I will now provide a few words about the working criteria for inclusion that governed the assembly of this volume.
The first priority was to avoid any unnecessary repetition of previously existing published material in his three published books — Capitalist Realism, Ghosts of my Life and The Weird and the Eerie — together with his essays in the two edited and co-edited volumes on Michael Jackson and post-punk.[12] As was noted above, he had culled material for all of these publications from k-punk and his writings in the Wire, and therefore much of that material is excluded from this collection. Where there are exceptions it was felt necessary to include original posts which had been either significantly truncated or modified by Mark for subsequent publication in these volumes, and these pieces have been clearly indicated in the endnotes.
There was a clear need to reflect the full range of his k-punk writing on many different subjects from 2004-2016, rather than concentrate on just his music or film writing. The aim was always to provide as comprehensive a picture as possible of the blog throughout its lifetime by selecting pieces that reflect both its eclectic content, its theoretical pluralism and most of all its remarkable consistency. His earlier pre-k-punk writing, including that undertaken as part of the Ccru, is not included here. For this collection the decision was made to concentrate solely on his writing after the inception of the k-punk blog, and for his earlier work to be the province of another volume. A very small number of early k-punk posts, e.g. on antinatalism, were excluded by virtue of the fact that they seemed wildly out of step with Mark’s overall theoretical and political development, and because they seemed to reflect a temporary enthusiasm for a dogmatic theoretical misanthropy he repudiated in his later writing and life. There was also a need to represent the sheer breadth of Mark’s other writings — his freelance reviews, commissions and activism, including his writing for the Wire, Frieze, New Humanist, the Visual Artists’ New Sheet, Electronic Beats, the Guardian, etc. Sometimes these resonate with and reflect on pieces he published on k-punk, but more often he wrote on a range of subjects and themes not found on the blog.
The pieces chosen for the collection needed be of sufficient length and depth to work in a published volume — there are so many insightful one or two paragraph blog posts on k-punk that are excluded for that reason. Their effectiveness is largely down to the context of the blog architecture and community, whether they are interventions in ongoing online conversations and dialogues, or immediate response pieces to something happening in the media, online, or in the everyday. There are, however, a few exceptions to this where it just seemed criminal not to include them, usually because they were exemplary bursts of critical savagery or brilliant pieces of brutal polemic.
For the purposes of this single-volume collection, there was a need to abstract k-punk posts, to a certain extent, from the old blog community architecture, and yet, at the same time, try to retain a certain flavour of that community of blog writing. In abstracting posts it was important not to completely lose sight of the fact that they were originally written as blog pieces, and to try and retain a sense of their immediacy, informality, collaborative qualities and the sense of them as being part of an ongoing online continuum. This was a delicate balance to achieve, sometimes necessitated by the simple fact that many of his online blog interlocutors have long disappeared from online or abandoned their blogs, leaving them like ghost ships in cyberspace. Where this balance was simply not possible or unworkable, pieces have been excluded. The advantage of a collection of writings such as this lies in the fact that it allows readers to access a vast range of Mark’s work on a whole host of different topics and themes, all in one place. It allows for an appreciation of the sheer scale, depth and originality of his work. However, I have been conscious of the peculiar disadvantages of doing this with Mark’s writing, the vast majority of which appears on his k-punk blog, in that it extracts and abstracts his work from the very specific context of the blog — its immediacy, its dialogic nature, its hyperlinked architecture and its sense of a holistic continuum. I have tried very hard to retain some important aspects of that particular quality of his writing in transposing it to a collection such as this, and where possible I took guidance from the way Mark himself carried this out when he culled his k-punk writing for his published books. All titles of the pieces are either Mark’s own or those of the editors of the publications for which he wrote. Each piece includes a reference indicating when and in which publication it first appeared. The spelling and punctuation have, with some exceptions, been regularised and rendered consistent across the collection.
Rather than simply arrange the writing in chronological order, the decision was made to separate the work chronologically into several different themed sections. This decision clearly has the advantage of thematic coherence for the reader, but the disadvantage is that it obviously creates an artificial division between the k-punk posts, articles, reviews and essays, and breaks up the hyperlinked quality of much of his writing. As one reads across these pieces one will discover the clear theoretical resonances between pieces written on say film, music or political activism in 2006 or 2007, where Mark is weaving the operative influence of a particular theorist’s ideas or set of principles between a range of different topics and themes at the same time. There is a slight tendency for this quality to be lost in the thematic arrangement of this volume, but efforts to mitigate this have been made in the footnotes where possible. However, despite arranging the writing thematically, the sheer consistency of his work across the different topics remains absolutely evident. These include Mark’s ongoing fidelity to theory in order to provide different and challenging perspectives, alternative narratives, and for producing important truths. One also finds across all of the work a consistent concentration on the themes of class and collectivism, precarity and insecurity, depression and mental illness, dogmatism and purpose, attempts to trace post-capitalist forms of desire, the need to expose egregious forms of reality management, efforts to express forms of collective memory, and a nostalgia for popular modernism, hauntology and lost futures. I feel confident that the thematic arrangement of his work in this volume does nothing to diminish the sheer lucidity of his work in this regard.
It is also the case that one of the most significant qualities of online writing, namely the ability to embed shortcut hyperlinks to references and sources, is somewhat difficult if not downright impossible to reproduce in traditional published form in a book. However, where possible, every effort has been made to track down and reproduce in the footnotes all of the surviving hyperlinks contained in k-punk’s posts and other online writing.
Finally, it was important to include any unpublished work. What is included here represents all that is considered unpublished, and takes the form of the final and sadly unfinished k-punk post “Mannequin Challenge”, which addresses the US presidential election of 2016; a piece titled “Anti-Therapy”, taken from a talk Mark gave in 2015, which was then translated into German and published as part of a collection by Matthes & Seitz and appears here in English for the first time; and the unfinished introduction to Mark’s proposed book titled Acid Communism. The introduction to this final proposed work is extremely suggestive and significant. It is clearly the case that a great deal of Mark’s work was written in response to Fredric Jameson, in particular Jameson’s work on postmodernism (in Postmodernism or, The Cultural Logic of Late Capitalism and A Singular Modernity). Mark repeatedly notes Jameson’s extraordinary prescience, back in 1991, in identifying and analysing the postmodern condition:
Jameson [writes of] a depthless experience, in which the past is everywhere at the same time as the historical sense fades; we have a “society bereft of all historicity” that is simultaneously unable to present anything that is not a reheated version of the past.[13]
And:
Could the only opposition to a culture dominated by what Jameson calls the “nostalgia mode” be a kind of nostalgia for modernism?[14]
What becomes evident in his late work on acid communism is how he became more and more drawn to a problematic identified by Jameson between Marcuse’s cultural exceptionalism or the “semiautonomy” of the cultural realm — “its ghostly, yet Utopian, existence, for good or ill, above the practical world of the existent” — and the question of whether this semiautonomy of the cultural sphere “has been destroyed by the logic of late capitalism”.[15] As Mark knew only too well, for Jameson this dissolution of an autonomous sphere of culture does not imply its disappearance. Rather, its dissolution is to be
imagined in terms of an explosion: a prodigious expansion of culture throughout the social realm, to the point at which everything in our social life — from economic value and state power to practices and to the very structure of the psyche itself — can be said to have become “cultural” in some original and yet untheorised sense… Distance in general (including “critical distance” in particular) has very precisely been abolished in the new sphere of postmodernism. We are submerged in its henceforth filled and suffused volumes to the point where our new postmodern bodies are bereft of spatial coordinates and practically (let alone theoretically) incapable of distantiation… It is precisely this whole extraordinary demoralising and depressing original new global space which is the “moment of truth” of postmodernism.[16]
And Mark, in his unfinished introduction to Acid Communism, insists upon revisiting the issue of the utopian vision that is the legacy of the Sixties, taught by Marcuse and raised here by Jameson:
The ultimate political coordinates of the problem of the evaluation of postmodernism is one of the Utopian impulses to be detected in various forms of the postmodern today. One wants to insist very strongly on the necessity of the reinvention of the Utopian vision in any contemporary politics: this lesson, which Marcuse first taught us, is part of the legacy of the Sixties which must never be abandoned in any reevaluation of that period and of our relationship to it.[17]
Here, towards the end of his life, in the unfinished introduction to Acid Communism, we find Mark reaching for “a new humanity, a new seeing, a new thinking, a new loving: this is the promise of acid communism”.
Covering over a decade of writing, reading this collection is like taking a trip through recent cultural and political history with Mark as your guide. Hopefully these writings, together with his other published books, will serve as further reminder of Mark’s own utopian vision, how vital, necessary and exciting k-punk was, his fidelity to the possibility of alternatives, and simply of how much we all have lost.
Darren Ambrose
Whitley Bay, February 2018
why k?
1. Why I started the blog? Because it seemed like a space — the only space — in which to maintain a kind of discourse that had started in the music press and the art schools, but which had all but died out, with what I think are appalling cultural and political consequences. My interest in theory was almost entirely inspired by writers like Ian Penman and Simon Reynolds, so there has always been an intense connection between theory and pop/film for me. No sob stories, but for someone from my background it’s difficult to see where else that interest would have come from.
2. Because of that, my relation to the academy has always been uh difficult. The way in which I understood theory — primarily through popular culture — is generally detested in universities. Most dealings with the academy have been literally — clinically — depressing.
3. The Ccru as an entity was developed in hostile conditions as a kind of conduit for continuing trade between popular culture and theory. The whole pulp theory/theory-fiction thing was/is a way of doing theory through, not “on”, pop cultural forms. Nick Land was the key figure here, in that it was he who was able to hold, for a while, a position “within” a university philosophy department whilst dedicatedly opening up connections to the outside. Kodwo Eshun is key as someone making connections the other way — from popular culture INTO abstruse theory. But what we all concurred upon was that something like jungle was already intensely theoretical; it didn’t require academics to judge it or pontificate upon it — the role of a theorist was as an intensifier.
4. The term k-punk came out of Ccru. “K” was used as a libidinally preferable substitution for the California/Wired captured “cyber” (the word cybernetics having its origins in the Greek, Kuber). Ccru understood cyberpunk not as a (once trendy) literary genre, but as a distributive cultural tendency facilitated by new technologies. In the same way, “punk” doesn’t designate a particular musical genre, but a confluence outside legitimate(d) space: fanzines were more significant than the music in that they allowed and produced a whole other mode of contagious activity which destroyed the need for centralised control.
5. The development of cheap and readily available sound production software, the web, blogs means there is an unprecedented punk infrastructure available. All that is lacking is the will, the belief that what can happen in something that does not have authorisation/legitimation can be as important — more important — than what comes through official channels.
6. In terms of will, there has been an enormous retrenchment since 1970s punk. The availability of the means of production has seemed to go alongside a compensatory reassertion of spectacular power.
7. To return to the academy: universities have either totally excluded or at least marginalised not only anyone connected with Ccru but also many who were at Warwick. Steve “Hyperdub” Goodman and Luciana Parisi are both Ccru agents who have managed, against the odds, to secure a position within universities. But most of us have been forced into positions outside the university. Perhaps as a result of not being incorporated (“bought off”), many in the Warwick rhizome have maintained an intense connection and robust independence. Much of the current theoretical drift on k-punk has been developed via a collaboration with Nina Power, Alberto Toscano and Ray Brassier (co-organiser of the NoiseTheoryNoise conference at Middlesex University last year). The growing popularity of philosophers like Žižek and Badiou means there is now an unexpected if rogue and fugitive line of support within the academy.
8. I teach Philosophy, Religious Studies and Critical Thinking at Orpington College. It is a Further Education college, which means that its primary intake is sixteen to nineteen year-olds. This is difficult and challenging work, but the students are in the main excellent, and far more willing to enter into discussion than undergraduates. So I don’t at all regard this position as secondary or lesser than a “proper” academic post.
PART ONE: METHODS OF DREAMING: BOOKS
book meme
At least two people have asked me to do this, so here — at last — goes.
1) How many books do you own?
No way of knowing. Certainly can’t count them and have no reliable way of calculating.
2) What was the last book you bought?
The Sex Appeal of the Inorganic, Mario Perniola.
3) What was the last book you read?
Read and finished: Michael Bracewell’s England is Mine — disappointing and frustrating. There are flashes of insight but the organisation of the book seems to change from chapter to chapter; at one moment the narrative is historical, the next it is thematic, and then regional. There is a sense of always just approaching the time when things are happening or just having missed it. Can’t help thinking that Bracewell will benefit from a more focused subject matter, which is why still I’m looking forward to his Roxy book, due out later this year. (And there’s way too much attention paid to Eng Lit: nothing will ever interest me in W.H. Boredom, for instance.)
Finishing: Houllebecq’s Atmomised. No wonder Žižek likes this one. Is there a better savaging of desolate hippie hedonism and its pathetic legacy in New Age zen bullshit?
4) Five books that mean a lot to me.
(I hate all those surveys of best films/books/LPs which have the Latest Thing at the top, so I have only allowed myself to select books that have meant something to me for at least a decade.)
Kafka: The Trial, The Castle
Is it possible to reproduce, later in life, the impact that books, records and films have between the ages of fourteen and seventeen? The periods of my adult life that have been most miserable have been those in which I lost fidelity to what I discovered then, in the pages of Joyce, Dostoyevsky, Burroughs, Beckett, Selby… Any of those could have been selected, but I choose Kafka, because of all of them, it is he who has been the most intimate and constant companion.
I actually encountered Kafka first in a Penguin compendium of The Novels of Franz Kafka that my parents, who knew very little about literature, bought me for Christmas because they thought “it looked like my kind of thing”. So it proved.
It’s difficult for me now to remember how I first received the text. Whether I initially enjoyed it or was frustrated by it I couldn’t say. Kafka, after all, is a writer who doesn’t waylay you. He invades subtly, slowly. I imagine that at the time I wanted and expected a more straightforward statement of existentialist alienation. Yet there was very little of that in Kafka. This was not a world of metaphysical grandstanding but a seedy, cramped burrow, whose ruling affect is not heroic alienation but creeping embarrassment. Physical force plays almost no role in Kafka’s fictions — it is the ever-present possibility of social shaming that is the motive force of his winding non-plots.
Remember the pitiful scenes in The Trial when K, looking for the court in an office block, knocks in turn on each door, making the pathetic excuse that he is a “house painter”? Kafka’s genius consists in banalising the absurdity of this: surprisingly, against all our expectations, it is indeed the case that K’s hearing is taking place in one of the apartments in the building. Of course it is. And why is he late? The more absurd K thinks things are, the more embarrassed he becomes for failing to understand “the ways” of the Court or of the Castle. The bureaucratic convolutions appear ridiculous and frustrating to him, but that is because he “has not understood” yet. Witness the comedy of the opening scenes of The Castle, which are less an anticipation of totalitarianism than of call centres, in which K is told that the telephones “function like musical instruments”. What kind of an idiot is he, if when he phones someone’s desk, he expects them to answer? Is he so wet behind the ears?
It’s not for nothing that Alan Bennett, the laureate of embarrassment, is an ardent admirer of Kafka. Both Bennett and Kafka understand that, no matter how absurd their rituals, pronunciations, clothes might appear to be, the ruling class are unembarrassable; that is not because there is a special code which only they understand — there is no code, precisely — but that whatever they do is alright, because it is THEM doing it. Conversely, if you are not of the “in-crowd”, nothing you can do could EVER be right; you are a priori guilty.
Atwood: Cat’s Eye
A while back, Luke asked me what an example of “cold rationalist” literature would look like. Atwood, with her reputation for coldness, is an obvious answer, but in truth, more or less all literature is cold rationalist. Why? Because it allows us to see ourselves as chains of cause and effect and thereby, paradoxically, to attain the only measure of freedom available to us. (Even Wordsworth, who admired Spinoza, described poetry as “emotion recollected in tranquility”, i.e. not raw emotion expressed in some Dionysian ejaculation.)
Cat’s Eye isn’t my favourite Atwood novel — that would be the stark Surfacing — but it is the one that means most to me. I don’t even remember all of the plot; what I will never forget are Atwood’s horribly vivid descriptions of the pitiless Hobbesian cruelty of teenage “friendships”. They walk behind you so as they can criticise your shoes, the way you walk… They are worse than your worst enemies. The long days, the breakfast toast turning to cardboard in your mouth, the anxiety so sharp and constant that you forget it is there, no longer even register it.
Are your most formative years those of your early childhood or your early teens? Reading Cat’s Eye in my early twenties was a kind of auto-psychoanalysis, a way out of the legacy of misanthropy, suppressed rage and cosmic sense of inadequacy that had been the legacy of my teenage years. Atwood’s icy analysis beautifully demonstrated that the humiliations of those teenage years were a structural effect of teenage relationships, not at all anything specific to me.
Spinoza: The Ethics
Spinoza changes everything, but gradually. There is no “road to Damascus” conversion to Spinozism, only a steady but implacable deletion of default assumptions. As with all the best philosophy, reading it is like running a Videodrome cassette: you think you are playing it, but it ends up playing you, effecting a gradual mutation of the way you think and perceive.
I’d been attracted to Spinoza as an undergraduate, but I only really read him at Warwick, under the influence of Deleuze. We spent over a year pouring over The Ethics in a reading group. Here was a philosophy that was at once forbiddingly abstract and immediately practical, pitched at both the largest conceivable cosmic scale and the minutiae of the psyche. The “impossible” bringing together of structural analysis and existentialism?
Ballard: The Atrocity Exhibition
If Spinoza and Kafka were slow-acting, Ballard’s impact was instant. He connected immediately with an unconscious saturated in media signal.
That was partly because I had in effect encountered Ballard long before I had actually read any of his work: in Joy Division (though more in Hannett’s sound than in many of the lyrics; the song “The Atrocity Exhibition”, with its anguished pleading, couldn’t be further from Ballard’s dispassionate sobriety), in Foxx and Ultravox, in Cabaret Voltaire, in Magazine.
The Drowned World is the best of his disaster novels, inundated London as a literalised surrealist landscape coolly surveyed by a latter-day Conrad, but it is The Atrocity Exhibition that is indispensable. Much more than the better-known Crash, The Atrocity Exhibition provided a conceptual and methodological repertoire for approaching the twentieth century assembled from the century’s own resources. It is austerely modernist, making little concession to either plot or character, more like a fictive sculpture than a story, an obsessively repeated series of patterns.
Yes, Ballard has been accepted into the review columns, become an elder statesman, but let’s not forget how different his background was from the standard Oxbridge man of letters. Ballard rescued Britain from Eng Lit, from “decent” humanist certainties and Sunday supplement sleepiness.
Greil Marcus: Lipstick Traces
I’ve written before about the importance of this book to me. I read it when I had just finished university, no plans, the future collapsing into a grim attempt — bound to fail — to commensurate myself to the Thatcherite economic reality principle. Marcus’ vast web of connections opened up an escape route. It was a description of a transhistorical Event, a break-out embracing anabaptists, situationists, dadaists, surrealists, punks. Such an Event was the exact opposite of the Grand Spectacles of the Eighties, the scripted and organised Non events which played out on global television with Live Aid at their epicentre. It was fugitive, secret, even when — necessarily — massively collective. Lipstick Traces was sure that pop can only have any significance when it ceases to be “just music”, when it reverberates with a politics that has nothing to do with capitalist parliamentarianism and a philosophy that has nothing to do with the academy.
Lipstick Traces is itself best read as part of a textual rhizome which attempted to register, a decade or more on, the impact of punk. See also Vague magazine (if you are looking for one of the most powerful triggers for Ccru-style cyberpunk theory, check out Mark Downham’s pieces in Vague), Savage’s England’s Dreaming. (This set not really complete until Rip It Up of course.)
5) Tag five people.
I can’t think of one other blog that hasn’t done this, so I’m stuck.
space, time, light, all the essentials — reflections on j.g. ballard season (bbc four)
Like his admirer Jean Baudrillard,[2] Ballard has for a long time resembled a rogue AI, re-permutating the same few themes ad infinitum, occasionally adding a sprinkling of contemporary detail to freshen up a limited repertoire of fixations. Fixations, fixations. Appropriate, since, after all, Ballard’s obsession is… obsession.
In the BBC Four profile — nothing new here, the old man gamely and tirelessly going over his favourite riffs, once again — Ballard repeated one of his familiar, but still powerfully sobering observations. People often comment on how extreme his early life was, Ballard said. Yet, far from being extreme, that early life — beset by hunger, fear, war and the constant threat of death — is the default condition for most human beings on the planet, now and in every previous century. It is the comfortable life of the Western Suburbanite which is in every way the planetary exception.
Thus Home, BBC Four’s brilliant adaptation of Ballard’s short story “The Enormous Space”.[3] Home is the kind of thing the BBC used to excel at: drama that was genuinely, unsettlingly weird without being insufferably, unwatchably experimental. Not that Home has much hope of appealing to popular taste stuck away on BBC Four, of course. A sign of the times.
Home revealed itself to be a perverse cousin of the suburban drop-out situation comedy, The Good Life or The Fall and Rise of Reginald Perrin spliced with Polanski’s Repulsion. (No surprise to see director Richard Curson-Smith name-checking Polanksi as an influence.) Anthony Sher was superbly, charmingly unhinged as Gerald Ballantyne, an accident victim who, instead of returning to work after his convalescence, decides to embark upon an experiment. “Decides” is no doubt too active a word; in every respect the typical Ballard character, Gerry discovers rather than initiates, finds himself drawn into a logic he is compelled to investigate. (In many ways a faithful Freudian, Ballard has no doubt that obsession always has/is a logic.)
The experiment, it turns out, has a simple premise. Gerry will stay indoors, indefinitely, living off the supplies of his well-stocked larder and freezer until… until what? Well, that is what the experiment will establish. Can he survive by “using his front door as a weapon”? What unfolds is the descent into the maelstrom Ballard has explored since High-Rise and Concrete Island, a quest to the outer edges of the human that follows a well-defined sequence, whose stages can be readily enumerated:
A letting go of the old identity. This is given up easily. Ballard’s twist on the disaster novel as far back as The Drowned World lay in the readiness of his characters to embrace rather than resist the new conditions which catastrophe had visited upon them. Ever since High-Rise, Ballard has seen characters going one-step further, actually initiating disaster as a revolt, not so much against conformity, as against air-conditioned comfort. Here, Gerry burns all his correspondence, his photographs, then his birth-certificate and — in the most sacrilegious act of all, which made mortified my Protestant soul — his money.
The loosening of the hold of civilisation [Bataille phase]. Ballard is endlessly rewriting Civilisation and its Discontents [4], and his fictions are attempts to imagine a libidinal utopia in which the pay-off between survival and repression spelled out by Freud’s mordant pessimism is somehow circumvented. The return to savagery, even the experiencing of raw hunger pangs, are eagerly savoured opportunities to relax civilisation’s impulse control and neutralising of affect. In Home, when Gerry’s conventional food supplies are running low, he turns first to the flowers in his garden and then to his neighbours’ pets. The scene in which Gerry’s neighbour questions him — in that middle-class ever-so-slightly-insinuating way — about the disappearance of his dog “Mr Fred” and his wife’s cat is a masterpiece of grisly comedy. “Perhaps they’ve eloped,” Sher gibbers, by then constantly on the edge of all-but illegible hysteria. Laughter, a strange, snorting, sniffling chortle that he can barely contain: it is that laughter which signals, more than anything else, that Gerald has left polite society, never to return.
The exploration of the transcendental beyond [Kant/Blake phase]. I mean transcendental in its strict Kantian sense, of course. Ballard likes to refer to this as his exploration of inner space, but I have always found this to be a profoundly misleading description. Much more than astronauts floating in empirical space, it is the “Outer” which Ballard’s suburban cosmonauts investigate: what they confront is time and space themselves, as preconditions of all perceptions and experiences, and what their explorations open up is an intensive zone beyond — outside — standard perceptual thresholds. Hence Home becomes an aberrant version of The Incredible Shrinking Man. Cut off from the world beyond his door — I refuse to call it the outside world — Gerry’s sense of space massively expands. “The rooms are getting bigger.” The attic becomes an antarctic “white world” of blank, freezer-burning vastness, the irruption of the transcendental outside into the empirical interior of the house, now a very cosmos, teeming with texture and previously unsuspected detail. “I feel like an explorer, or an astronaut.”
Curson-Smith’s use of the video-diary format gave the film a queasy intimacy and a suitably unheimlich relation to Pop TV now, something underscored by Gerry’s sign-off remarks about undertaking the “ultimate home makeover.” Yes, that’s one way to make the most of your space.
The man whose head expanded. “Are you on drugs, Gerald?”
And self-denial, starving, the withdrawal from company, it’s all very topical. I wonder — I hope — something Gerald-like is going on in David Blaine’s head right now.[5]
why i want to fuck ronald reagan
At the 1980 Republican Convention in San Francisco, pranksters reproduced and distributed the section of The Atrocity Exhibition called “Why I Want to Fuck Ronald Reagan”,[2] without the title and adorned with the Republican Party seal. “I’m told,” Ballard reports, “that it was accepted for what it resembled, a psychological position paper on the candidate’s subliminal appeal, commissioned from some maverick think tank.”[3]
What does this neo-Dadaist act of would-be subversion tell us? In one sense, it has to be hailed as the perfect act of subversion. But, viewed another way, it shows that subversion is impossible now. The fate of a whole tradition of ludic intervention — passing from the Dadaists into the Surrealists and the Situationists — seems to hang in the balance. Where once the Dadaists and their inheritors could dream of invading the stage, disrupting what Burroughs — still very obviously a part of this heritage — calls the “reality studio” with logic bombs, now there is no stage — no scene, Baudrillard would say — to invade. For two reasons: first, because the frontier zones of hypercapital do not try to repress so much as absorb the irrational and the illogical, and, second, because the distinction between stage and offstage has been superseded by a coolly inclusive loop of fiction: Reagan’s career outstrips any attempt to ludically lampoon it, and demonstrates the increasing pliability of the boundaries between the real and its simulations. For Baudrillard, the very attacks on “reality” mounted by groups such as the Surrealists function to keep the real alive (by providing it with a fabulous dream world, ostensibly entirely alternative to but in effect dialectically complicit with the everyday world of the real). “Surrealism was still in solidarity with the real it contested, but which it doubled and ruptured in the imaginary.”[4] In conditions of third (and fourth-order) simulacra, the giddy vertigo of hyperreality banalises a coolly hallucinogenic ambience, absorbing all reality into simulation. Fiction is everywhere — and therefore, in a certain sense, eliminated as a specific category. Where once Reagan’s own role as actor-president seemed “novel”, his subsequent career, in which moments from film history become montaged — in Reagan’s own hazy memory and in media accounts — with Reagan’s role in particular movies, the ludic becomes the ludicrous.
The apparent acceptance, by the Republican delegates, of the genuineness of the “Why I Want to Fuck Ronald Reagan” text, is both shocking and oddly predictable, and both responses are in fact a testament to the power of Ballard’s fictions, which resides no more in their ability to mimetically reflect a pre-existing social reality than it does in their capacity to imaginatively overturn it. What Ballard achieves, rather, is what Iain Hamilton-Grant calls “realism about the hyperreal”, a homeopathic participation in the media-cybernetisation of reality in late capitalism. The shock comes when we remind ourselves of (what would seem to be) the radical aberrance of Ballard’s material. “Why I Want to Fuck Ronald Reagan”, like many of the sections of The Atrocity Exhibition, particularly in the latter part of the novel, is presented as a report on experiments into audience responses to prepared media stimuli.
Ronald Reagan and the conceptual auto-disaster. Numerous studies have been conducted upon patients in terminal paresis (G.P.I.), placing Reagan in a series of simulated auto-crashes, e.g. multiple pile-ups, head on collisions, motorcade attacks (fantasies of Presidential assassinations remained a continuing preoccupation, subjects showing a marked polymorphic fixation on windshields an rear-trunk assemblies). Powerful erotic fantasies of an anal-sadistic character surrounded the image of the Presidential contender.[5]
But this shock is counterposed by a sense of predictability arising from the cool elegance of Ballard’s simulations. The technical tone of Ballard’s writing — its impersonality and lack of emotional inflection — performs the function of neutralising or normalising the ostensibly unacceptable material. Is this simulation of the operations of hypercontrol agencies a satire on them, or do their activities — and the whole cultural scene of which they are a part — render satire as such impossible now? What, after all, is the relationship between satire and simulation? To begin to answer that question we need to compare Ballard’s text with other, more definitively “satirical” texts. Before that, though, we need to bear in mind Jameson’s comments on the eclipse of parody by pastiche, which we shall examine, briefly, now.
This is not the place to interrogate the differences between parody and satire; we shall proceed on the assumption that, whatever differences there are between parody and satire, they share enough in common so as to be jointly subject to Jameson’s analyses. Parody, Jameson argues, depended upon a whole set of resources available to modernism but which have faded now: the individual subject, whose “inimitable” idiosyncratic style, Jameson wryly observes, could precisely gave rise to imitations; a strong historical sense, which has its necessary obverse a confidence that there is a genuinely contemporary means of expression; and a commitment to collective projects, which could motivate writing and give it a political purpose. As these disappear, Jameson suggests, so does the space of parody. Individual style gives way to a “field of stylistic and discursive heterogeneity without a norm”, just as the belief in progress and the faith that one could describe new times in new terms wanes, to be replaced by “the imitation of dead styles, speech through all the masks and voices stored up in the imaginary museums of a new global culture”. Late capitalism’s “postliteracy”, meanwhile, points to “the absence of any great collective project.” What results, according to Jameson, is a depthless experience, in which the past is everywhere at the same time as the historical sense fades; we have a “society bereft of all historicity” that is simultaneously unable to present anything that is not a reheated version of the past. Pastiche displaces parody:
In this situation, parody finds itself without a vocation; it has lived, and that strange new thing pastiche comes to take its place. Pastiche is, like parody, the imitation of a peculiar or unique, idiosyncratic style, the wearing of a linguistic mask, speech in a dead language. But it is a neutral practice of such mimicry, without any of parody’s ulterior motives, amputated of the satiric impulse, devoid of laughter and of any conviction that alongside the abnormal tongue you have momentarily borrowed, some healthy linguistic normality still exists. Pastiche is thus blank parody, a statue with blind eyeballs…[6]
Despite what Jameson himself writes on Ballard,[7] one of the important differences between the Ballard text and pastiche as Jameson describes it is the absence of “nostalgia” or the “nostalgia mode” — an insistent presence in other postmodernist science fiction texts, as Jameson shows — in Ballard’s work. Indeed, Ballard’s commitment to striking textual innovations — as evidenced in the layout of the pages themselves in The Atrocity Exhibition — mark him as something of an anomaly in Jameson’s terms; in this sense, at least, Ballard seems to be continuous with modernism as Jameson understands it. Yet in certain other respects — specifically in terms of the collapse of individual subjectivity and the failure of collective political action
— Ballard is emblematic of Jameson’s postmodernity. But, unlike Jameson’s pastiche, Ballard does not imitate “a peculiar or unique idiosyncratic style”. The style that Ballard simulates in “Why I Want to Fuck Ronald Reagan” — a style towards which the whole of The Atrocity Exhibition tends — is precisely lacking in any personality: if there are any idiosyncrasies, they belong to the technical register of (pseudo)scientific reportage, not to the characteristics of an individual subject. The fact that the text concerns a political leader draws attention to the lack of any explicit — or, more importantly when discussing satire or parody, implicit — political teleology in Ballard’s writing. It is in this sense that “Why I Want to Fuck Ronald Reagan”, like Jameson’s pastiche, is “without any of parody’s ulterior motives”.
Certainly, this is one way in which “Why I Want to Fuck Ronald Reagan” differs greatly from a classical work of satire such as Swift’s A Modest Proposal (1729). A Modest Proposal is a paradigmatic work of what Joyce called “kinetic” art, produced in particular political and cultural circumstances with a particular aim, to sway an audience into action. Swift’s political purpose — his disparaging of the cruelty of certain English responses to the Irish potato famine — is marked by a certain stylistic and thematic excess (an excess that famously bypassed altogether certain of Swift’s readers, who were able to take the text at face value), whereas Ballard’s text — which emerged, no less than Swift’s, from a very particular sociocultural situation — can be defined by its flatness. This marks a move on, (even), from Burroughs. For all their linguistic inventiveness, Burroughs’ humorous “routines” such as “The Complete All-American Deanxietised Man”[8] remain in a classical tradition of satire through their use of exaggeration and their clear political agenda: using a series of excessive tropes, Burroughs mocks the amoral mores of American techno-science. By contrast, what Ballard’s text “lacks” is any clear designs on the reader, any of Jameson’s “ulterior motives”; the parodic text always gave central importance to the parodist behind it, his implicit but flagged attitudes and opinions, but “Why I Want to Fuck Ronald Reagan” is as coldly anonymous as the texts it imitates. Whereas we hear Burroughs’ cackling at the absurd excesses of the scientists in “The Complete All-American Deanxietized Man”, the response of Ballard to the scientists whose work he simulates is unreadable. What does “Ballard” want the reader to feel: disgust? amusement? It is unclear, and, as Baudrillard argues in relation to Crash,[9] it is somewhat disingenuous of Ballard the author to overcode his texts — in prefatory authorial remarks — with all the traditional baggage of “warning” that they themselves clearly elude. The mode Ballard adopts in “Why I Want to Fuck Ronald Reagan” is not that of (satirical) exaggeration, but is a kind of (simulated) extrapolation. The very genre of the poll or the survey, as Baudrillard shows, makes the question unanswerable, undecidable.
Despite what Ballard himself suggests, (see above), what matters is less the (possible) resemblance of “Why I Want to Fuck Ronald Reagan” to (possible) reports than the circulation of simulation to which such reports already contribute. Writing on pastiche, Jameson comes upon the concept of simulation, but attributes it to Plato rather than referring — here at least — to Baudrillard’s reinvention of it. Yet Jameson’s intuition about the relationship between pastiche and simulation is important. We could perhaps suggest a correlation between Baudrillard’s third-order simulacra and Jameson’s pastiche, on the one hand, and Ballard’s text on the other. What simulation in Baudrillard’s third-order sense entails is, as we have repeatedly insisted, the collapse of distance between the simulation and what is simulated. Satire, in its classical sense, we would probably want to locate as part of “first-order simulacra” — a simulation that resembles the original, but with certain tell-tale differences. Ballard simulates the simulation (the poll, the survey).
a fairground’s painted swings
Speaking of the pathology of amour, is it anywhere better exemplified than in the lyrics of “These Foolish Things” (the title track, significantly, of Ferry’s first LP of covers).
What is fascinating about the song’s litany of lost affects (“wild strawb’ries only seven francs a kilo… the sigh of midnight trains in empty stations… a fairground’s painted swings”) is that the lover features in them only as a series of absences (“a cigarette that bears a lipstick’s traces… gardenia perfume ling’ring on a pillow”) and is never directly invoked. This, of course, is because there is no “loved object itself”. What is loved is the petit objet a, which is not a particular object, but the object as such, the “void presupposed by a demand”. The physical and psychical “presence” of the lover is required only as that which allows the assemblage of affects to be given an apparent cohering centre. But, in the end, the lover is just that: the space, the canvas, on which the collage of memories and associations can be arranged.
Nevertheless, even though it is not the lover “herself” that is desired, the lover cannot be dispensed with altogether: otherwise we are in straightforward fetishism. Žižek illustrates the difference between “normal” pathology and fetishism by reference to that scene in Vertigo where Scotty is embracing Judy (re)made-over as Madeleine. The camera cuts away to show his pausing from kissing her to anxiously check that her hair colour is still blond. But this is NOT fetishism, since the fetishist would dispense with the woman altogether and derive his enjoyment from the lock of hair itself.
Vertigo’s horror lies in its unstinting revelation of the artificiality of desire. Scottie can look into the void presupposed by his demands and still, grotesquely, make the demands. That is the difference between Vertigo and many of the film noirs it references, comments upon and surpasses: it is not, in the end, that he is being deceived by a femme fatale, duped into believing that she is something that she is not. On the contrary, he knows full well that there is no Madeleine. But knowledge is nothing, and the explanation for his continued fixation upon a Madeleine that is not even a ghost is the one provided by Zupančič: for Scottie to give up his object would be for him to give up himself, to die.
There is no doubt a specifically male relationship to the objet petit a which Vertigo reveals. This goes some way to answering the question posed by
I.T.[2] a while ago, after Žižek: namely, why would men, given the choice of sex with a monkey or sex with a robot always choose the robot? The more disturbing thought is that men would always in practice prefer a robot to an actual woman — and this is why the libidinal economics, if not yet the technical feasibility, of The Stepford Wives are horribly credible.
The text which most explicitly lays bare this male desiring mechanism is Villiers de l’Isle-Adam’s The Future Eve (1877), which anticipates both Vertigo and The Stepford Wives, as well as Metropolis and Blade Runner.
The story concerns a dissolute decadent who is enchanted with his beloved, Alicia’s, form, but who detests what he considers to be the frivolity and shallowness of her personality. He is persuaded by an inventor-mentor figure (given the name, in some versions, incredibly, of the then still-living Thomas Edison) that he should simply accept an automaton-copy of his lover, prepared by the inventor, which will be a perfect replica in every respect, except that it will be programmed to be a stimulating companion.
“Edison” couldn’t be more forthright, more demystificatory, more Lacanian:
the creature whom you love, and who for you is the sole REALITY is by no means the one who is embodied in this transient human figure, but a creature of your desire. […] This illusion is the one thing you struggle against all odds to VITALISE in the presence of your beloved, depsite the frightful, deadly, withering nullity of Alicia. What you love is this phantom alone; it’s for the phantom that you want to die. That and that alone is what you recognise as unconditionally REAL. In short, it is this objectified projection of your own mind that you call on, that you perceive, that you CREATE in your living woman, and which is nothing but your mind reduplicated in her.[3]
Of course, the “creative” force that really animates the loved object is not the freeplay of the Romantic imagination, but the implacable mechanism of the unconscious. It’s for the phantom that you want to die: but such a “death” would mean that the desiring frame that makes sense of the world would survive. The only real death would be one in which that whole framework was destroyed, and the subject was confronted with the “white space” of pure potential.
This is what the subject slaved to the pleasure principle must avoid at all costs. The well-known tedium of Sadean desire is the inevitable consequence when this impasse is honestly confronted. If the object of Sadean desire is, as Žižek, says, the eternally beautiful undead victim, who can suffer all manner of privations and yet be magically renewed forever, then the subject of this desire is, as Burroughs knew very well, the vampire-junky. The vampire-junky must be insatiable and must pursue their desires up to the point of self-destruction, but must never cross the line into annihilation.
The empirical narrative would have it that the junkie is gradually “drawn into” addiction, lured into dependence by a chemical need. But it is clear that the junkie chooses to be addicted — the desire to get high is only the ostensible motivation for the drive, just as “winning money” is only the official alibi for the gambler’s enjoyment.
Burroughs’ paralleling of love with addiction is thus by no means cynical hyperbole. Burroughs understood very well that, if love is addiction (“If there’s a cure for this, I don’t want it”), then addiction is also a form of love (“It’s my wife and it’s my life”). There is always, as Gregory Bateson observed in his essay on alcoholism, a meta addiction to be dealt with: the addiction to addiction itself.[4] It is on this that Burroughs’ “control addict” Bradley Martin is hooked: “I am not AN addict. I am THE addict.”
The lyrical power of Burroughs’ writing — especially in the early cut-up novels, which are consensually dismissed as difficult and boring — is often overlooked. But much of its mechanical melancholia is generated from its displaying of the “foolish things” of desire, the heroin-hacceities of train whistles, radio jingles, billboard images and sexual contact. Although initially random, these affect-collages, when repeated and remixed by memory and desire, become necessary. Thus only THAT shade of blue for Madeleine’s suit, only THAT shade of lipstick on the cigarette tip, will do.
Yes, the painted swings of desire’s cruel fairground…
what are the politics of boredom? (ballard 2003 remix)
“Prosperous suburbia was one of the end states of history. Once achieved, only plague, flood, or nuclear war could threaten its grip.”
— J.G. Ballard, Millennium People[2]
“J.G. Ballard” is the name of a repetition.
That’s very different from saying that Ballard repeats himself. On the contrary, it is Ballard’s formalism, his re-permutation of the same few concepts and fixations — disasters, pilots, random violence, mediatisation, the total colonisation of the unconscious by images — that prevent his name being easily attributed to any self.
The obsessive quality of his preoccupations and his methodology is a sign that Ballard has never lost faith with his earliest inspirations: psychoanalysis and surrealism. In both, he found a rigorously depersonalised account of the formations of the person. The so-called interior had a logic that could be both exposed and externalised.
Ballard’s career can be seen as a repeated rewriting of two texts of Freud, Civilisation and its Discontents and Beyond the Pleasure Principle. The environmental catastrophes in his earliest phase of novels (The Drowned World, The Drought, The Crystal World) tend to be greeted by the characters as opportunities, chances to shuck off the dull routines and protocols of sedentary society. The second phase of his work, which began in the mid-Sixties and to some extent continues to this day, follows this logic through so that the catastrophes and atrocities that afflict the characters in these fictions are actively willed by them. (Or is it that the humans seek to manage, through repetition, the originary trauma of their being?) Disasters are now the disasters of the media landscape — the space in which humans now primarily live, and one which is both shaped by, and manufactured from, their desires and drives. Once again, though, we must qualify this claim with the further observation that human beings are not the “owners” of desires and drives — they don’t “have” them. Rather, human beings are the playing out of these impulses, instruments through which trauma is registered.
Since High-Rise in 1975, Ballard has directed most of his attention to the hyper-affluent and bored denizens of closed communities. If Ballard’s treatment of the mores of this population had begun to pall, it was refreshed by Millennium People, his latest and best rendition of this theme.
The world of Millennium People is ruled, “for the first time in history” (but not for the first time in Ballard’s work), by a “vicious boredom”, “interrupted by meaningless acts of violence”. At first glance, the novel can look like a long overdue savaging of the middle class, in which the reader can revel in the brutal destruction of bourgeois sacred cows. Tate Modern… Pret A Manger… the NFT… all of them burn in Ballard’s Bourgeois Terror.
“I’m a fund raiser for the Royal Academy. It’s an easy job. All those CEOs think art is good for their souls.”
“Not so?”
“It rots their brains. Tate Modern, the Royal Academy, the Hayward… They’re Walt Disney for the middle classes.”[3]
The novel’s middle-class insurgents seem, at first, to be merely the hard done-to whiners whose complaints about the rising expense of child care and school fees and the “inequity” of too high rents in their not quite luxurious enough apartments are the stuff of endless media columns. “Believe me, the next revolution is going to be about parking”[4], one character announces, echoing the petrol blockades of four years ago and anticipating the Ikea riots of 2005. Once their discontent is stirred up, however, the goals of this group of former professionals become less specific, less instrumentalist.
Like the Situationists, the insurgents of Ballard’s fictional Chelsea Marina want to “destroy the twentieth century”:
“I thought it was over.”
“It lingers on. It shapes everything we do, the way we think… Genocidal wars, half the world destitute, the other half sleepwalking through its own brain-death. We bought its trashy dreams and now we can’t wake up.”[5]
Millennium People is in many ways Britain’s answer to Fight Club (though, needless to say, the chances of Britain producing Millennium People as a film that would even remotely do the book justice are not even slight — precisely because the British film industry is under the control of the same militantly complacent whingers that it attacks). Like Fight Club, the novel begins with a rage against the bullet-pointed, brand-consulted hyper-conformity of modern professional life, but ends up in surfascism.
The most important figure in this respect is Richard Gould who, like most of Ballard’s other characters, is little more than a spokesperson for the author’s theories. (Which is fine, of course: we need more “well-drawn characters” like we need more “well-wrought sentences”. The UEA Eng Lit mafia are as ripe for immolation as are any of the other cosily depressing targets of Ballard’s pyromaniac prose.)
Gould reiterates essentially the same attack on the “air-conditioned totalitarianism” of contemporary securo-culture that had been essayed by Nietzsche, Mauss, Bataille, Dada, Surrealism, Situationist theory, Lettrism, Baudrillard and Lyotard:
We’re living in a soft regime prison built by earlier generations of inmates. Somehow we have to break free. The attack on the World Trade Centre in 2001 was a brave attempt to free America from the 20th century. The deaths were tragic, but otherwise it was a meaningless act. And that was its point. Like the attack on the NFT.[6]
Gould re-states the Nietzschean claim that human beings need cruelty, danger and challenge, but that civilisation gives them security. Gould, though, is as reminiscent of Fukuyama’s rehearsal of Nietzsche’s discontent with civilisation as he is of Nietzsche himself.
It is Fukuyama’s Nietzsche — the scourge of bland egalitarianism and empty inclusiveness — that is the most relevant Nietzsche today. As you read the appalled invective with which Nietzsche blasts the herd-cult of managed security (which is so weak and insipid that it can never utter its real rallying cry: “long live mediocrity!”) you can’t help but think of Blair and the Millennium Dome, whose pallid, paradoxically self-deprecatory pomposity contrasts unhappily with the cruel opulence of the monuments erected in Nietzsche’s beloved tragic and heroic aristocratic societies.
“Democratic societies,” Fukuyama wrote in The End of History and the Last Man,
tend to promote a belief in the equality of all lifestyles and values. They do not tell their citizens how they should live, or what will make them happy virtuous and great. Instead they cultivate the value of toleration, which becomes the chief virtue in democratic societies. And if men are unable to affirm that any particular way of life is superior to another, they will fall back on the affirmation of life itself, that is, the body, its needs, and fears. While not all souls may be equally virtuous, all bodies can suffer; hence democratic societies will tend to be compassionate and raise to the first order of concern the question of preventing the body from suffering. It is not an accident that people in democratic societies are preoccupied with material gain and live in an economic world devoted to the myriad small needs of the body. According to Nietzsche, the last man has “left the region where it was hard to live, for one needs warmth”.[7]
“We need to pick targets that don’t make sense.”[8]
If the characters in The Atrocity Exhibition wanted to re-stage the founding traumatic moment of the media Sixties — the assassination of Kennedy — then Gould and his allies want to re-stage the founding traumatic moment of the media Noughties — 9/11. But where Traven/Tallis/Travis wanted to kill Kennedy again, “but this time in a way that makes sense”, Gould wants 9/11 to happen again, but in a way that doesn’t make sense.
For Gould, the (post)modern world is oppressed by an excess of Sense, a surplus of Meaning. “Kill a politician and you’re tied to the motive that made you pull the trigger. Oswald and Kennedy, Princip and the Archdukes. But kill someone at random, fire a revolver into McDonald’s — the universe stands back and holds its breath. Better still, kill fifteen people at random.”[9] Thus, the Jill Dando murder is more of a template for Gould’s anti-political insurgency than is September 11th, whose violence was (still) too motivated, too freighted with Meaning. Dando’s killing however — brutal, meaningless, and without any apparent motive — was a direct attack on the BBC’s “regime of moderation and good sense”[10] and the “castle of obligations”[11] it protects. An action like this, whose only motive is an attack on the concept of motive itself, blows open an “empty space we could stare into with real awe. Senseless, inexplicable, as mysterious as the Grand Canyon.”[12]
Gould is an elegant and eloquent salesman of the Deleuze-Guattari “line of abolition”, the fascist drive to destruction which is ultimately a drive towards self-destruction. Ballard, who, to his credit has always refused to endorse facile moralising, would no doubt object to that characterisation, since to in any way condemn or censure Gould would be to confirm the very securocratic values he seeks to undermine.
However, the most compelling aspect of Millennium People, politically speaking, is not the in many ways familiar asignifying violence, but its punk theory of class revolt.
“Twickenham is the Maginot Line of the English class system. If we can break through here, everything will fall.”
“So class systems are the target. Aren’t they universal — America, Russia…?”
“Of course. But only here is the class system a means of political control. Its real job isn’t to suppress the proles, but to keep the middle classes down, make sure they’re docile and subservient.”[13]
The moment at which Ballard’s “new proletariat” (“furnished with private schools and BMWs”) become real political actors is when they cease to pursue their own class interests. Only then can they come to the Marxist revelation that bourgeois class interests are in no one’s interests.
“They see that private schools are brainwashing their children into a kind of social docility, turning them into a professional class who will run the show for consumer capitalism.”
“The sinister Mr Bigs?”
“There are no Mr Bigs. The system is self-regulating. It relies on our sense of civic responsibility. Without that society would collapse. In fact, the collapse may even have begun.”[14]
Blairism has consolidated and outstripped the ideological gains of Thatcherism by ensuring the apparently total victory of PR over punk, of politeness over antagonism, of middle-class utility over proletarian art. It manages the tricky ideological dodge of reducing everything to instrumentality whilst at the same time dedicating all resources to the production of cultural artefacts of no possible use or function. From the Mayan codices to Mission Statements… Spin engenders a meaninglessness which, in the mandatory banality of its corrosive nihilism, makes Gould’s grand poetics of asignifying rupture seem quaintly nostalgic.
Blair has made middle-class security the horizon of all aspiration. In this over-conscious, over-lit twenty-four-hour office of the soul, business, preposterously, is served up to us as the closest thing to anything animated by libido. Ballard knows that a break-out from this affective prison must involve the explicit de-cathexis of the “nice house, nice family” picture that bourgeois culture is still capable of projecting as ideal.
In histories of punk, much is made of the role of the middle classes, but the crucial catalytic role of that particular kind of middle-class refusal remains under-thought. The middle-class defection from reproductive futurism into scarification and tribalisation did nothing more than state the obvious — middle-class careers and the privileges they bring are empty, tedious and ennervating — but, now more than ever, it is this obviousness that cannot be stated.
The interesting thing is that they’re protesting against themselves. There’s no enemy out there. They know that they are the enemy.[15]
let me be your fantasy
What Ballard, Lacan and Burroughs have in common is the perception that human sexuality is essentially pornographic. For all three, human sexuality is irreducible to biological excitation; strip away the hallucinatory and the fantasmatic, and sexuality disappears with it. As Renata Salecl argues in (Per)Versions of Love and Hate,[2] it is easier for an animal to enter the Symbolic Order than it is for a human to unlearn the Symbolic and attain animality, an observation confirmed by the news that, when an orangutan was presented with pornography, it ceased to show any sexual interest in its fellow apes and spent all day masturbating. The orangutan had been inducted into human sexuality by the “inhuman partner”, the fantasmatic supplement, upon which all human sexuality depends.
The question is not, then, whether pornography, but which pornography? For Burroughs, pornosexuality would always be a miserable repetition, a Boschian negative carnival in which the rusting fairground wheel of desire forever turns in desolate circles. But in Ballard, and in Cronenberg’s version of Ballard’s Crash, it is possible to uncover a version of pornography that is positive, even utopian.
Cronenberg’s work can be seen as a response to the challenge Baudrillard posed in Seduction.[3] Hardcore pornography haunts late capitalism, functioning as the cipher of a supposedly demystified, disillusioned “reality”. “A pornographic culture par excellence: one that pursues the workings of the real at all times and all places.” Here, hardcore is the reality of sex, and sex is the reality of everything else. Hardcore trades on a kind of earnest literalism, a belief that there is some empirically specifiable “it” which = sex in/as the real. As Baudrillard wryly noted, this empiricist bio-logic is fixated on a kind of technical fidelity — the pornographic film must be faithful to the (supposed) unadorned, brute mechanism of sex. Yet, sign and ritual are inescapable: in hardcore, especially in bukkake, the function of semen is, after all, essentially semiotic. No sex without signs. The higher the resolution of the image, the closer you get to the organs, the more that the “it” disappears from view. There is no better image of this “orgy of realism” than the “Japanese vaginal cyclorama” Baudrillard described in the “Stereoporno” section of Seduction. “Prostitutes, their thighs open, sitting on the edge of a platform, Japanese workers in their shirt-sleeves… permitted to shove their noses up to their eyeballs within the woman’s vagina in order to see, to see better — but what?” “Why stop with genitalia?” Baudrillard asks, “Who knows what profound pleasure is to be found in the visual dismemberment of mucous membranes and smooth muscles?”[4]
Cronenberg’s early work — from Shivers and Rabid through to Videodrome — is an answer to that very question. Cronenberg famously posed his own question, “why aren’t there beauty contests for the inside of the body?”, and Shivers and Rabid posit an equivalence between body horror and eroticism. The ostensible catastrophe with which both films conclude — the total degeneration of social structure into a seething, anorganic orgy — functions ambivalently. The disintegration of organismic integrity, the reversion to the condition of the pre-multicellular, is a kind of parodic-utopian riposte to Freud’s Civilzation and its Discontents. If civilisation and unbound libido are incommensurate, it is implied, so much the worse for civilisation. The apartment block taken over by mindless sex zombies at the end of Shivers is the Sixties dream of liberated sex come true…
Crash is a sober retreat from all this, a model for a new mecho-Mascohistic mode of pornography in which it is no longer the so-called inside of the body that matters, but the body as surface — a surface to be adorned with clothes, marked by scars, punctured by technical machinery. Possessed by a mad passion to exchange biotic code, the sex plague victims in Shivers devolve beyond animality into a kind of bacterial replicator frenzy. By firm contrast, Crash is as passionless as a Delvaux dream. Sex here is entirely colonised by culture and language. All the sex scenes are meticulously constructed tableaux, irreducibly fantasmatic, not because they are “unreal”, but because their staging and their consistency depend on fantasy. The film’s opening scene, with Catherine Ballard in the aircraft hangar, is quite clearly an acting out of a fantasmatic scenario; it also functions, later, via its recounting, as a fantasmatic supplement to the first sexual encounter we see between Catherine and James. There is no “it” of sex, no brute, naked, definable moment when “it” happens, only a plateau that is (paradoxically) both dilated and deferred, in which words and memories reverberate more powerfully than any penetration.
Crash is so indebted to Helmut Newton that it often looks like little more than a series of animated Newton images. Or, better: in Crash, the bodies attain the near-inanimate stillness of Newton’s living mannequins. The echoes of Newton are entirely fittingly, since Ballard regarded Newton as “our greatest visual artist”,[5] a Surrealist image-maker whose vision shamed the mediocrity of those officially working in the fine arts. “In Newton’s work,” Ballard writes, “we see a new race of urban beings, living on a new human frontier, where all passion is spent and all ambition long satisfied, where the deepest emotions seem to be relocating themselves, moving into a terrain more mysterious than Marienbad.”[6]
When Cronenberg talks about the future sexuality of the “new race of urban beings” in Crash, he tends to refer to it in negative terms:
The conceit that underlies some of what is maybe difficult or baffling about Crash, the sci-finess of it, comes from Ballard anticipating a future pathological psychology. It’s developing now, but he anticipates it and brings it back to the past — now — and he applies it as though it exists completely formed.
The Ballards’ marriage is to be understood as inherently dysfunctional:
Some potential distributors said, “You should make them more normal at the beginning so that we can see where they go wrong.” In other words, it should be like a Fatal Attraction thing. Blissful couple, maybe a dog and a rabbit, maybe a kid. And then a car accident introduces them to these horrible people and they go wrong. I said, “That isn’t right, because there’s something horribly wrong with them right now. That’s why they’re vulnerable to going even further.[7]
Yet the Ballards’ “pathology” in Crash seems oddly healthy, their marriage a model of well-adjusted perversity. Theirs is a utopian sexuality, where sexual contact is voided of all sentimentality, stripped of any reference to reproduction, and unfreighted by any guilt. The lack of face-to-face sex in the Ballards’ marriage — which, again Cronenberg himself tends to talk of negatively, as if it were a deviation from some wholesome, facialised sex in which the partners achieve a harmonious oneness — points to an awareness that there is no sexual relationship. Yet, very far from being a difficulty for the Ballards’ marriage, the lack of a direct rapport, the recognition that any sexual encounter has to go via fantasy, is the basis of all their erotic adventures. Compare the Ballards’ marriage to that of the Harfords’ in Eyes Wide Shut. The Ballards’ using of their sexual encounters with others as a stimulus for their own — impassive, poised, oneiric — sex forms a clear contrast with the deadlock of the Harfords’ marriage, which is exposed in Bill’s failure to cope, or keep up, with Alice’s fantasy. While Bill is scandalised by Alice’s articulation of her fantasies, sex in the Ballards’ marriage is governed by the “feminine” drive to talk; it is almost as if all of the physical encounters happen only so that they can be converted into stories to be recounted later.
The most charged scene in Crash takes place in the carwash, where James looks on through the rearview mirror at Catherine and Vaughan, who — in the words of Cronenberg’s script — are “like two semi-metallic human beings of the future making love in a chromium bower”. Deborah Unger, the film’s real star, is particularly impressive here. A kind of feline automaton, she “acts with her hair, minor adjustments, tosses of the head that advertise the transit of small emotions.”[8]
Who is using whom here? The answer is that all three of the characters are using each other. Catherine’s encounter with Vaughan stimulates James, just as Catherine is stimulated by the thought that James is watching her with Vaughan. Vaughan is using the couple as subjects of his own libidinal experiments, while the Ballards are using Vaughan as the third figure in their marriage. A mis-en-abyme of desire…
Far from being some nightmare of mutual domination, this is Cronenberg/ Ballard’s sexual utopia, a perverse counterpart to Kant’s kingdom of ends. The kingdom of ends was Kant’s ideal ethical community, in which everyone is treated as an end in themselves. Kant reasoned that, from the point of view of his ethics, sex was inherently problematic, because to engage in sexual congress entails treating the other as an object to be used. The only way in which sex could be commensurate with the categorical imperative — which in one of its versions maintains that one should never treat others as a means to an end — was if it took place in the context of a marriage, in which each partner has contracted out the use of their organs in exchange for the use of their partner’s.
Desire is construed here in terms of simple appropriation (this equivalence is yet another way in which Kant is in tune with Sade). But what Kant — and those who follow him in condemning pornography because it “objectifies” — fails to recognise is that our deepest desire is not to possess an other but to be objectified by them, to be used by them in/as their fantasy. This is one sense of the famous Lacanian formula that “desire is the desire of the other”. The perfect erotic situation would involve neither a dominance of, nor a fusion with, the other; it would consist rather in being objectified by someone you also want to objectify.
Crash, of course, follows Masoch and Newton in delocalising sex from genitality. Libido is invested in the mis-en-scene more than in the meat, which draws its attraction almost entirely from its adjacency to the decorous nonorganic — to clothes as much as cars. Clothes differentiate glam’s cold and cruel cultivation of appearances from hardcore’s passion for the real. Without suits, dresses and shoes, without fur, leather and nylon, pornography might as well be arranging meat in a butcher’s window. Newton told Ballard that he “loved Cronenberg’s Crash”, but one thing bothered him. “The dresses,” he whispered. “They were so awful.” This strikes me as waspishly unfair to Denice Cronenberg’s elegant wardrobe selections. (One major problem with Jonathan Weiss’ version of The Atrocity Exhibition, however, is precisely the dreadfulness of the clothes.) Crash takes its cues from high fashion magazines, whose images are more sumptuously arty than fine art, more suffused with deviant eroticism than hardcore porn. Would it be impossible for there to be a pornography, sponsored by Dior or Chanel, scripted by a latter-day Masoch or Ballard, whose fantasies were as artfully staged as the most glamorous fashion photo shoot?
fantasy kits: steven meisel’s “state of emergency”[2]
A few weeks ago, I asked whether it would be possible “for there to be a pornography, sponsored by Dior or Chanel, scripted by a latter-day Masoch or Ballard, whose fantasies were as artfully staged as the most glamorous fashion photo shoot?”[3] Steven Meisel’s Vogue photo-shoot, much more than Mike Figgis’ drearily vanilla promotional films for Agent Provocateur, suggests that such a pornography is conceivable.
“State of Emergency” shows, once again, that it is left to high fashion to take up the role that fine art has all but abandoned. While much of fine art has succumbed to the “passion for the real”, high fashion remains the last redoubt of Appearance and Fantasy.
The used tampons and pickled animals of Reality Art offer, at best, tracings of the empirical. Their quaint biographism reveals nothing of the unconscious. Meisel’s elegantly-staged photographs, meanwhile, drip with an ambivalence worthy of the best Surrealist paintings. They are uncomfortable and arousing in equal measure because they reflect back to us our conflicted attitudes and unacknowledged libidinal complicities. (In this respect, they form a sharp contrast with the infinitely more exploitative image being used to front the American Express Red campaign[4], whose meaning is easily anchored to the coordinates of the currently dominant ideological constellation.)
Reframed as Art, the Vogue photographs would no doubt be described — in the all-too familiar terms of art-critical muzak — as “negotiating with ideas of violence/ terror/etc.” As high fashion, they meet instead with a type of liberal denunciation that is no less familiar. In the Guardian, Joanna Bourke complained that, “It is no coincidence that the security forces are shown to be protecting us from a person who is neither male nor obviously Muslim”.[5] Would Bourke have preferred it, then, if the images did feature a Muslim man?
Bourke continues:
Instead, the terrorist threat is an unreal woman. In contrast to the security personnel depicted, she is placed beyond the realm of the human. Her skin is as plastic as a mannequin’s; her body is too perfect, even when grimacing in pain. When the model is depicted as the aggressor, she remains nothing more than the phallic dominatrix of many adolescent boys’ wet dreams. In both instances, the beauty of the photographs transforms acts of violence and humiliation into erotic possibilities.
Again, what would Bourke have preferred: simulated snuff in which “reallooking” women were roughed up by security staff? Bourke’s hostility to the fantasmatic is oddly doubled by the aggression of the security personnel towards the “unreal” women. And what does it mean to substitute an “unreal woman” for an all-too-real Muslim male, in any case? What does the confusion of ontological levels — agents of reality conjoined with the waxy artificiality of Bellmer-doll fashion models — tell us? The photographs are fascinating and unsettling because there are no straightforward answers to these questions.
Needless to say, Meisel’s photographs do find erotic possibilities in violence and humiliation, but this is not so much a “transformation” as a rediscovery. Two hundred years after Sade, a century after Bataille and Masoch, it appears that anything which publicly acknowledges that eroticism is inseparable from violence and humiliation is more unacceptable than ever. The issue is not how “healthy” sexuality can be purged of violence, but how the violence inherent to sexuality can be sublimated. Meisel’s photographs — which, we should remember, appear in a magazine the vast majority of whose readership is not “adolescent males” but women — are “fantasy kits” which offer just such sublimations, providing scenarios, role-play cues and potential fantasmatic identifications.
“State of Emergency” demonstrates that, rather than simply retaining its capacity to shock, The Atrocity Exhibition is more disturbing than ever. The overt sexualisation and compulsory carnality of postmodern image culture distracts us from the essential staidness of its rendition of the erotic. As Baudrillard argues in Seduction, biologised sex functions as the reality principle of contemporary culture: everything is reducible to sex, and sex is just a matter of meat mechanics. Ours is an age of cynicism and piety, which, as Simon suggested in his initial post on “State of Emergency”,[6] primly and pruriently resists the equivalences between eroticism, violence and celebrity that Ballard explored:
Entering the exhibition, Travis sees the atrocities of Vietnam and the Congo mimetised in the “alternate” death of Elizabeth Taylor; he tends the dying film star, eroticising her punctured bronchus in the over-ventilated verandas of the London Hilton; he dreams of Max Ernst, superior of the birds.[7]
To imagine the atrocities of September 11th and Abu Ghraib mimetised in the alternate death of Paris Hilton feels far more unacceptable, because contemporary piety has sacralised its atrocities in a way that the Sixties could not. In Atrocity, Dr Nathan’s reminder that, at the level of the unconscious, “the tragedies of Cape Kennedy and Vietnam… may in fact play very different parts from the ones we assign them” is extremely timely. (As Burroughs tells us in his preface to The Atrocity Exhibition, “Surveys indicate that wet dreams in many cases have no overt sexual content, whereas dreams with an overt sexual content in many cases do not result in orgasm”.) It is clear that the appalling Abu Ghraib photographs were already intensely eroticised stagings whose scenarios were derived from cheap American pornography. Love and Napalm: Export USA, indeed. Part of the reason that the Abu Ghraib images were so traumatic for a deeply conflicted American culture which combines religious moralism with hyper-sexualised commerce, and which is united only by a taste for mega-violence, is that they exposed the equation between military intervention and sexual humiliation that the official culture both depends upon and must suppress.
It’s interesting to compare both The Atrocity Exhibition and “State of Emergency” to Martha Rosler’s series of collages, “Bringing the War Home”[8]. “Sixties iconography: the nasal prepuce of LBJ, crashed helicopters, the pudenda of Ralph Nader, Eichmann in drag, the climax of a New York happening: a dead child”: this typical section from The Atrocity Exhibition could almost be a gloss on Rosler’s images, with their irruptions of war and atrocity amidst domestic scenes. But in Rosler’s case, unlike in Ballard’s, surrealist juxtaposition has a clear polemical purpose. The Atrocity Exhibition, like “State of Emergency”, is devoid of any decipherable intent; the oneiric juxtapositions in Ballard’s and Meisel’s work seemed to be conceived of as neutral re-presentations of the substitutions and elisions made by the mediatised unconscious.
Meisel’s fantasy kits, their narratives left implicit and mysterious, suggest ways in which Ballard might be adapted in future. Part of the problem with Weiss’ film adaptation of The Atrocity Exhibition is that it subordinated the fragmentary mode of the novel to the duree — the lived time — of the feature film.[9] The most successful part of the film was perhaps the first few moments, where Ballard’s text was intoned over still images in a style reminiscent of Marker’s La Jetee (a film which Ballard adores, of course). That is partly because it is the profound stillness of the Surrealist paintings which The Atrocity Exhibition describes and appropriates — their beaches drained of time — which sets the rhythm of the novel. The most successful adaptation of The Atrocity Exhibition would, precisely, be an exhibition — not only of photographs, but also of newsreel footage, mandalas, diagrams, paintings and notebooks. It would be left for the viewer-participant to assemble their own narratives from these fantasy kits.
the assassination of j.g. ballard
They wanted to kill Ballard again, but this time in a way that made sense. The British know how best to kill something, softly. Assimilation is sometimes the most effective kind of assassination.
“You say these constitute an assassination weapon?”
So here they come again — all the familiar profiles, all the old routines. All that over-rehearsed musing about the supposed contrast between Ballard’s writing and his lifestyle and persona. All that central London cognoscenti condescension: he lived in Shepperton, he wore a tie and drank gin and yet he could come up with this — imagine that. As if it isn’t obvious that English suburbs are seething with surrealism. As if you could think for a minute that The Drowned World or The Atrocity Exhibition were written by anyone wearing jeans. Ballard mapped another America, another 1960s, one beyond the pleasure principle of rock ‘n’ roll and its paraphernalia. (That was one of the reasons that Ballard should have been so integral to post-punk’s unlearning of r and r and to electro’s pursuit of a colder mechano-erotics outside rock’s passional regime.) As if Ballard’s works could be mistaken as anything other than the work of a bourgeois — Ballard’s was to have unashamedly fixated on the psychopathologies of his class (so no Keith Talents here, only a litany of deranged professionals), a class which he had a special insight into because he was always semi-detached from it.
You: Coma: Princess Diana
Assessing cultural figures by their alleged influence, their legacy, is an egregious postmodern tic — as if it reflected any merit to have inspired the Klaxons. Ballard is important precisely because it is completely unimaginable that any equivalent of his work could emerge from current conditions. As he made clear in his 1989 annotations to his most important work, The Atrocity Exhibition, he was a meta-psychologist of the pop age, his sensibility unsuited to the era of Reality, with its flattening fusion of celebrity and the hyper-banal.
A unique collision of private and public fantasy took place in the 1960s, and may have to wait some years to be repeated, if ever. The public dream of Hollywood for the first time merged with the private imagination of the hyper-stimulated TV viewer. People have sometimes asked me to do a follow-up to The Atrocity Exhibition, but our perception of the famous has changed — I can’t imagine writing about Meryl Streep or Princess Di, and Margaret Thatcher’s undoubted mystery seems to reflect design faults in her own self-constructed persona. One can mechanically spin sexual fantasies around all three, but the imagination soon flags. Unlike [Elizabeth] Taylor, they radiate no light. […] A kind of banalisation of celebrity has occurred: we are now offered an instant, ready-to-mix fame as nutritious as packet soup.[2]
Ballard’s Sixties were inaugurated by the Kennedy assassination. The founding event of the media environment we live in now, in which consensual sentimentality has long since occluded Ballard’s death of affect, was Princess Diana’s car crash death in 1997. In his later novels, Ballard tried to get a grip on this mall-world of Ikea psychosis and shopping channel charismatics, but they never produced the same spinal charge as his encounters with the Sixties tele-cinematic arcades presided over by Elizabeth Taylor and Ronald Reagan. Ballard’s most probing contributions in later years came in interviews and articles rather than in the novels: it was here that he identified retail parks and anonymous non-places as the authentic landscape of the twenty-first century, but he was not able to poeticise this hyper-banal terrain in the same way that he mythologised the brutalist concourses and high-rises of the Sixties and Seventies.
A Pulp Modernist Magus
What better way to destroy something than send in Martin Amis to praise it? Ballard was never a “good writer” in the way that Amis and his admirers and cronies in urbane Brit lit, with their handcrafted sentences, their well-drawn characters, their concerned social commentary, were. The significance of The Atrocity Exhibition was to have obsolesced this machinery of mediocrity, which he eviscerated in his 1964 profile of Burroughs.
To use the stylistic conventions of the traditional oral novel — the sequential narrative, characters “in the round”, consecutive events, balloons of dialogue attached to “he said” and “she said” — is to perpetuate a set of conventions ideally suited to a period of great adventures in the Conradian mode, or to an overformalised Jamesian society, but now valuable for little more than the bedtime story and the fable.[3]
But Ballard’s strategy in his best works was also opposed to that of another of his admirers and appropriators, Iain Sinclair. Whereas Sinclair transforms pop-cultural material into something opaque, obscure and hermetic, Ballard innovated a kind of pulp modernism in which the techniques of high modernism and the riffs of popular fiction intensified one another, avoiding both high cultural obscurantism and middlebrow populism. Ballard understood that collage was the great twentieth century artform and that the mediatised unconscious was a collage artist. Where are his twenty-first century inheritors, those who can use the fiction-kits Ballard assembled in the Sixties as diagrams and blueprints for a new kind of fiction?
a world of dread and fear
“You couldn’t sleep. You had to work.
Always light.
Head against the window, sun coming up —
The troops were gathering on the street below him. The Red Guard in good voice:
SCAB, SCAB, SCAB —
The dawn chorus of the Socialist Republic of South Yorkshire.
Another cup of coffee. Another aspirin”
— David Peace, GB84[2]
David Peace’s GB84 is typed in prose as stark and unforgiving as motorway service station strip-lights.
The harsh expressionist realism Peace honed over the course of the four books of the Red Riding Quartet is perfect for handling GB84’s subject matter, the events of the 1984-85 Miners’ Strike. The Quartet counted forward — 1974-1977-1980-1983 — as if it is was approaching but would never reach the fateful date that will provide the title of GB84. From here we count backwards; GB84 “is actually the last of an inverse post-war trilogy which will include UKDK, a novel about the plot to overthrow Wilson and the subsequent rise of Thatcher and another book, possibly about the Atlee Govt.” From gothic crime to Political gothic…[3]
The fiercely partisan novel ends with the incantation: “the Year is Zero”. But 1985, when both the strike and the book end, was very far from being a year of beginning or of possibilities for the novel’s “us”. (In fact, the very existence of that “us”, the collective proletarian subject, is itself in question by then. At the same time, however, this is the first of Peace’s novels in which the possibility of any sort of group-subject is raised. More typically, his characters are solipsistically alone, connected only by violence, their only shared project dissimulation.) On the contrary, it was a year of catastrophic defeat, the scale of which would not become apparent for a decade or more. (Perhaps it was only in the election of New Labour twelve years later that the defeat was both registered and finally secured.)
We now know — although this cannot enter into the present tension of the novel — that the strike was about a failed Proletarianisation. After the events the novel describes, what awaited was fragmentation, new opportunities for the few, unemployment and underemployment for the majority. The technique of flying picketing that Scargill had pioneered so successfully in the late Sixties and early Seventies (and which had contributed to the humiliation and collapse of the Heath government) was combatted by a comprehensive range of strategies (including a highly-organised counter-subversion operation run by MI5) that were designed while the Tories were still in opposition. The aim was to fragment miners’ solidarity and to prevent support from sympathetic workers in other industries. In this, the creation of the Working Miners Committee and the Union of Democratic Mineworkers would prove crucial. The deterritorialisation of capital — its transmutation into “messages which pass instantaneously from one nodal point to another across the former globe, the former material world”[4] — was not to be met by a complementary deterritorialisation of labour. Miners were inveigled into identifying with their own terrirory rather than with the industry as a whole; hence the return to work of the Nottinghamshire and Derbyshire miners, who believed that they were safeguarding their future but in a satisfying irony found themselves no better off than miners from any other coalfield. Within a decade, the industry would be all but closed down in Britain, with members of the UDM no more likely to be in employment than those of the NUM.
Yes, we know all this, now. But Peace restores drama by excluding any of the knowledge hindsights brings. The events come at you as if they were happening for the first time, and without the emollient shield of an omniscient authorial voice. As Joseph Brooker identifies in a lengthy piece on GB84 in the current issue of Radical Philosophy,[5] the novel is bereft of any mediating meta-language. The tragic quality the novel possesses even in its earliest scenes comes courtesy of the knowledge we, the readers, bring — but which is, naturally, is denied to the protagonists — of the eventual course that the strike will take.
Counterfactuals are largely the preserve of the reactionary right, and Peace refuses the temptation to change the facts. He writes his retro-speculative fiction in the spaces between the recorded facts, extrapolating, inferring, guessing. Yet the question the reader cannot help but pose is: what would have happened if the miners had won? (A question that has added piquancy since subsequent revelations have shown that the government was much closer to defeat than was ever suspected at the time.) The narrative in which the strike is now embedded — the only narrative in town, the story of Global Capital — has it that it was part of a receding ebb tide of organised working-class insurgency. Defeat was inevitable, written into the historical passage from Fordism to post-Fordism. The hard left are outflanked, fighting under the banner of the Past for “the history of the Miner. The tradition of the Miner. The legacies of their fathers and their fathers’ fathers.”[6]
But such a narrativisation is question-begging, since the very credibility of this story relies upon the events of the strike unfolding as they did. What if they hadn’t? Under the aspect of eternity, everything is inevitable and we are all Spinozists. But life has to be lived “forward”, making us Sartreans. Reading the book now inevitably dramatises the tension between these two positions, between knowing that everything has already happened and acting as if it hasn’t.
A gang of doppelgangers, near-duplicates, haunt the pages of GB84, this “fiction based upon a fact”. Peace writes an occulted history of the present by constructing a simulation of the near-past. The dramatis personae do not bear the names of their real-life counterparts, and sometimes don’t have names at all, merely titles designating their structural role: The President, The Chairman, The Minister. Sometimes, real world names are slightly altered; in GB84 the NUM’s Chief Executive Roger Windsor becomes the hapless Terry Winters. The relationship of these simulations to their real-life counterparts is complex. The President is not Scargill. But he’s not not Scargill. No doubt Peace changed the names in part to avoid legal action, but in an odd way the extra imaginative latitude he is granted by not being compelled into fidelity to actual biography gives the characters more reality. He is able to get inside their heads in a way that would not be possible with actual biographical individuals.
The most controversial characterisation is that of Stephen Sweet, the professional strikebreaker based on Thatcher’s right-hand man throughout the strike, David Hart. Hart was the driving force behind the creation of the Working Miners’ Committee and the UDM. In the novel, Sweet is seen planning the crucial battle between police and pickets at the coking plant, Orgreave. (Devoting all its resources to Orgreave is now regarded as a major strategic error by the NUM.) Sweet is referred to throughout the novel as “The Jew”. Although this designation remains uncomfortable — as it is intended to be, Peace has said — suspicions of anti-semitism are immediately rebutted by any sort of close reading of the novel. Everything we see of Sweet is focalised through his chauffeur-factotum, Neil Fontaine. (This distancing is significant, since Sweet’s pomposity and grandiosity strike a slightly unconvincing note. It is almost as if Peace is unable to find the sympathy necessary for a convincing characterisation. On the other hand, perhaps Hart was the faintly absurd figure that Peace paints his fictional counterpart as. Peace does not make the mistake of portraying Sweet as a self-consciously evil figure; on the contrary, Sweet sees his efforts in a messianic light.)
Fontaine, presumably a co-opted member of the working class who has worked for the security services, is a blank slate of a figure, a man reduced to function (he is doubled in the novel by David Johnson, The Mechanic, who becomes an antagonist but who was clearly an ally in the past). It is Fontaine, a man with right-wing affiliations and connections but few passions, who will never stop seeing Sweet as “The Jew”. That description foregrounds the provisional nature of the political alliance that Thatcher built: somehow, the Thatcher programme allowed fascists to consort with Jews, nationalists to find common cause with the agents of multinational capitalism.
Fontaine is also the connecting link between the overt and the covert counter-“subversion” operations undertaken by the state in GB84. It is in the unraveling of the MI5’s role in proceedings that leads Peace into the territory of endemic corruption and betrayal that he staked out so viscerally in the Red Riding Quartet. Unusually for Peace, so skilled at putting himself (and therefore us) into the shoes of irredeemably corrupt power puppets, there is no major character in GB84 who is a policeman. But there are state functionaries: Fontaine, Johnson, but, most vividly, Malcolm Morris, a man whose role is to be a shadow, a cipher, an expert phone-tapper who, in a Francis Baconoid delirium, fancies that his ears are always bleeding…
In GB84, MI5 are the key players in organising Terry Winters’ spectacularly ill-judged trip to Libya. Who can forget the television images of Roger Windsor kissing Gadaffi in his tent? Winters/Windsor’s Libyan visit — only a few months after the policewoman Yvonne Fletcher had been killed by Libyan agents — proved an important, perhaps decisive, PR defeat for the NUM. (The actual role of Libya in the strike was somewhat different: the Thatcher government had illicitly increased oil imports from the supposedly outlawed regime so as to see off the threat of power cuts.) Peace deliberately leaves the degree of Windsor/Winters’ collusion with the security services unclear. He wanted the novel to be a “mess”, like the strike itself.
The doubling of historical fact with Peace’s version of it is internal to the novel’s own structure, whose main fictional thread is cut through by a diary account of the strike by two miners, Martin and Peter. Martin and Peter’s accounts, rendered in the Yorkshire dialect Peace captures so ably, were “not fictionalised”, Peace has said. It is here that Scargill, Macgregor, Thatcher, McGahey and Heathfield appear in their own names. The first person accounts register the grim miseries of the strike, as well as its comaraderie, forming a contrast with the skullduggery, the corruption and the high-level meetings of the novel’s central narrative.
Peace says that he first puts himself into the past, and then imagines. It’s like method writing, or time travel. Peace has tried and tested tricks to get back. He uses jaundiced newsprint, books but most of all pop — not the stuff he would have listened to himself, then, and has listened to ever since, but the songs that, ubiquitous then, forgotten now, can function as audio-madeleines. Digging through the carboot sale detritus of 84 and 85 pop, Peace finds a coded history of the strike secreted beneath the dull sheen of thrownaway post-new pop. GB84 begins with Nena’s execrable “99 Red Balloons”, which here becomes an apocalyptic carnival tune, bursting with all the hopes that will sag and bleed by the end of the novel’s gruelling, long, long march. “Two Tribes” soundtracks the next phase, the confrontation between police and miner (both this and the Nena song, of course, played upon Cold War anxieties when they were released. Another reminder, and there are many in this novel, that 1984 was a world away). The exhilaration and adrenalin of out and out confrontation, Us and Them, gives way to suspicion (who is with us, and who is against us?) “Two Tribes — Must have heard that bloody song ten times a day now for weeks. Ought to make it National Anthem, said Sean.”[7] The songs that Peace dredges up for the final phase of the strike are “Careless Whisper” (“guilty feet have got no rhythm”) and, for the 84-85 winter that was cold, but not cold enough — the power cuts never come — Band Aid’s “Do They Know It’s Christmas”. Characters speculate that Band Aid is a government-backed scheme to distract from the plight of the miners, and the line that Peace selects for sampling is, naturally, “There’s a world outside your window, and it’s a world of dread and fear.”
Sampling is precisely the right term, since pop, much more than literature, film or TV (Peace actively distrusts these latter two) provides Peace with a methodology for drilling his words into the repetitions and refrains that are his stock-in-trade. Repetition is a hallmark of Peace’s style; he has famously remarked that the strike was intensely repetitive and that the prose would reflect that. But in all of Peace’s writing, repetition is what substitutes for both plot and character. His crime novels make no attempt to interest readers in the intrigue and enigma of plots; the plot of GB84, meanwhile, is given in advance, a kind of readymade. And one strange quality of Peace’s writing that is not immediately evident is that, although it is unusually intimate — reading his novels is always like rifling through someone else’s most secret places — his characters lack what is usually called “inner life”. They are identified less by a reflexive vitality than by death-drive repetitions, riffs, echoes, habit-forms.
In GB84 the result is more poetic than most poetry; it is, naturally, a poetry stripped of all lyricism, a harshly dissonant word-music. Peace is a writer particularly attentive to sound: the unsleeping vigilance of state power is signified by the “Click, Click” of the telephone tap, the massed ranks of the police by the Krk, Krk of boots and truncheons beaten against shields, both sounds repeated so much that they become background noise, part of the ambience of paranoia. The Telegraph review was right to observe that, “At times, the novel feels like an eardrum buzz, the literary equivalent of late-1970s Northern bands such as Throbbing Gristle and Cabaret Voltaire.” It resembles even more closely the two great post-punk responses to the strike: Mark Stewart’s As the Veneer of Democracy Starts to Fade (Keith Leblanc also produced the single “The Enemy Within”) and Test Department’s The Unacceptable Face of Freedom. Perhaps for this reason, Post-punk recedes as an explicit reference in GB84. It had been present in Nineteen Eighty Three in titles of sections of the novel: “Miss the Girl” (the Creatures) and “There are No Spectators” (the Pop Group). The tone for GB84 is in every sense set by the title of the last section of Nineteen Eighty Three: “Total Eclipse of the Heart”.
Part of the reason that 1985 seemed like the worst year for pop ever was that it was the beginning of the restoration. Up to 1984, British popular and political culture was still a battleground. 1985 was the year of Live Aid, the beginning of a time of the fake consensus that is the cultural expression of global capital. If Live Aid was the non-event that happened, the strike was the Event that didn’t.
Swords and shields. Sticks and stones. Horses and dogs. Blood and bones—
The armies of the dead awoken, arisen for one last battle.
The windscreen of the Granada lit by a massive explosion—
The road. The hedges. The trees—
Fire illuminating the night. The fog now smoke. Blue lights and red—
Terry shook Bill’s arm. Shook it and shook it. Bill opened his eyes—
“Where are we?” shouted Terry. “Where is this place?”
“The start and the end of it all,” said Bill. “Brampton Bierlow. Cortonwood.”
“But what’s going on?” screamed Terry Winters. “What’s happening? What is it?”
“It’s the end of the world,” laughed Bill Reed. “The end of all our worlds.”[8]
ripley’s glam
“He hated becoming Thomas Ripley again, hated being nobody, hated putting on his old set of habits again, and feeling that people looked down on him and were bored with him unless he put on an act for them like a clown, feeling incompetent and incapable of doing anything with himself except entertaining people for minutes at a time.”
— Patricia Highsmith, The Talented Mr Ripley[2]
We can learn a great deal about the glam impulse from these lines from The Talented Mr Ripley.
Significantly, Highsmith wrote the first Ripley novel in 1955 and only returned to the character in 1970. Tom Ripley was not a character that could fit into the rock and roll era, with its emphasis on teen desire, social disruption and Dionysiac excess. But Ripley’s “hedonic conservatism”, his snobbery and his facility with masks and disguise, mean that he would be perfectly at home in the Marienbad-like country estate of glam. If Sixties rock was characterised, on the one hand, by appeals made to the big Other (demands for social change and/or more pleasure) and, on the other hand, by the denial of the existence of the Symbolic order as such (psychedelia), then glam was defined, initially, by a hyperbolic/parodic identification with the big Other — by the return of Signs and/of Status.
In the sentence cited above, there are, evidently, two Toms — “Thomas Ripley” the performed social role, and the Tom who performs that role; Tom the speaking subject and Tom the subject of the statement. At the outset of The Talented Mr Ripley both these Toms are “nobodies” — as a speaking subject, like all speaking subjects, Tom is ontologically nothing; and as the subject of the statement is socially nothing. At this stage, Tom is very far from being the insouciant, poised figure he will appear to be later; he is capable of simulating confidence only when taking on the role of Other people. It is not that Tom lacks status; it is that he has no place whatsoever in the social hierarchy. His status is not even low. His indeterminate social origins and his ability as a mimic and as a forger (skills upon which his anti-career as a fraudster are based) mean that he fits in nowhere (or anywhere). Tom experiences this nothingness in classic existentialist terms, feeling himself to be inchoate, a void, unresolved, unreal.
But the novel is a kind of existentialist picaresque by the end of which Tom has the (financial) means to create a Thomas Ripley he will not hate being. At the beginning of the next novel, Ripley Under Ground, it is immediately evident that Tom has created/become such a figure. Tom has fashioned his best forgery — a Thomas Ripley who is independently wealthy, owns an elegant house in the Paris suburbs and is married to a beautiful, hedonistic heiress. From now on, Ripley’s anxieties will concern not the establishment of an identity, but the preserving and defending of the status he has acquired.
Ripley’s trajectory is uncannily in sync with that of Bryan Ferry. Roxy Music and For Your Pleasure, those exercises in learning and unlearning of accent and manners, are pop’s equivalent of The Talented Mr Ripley. The clothes, the bearing and the voice are faked, but not yet perfectly. The roots still show, and the painful drama of becoming something you are not still carries an existential charge. Stranded and the subsequent albums, meanwhile, are the equivalent of the later novels; here, success is assumed, and the threats to the tasteful but banal idyll come from ennui, a certain unease with contentment, and — most ominous of all — the danger of the past returning. The vapid bucolia of Roxy’s Avalon — recorded when Ferry was himself married to an heiress and living on a country estate — would be the perfect soundtrack to Ripley puttering around in his Harpers and Queens dream home, Belle Ombre, with his wife, Heloise.
The first step to Ripley’s becoming a Something turns out to be his vampirising of the identity of Dickie Greenleaf. I say “turns out” because, contrary to what Anthony Minghella’s film implies, it is clear that Tom does not to go to Europe with the thought of destroying Dickie already in his mind. Ripley is a brilliant improviser, not a planner; the plans he does make are short-term, often leading to more problems than they solve, and he derives enjoyment from cleaning up messes rather than from avoiding them in the first place.
Initially, Tom’s attitude to Dickie is ambivalent and is not straightforwardly predatory — he is aggressive and envious but also affectionate. If Tom is Nothing, a turmoil of unresolved purposes, a tumult of shame and inadequacy, then Dickie is really Something, an Object, resolved and real, possessing “the solidity of a stone”. By taking the place of Dickie, Ripley can escape the pain, anxiety and awkwardness of being himself, a self. To become an Object — to be relieved of the pressures of subjectivity, untroubled by any interiority — isn’t this one of central fantasies of glam?
Žižek is certainly right to argue that the sexualisation of the relationship between Tom and Dickie in Anthony Minghella’s film is a mis-step. Yet Žižek’s interpretation is not fully adequate either. According to Žižek:
Dickie is for Tom not the object of his desire, but the ideal desiring subject, the transferential subject “supposed to know how to desire.” In short, Dickie becomes for Tom his ideal ego, the figure of his imaginary identification: when he repeatedly casts a coveting side-glance at Dickie, he does not thereby betray his erotic desire to engage in sexual commerce with him, to HAVE Dickie, but his desire to BE like Dickie.[3]
What is missing from Žižek’s analysis is a recognition of the way that Dickie fails to serve as an adequate ideal ego. The pivotal moment of the novel comes when Ripley is no longer capable of sustaining his fantasy identification with Dickie. When Tom looks into Dickie’s eyes and sees not the windows of a soul with which he can identify but the dead, glassy surface of an inert and idiotic dummy, he falls (back) into a deep existential nausea and vertigo, experiencing a moment of profound cosmic loathing and miserable dislocation:
He stared at Dickie’s eyes that were still frowning, the sun bleached eyebrows white and the eyes themselves shining and empty, nothing but little pieces of blue jelly with a black dot in them. You were supposed to see the soul through the eyes, to see the love through the eyes, the one place where you could look at another human being and see what really went on inside, and in Dickie’s eyes Tom saw nothing more than he would have seen if he had looked at the hard, bloodless surface of a mirror. Tom felt a painful wrench in his breast, and he covered his face with his hands. It was as if Dickie had suddenly been snatched away from him. They were not friends. They didn’t know each other. It struck Tom like a horrible truth, true for all time, true for the people he had known in the past and for those he would know in the future: each had stood and would stand before him, and he would know time and time again that he would never know them, and the worst was that there would always be the illusion, for a time, that he did know them, and that he and they were completely in harmony and alike. For an instant the wordless shock of the realisation seemed more than he could bear. He felt in the grip of a fit, as if he would fall to the ground.[4]
No doubt this is partly a registering of Dickie’s rejection of Tom. But it also expresses Tom’s feelings of revulsion for Dickie. What has been “snatched away” from Tom is not just Dickie “himself”, but the fantasy of Dickie. It is as if Tom is no longer capable of pretending (to himself) that Dickie is anything other than a really rather mediocre person; as if he has encountered, for the first time, the brute, stupid physicality of Dickie — has seen Dickie, directly, without the screen/sheen of fantasy to beatify him.
Tom’s break from Dickie is inevitable after the desperately painful scene, slightly earlier, when Dickie discovers Tom wearing his clothes and imitating him in front of the mirror. Dickie is disgusted and angered by Tom’s imitation (what is more horrifying than being someone else’s ideal ego?), just as Tom is utterly mortified by the fact that Dickie has discovered him in the act (what is more shameful than being caught by your ego ideal fantasising about them?). Significantly, Dickie makes the same error as Minghella, (mis) interpreting Tom’s behaviour in terms of sexual obsession, choosing this moment to emphatically deny to Tom that he is “queer”. But Tom’s wanting to be Dickie is far more obscene, far more deadly, far more Burroughsian, than his wanting to have him would have been.
Once Tom can no longer sustain his fantasy identification with Dickie, the logic of his psychosis insists that he will only be able to resolve his existential crisis — his lack of Being — by killing Dickie. That is partly because, in Ripley’s mind, Dickie is already dead: a soulless shell who illegitimately possesses wealth and social status that the more tasteful and refined Tom feels that he rightfully deserves. Tom is sure that he can be Dickie better than Dickie himself could be, and Dickie will be the daub that Tom will use as the basis for his masterpiece, the new Thomas Ripley. There is also a sense in which, by killing Dickie, Tom “earns” his place in the unproductive leisure class. Even before he is elevated into the leisure class, Tom shares its disdain for “drudgery”. The difference between Tom the common thief and con artist and Tom the member of the leisured elite is a successful act of violence. Veblen argues that “leisure class society” is founded on the “barbarian” distinction between exploit — “the conversion to his own ends of energies previously directed to some end by another agent” — and industry (or drudgery) — “the effort that goes to create a new thing with a new, (‘brute’) material”.[5] The Masters must always vampirise, never produce.
The performance of productive work, or employment in personal service, falls under the same odium for the same reason. An invidious distinction arises between exploit and acquisition by seizure on the one hand and industrial employment on the other. Labour acquires a character of irksomeness by virtue of the indignity imputed to it.[6]
Hunting has always been one of the activities upon which the leisured elite has prided itself, and Ripley is a consummate hunter (prey is one of the meanings of Ripley’s Game).
The use of homicidal violence to achieve and protect a position of privilege is very far from being aberrant, and Tom is no more likely to face justice than are the brigands of our real life ruling elites. (Highsmith’s refusal to impose a justice in the novels that is conspicuously lacking in the world is one of the most subversive aspects of her depictions of the character.) If Tom is pathological, his pathologies are the pathologies of a class; it is only the freshness of the blood of his victims (and his willingness to spill it himself) that separates Ripley’s exploits from those of his new peers. Yet Ripley is not a Slasher who enjoys killing. On the contrary, he is horrifying because he treats murder as a practical task devoid of any special existential or affective charge. Ripley’s commission of murders are remarkable for their their coldness and lack of cruelty; famously, Ripley only kills because he needs to, not because he enjoys it. Ripley kills out of cold, utilitarian logic, eliminating those who stand in his way or threaten to expose him. Again, far from being aberrant, a carefully maintained distinction between a violent, obscene underside and a bland, official front is the normal practice of power and privilege. It is not moral scruples that motivate Ripley (he notoriously has none), but a fear of humiliation. As Julie Walker argues:
What Tom does fear is unmasking; not merely the unmasking of himself as Dickie or even the unmasking of himself as a killer but the unmasking of his lack of a real self and therefore his self-perceived inadequacy in the face of others — there is no appreciable difference between fear of discovery for his tax scam or for his murders. His main fear is that of socially not quite making the grade.
This rendition of amorality is what is (post)modern about Ripley. Classic psychosis consisted in the confusion of the Real and the Symbolic (the most obvious example of which would be hearing the voice of God). But Ripley’s psychosis resides in his conviction that only the big Other exists. Tom is not troubled by specific, named others being aware of, or suspecting, his criminality, so long as his crimes are not Symbolically inscribed. What is distinctive about Ripley’s postmodern take on the big Other is that it is radically atheistic — he neither believes in God nor in any moral order written into the fabric of the universe. The postmodern big Other is a Symbolic Order stripped of its symbolisation of itself; it no longer poses as God or History and openly announces itself as a social construct — but this ostensible demystification does nothing to impede its functioning. On the contrary, the big Other has never functioned more effectively.
methods of dreaming
Two novels that — purely by coincidence, or so it would seem — I happened to read one after the other which both draw on dreaming, but which emphasise opposite poles of the dreaming experience.
Christopher Priest’s A Dream Of Wessex (1977) is about a collective dreaming project, a government-sponsored initiative to tap the unconscious in order to come up with solutions to the economic and political problems that have paralysed the society in the novel’s present day of 1985. In the projected future world, the USA has converted to Islam and the UK has been annexed by the Soviet Union. The result is a strange kind of utopia, in which the bureaucratic provides a background to the bucolic: the irritations of the Soviet official machinery seem built into the dreamspace as a necessary precondition for the aching languor of the Wessex idyll, where everyday life is suffused by a Mediterranean eroticism. Priest conjures the atmosphere of a gentle solar trance, broken, significantly, by small circular mirrors, which are used to trigger the dreamer’s return to the dismal drizzle of the novel’s real world.
Once inside the Wessex projection, the participants cannot remember their real world identities. This means that, although they are referred to by the same name, the dreamers in the simulation are different entities from their real world counterparts (just as any dreamer is a different being from their double in waking life). A classic case of the Real (of unconscious wishes) versus reality. When they exit the Wessex simulation, the dreamers are replaced in the consensual hallucination by placeholder doppelgangers, programmed selves that, possessing no inner life, only exist for the Others in the dreamspace. Some of the participants come to recognise the points at which other dreamers depart from the simulation and come back to it: something in the other, that which is in them more than themselves perhaps, disappears or (seemingly miraculously) returns. What the novel renders especially powerfully is the overwhelming, intoxicating intensity of erotic connections with a dream Other, the uncanny sense of recognition, the deja vu of dreamlove. In the case of A Dream Of Wessex, the sense of recognition between the lovers can be accounted for by the fact that the two, Julia and David, know each other in the novel’s real world; and yet Julia and David are not in love in the real world, nor is there any suggestion that they would necessarily fall in love. It is their dream-selves that fall for each other. What ultimately unsettles the idyll is the kind of reality bleed or ontological haemorrhage which Priest’s later novels all turn around. A Dream Of Wessex looks forward to Gibson’s cyberspace, but it is also a vision of the Sixties recalled at the bitter end of the Seventies.
Kazuo Ishiguro’s The Unconsoled (1995) makes contact with another kind of dream space-time altogether. The novel is well-titled since it plunges us, like Alice projected into Wonderland, into a world without consolation, a world of unrelieved urgencies. This is the first and most obvious point of contrast with A Dream Of Wessex, where the official imperatives, both inside and outside the dreamspace, operate as receding pretexts for libidinal trajectories which depart from “what should be happening” (this tendency puts the whole project at risk). In The Unconsoled, the official too recedes, but assumes now not the benign quality of the libidinal pretext (the ostensible goal which allows jouissance to happen precisely by being endlessly missed) but the tortuous, tantalising, thwarted object whose failure to be attained casts a pall of terrible anxiety over everything.
Upon arriving in a nameless central European city to give a performance, the renowned pianist Ryder finds himself assailed by countless demands which distract him from his official duties, but which he seems powerless to resist. He must listen to young hopefuls playing the piano; he must speak at late-night meetings of which he was not previously aware; he must go to the outskirts of the city and be photographed in front of a monument whose significance he does not understand. New urgencies are embedded within older urgencies, endlessly.
The Unconsoled is, in part, a pastiche of Kafka, and what Ishiguro borrows from Kafka above all else is his oneiric geography, at once bizarre and strangely familiar. Spaces which had seemed to be very far from one another are suddenly revealed to be adjacent; a meeting hall which Ryder has traveled to turns out to be the very hotel that he started from. This allows problems which had seemed intractable to suddenly resolve themselves; yet the solutions bring no relief, for by now Ryder has been gripped by another urgency. The previous imperative, once so overwhelmingly important, recedes into irrelevance at the moment the next one arrives.
In The Unconsoled, as in Kafka, this perverse spatiality of contiguity without consistency arises because all space (and time) is subordinated to the urgency. There is no time except that of the urgency; and all space is curved by the urgency (and its frustrations). Obstacles suddenly emerge: most notably a wall that inexplicably looms up at the last moment preventing Ryder from getting to the concert hall where he is due to give his recital. The hectic pace is driven by the improvisational logic of retrospective confabulation, which is always making sense of things a moment too late. Ryder is perpetually noticing things that should have been obvious. As with Kafka, then, The Unconsoled is coloured by an ingenue’s sense of embarrassment.
Two opposed methods of dreaming, then: the one languid, laconic, the other harried, harassed.
atwood’s anti-capitalism
“Regressive it all is”, Jameson remarks of the “God’s Gardeners” cult in Atwood’s The Year of the Flood, adding a provocative parenthesis: “it is always helpful to wonder what politics today could possibly be otherwise.”[2] The Year of the Flood is disappointing in part because it has no alternatives to regression — the only way forward, it seems, is back to nature.
It isn’t the focus on religion per se that is the signature of this regression; rather, it is Atwood’s retreat from the questions about religion that Oryx and Crake posed so intriguingly. One of the climactic moments of Oryx was the foundation of religious feeling amongst the lab-designed neo-noble savages, the Crakers. As per Totem and Taboo and Moses And Monotheism, the religion emerges as a consequence of the death of the father figure. Ironies abound here: since the “Crakers” were made, not begotten, the “father” is actually their creator-designer, the misanthropic wunderkind Crake — who had precisely designed them without the neurological configuration which he believes gives rise to religion. Crake is not so much an eliminative materialist as a materialist eliminativist: “Crake thought he’d done away with all that, eliminated what he called the G-spot in the brain. God is a cluster of neurons, he’d maintained. It had been a difficult problem, though: take out too much in the area and you got a zombie or a psychopath.” If, at first sight, the emergence of religion amongst the Crakers appears to be a kind of miracle, in the end it is only a testament to the power of other (psychoanalytic and cultural) determining factors in addition to neurology.
Crake’s experiments constitute a retort to the hoary old reactionary homily that utopia is alien to human nature. (For a recent version of this, see one of the antagonists in Žižek’s latest book, the uber-capitalist realist Guy Sorman[3], with his claim that, “[w]hatever the truths uncovered by economic science, the free market is finally only the reflection of human nature, itself hardly perfectible.”) If that’s the case, Crake concludes with the pragmatism of the autist, we should change human nature: the means are now available. Crake in effect responds to Freud’s argument in Civilisation and its Discontents that, even if property relations were made egalitarian, antagonism would continue to arise because of sexual competition. “Maybe Crake was right,” Snowman reflects,
Under the old dispensation, sexual competition had been relentless and cruel: for every pair of happy lovers there was a dejected onlooker, the one excluded. Love was its own transparent bubble-dome you could see the two inside it, but you couldn’t get in there yourself. That had been the milder form: the single man at the window, drinking himself into oblivion to the mournful strains of the tango. But such things could escalate into violence. Extreme emotions could be lethal. If I can’t have you nobody will, and so forth. Death could set in.[4]
So Crake replaces what Toby in The Year of the Flood calls “romantic pain” with sedate animal courtship rituals. “Their sexuality was not a constant torment to them, not a cloud of turbulent hormones: they came into heat and regular intervals, as did most other mammals other than man.”[5] It would have been fascinating for Atwood to have given a fictional testing to Crake’s claim to have eliminated hierarchy, hunger and racism amongst his genetic creations. There’s also the problem of language. The Crakers are able to maintain their genetically-designed innocence, Atwood suggests, because they lack the past subjunctive tense. (“[T]he idea of the immortality of the soul […] was a consequence of grammar. And so was God, because as soon as there is a past tense, there has to be a past before the past until you get to I don’t know, and that’s what God is. It’s what you don’t know — the dark, the hidden, the underside of the visible, and all because we have grammar.”[6] But, this too, is fixable with a little genetic engineering: “[G] rammar would be impossible without the FoxP2 gene gene.”)
Yet the loss of Crake — which is nothing less than an encounter with loss and negation itself — threatens to project the Crakers out of their animal-time into the wounded time of human abjection. But the Crakers recede from focus in The Year of the Flood: a sign, perhaps, that Atwood has lost interest in them, or — maybe — that such creatures cannot elicit much interest from beings such as us. What looms to the fore in the narrative is the progressive-regressive religious form that a less pacific group of humans cleave to in the dying days of the world.
Atwood has said that one inspiration for the creation of the eco-religion was “the death of her father and mother […] and the necessity to choose hymns for their funerals that would have been acceptable to them: both were scientists.” It’s easy to sneer at the difficulty that Atwood touches upon here, and the familiar problems of reconciling religion and science may ultimately be less intractable than the issue of symbolic deficit in contemporary secularism that she is pointing to. Atheism has yet to come up with rituals that can muster the symbolic weight of religion, and there are strong reasons to suspect that the failure is more than a contingent one. That’s because Atheism typically construes the death of God in terms of a disavowal of the Symbolic (=big Other) itself. There’s a close fit between this quintessentially postmodern disavowal — where official denial of the existence of the big Other is combined with a de facto observance of the symbolic at another level — and capitalist realism. As Althusser realised, the rituals of capitalist ideology function all the better for not being acknowledged as rituals at all. In place of the intransigent solemnity of the religious ritual, postmodern secularism presents us with either an eschewal of ritual altogether (no need for any kind ceremony), or “write-your-own-vows” personalisation, or a kind of ersatz humanist-kitsch, in which religious form is preserved even as belief in a supernatural God is denied. The problem is not a secular “lack of meaning”, but almost the opposite: it is religious rituals’ very meaninglessness, their lack of personal significance, which gives them much of their power. Partly, as Jameson suggests in his LRB piece on The Year of the Flood, the problem is time: any new “belief system” “demands a supplement in the form of deep time, ancient cultural custom, or revelation itself”. Time precisely allows a ritual to become a custom, an empty form to which the individual is subjected — and, very far from being a disadvantage, this is what yields funeral rites much of their power to console.
Mourning and loss are not only at the origins of religion, but also, it goes without saying, at the root of much of its continuing appeal. One of the most contentious — and borderline acrimonious — discussions amongst students that I’ve seen for a while came up in a session on Philosophy of Religion that I taught earlier this year. What prompted the controversy was my contention that atheism has far more of a problem with evil and suffering than religion does — not least because of the suffering of those who are now dead. Ivan Karamazov’s howl of anguish can be directed at the atheist architects of the radiant city as much as at God, since what can any revolutionary eschatology, no matter how glorious, do about the agonies of those who are long dead? No amount of secular good will can guarantee any correlation between virtue and happiness, as Kant argues in an incendiary passage of “The Critique of Teleological Judgment”:
Deceit, violence, and envy will be rife around [the righteous non-believer], even though he himself is benevolent. Moreover, as concerns the other righteous people, he meets: no matter how worthy of happiness they may be, nature, which pays no attention to that, will still subject them to all the evils of deprivation, disease, and untimely death, just like all the other animals on the earth. And they will stay subjected to these evils always, until one vast tomb engulfs them one and all (honest or not, that makes no difference here), and hurls them, who managed to believe that they were the final purpose of creation, back into the abyss of the purposeless chaos of matter from which they were taken.[7]
Note also that Kant’s argument here applies equally well to the neopaganism of God’s Gardeners as it does to “righteous non-believers”, for Kant absolutely refuses the equation of nature with beneficence that the Gardeners preach. On the contrary, Kant argues, God is necessary to make good a nature characterised by amoral purposelessness. The true atheist must be able to look this “vast tomb”, this “abyss of purposeless chaos”, full in the face — whereas I suspect that most (of us) non-believers manage only to look away from it. But Kant’s moral argument is less easily dismissed than it would appear, because it is far harder to eliminate belief in a providential structure of the universe than we first imagine — precisely because this kind of belief lurks far beneath anything that we would admit to accepting. (Watch an edition of Deal or No Deal, though, and it’s clear that many openly evince such a belief.) Perhaps it would indeed take a Crake’s genetic tinkering to eradicate it.
The problem with The Year of the Flood is that politics and religion become synonymous — and while there’s every reason to be positive about politicised religion, there are deep problems with a politics which cannot shed the redemptive and messianic mantles of religious eschatology. It’s striking how much God’s Gardeners resemble the Greens as abominated by Sorman, in a passage quoted in First As Tragedy, Then As Farce:
No ordinary rioters, the Greens are the priests of a new religion that puts nature above humankind. The ecology movement is not a nice peace-and-love lobby but a revolutionary force. Like many a modern-day religion, its designated evils are ostensibly decried on the basis of scientific knowledge: global warming, species extinction, loss of biodiversity, superweeds. In fact, all these threats are figments of the Green imagination. Greens borrow their vocabulary for science without availing themselves of its rationality. Their method is not new; Marx and Engels also pretended to root their world vision in the science of their time, Darwinism.[8]
Atwood makes a case for such a religion. (Clarifactory note: just to be 100% clear — I in no way endorse Sorman’s views of the Greens. I just thought it was amusing that Atwood constructed an eco-cult which so closely fitted Sorman’s stereotype.) In an exchange with Richard Dawkins on Newsnight a couple of weeks ago, Atwood maintained that arguing against religion from the perspective of evolution makes little sense, because the persistence of religion itself suggests that it confers evolutionary benefit on humans. Given this, Atwood suggested, religion should be used as a tool for “progressive” struggles; and Adam One, the leader of God’s Gardeners, is interesting only when he sounds like a Machiavelli or a Strauss, who uses religion to manipulate popular sentiment — the rest of the time his ecopiousness is made bearable only by virtue of Atwood’s gentle satirical teasing (witness, for instance, the convolutions into which Gardener-doctrine is forced in its attempts to reconcile vegetarianism with both the carnivorebias of the Bible and the “amoral chaos” of a nature red in tooth and claw). Initially, what appeals about the idea of God’s Gardeners is the promise that Atwood will describe a new kind of political organisation. Yet the Gardeners’ doctrine and structure turns out to be a disappointing ragbag of stale and drab No Logo-like anti-consumerist asceticism, primitivist lore, natural remedies and self-defence that is as alluring as last week’s patchouli oil. Ultimately, The Year of the Flood feels like a symptom of the libidinal and symbolic impasses of so much so-called anti-capitalism. Atwood imagines the end of capitalism, but only after the end of the world. Oryx was like the first part of Wall-E; The Year of the Flood is like the second part, where we find that the last survivor was nothing of the sort, and there were existing bands of human beings already wandering around, mysteriously just out of sight. (At least in Wall-E the surviving humans were off-world, whereas in Oryx, we are now asked to believe, they had somehow remained just outside Snowman’s eyeline.) It has a retrospectively deflationary effect, subtracting most of the pathos and nobility from Snowman’s plight, and converting what had seemed like a cyberpunk-Beckett tragicomedy into mere comedy. (Incidentally, perhaps the greatest “achievement” of The Year of the Flood is that, by the end, it no longer feels like an Atwood novel at all. Instead, it’s written in the kind of functional prose of a middling Stephen King novel, and populated by cyberpunk genre-standard hardass women, in a post-apocalyptic setting which is surprisingly lacking in vividness. The result is what Robert Macfarlane memorably calls a “dystoap-opera”.)
The question that kept recurring when I was reading both Oryx and Crake and The Year of the Flood was: why do these books not succeed in the way that The Handmaid’s Tale did? If The Handmaid’s Tale was an exemplary dystopia, it was because the novel made contact with the Imaginary-Real of neoconservatism. Gilead was “Real” at the level of a neo-conservative desire that was operating in the Reaganite Eighties; a virtual present that conditioned the actual present. Offred, the handmaids, the Marthas, the Wall — these names have the resonant consistency of a world. But Atwood does not have so assured a handle on neoliberalism as she did on neoconservatism. Atwood gives every appearance of underestimating the cheap poetry of brands, banal as it is; her corporate names are ugly and clunky, no doubt deliberately so — perhaps this is the way that she hears the absurd infantilisms of late capitalist semiotics. AnooYoo, HelthWyzer, Happicuppa, ReJoovenEssens, and — most ungainly of all — Sea(H)ear Candies: these practically caused me physical pain to read, and it is hard to conceive of any world in which these would be leading brands. Atwood’s mistake is always the same — the names are unsightly plays on the function or service that the corporations offer, whereas capitalism’s top brand names — Coca-Cola, Google, Starbucks — have attained an asignifying abstraction, in which any reference to what the corporation does is merely vestigial. Capitalist semiotics echo capital’s own tendency towards ever-increasing abstraction. (For the Imaginary-Real of neoliberalism, you’d be far better off reading Nick Land’s Nineties texts, shortly to be re-published.) Atwood’s names for genetically-spliced animals — the pigoon, the spoat/gider, the liobam — are also examples of linguistic butchery; perhaps she was trying to provide a parallel in language for the denaturalising violence of genetic engineering. In any case, these linguistic monsters are unlikely to roam far beyond Atwood’s texts (they certainly don’t have anything like the dark sleekness and hyperstitional puissance of, say, Gibson’s neologisms).
But the principal failing of The Year of the Flood’s anti-capitalism consists in its inability to grasp the way in which capitalism has absorbed the organic and the green. Some of the strongest passages in Žižek’s First As Tragedy, Then As Farce keep reiterating this message. (One of my favourite lines in the book: “Who really believes that half-rotten and overpriced ‘organic’ apples are really healthier than the non-organic varieties?”) Needless to say, while any credible leftism must make ecological issues central it is a mistake to seek out an “authentic” organicism beyond capitalism’s simulated-organic. (Another of my favourite lines in First As Tragedy: “if there is one good thing about capitalism, it is that, precisely, mother earth now no longer exists.”) Organicism is the problem, and it’s not some eco-spirituality that will save the human environment (if it can be saved), but new modes of organisation and management.
toy stories: puppets, dolls and horror stories
“In many horror stories there is an assortment of figures that appear as walk-ons or extras whose purpose is to lend their spooky presence to a narrative for atmosphere alone, while the real bogey is something else altogether. Puppets, dolls, and other caricatures of the human often make cameo appearances as shapes sagging in the corner of a child’s bedroom or lolling on the shelves of a toy store […] As backdrops or bit-players, imitations of the human form have a symbolic value because they seem connected to another world, one that is all harm and disorder- the kind of place we sometimes feel is a model for our own home ground, which we must believe is passably sound and secure, or at least not an environment where we might mistake a counterfeit person for the real thing.”
— Thomas Ligotti, The Conspiracy Against the Human Race[2]
So writes the horror author Thomas Ligotti in his recently published book, The Conspiracy Against the Human Race. The book is not a work of fiction — it is, instead, a work of amateur philosophy in the best possible sense, driven by a metaphysical hunger that is so often lacking in the work of professional philosophers. Ligotti is unembarrassed to return to those questions which academic philosophers typically disdain in favour of an entanglement in scholarly minutiae. Why is there something rather than nothing? Should we be glad to be alive? Ligotti’s answer to this latter question is emphatically in the negative. Possessed of a cold, sober seriousness that couldn’t be more at odds with the atmosphere of cheery vitalism and inane lightness that prevails in early twenty-first-century culture, The Conspiracy Against the Human Race has the feel of a nineteenth-century tract.
Puppets are one of the leitmotifs of Ligotti’s work, but the terror that they cause does not primarily arise from any malicious intentions on their part, or from the suspicion that they might secretly move when we do not watch them. Rather, the puppet is an emissary of what Ligotti repeatedly characterises in The Conspiracy Against the Human Race as the “malignantly useless” nature of the cosmos itself. The painted-faced marionette is a symbol of the horror of consciousness, the instrument which, for Ligotti, allows that “malignant uselessness” to be perceived, and which brings all suffering into the world.
The puppet is a figure which belongs equally as much to the children’s story as to the weird tale. Ian Penman has written of how the most famous puppet story, Carlo Collodi’s The Adventures of Pinocchio (1883),
contains scarcely credible levels of cruelty and pain […] Accusations of abuse. Thrown hammers. Burned-off feet. Children used as firewood: innocence kindling. Curiosity rewarded with concussion and kidnap. Hanging, amputation, suffocation. A snake laughs so hard at Pinocchio’s fear he bursts an artery and dies. On his way to school Pinocchio sells his schoolbooks to join a Street Theatre: forget education, become a marionette. A dancing fool. Apprentice Golem. Malignant clown. Neuter, castrato.
(Penman’s remarks were made in the piece that he contributed to a book on Michael Jackson I edited last year — and Jackson’s own story is one in which kitsch and Gothic, puppet and master manipulator, frequently reversed into one another.[3])
On his blog on memory and technology, Bat, Bean, Beam, the theorist Giovanni Tiso recently noted the echoes of Pinocchio in the Toy Story films.[4] For the Marxist Richard Seymour,
Toy Story 3 is a story of how freedom is achieved through commodification, and how “the consent of the governed” roughly equals the willing embrace of bondage […] Everyone, and everything, has its place in the Toy Story scheme of things. That scheme is a hierarchy of commodities with toys near the bottom, subordinate and devoted to their owners.[5]
Yet, at an ontological level, the Toy Story films constitute something of a “tangled hierarchy”. The toys that are depicted in the films do not only exist at the “ontologically inferior” level of the film’s fiction; they are real in the sense that you can buy them outside the cinema. In Ligotti, puppets and puppetry frequently symbolise this tangling of ontological hierarchy: what should be at the “inferior” level of the manipulated manikin suddenly achieves agency, and, even more horrifyingly, what is at the supposedly “superior” level of the puppet master suddenly finds itself drawn into the marionette theatre. Ligotti writes that it is a terrible fate indeed
when a human being becomes objectified as a puppet and enters a world that he or she thought was just a creepy place inside of ours. What a jolt to find oneself a prisoner in this sinister sphere, reduced to a composite mechanism looking out on the land of the human, or that which we believe to be human by any definition of it, and yet be exiled from it.
With Ligotti, it is not clear which is the more terrifying prospect — an ultimate puppet master pulling the strings or the strings fraying off into blind senseless chaos.
Tiso noticed something peculiar about the desire of the toys in the Toy Story series: “what they like best is to be played with by children. But it so happens that at those times they are limp and inanimate; as is the case whenever they are in the presence of people, their spark abandons them, their eyes become vacant.”[6] It’s as if the message of the Toy Story films rhymes with that of Ligotti’s pessimistic tract: consciousness is not a blessing bestowed on us by a kindly toymaker standing in for a beneficent God, but a loathsome curse.
Zer0 books statement
Contemporary culture has eliminated both the concept of the public and the figure of the intellectual. Former public spaces — both physical and cultural — are now either derelict or colonised by advertising. A cretinous anti-intellectualism presides, cheered by expensively educated hacks in the pay of multinational corporations who reassure their bored readers that there is no need to rouse themselves from their interpassive stupor. The informal censorship internalised and propagated by the cultural workers of late capitalism generates a banal conformity that the propaganda chiefs of Stalinism could only have dreamt of imposing. Zer0 books knows that another kind of discourse — intellectual without being academic, popular without being populist — is not only possible: it is already flourishing, in the regions beyond the striplit malls of so-called mass media and the neurotically bureaucratic halls of the academy. Zer0 is committed to the idea of publishing as a making public of the intellectual. It is convinced that in the unthinking, blandly consensual culture in which we live, critical and engaged theoretical reflection is more important than ever before.
PART TWO: SCREENS, DREAMS AND SPECTRES: FILM AND TELEVISION
a spoonful of sugar
The worst aspect of Dennis Potter’s final two indulgent and indulged works (Cold Lazarus and Karaoke) was that they had the effect of retrospectively introducing doubts over everything else he’d done. Could he possibly be anything like as good as we’d always believed?
Actually, there’s a case for saying that, if 1986’s The Singing Detective marked the peak of Potter’s career, it also preceded a slow and painful decline. It would only be slightly harsh to say that everything after 1986 was either formulaic reiteration (Lipstick On Your Collar) or tortuously introspective, failed experimentalism (Blackeyes, the film Secret Friends). By the time of his death in 1994, Potter had been lionised by the great and good everywhere, his reputation for controversy forgotten (or forgiven?). Melvyn Bragg’s famous interview-cum-hagiography elevated Potter to the state of an unimpeachable morphine saint. All of this solemnity had the effect of devitalising Potter’s work, prematurely shrouding it with all the cobwebs of respectability and reverence.
Well, I had the opportunity to see Potter’s 1976 masterpiece Brimstone and Treacle again very recently. (The play is shortly to be reissued as part of a must-have Potter DVD boxset, which also includes The Singing Detective, Pennies from Heaven and Casanova). In 2004, when TV drama is corporate, committee-driven, blandly homogenous, Potter looks even more of an anomaly than ever. Today, there’s almost no way of identifying TV dramas by who has written them; they are routinely conceived of as vehicles for actors, not authors. By contrast, even at its worst, Potter’s work was marked by an indelible signature, characterised by a singular VISION. (The tendency to fall back on these trademark elements without remixing them was one of the weaknesses of his last pieces.) It’s hard to imagine that Potter’s peculiar portfolio of obsessions and techniques (his playful anti-naturalism, his disturbed disquisitions on sexuality, politics and religion, his loving interrogation of the appeal of pop music and pulp genres, his exemplification/ analysis of misogyny) would get past our Noughties culture’s gatekeepers (which might be tolerant of representations of sex, but which are, in every other way, more censorious than those of the Seventies). As the Independent pointed out when it reappraised Potter in the light of the US film version of The Singing Detective, his influence is more likely to be felt on American than on British TV, in an expressionist drama such as Six Feet Under or even in the delirial departures from naturalism of something like Ally McBeal.
In any case, Potter did fall foul of Seventies sensibilities with Brimstone and Treacle. Filmed in March 1976, it was due for broadcast as a Play for Today in April, but was pulled at the last minute when the BBC authorities quailed at its “nauseating” qualities. It didn’t surface until over a decade later, when, in the wake of the success of The Singing Detective, the play was eventually shown in 1987. An inferior film version, starring Sting, was released in 1982.
Brimstone and Treacle features a young Michael Kitchen as the devil. In an echo of Potter’s earlier “visitation” plays, Kitchen’s character, Martin, inveigles himself into people’s lives and homes by cold reading them like a stage hypnotist.
Potter’s vision of evil is a million miles away from the white-catting portentousness or Pacino-like histrionics to which countless clichéd cinema renderings have accustomed us. Kitchen’s devil is impeccably polite, insufferably, cloyingly nice, sanctimoniously religiose. “Religiose” is a word Potter used with a particular contempt, carefully contrasting its pious pomposity with what he saw as the genuine religious sensibility.
The play opens with two epigraphs: the first from Kierkegaard’s Fear and Trembling: “there dwells infinitely more good in a demoniac than in a trivial person”, the second from Mary Poppins (“A spoonful of sugar helps the medicine go down”). For Kierkegaard, the most pressing danger for Christianity was not doubt, but the kind of bluff certainty peddled by pompous philosophers like Hegel. Kierkegaard’s Faith was indistinguishable from terrible anxiety. The paradox of Faith for Kierkegaard was that, if God completely revealed himself, Faith would be unnecessary. Faith is not a form of knowing; on the contrary. Kierkegaard’s models were Abraham on the day he was asked to sacrifice Isaac and Jesus’ disciples: tormented by uncertainty, unmoored from any of society’s ethical anchors, staking their life on fabulous improbabilities.
Martin is a perverse double of 1976’s most iconic of icons, Johnny Rotten, that demonic purge of trivia and mediocrity. If Rotten’s Nietzscheanism (“I yam an antichrist”) concealed a burning core of righteousness, Martin’s surface charm belies malevolence. At the limit though, what both Rotten and Martin show is the deep complicity of “good” and “evil”, their mutual interdependence. Both Martin and Rotten are ultimately deliverers, destroyers of fragile status quos, bringers of disequilibrium and agents of chaos. Punk’s greatest disgust was with the trivial and the mediocre, with the existential death of boredom. The decadence would be cleansed by rage (cf the apopleptic Colin Blakeley in Potter’s 1969 version of Christ’s life, Son of Man).
Brimstone and Treacle begins with Martin accosting Denholm Elliott’s Mr Bates in the street. Martin’s questioning quickly establishes that Bates has a daughter, suffering from apparently incurable neurological damage after being hit by a car two years previously. Posing as an unrequited admirer of the daughter, Pattie, Martin insinuates his way into the Bates’ home. The house is a suburban fortress incubating quiet desperation, nagging frustration and unspoken betrayals. You can almost smell the house, thick with the stench of unaired rooms, the pulped food with which Pattie is spoonfed — and despair. Martin’s incursion is greeted with initial suspicion and circumspection by Mr Bates, but welcomed by the easily beguiled Mrs Bates (Patricia Lawrence), eager to clutch at any potential escape route from the treadmill of drudgery in which she is confined. While Bates has given up any hope of Pattie recovering, his wife cherishes the seemingly impossible dream of a miraculous return to health.
Kitchen’s performance is magnificent, but it is Elliott who steals the show. He manages, incredibly, to make the obnoxious and unpleasant Bates, a neophyte National Front supporter, painfully sympathetic. The scene in which Bates regales his wife and Martin with a desperately unfunny Irish “joke” is excruciating. Elliot renders Bates’ typical expression as a grimace — of irritation, suppressed rage, bewilderment. It is the expression of a whole class, a whole generation’s, incredulity that the world no longer belongs to them, if it ever did. Bates’ political pathology is rooted in a bewildered and misconceived nostalgia, a bleary and inarticulate longing for the world to be like it used to be. He’s a bit like the average Britpop fan would be twenty years later.
Potter is at his most politically acute here, in his exposing of the proximity of a respectable, “common-sense”, Daily Mail agenda to that of the far right. Potter locates Anglo-fascism’s Seventies heartland behind the politely manicured lawns and privet hedges of suburbia. Martin wins Bates over by agreeing with him that “we need to get rid of the blacks”. “It’s so good to have an intelligent conversation like this”, Bates enthuses, cracking open the scotch. However, Martin’s gleeful description of what will happen when “they won’t go”, “we’ll round them up, put them in camps” — makes Bates blanche. Mrs Bates is not so convinced. “You can be too nice you know.”
Brimstone and Treacle is disturbing, ethically opaque. It is troubling for reasons other than those of cultural or political conservatism. The denouement sees Martin’s raping of Pattie shocking her into an unexpected recovery (which itself prompts the play’s final shocking revelation, which I won’t give away for the sake of those who haven’t seen it yet). There is no easily digestible “message”. It’s a bitter pill rather than a spoonful of sugar.
she’s not my mother
“Interviewer: It’s hard to see this movie and not consider that all our memories are creations.
Cronenberg: But they are, they totally are.”
— Andrew O’Hehir, “The Baron of Blood does Bergman”[2]
“Watch from the wings as the scenes were replaying. We saw ourselves now as we never had seen.”
— Joy Division, “Decades”[3]
Cronenberg’s Spider — adapted from Patrick McGrath’s superb novel — is a study of schizophrenia that couldn’t be further removed from the clichéd image of “madness” in cinema. There are numerous examples of this, but the one that comes immediately to mind (perhaps because I watched it recently) is Windom Earl in the second season of Twin Peaks: gibbering, histrionic, megalomaniac. Think also of Nicholson’s Joker in the first Batman movie. Madness is here imaged as a kind of absurdly inflated ego; a self that knows no bounds, which wants to expand itself infinitely. As played by Ralph Fiennes in Cronenberg’s film, Spider, too, has a precarious sense of his own limits, but, far from wanting to spread further into the world, he seems to want to make himself disappear. Everything about him — his mumbling speech, shambling movements — screams withdrawal, retreat, terror of the outside. That’s because, as ever in Cronenberg’s schizoverse, the outside is already inside. And the reverse.
McGrath’s novel is set entirely within the head of its archetypal unreliable narrator, Spider, since it is written as a series of diary entries. To simulate this, Cronenberg could have gone with the strategy employed in the early versions of the script and used voiceover (although anyone who’s seen Spike Jonze’s Adaptation will remember Robert Mckee’s rant about that particular technique). In the end, Cronenberg strips out Spider’s narrative voice altogether, with the result that the film is, in a strange way, truer to the novel than the novel itself. In the novel, Spider’s articulacy gives him a kind of self-awareness and (albeit limited) transcendence of his mania. In the film, there is no distance, no narrative voice, only a ceaselessly productive narrative machine, chattering out multiple permutations. In place of the transcendent offscreen voice, we are presented with Spider as a character in his own delirium, the adult version of himself observing and writing, always writing, as the memories of his childhood life play out. As Cronenberg has observed, it is almost as if Spider is directing his own memories. “One journalist said to me, ‘When we see Spider in his own memories, peeking in the windows or hiding in the corner, isn’t that like a director being on the set?’ I hadn’t thought of it that way, but he is redirecting and rechoreographing his memories.”[4] We are reminded that the dreamer is every character in his dream.
So Spider develops a naturalistic expressionism, or expressionist naturalism. Its strangely solitary London is, Cronenberg says, an expressionist London. Spider captures the boiled potatoes atmosphere of the pre-rock ‘n’ roll Fifties, its muted colours as washed out as cabbage water.
The film of Cronenberg’s which Spider most resembles is Naked Lunch; not only because it, too, is based upon a supposedly unadaptable book, but also because both films principally concern writing, insanity, masculinity and the death of a woman. In both Naked Lunch and Spider, the phantasmatically reiterated murder of a woman is the pivotal event, the lacuna around which the films circle. In Naked Lunch, Lee initially disavows the killing of his wife Joan by attributing it to the influence of Control. Lee is only able to accept minimal responsibility for the killing when he is “required”, at the end of the film, to assassinate Joan, or at least her double, again. The re-staging of the death is less an admission of ethical responsibility than an attempt to own it, to make sense of it. Such is the logic of trauma. (Reminding us of Ballard’s description of the motives of the schizo in The Atrocity Exhibition: “He wanted to kill Kennedy again, but this time in a way that made sense.”)
In Spider we are initially led to believe that Spider’s father, Bill Cleg (Horace in the novel) has killed Spider’s mother after embarking on an affair with the “fat tart” Yvonne (Hilda in the novel). No sooner has Bill brutally and casually murdered his wife, rolling her into a hastily dug grave in the earth of his allotment (“out with the old”, Yvonne callously cackles), than he moves Yvonne into his home. At this point, our suspicions that something is amiss with Spider’s narration begins to harden into a conviction. But it’s only at the end of the film that we learn what appears to have really happened: it is Spider himself who killed his mother, gassing her whilst apparently suffering from a delusion that she is another person. The early exchanges between Spider and his father take on a different significance (Spider: “She’s not my mother”. Bill: “Well, who is she then?”) The final scene sees Bill rescuing Spider from the house, and desperately trying to revive Yvonne, who in death, has become, once again, the dark-haired Mrs Cleg.
While this seems to be the preferred interpretation, the film does not close down any of the narrative possibilities it has opened up. I think we can enumerate nine distinct narrative options that the film leaves open:
1. Bill killed his wife, and he really did co-habit with a prostitute called Yvonne.
2. Bill did kill his wife, there really is an Yvonne, but she never moved in with Spider’s father.
3. Bill killed his wife, but there is no such a person as Yvonne.
4. Spider, not Bill, killed his mother, but Bill moved in with Yvonne after his wife’s death.
5. Spider killed his mother, there is a prostitute called Yvonne, but she never moved in with Spider’s father.
6. Spider killed his mother, and there is no such person as Yvonne.
7. Neither Spider nor Bill killed Mrs Cleg, but Bill moved in with Yvonne after his mother’s death.
8. Neither Spider nor Bill killed Mrs Cleg, there really is an Yvonne, but she never moved in with the Clegs.
9. Neither Spider nor Bill killed Mrs Cleg, and there is no such person as Yvonne.
Rather than resolving the ambiguities of McGrath’s novel, the film actually amplifies them. In the novel, we at least learn (it seems) that Spider has been incarcerated for killing his mother (even though he continues to maintain that it was his father who was responsible for the death). In the film, the twenty years between Mrs Cleg’s death and Spider’s arrival at the halfway house are a blank. We know, or think we know, by inference, that he has been in a psychiatric institution, but no more.
Miranda Richardson’s performance is crucial to the maintenance of the film’s polysemous ambiguity. She is superb in three different roles: as the virtuous brunette Mrs Cleg, the licentious blonde Yvonne and also as the suddenly and inappropriately sexually aggressive landlady of the halfway house, Mrs Wilkinson. The situation is complicated by the fact that Yvonne is played at first by another actress altogether (at least, I think that is the case; it is a tribute to the film’s queasy delirium and to Richardson’s performance, that I’m just not sure), just as Mrs Wilkinson is played for most of the film by Lynne Redgrave.
As in Naked Lunch, writing is both passive and active. Like Bill Lee, Spider, scratching away in his notebook in his idiolectic hieroglyphics, seems at one level only to be recording signal from outside; at another level, he is the producer of the whole scene, its derealiser.
Talking about the film, Cronenberg has referred to Nabokov’s theory of memory and art as attempts to recover the unrecoverable. But the figure that dominates the film is another writer who, like Nabokov, Brian McHale has referred to as a “limit-modernist”, Samuel Beckett. Cronenberg has said that Spider’s look, with its shock of spiky hair, was very much influenced by photographs of Beckett, but the affinity with Beckett goes much deeper. Like Molloy or Malone, Spider is continually fumbling in his pockets for talismanic objects. Such partial objects mark the routes on their “intensive voyages”. Like McGrath, Cronenberg seduces us into identification with Spider (Cronenberg: “I am Spider”), taking us with him on his schizo-stroll, then strands us in the delirium…
stand up, nigel barton
“I remember, I remember
The school where I was born;
I remember, I remember,
The school where I was… torn.”
— Dennis Potter, The Nigel Barton Plays[2]
“And nowadays what else does education and culture want! In our age of the people —I mean our uncouth age — ‘education’ and ‘culture’ must basically be the art of deception, to mislead about the origin of the inherited rabble in one’s body and soul.”
— Friedrich Nietzsche, Beyond Good and Evil[3]
Dennis Potter’s Stand Up, Nigel Barton, shown as part of BBC 4’s “Summer of the Sixties” season, is still almost too painful to watch.
Here is Potter writing a television play which draws very closely upon his experiences as a scholarship boy, projected out of his class into the rarefied world of Oxford. Stand Up, Nigel Barton was actually written after Vote, Vote, Vote for Nigel Barton, Potter’s fictionalised account of his failed attempt to become elected as a Labour MP. To Potter’s disgust, Vote, Vote, Vote was suppressed by the BBC, but its temporary banning allowed him to work again with the characters he had invented, writing this prequel which would be shown first.
English fiction has always been ambivalent about social mobility. Potter’s theme was very much one with which the Sixties would be preoccupied, in music as much as drama. Consider the Kinks (“Rosy, won’t you please come home”, “See my friends/they cross the river”) or the Who (“I was born with a plastic spoon in my mouth”). Like Dickens’ Pip, Nigel is profoundly torn; unwilling to give up the privilege and status he has newly acquired, unable to accept and enjoy them as one to the manner born, simultaneously holding onto his roots and repudiating them, never forgetting where he has come from, but ashamed of the stains that his origins have left upon him. And ashamed of that shame. Never comfortable amongst the masters, but no longer at home in the community which produced him.
Forty years on, and the screen still crackles with rage, confusion and embarrassment. Potter intercuts between the working men’s club, the bedrock of the proletarian community, with its “suffocating affection” but deep suspicion, resentment and distrust of those who leave; and the smug redoubt of the Oxford Union, whose louche members idly trade bon mots (“Oxford”, as Nigel observes in his somewhat too histrionic style, “where nothing really matters”, where a dissolute, ironic detachment is the mark of a gentleman, and where Nigel’s very passion marks him out as not quite right).
Who can watch the final scene — Nigel at home with his parents, watching himself being interviewed on television about class — without cringing? What Nigel says about his father “watching him like a hawk”, about “walking a tightrope”, about class only being experienced by those who move between classes; none of this is a distortion. And yet, Nigel is too much in love with his own cleverness, too much attached to the role of alienated workingclass boy that he has been invited to play. He knows he has betrayed his parents. His father, ambivalent about him at the best of times, both proud and resentful, simmers; his mother, uncomprehending, weeps, “But it’s clean. You could eat off the floor here…”
Potter shows that he can do naturalism painfully and powerfully. But he’s already exploring more expressionistic techniques: playing with chronology, breaking the frame (adults playing children, characters speaking directly to camera). The origin of the famous classroom scene in The Singing Detective is here, with Janet Henfrey taking on the role of the terrifyingly inquisitorial, witch-like schoolmistress she will reprise in the later play. The performances, especially Keith Barron as Nigel and Jack Woolgar as his father, are universally superb.
No need to reiterate, by now, my lament for TV drama this challenging, this near-the-knuckle, this relevant. But what a nihilistic message Potter conveys. There is nothing to aspire to, nothing you’d want to return to. Nigel trapped and alone, forever alone…
With Stand Up, Nigel Barton I knew that in small family groupings — that is, at their most vulnerable — both coalminers and Oxford dons would probably see the play. This could add enormously to the potency of a story which attempted to use the specially English embarrassment about class in a deliberately embarrassing series of confrontations. In the theatre — or, at least, in the West End — the audience would have been largely only on one side of this particular fence. There is no other medium which could virtually guarantee an audience of millions with a full quota of manual workers and stockbrokers for a “serious” play about class.[4]
portmeirion: an ideal for living
“As the bourgeoisie laboured to produce the economic as a separate domain, partitioned off from its intimate and manifold interconnectedness with the festive calendar, so they laboured conceptually to reform the fair as either a rational, commercial trading event or as a popular pleasureground. As the latter, the fair had from classical times been subject to regulation and suppression on both political and moral grounds. But although the bourgeois classes were frequently frightened by the threat of political subversion and moral licence, they were perhaps more scandalised by the deep conceptual confusion entailed by the fair’s inmixing of work and pleasure, trade and play. In so far as the fair was purely a site of pleasure, it could be envisaged as a discrete entity: local, festive, communal, unconnected to the ‘real’ world. In so far as it was purely a commercial event it could be envisaged as a practical agency in the progress of capital, an instrument of modernisation and a means of connecting up local and communal ‘markets’ to the world market.”
— Peter Stallybrass and Allon White, “The Fair, the Pig, Authorship”[2]
If you know about Portmeirion, it’s almost certainly because of The Prisoner, justly recognised as one of the most innovative television series ever produced (more on which presently). Our tendency is to think of Portmeirion, built by gentlemen-philanthropist Sir Clough Williams-Ellis on his private peninsula near Porthmadog, as a quaintly attractive divertissement; an example of charming English eccentricity that has somehow fetched up in Wales. The subtext we don’t even need to articulate to ourselves (so we think) is that all this — attractiveness, eccentricity, charm — are harmless, which is to say, pleasant but ultimately irrelevant. The idea that they could have political-economic significance; that’s more absurd than Ellis’ absurdist architecture, surely?
It’s fitting that I should have encountered both Ellis’ village and Llandudno’s homage to Lewis Carroll in the same week, in Wales, since both belong to an ex-centric Britishness that is as at least as important as Magritte’s Belgian Surrealism. Remember that André Breton thought that the British — with Edward Lear, Lewis Carroll and their ludic ilk — had little need of Surrealism, since they were already Surrealist… But Artaud, who could hardly have been accused of being over-conscious, was an admirer of Carroll; as were the Situationists, who recognised that there was something utterly serious about English Nonsense. As did Deleuze, of course, who produced what is one of the strangest landmarks in Psychedelic Reason, The Logic of Sense, as a rigorous philosophical exposition of Carroll’s Nonsense. (One of its most inciting sections is an account of Artaud’s translation of “Jabberwocky”.)
But it’s worth pausing and thinking a little more about the Situationists. It’s disastrous that the Situationist insistence upon the ludic has degenerated into a smugonautic celebration of bourgeois circus trickery (juggling and unicylcists as the shock troops of the revolution against Corporate Kapital). You have to reread Ivan Chtcheglov’s astonishing “Formulary for a New Urbanism”[3] — written in the year of our current Queen’s coronation, 1953 — to be reminded of the force of the Situationist critique. How could architecture — i.e. the places in which we live — not be an intensely political matter? And why should we live in boring, utilitarian spaces when we could live in grottoes and crooked caverns? “A mental disease has swept the planet: banalisation. Everyone is hypnotised by production and conveniences…”
Like punk, Surrealism is dead as soon as it is reduced to an aesthetic style. It comes unlive again when it is instantiated as a delirial program (just as punk comes unlive when it is effectuated as an anti-authoritarian, acephalic contagion-network). Chtcheglov resists the aestheticisation of Surrealism, and treats De Chirico’s paintings, for instance, not as particular aesthetic contrivances, but as architectural blueprints, ideals for living. Let’s not look at a De Chirico painting — let’s live in one. Chtcheglov’s call was astonishingly pre-empted by Clough Williams-Ellis’ building of Portmeirion. Ellis described himself as follows:
He almost certainly has a weakness for splendour and display and believes that even if he were reduced to penury himself he would still hope to be cheered by the sight of uninhibited lavishness & splendour unconfined somewhere which is why he feels that Copenhagen’s Tivoli Gardens or something like them should be spread around the civilised world giving everyone a taste of lavishness, gaiety and cultivated design.[4]
Ellis recognised, that is to say, that the production of the aesthetic as a category separate from the “necessary” (i.e. the utile, in the Bataille restricted economy sense) was complicit in a kind of (from any rational POV) inexplicable diminution of the possibilities of human experience. Why must architecture be part of a banalising culture of vampiric undeath? Why should only the privileged be able to enjoy their surroundings? Why should the poor be penned into miserable concrete blocks?
Ellis referred to beauty as a “strange necessity”, cutting through the binary of needs = biological and aesthetic = cultural luxury. Bodies deprived of attractive surroundings were as likely to be as depressed — or to use the superbly multivalent Rasta term, downpressed — as those deprived of anything they more obviously “needed”.
According to the Portmeirion website,[5] Ellis sought, in the building of Portmeirion, to demonstrate that it was possible to develop sites of natural beauty without destroying them:
A tireless campaigner for the environment Clough was a founder member of both the Council for the Protection of Rural England in 1926 and the Campaign for the Protection of Rural Wales in 1928 (and of which he was president for twenty years). He was an advocate of rural preservation, amenity planning, industrial design and colourful architecture.
The fact that The Prisoner was filmed here then is in no sense an accident. In addition to its Foucauldian analyses of power (“you are Number 1”), its — in every good sense — existentialism, its PKD-like psychedelic dismantling of identity, The Prisoner was a withering account of the English class system. McGoohan, auteur-actor, was given an artistic licence by the then head of ITV (yes, remember, The Prisoner appeared on ITV — I know it beggars belief now), Lew Grade — both were outsiders (McGoohan an Americanborn Irishman, Grade a Jew) who had penetrated into the genteel brutality of the English Core’s gentlemen’s club. However irascible they sometimes became, the series of Number 2’s typically had that impermeable urbane assurance so infuriatingly characteristic of the English Core Master Class. Power expressed itself not in crude force — whenever that was used (cf the episode “Hammer into Anvil”) you knew that they had in every sense lost it — but with the quiet, insinuating menace lurking behind an inscrutable politesse. “Cup of tea, Number 6?”
The village had all the quaint charm of politely ritualised Englishness ambivalently celebrated by the Kinks in their Village Green Preservation Society (which came out contemporaneously with The Prisoner). And of course McGoohan’s genius lay in exposing the acidic undertaste of phrases like “be seeing you” and “feel free”.
The Prisoner is the heir of both Kafka and Carroll — and part of its importance consists in its revelation of the shared sensibility. Kafka’s observations of the banalising terror of the decaying Hapsburg bureaucracy as it moved towards Weberian impersonality owes much to Carroll. K’s Trial after all has no more sense than the trial at the end of Alice’s Adventures in Wonderland. Like Alice, K often comes across as a lucid child — for only a child can be lucid in Carroll and Kafka’s world — observing the senseless and arbitrary cruelty of adult caprice, whose only alibi is precedent. “Things have always been done that way. Don’t you know? How stupid are you?”
It is their restoration of the child’s reason in the face of adult intransigent baboonery that makes Kafka, Carroll and The Prisoner punk. Until it is socialised — i.e. stupefied into mute acceptance of the irrational caprice of the socius — the child knows that authority is nothing unless it is can be defended via reason.
The Prisoner, like Williams-Ellis, like the Situationists and the Surrealists, dreamed a dream deemed to be impossible, conceiving of a social system in which play and reason combine in an exploration of Intensive Now.
golgothic materialism
I finally saw The Passion of the Christ this week. I watched it at work with the A-level Religious Studies students. They, like me, were moved to tears and beyond. (Tip for any teacher out there: show the film at nine in the morning, that’ll wake up any students still yawning their way into the day.)
Whilst agreeing with much of what Žižek says about Gibson’s film in his brilliant essay “Passion in the Era of Decaffeinated Belief”,[2] I think that he doesn’t go nearly far enough.
Žižek is right to challenge the smug and lazy culturalist consensus that religious conviction is inherently pathological and dangerous. But he is wrong to suggest that what is most important about Passion is belief. Gibson’s Gnostic vision — which is simply Christ’s ethical Example rescued from the institutionalised religion that has systematically distorted it in his name — makes the two traditional supports of religious belief irrelevant. Astonishingly, The Passion of the Christ demonstrates that neither Revelation nor Tradition are important for those seeking to become-Christ(ian). What matters is not so much whether the events described in the film really happened — and there is no reason to doubt that something resembling them did — but the life-practice which the Christ story narrates.
Life as parable.
Let’s dismiss first of all the idea that the film is anti-semitic. Certainly, the first half of the film threatens to invite this interpretation. In the run-up to Jesus’ arrest, the film appears to depict the Jewish religious authorities as near-subhuman monsters, while the Roman imperial powers are viewed sympathetically, as benign and puzzled observers of a distasteful local conflict amongst the people they have colonised. (In this respect, Gibson appears to buy into the anti-Jewish narrative retrospectively imposed by the Roman Catholic Church once it had come to its concordat with the Roman Empire and was keen to excuse its new Masters of any responsibility for the crucifixion.)
But once the notorious beating scene happens, the film goes through an intensive threshold. Here, the Roman soldiers are seen to be gratuitously cruel psychopaths, whose excessive zeal in punishing Jesus exceeds any “duty”. It is clear by now that The Passion of the Christ has no ethnic axe to grind: it is about the stupidity and cruelty of the human species, but more importantly, about an escape route from the otherwise meaningless and nihilistic cycle of abuse begetting abuse that is human History.
The Gnostic flashes that surface in the Gospels are given full weight in Gibson’s film. “My kingdom is not of this world.” But Gibson refuses to give any comfort to those life-deniers and body-haters that Nietzsche rightly excoriates in his many attacks on Christianity. There is little supernatural or transcendent dimension to The Passion’s vision. If Christ’s kingdom is not of this world, Gibson gives us few reasons to assume that this kingdom will be the Platonic heaven of which those tired of the body dream.
The World which Christ rejects is the World of Lies, the consensual hallucination of established power and authority. By contrast, Christ’s kingdom only subsists whenever there is an Affectionate Collectivity. In other words, it exists not as some deferred supernatural reward, but in the Ethical actions of those, who in becoming-Christ, keep his spirit alive. Again, it is important to stress that this spirit is not some metaphysical substance, but a strictly material abstract machine that can be instantiated only through actions and practices. Loving God and loving others more than yourself are preconditions for dissolving your ego and gaining deliverance from the Hell of Self.
What, from one perspective, is the utter humiliation and degradation of Jesus’ body is on the other a coldly ruthless vision of the body liberated from the “wisdom and limits of the organism”.
Masochristianity.
Christ’s Example is simply this: it is better to die than to pass on abuse virus or to in any way vindicate the idiot vacuity and stupidity of the World of authority.
Power depends upon the weakness of the organism. When authority is seriously challenged, when its tolerance is tested to the limit, it has the ultimate recourse of torture. The slow, graphic scenes of mindless physical degradation in The Passion of the Christ are necessary for revealing the horrors to which Jesus’ organism was subject. It is made clear that he could have escaped the excruciating agony simply by renouncing his Truth and by assenting to the Authority of the World. Christ’s Example insists: better to let the organism be tortured to death (“If thine own eye offend thee, pluck it out”) than to bow, bent-headed, to Authority.
This is what is perhaps most astonishing about Gibson’s film. Far from being a statement of Catholic bigotry, it can only be read as an antiauthoritarian AND THEREFORE anti-Catholic film. For the Pharisees of two millennia ago, puffed up in their absurd finery, substitute the child-abuser apologists of today’s gilt-laden, guilt-ridden Vatican. Against all the odds, against two thousand years of cover-ups and dissimulation, The Passion of the Christ recovers the original Christ, the anti-Wordly but not otherwordly Christ of Liberation Theology: the Gnostic herald of Apocalypse Now.
this movie doesn’t move me
As I nervously anticipate the new Doctor Who (although after McCoy, after McGann, what more can there be to fear?), it is worth thinking again about the appeal of the series, and also, more generally, about the unique importance of what I will call “uncanny fiction”.
A piece by Rachel Cooke in the Observer two weeks ago brought these questions into sharp relief.[2] Cooke’s article was more than an account of a television series; it was a story about the way broadcasting, family and the uncanny were webbed together through Doctor Who. Cooke writes powerfully about how her family’s watching of the programme was literally ritualised: she had to be on the sofa, hair washed, before the continuity announcer even said the words, “And now…” She understands that, at its best, Dr Who’s appeal consisted in the charge of the uncanny — the strangely familiar, the familiar estranged: cybermen on the steps of St Paul’s, yeti at Goodge Street (a place whose name will forever be associated with the Troughton adventure, “The Web of Fear”, for Scanshifts,[3] who saw it whilst living in New Zealand).
Inevitably, however, she ends the piece on a melancholy note. Cooke has been to a screening of the first episode of the new series. She enjoys its expensive production values, its “sinister moments”, its use of the Millennium Wheel. “But it is not — how shall I put this? — Doctor Who.” Faced with an “overwhelming sense of loss”, she turns to a DVD of the Baker story Robots of Death for a taste of the “real” stuff, the authentic experience that the new series cannot provide. But this proves, if anything, to be even more of a disappointment. “How slow the whole thing seems, and how silly the robots look in their Camilla Parker-Bowles-style green quilted jackets… Good grief.”
Let’s leave aside, for a moment, all the post-post-structuralist questions about the ontological status of the text “itself”, and consider the glum anecdote with which the article concludes:
Before Christmas, when it became clear that my father’s cancer was in its final stages, my brother went out and bought a DVD for us all to watch together. Dad was too ill, and box went unopened. At the time, I cried about this; yet another injustice. Now I know better. Some things in life can’t ever be retrieved — an enjoyment of green robots in sequins and pedal pushers being one of them.
This narrative of disillusionment belongs to a genre that has become familiar: the postmodern parable. To look at the old Doctor Who is not only to fail to recover a lost moment; it is to discover, with a deflating quotidian horror, that this moment never existed in the first place. An experience of awe and wonder dissolves into a pile of dressing up clothes and cheap special effects. The postmodernist is then left with two options: disavowal of the enthusiasm, i.e. what is called “growing up”, or else keeping faith with it, i.e. what is called “not growing up”. Two fates, therefore, await the no longer media-mesmerised child: depressive realism or geek fanaticism.
The intensity (with) which Cooke invested in Doctor Who is typical of so many of us who grew up in the Sixties and Seventies. I, slightly younger than her, remember a time when those twenty-five minutes were indeed the most sacralised of the week. Scanshifts, slightly older than me, remembers a period when he didn’t have a functioning television at home, so he would watch the new episode furtively at a department store in Christchurch, silently at first, until, delighted, he found the means of increasing the volume.
The most obvious explanation for such fervour — childhood enthusiasm and naïveté — can also be supplemented by thinking of the specific technological and cultural conditions that obtained then. Freud’s analysis of the unheimlich, the “unhomely”, is very well known, but it is worth linking his account of the uncanniness of the domestic to television. Television was itself both familiar and alien, and a series which was about the alien in the familiar was bound to have particularly easy route to the child’s unconscious. In a time of cultural rationing, of modernist broadcasting, a time, that is, in which there were no endless reruns, no VCRs, the programmes had a precious evanescence. They were translated into memory and dream at the very moment they were being seen for the first time. This is quite different from the instant — and increasingly pre-emptive — monumentalisation of postmodern media productions through “makings of” documentaries and interviews. So many of these productions enjoy the odd fate of being stillborn into perfect archivisation, forgotten by the culture while immaculately memorialised by the technology.
But were the conditions for Dr Who’s colonising presence in the unconscious of a generation merely scarcity and the “innocence” of a “less sophisticated” time? Does its magic, as Cooke implies, crumble like a vampire seducer in bright sunlight when exposed to the unbeguiled, unforgiving eyes of the adult?
According to Freud’s famous arguments in Totem and Taboo and The Uncanny, we moderns recapitulate in our individual psychological development the “progress” from narcissistic animism to the reality principle undergone by the species as a whole. Children, like “savages”, remain at the level of narcissistic auto-eroticism, subject to the animistic delusion that their thoughts are “omnipotent”; that what they think can directly affect the world.
But is it the case that children ever “really believed” in Doctor Who? Žižek has pointed out that when people from “primitive” societies are asked about their myths, their response is actually indirect. They say “some people believe”. Belief is always the belief of the other. In any case, what adults and moderns have lost is not the capacity to uncritically believe, but the art of using the series as triggers for producing inhabitable fictional playzones.
The model for such practices is the Perky Pat layouts in Philip K. Dick’s The Three Stigmata of Palmer Eldritch. Homesick off-world colonists are able to project themselves into Ken and Barbie-like dolls who inhabit a mockup of the earthly environment. But in order to occupy this set they need a drug. In effect, all the drug does is restore in the adult what comes easily to a child: the ability not to believe, but to act in spite of the lack of belief.
In a sense, though, to say this is already going too far. It implies that adults really have given up a narcissistic fantasy and adjusted to the harsh banality of the disenchanted-empirical. In fact, all they have done is substituted one fantasy for another. The point is that to be an adult in consumer capitalism IS to occupy the Perky Pat world of drably bright soap opera domesticity. What is eliminated in the mediocre melodrama we are invited to call adult reality is not fantasy, but the uncanny — the sense that all is not as it seems, that the kitchen-sink everyday is a front for the machinations of parasites and alien forces which either possess, control or have designs upon us. In other words, the suppressed wisdom of uncanny fiction is that it is THIS world, the world of liberal-capitalist commonsense, that is a stage set with wobbly walls. As Scanshifts and I hope to demonstrate in our upcoming audiomentary London Under London on Resonance FM, the Real of the London Underground is better described by pulp and modernism (which in any case have a suitably uncanny complicity) than by postmodern drearealism. Everyone knows that, once the wafer-thin veneer of “persons” is stripped away, the population on the Tube are zombies under the control of sinister extra-terrestrial corporations.
The rise of fantasy as a genre over the last twenty-five years can be directly correlative with the collapse of any effective alternative reality structure outside capitalism in the same period. Watching something like Star Wars, you immediately think two things. Its fictional world is BOTH impossibly remote, too far-distant to care about, AND too much like this world, too similar to our own to be fascinated by. If the uncanny is about an irreducible anomalousness in anything that comes to count as the familiar, then fantasy is about the production of a seamless world in which all the gaps have been mono-filled. It is no accident that the rise of fantasy has gone alongside the development of digital FX. The curious hollowness and depthlessness of CGI arises not from any failure of fidelity, but, quite the opposite, from its photoshopping out of the Discrepant as such.
The fantasy structure of Family, Nation and Heroism thus functions, not in any sense as a representation, false or otherwise, but as a model to live up to. The inevitable failure of our own lives to match up to the digital Ideal is one of the motors of capitalism’s worker-consumer passivity, the docile pursuit of what will always be elusive, a world free of fissures and discontinuities. And you only have to read one of Mark Steyn’s preppy phallic fables (which need to be ranked alongside the mummy’s boystories of someone like Robert E. Howard) to see how fantasy’s pathetically imbecilic manichean oppositions between Good and Evil, Us and (a foreign, contagious) Them are effective on the largest possible geopolitical stage.
fear and misery in the third reich ‘n’ roll
I (belatedly) went to see the traumatically powerful Downfall a couple of nights ago at the behest of Karl Kraft. Overhype of mediocre tat renders one suspicious of any praise surrounding contemporary films, but this is a genuine masterpiece, and one that can only be appreciated fully in the cinema environment, where the relentless pummelling of the Soviet artillery and the claustrophobic airlessness of the Hitler bunker have a crushingly visceral presence.
Downfall, actually, is the second film this year (the first was The Aviator) to flout my otherwise reliable dictum that movies based on real life are to be avoided. But the reason why both work is that they describe situations in which reality had itself gone psychotic. As Ballard has observed, the Nazi delirium was one of those moments when the distinction between the internal and the external world no longer held: hell has erupted on earth, there is no escape, no future, and you know it…
Downfall is fascinating because it closely and, I’m assuming, meticulously documents the “line of abolition” that Deleuze and Guattari claim is constitutive of Nazism. For Deleuze and Guattari, who borrow the idea from Virilio, the Nazis’ scheduled auto-annihilation — “if we are defeated, better that the nation should perish” — was less a forced contingency than the realisation, the very consummation, of the Nazi project.[2] Deleuze and Guattari’s account might be dubious empirically, but the great service it provides for cultural analysis may not be the idea that Nazism is suicidal, but the thought that the suicidal, the self-destructive is Nazi.
Since at least the death of Chatterton, popular culture has found the temptation to glamourise self-destruction irresistible. The Nazis provide the definitive twentieth-century version of this age-old Romance of Death. As Ballard noted in his essay on Hitler, “Alphabets of Reason”, the Nazis are a creepily modern phenomenon, their technicolour glamour a world away from the fussy frock-coated figures of the Edwardian British ruling elite. The Nazis’ facility with broadcasting laid the groundwork for the media landscape we now occupy. Hitler as the first rock star?
Downfall takes us through the scenes in which the Nazi party disintegrates only for the Third Reich ‘n’ Roll to begin. The death of the frontman is the blood-sacrifical rite that will guarantee a hideous immortality. Hitler was the first twentieth-century figure to pass from historical individuality to becoming a permanent archetype-artefact in the the McLuhan-Ballard media unconscious. After him, Kennedy, Malcolm X, King, Morrison, Hendrix, Curtis seem local, particular, whereas Hitler comes to stand for a general principle, for modern Evil itself.
As spectators of Downfall, we spend most of the time in the Führer Bunker, forced into an unsettling sympathy if not for the Reich’s leaders then for those who were loyal to them, the secretaries and functionaries who admired, by no means fanatically, Hitler and National Socialism. Meanwhile, the glimpses we have of the Berlin above show a landscape out of The Triumph of Death, a city devolving into total anomie: child conscripts, vigilante hangings, intoxicated revelling, carnivalesque sexual excess.
While those scenes play out, you can almost hear Johnny Rotten leering, “when there’s no future how can there be sin?” (Although for Germany, in fact, there was nothing but the future: immediate postwar Germany was subject to a willed amnesia, a disavowal of cultural memory.) It’s no accident that post-punk in many ways begins here. As the Pistols pursue their own line of abolition into the scorched earth nihilism of “Belsen was a Gas” and “Holidays in the Sun”, they keep returning to the barbed-wire scarred Boschscape of Nazi Berlin and the Pynchon Zone it became after the war. Siouxsie famously sported a swastika for a while, and although much of the flaunting of the Nazi imagery was supposedly for superficial shock effects, the punk-Nazi connection was about much more than trite transgressivism. Punk’s very 1970s, very British fixation on Nazism posed ethical questions so troubling they could barely be articulated explicitly: what were the limits of liberal tolerance? Could Britain be so sure that it had differentiated itself from Nazism (a particularly pressing issue at a time that the NF was gathering an unprecedented degree of support)? And, most unsettling of all, what is it that separates Nazi Evil from heroic Good?
Downfall poses that last question with a real force, and it is a question that has a special resonance at the moment given Žižek and Zupančič’s theory of the ungrounded Act as the very definition of the ethical. As I watched the most “monstrous” act depicted in the film, Frau Goebbels’ drugging and then poisoning her children — better this “redemption”, she reasoned, than that they be left in a world without National Socialism — I was struck by the parallel with Sethe in Toni Morrison’s Beloved, who kills one of her children rather than let it fall into the hands of the slavers. What is to separate Frau Goebbels’ act of abominable Evil from Sethe’s act of heroic Good? (Those who have read The Fragile Absolute will remember that Žižek uses Sethe precisely as an example of a Good entirely alien to liberal morality, with its ethic of enlightened self-interest.)
Downfall seems to invite us to sympathise with the “liberal Nazis”, the “reasonable” doctor, for instance, who wants to keep the medical services running and is disgusted and aghast at the “senseless, suicidal” behaviour that results from seeing Duty through to the end; the General who wants to end the war to protect the lives of civilians. But these “pragmatic humanitarian” figures are the least defensible because they are not prepared to follow the principles of their actions to the end (if they were committed to Nazism, why not die for it? If they weren’t, why not resist it?). Strangely, it is almost as if the film seems to suggest that what was irredeemably malevolent about the Nazis was their will to die for the cause.
In spite of ourselves, we find ourselves thinking that the Evil Nazis — those who totally identify with the Nazi project and who destroy themselves when it is clear that has failed — attain a certain tragic heroism by refusing to give up on their fundamental commitment. All of which leads us back to the old question: does the Kantian emphasis on unconditional duty legitimate Nazi Evil?
Zupančič, who has done so much to re-discover Kantian ethics from the perspective of Lacanian theory, addressed this question in her interview with Cabinet magazine:
Recall that, in Hannah Arendt’s famous example, Nazi functionaries like Eichmann took themselves to be Kantians in this respect: They claimed to act simply on principle without any consideration for the empirical consequences of their actions. In what way is this a perversion of Kant?
This attitude is “perverse” in the strictest clinical meaning of the word: The subject has here assumed the role of a mere instrument of the Will of the Other. In relation to Kant, I would simply stress the following point, which has already been made by Slavoj Žižek: In Kantian ethics, we are responsible for what we refer to as our duty. The moral law is not something that could clear us of all responsibility for our actions; on the contrary, it makes us responsible not only for our actions, but also — and foremost — for the principles that we act upon.[3]
Is this enough though to distinguish Goebbels from Sethe? Was it really the case that Frau Goebbels was making herself into “a mere instrument of the Will of the Other”? Or had she freely chosen to assume responsibility for her actions and for the principles on which she acted? Remember that Kantian freedom consists in choosing to obey the moral law. To be motivated by anything other than “duty” is to be driven by “pathological” passions, and hence not to be free at all. There is no obvious pathological motivation for Frau Goebbels’ actions. She stood to gain nothing from this act of “destroying what is best in her” (and indeed, shortly after she killed her children, she consented to be shot by her husband).
The only answer you are left with is that the Nazi Cause is itself a pathology. By definition, the Nazi Act cannot be universal, since it is based upon preserving — if only, at the end, at the level of myth — the particular pathological characteristics of “a chosen people” and, more abstractly therefore, of defending the very principle of “ethnic pathology”. Sethe’s abominable act in Beloved is an act of Unplugging from a social situation fatally, totally corrupted by a lethally imbecilic racial delirium; Frau Goebbels’ multiple infanticide, by contrast, is an attempt to hardwire herself and her children into an ethnocidal madness that can only live through their deaths and the deaths of millions of others.
we want it all
What use might Nietzsche be today? Or, to put it another way: which Nietzsche might be of use, now?
It will come as no surprise that I would count Nietzsche the perspectivist — he who questioned not only the possibility but the value of Truth — as the enemy. There will be even fewer surprises that I would reject the Dionysian Nietzsche, the celebrant of transgressive desire. This Nietzsche, in any case, is largely a post-Bataillean retrospective construct (even in The Birth of Tragedy, what Nietzsche mourns is the lost tension between Dionysus and Apollo; and in his later writings Nietzsche is more likely to be found extolling the necessity of constraints and limitations than he is to be heard calling for the unrestrained venting of libido). The perspectival and the Dionysiac are far too timely.
The Nietzsche that remains untimely — and by that I do not mean outmoded, very far from it — is Nietzsche the aristocrat. Nietzsche should not be taken seriously as a political theorist, at least not at the level of his positive prescriptions. But the Nietzsche who denounces the insipidity and mediocrity that result from democracy’s levelling impulses could not be more acute. Passage after passage of polemic in Beyond Good and Evil seems uncannily apposite in these times of focus-grouped blandness and “autonomous herding”. Nietzsche’s real interests lay with cultural politics; government and social institutions troubled him only insofar as they produced cultural effects, his ultimate question being: “What are the conditions in which great cultural artefacts can emerge?”
I was reminded of Nietzsche’s warnings about what would happen to culture if all “special claims and privileges” are denied, if the very concept of superiority is abolished, when Chantelle Houghton won Celebrity Big Brother a week or so ago (it already seems much longer than that). I was reminded, too, of Nietzsche’s scalding admonition that “harshness” and “cruelty” must be cultivated if the human animal is to transformed, by hammer blows and force of will, into a great work of art; reminded, especially, when some posters on Dissensus were seriously advancing “niceness” — niceness, that is — as a desirable trait.
Chantelle’s victory wasn’t just a popularity contest: as Marcello’s excellent Big Brother piece observed, a principle was at stake, the principle that ordinariness must trump any notion of superiority:
“You are not going to win support or respect by placing yourself out of the ordinary. You need to be approachable but you also need to be yourself. That’s what young people respect.” That’s a recent quote from one Alex Folkes, the speaker for a pressure group named Votes at Sixteen, apropos George Galloway, and it’s the kind of exhausting, fatuous antiphilosophy which tempts me to form a pressure group called Votes at Thirty. Nevertheless it is (un)pretty fitting for an age bereft of desire for godhood. Where once we assembled in front of screens or stages to gasp in awe at people doing and achieving things we could never hope of doing or achieving ourselves — but how we luxuriated, carried ourselves afloat, on the dream of doing so — now all we require is a humbling mirror. This is the sort of thing which stops dangerous people from gaining power, but also the kind of closure which would ultimately forbid all art. Where once we assembled in front of screens or stages to gasp in awe at people doing and achieving things we could never hope of doing or achieving ourselves […] now all we require is a humbling mirror.[2]
This is Celebreality: the simultaneous desublimation of the Star and the elevation of “the ordinary”. The commentary on Celebrity Big Brother treated it as self-evident that people will want to “identify with” media figures who offer a comforting and unchallenging reflection of themselves at their most mediocre, stupid and harmless. Julie Burchill’s endlessly reiterated polemic in favour of Big Brother — that it allows working-class people opportunities to break into a media otherwise dominated by the privileged — is baseless for three reasons. First, because the real beneficiaries of Big Brother are not the contestants, whose “career” is notoriously short-lived, but Endemol, with its coterie of smug graduate producers. Second, because Big Brother trades in a patronising and reductive image of the working class, the dominion of Celebreality relies upon the mediocrats inducing the working class into corresponding to — and “identifying with” — that image. Third, because Big Brother and reality TV have effaced those areas of popular culture in which a working class that aspired to more than “wealth” or “fame” once excelled. Its rise has meant a defeat for that over-reaching proletarian drive to be more, (I am nothing but should be everything), a drive which negated Social Facts by inventing Sonic Fictions, which despised “ordinariness” in the name of the strange and the alien. On Celebrity Big Brother, Pete Burns, with his casual cruelties, his savage articulacy and his Masoch-furs, was a cartoon symbol of those lost ambitions, skulking and sulking at the periphery, a glam prince in an age of post-Blairite roundheads.
We all know that the “reality” of reality TV is an artful construction, an effect not only of editing but of a Lorenzian rat-in-a-mirrored-labyrinth artificial environment which attenuates psychology into a series of territorial twitches. The “reality” that is designated is significant more for what is absent from it than for any positive properties it is deemed to possess. And what is absent, above all, is fantasy. Or rather, fantasy objects.
We once turned to popular culture because it produced fantasy objects; now, we are asked to “identify with” the fantasising subject itself. It was entirely appropriate that, the week after Chantelle won Celebrity Big Brother, Smash Hits should have announced its imminent closure.
Smash Hits began just as the glam continuum was winding down. What Smash Hits took from punk was its least Nietzschean affect, namely its “irreverence”. In the case of Smash Hits, this amounted to a compulsory trivialisation coupled with a kind of good-humoured debunking of the pretensions of Stardom. Behind Smash Hits’ silly surrealism was good solid commonsense and a conflicted desire, to both have your idols and kill them. Heat was Smash Hits’ successor and what rendered it obsolete. No need to bother with the (pop) pretext, now you can consume celebrity directly, untroubled by pop’s embarrassing Dreams. Chantelle is the logical conclusion of the process: the anti-Pop anti-Idol.
Nietzsche’s contention was that the kind of levelling Chantelle stands for was the inevitable and necessary consequence of all egalitarianism. Yet popular culture was once the arena which demonstrated that any genuine egalitarianism is inimical to any such levelling down. I wrote last year of goth as “a paradoxically egalitarian aristocracy in which membership [is] not guaranteed by birth or beauty but by self-decoration”; will popular culture ever again teach us that egalitarianism is not hostile to, but relies upon, a will-to-greatness, an unconditional demand for the excellent?
gothic oedipus: subjectivity and capitalism in christopher nolan’s batman begins
Batman has contributed more than its fair share to the “darkness” that hangs over contemporary culture like a picturesque pall. “Dark” designates both a highly marketable aesthetic style and an ethical, or rather anti-ethical, stance, a kind of designer nihilism whose chief theoretical proposition is the denial of the possibility of the Good. Gotham, particularly as re-invented by Frank Miller in the Eighties, is, along with Gibson’s Sprawl and Ridley Scott’s LA, one of the chief geomythic sources of this trend.[2]
Miller’s legacy for comics has been ambivalent at best. Reflect on the fact that his rise coincides with the almost total failure of superhero comics to produce any new characters with mythic resonance.[3] The “maturity” for which Miller has been celebrated corresponds with comics’ depressive and introspective adolescence, and for him, as for all adolescents, the worst sin is exuberance. Hence his trademark style is deflationary, taciturn: consider all those portentous pages stripped of dialogue in which barely anything happens and contrast them with the crazed effervescence of the typical Marvel page in the Sixties. Miller’s pages have all the brooding silence of a moody fifteen-year-old boy. We are left in no doubt: the silence signifies.
Miller traded on a disingenuous male adolescent desire to both have comics and to feel superior to them. But his demythologisation, inevitably, produced only a new mythology, one that posed as more sophisticated than the one it has displaced but is in fact an utterly predictable world of “moral ambivalence” in which “there are only shades of grey”. There are reasons for being highly sceptical about Miller’s bringing into comics a noir-lite cartoon nihilist bleakness that has long been a cliché in films and books. The “darkness” of this vision is in fact curiously reassuring and comforting, and not only because of the sentimentality it can never extirpate. (Miller’s “hard-bitten” world reminds me not so much of noir, but of the simulation of noir in Dennis Potter’s Singing Detective, the daydream-fantasies of a cheap hack, thick with misognyny and misanthropy and cooked in intense self-loathing.)
It is hardly surprising that Miller’s model of realism came to the fore in comics at the time when Reaganomics and Thatcherism were presenting themselves as the only solutions to America and Britain’s ills. Reagan and Thatcher claimed to have “delivered us from the ‘fatal abstractions’ inspired by the ‘ideologies of the past’”.[4] They had awoken us from the supposedly flawed, dangerously deluded dreams of collectivity and re-acquainted us with the “essential truth” that individual human beings can only be motivated by their own animal interests.
These propositions belong to an implicit ideological framework we can call capitalist realism. On the basis of a series of assumptions — human beings are irredeemably self-interested, (social) Justice can never be achieved — capitalist realism projects a vision of what is “Possible”.
For Alain Badiou, the rise to dominance of this restricted sense of possibility must be regarded as a period of “Restoration”. As Badiou explained in an interview with Cabinet magazine, “in France, ‘Restoration’ refers to the period of the return of the King, in 1815, after the Revolution and Napoleon. We are in such a period. Today we see liberal capitalism and its political system, parlimentarianism, as the only natural and acceptable solutions”.[5] According to Badiou, the ideological defence for these political configurations takes the form of a lowering of expectations:
We live in a contradiction: a brutal state of affairs, profoundly inegalitarian — where all existence is evaluated in terms of money alone — is presented to us as ideal. To justify their conservatism, the partisans of the established order cannot really call it ideal or wonderful. So instead, they have decided to say that all the rest is horrible. Sure, they say, we may not live in a condition of perfect Goodness. But we’re lucky that we don’t live in a condition of Evil. Our democracy is not perfect. But it’s better than the bloody dictatorships. Capitalism is unjust. But it’s not criminal like Stalinism. We let millions of Africans die of AIDS, but we don’t make racist nationalist declarations like Milosevic. We kill Iraqis with our airplanes, but we don’t cut their throats with machetes like they do in Rwanda, etc.[6]
Capitalism and liberal democracy are “ideal” precisely in the sense that they are “the best that one can expect”, that is to say, the least worst.[7]This chimes with Miller’s rendition of the hero in The Dark Knight Returns and Year One: Batman may be authoritarian, violent and sadistic, but in a world of endemic corruption, he is the least worst option. (Indeed, such traits may turn out to be necessary in conditions of ubiquitous venality.) Just as Badiou suggests, in Miller’s Gotham it is no longer possible to assume the existence of Good. Good has no positive presence — what Good there is has to be defined by reference to a self-evident Evil which it is not. Good, that is to say, is the absence of an Evil whose existence is self-evident.
The fascination of the latest cinema version of Batman, Batman Begins (directed by Christopher Nolan) consists in its mitigated return to the question of Good. The film still belongs to the “Restoration” to the degree that it is unable to imagine a possible beyond capitalism: as we shall see, it is a specific mode of capitalism — post-Fordist finance capital — that is demonised in Batman Begins,not capitalism per se. Yet the film leaves open the possibility of agency which capitalist realism forecloses.
Nolan’s revisiting of Batman is not a re-invention but a reclaiming of the myth, a grand syncresis that draws upon the whole history of the character.[8]Gratifyingly, then, Batman Begins is not about “shades of grey” at all, but rather about competing versions of the Good. In Batman Begins, Christian Bale’s Bruce Wayne is haunted by a superfluity of fathers (and a near absence of mothers: his mother barely says a word), each with their own account of the Good. First, there is his biological father, Thomas Wayne, a rosetinted, soft focus moral paragon, the very personification of philanthropic Capital, the “man who built Gotham”. In keeping with the Batman myth established in the Thirtes Detective Comics, Wayne Pere is killed in a random street robbery, surviving only as a moral wraith tormenting the conscience of his orphaned son. Second, there is R’as Al Ghul, who in Nolan’s film is Wayne’s hyperstitional[9]mentor-guru, a Terroristic figure who represents a ruthless ethical code completely opposed to the benevolent paternalism of Thomas Wayne. Bruce is assisted in the struggle (fought out in his own psyche) between these two Father figures by a third, Michael Caine’s Alfred, the “maternal” carer who offers the young Bruce unconditional love.
The struggle between Fathers is doubled by the conflict between Fear and Justice that has been integral to the Batman mythos since it first appeared in 1939. The challenge for Bruce Wayne in Batman Begins is not only to best Fear, as wielded by the Miller-invented crime boss Falcone and the Scarecrow with his “weaponised hallucinogens”, but to identify Justice, which, as the young Wayne must learn, cannot be equated with revenge.
From the start, the Batman mythos has been about the pressing of Gothic Fear into the service of heroic Justice. Echoing the origin story as recounted in Detective Comics in 1939, which has Bruce famously declare, “Criminals are a superstitious cowardly lot, so my disguise must be able to strike terror into their hearts”, Nolan’s Wayne dedicates himself to turning fear against those who use it. Yet Nolan’s version makes the origin story both more Oedipal and more anti-Oedipal than it appeared in Detective Comics. In the original comic, Bruce settles upon the name “Batman” when a single bat flies into his room. Nolan’s rendering of Batman’s primal scene is significantly different, in that it takes place outside the family home, beyond the realm of the Oedipal, in a cave in the capacious grounds of Wayne Manor, and not with a single bat but with a whole (Deleuzian) pack.[10]The name “Batman”, with its suggestions of becoming-animal, does indeed have a Deleuzoguattarian resonance. Yet the proximity of Batman’s name to that of some of Freud’s case histories — “Ratman” especially, but also “Wolfman” — is no accident either. Batman remains a thoroughly Oedipal figure (as Batman Begins leaves us no doubt).[11]Batman Begins re-binds the becoming-animal with the Oedipal by having Bruce’s fear of bats figure as a partial cause of his parents’ death. Bruce is at the opera when the sight of bat-like figures on stage drives him to nag his parents until they leave the theatre and are killed.
The Gothic and the Oedipal elements of the Batman mythos were entwined immediately, on the two pages of Detective Comics on which Batman’s origin was first told. As Kim Newman identifies, Wayne’s epiphanic revelation that “I must be a terrible creature of the night… I shall become a BAT… a weird figure of the night”, contains “subliminal” quotes from Dracula (“creatures of the night, what sweet music they make”) and The Cabinet of Dr Caligari (“you shall become Caligari”).[12] These panels follow three at the top of the page where the shocked Bruce sees the bodies of his parents (“father, mother […] Dead, they’re dead”) and “swears by [their deaths] to avenge [them] by spending the rest of my life warring on all criminals”. Batman is self-consciously imagined — and self-created — as a Gothic monster, a “weird figure of the dark”, but one who will use “the night” against the criminals who habitually hide in it.
If Batman was heavily indebted to German Expressionism — via Universal’s horror pictures — so, famously, was film noir, which emerged, like Batman, in the late Thirties and early Forties. (As we’ve already seen, Miller’s rendition of Batman can be seen as in many ways a postmodern investigation of this parallel.) Remarks made by Alenka Zupančič suggest a possible hidden source for the complicity between Batman and noir: Oedipus again. “[I]n contrast to Hamlet”, Zupančič writes,
the story of Oedipus has often been said to belong to the whodunnit genre. Some have gone even further, and seen in Oedipus the King the prototype of the noir genre. Thus Oedipus the King appeared in the “noir series” of French publisher Gallimard (“translated from the myth” by Didier Lamaison).[13]
Batman, the superhero-detective, walks in the footsteps of the first detective, Oedipus.
Ultimately, however, the problem for Batman is that he remains an Oedipus who has not gone through the Oedipus complex. As Zupančič points out, the Oedipus complex turns on the discrepancy between the Symbolic and the empirical father: the Symbolic Father is the embodiment of the Symbolic order itself, solemn carrier of Meaning and bearer of the Law; the empirical father is the “simple, more or less decent man”. For Zupančič, the standard rendering of the “typical genesis of subjectivity” has it that the child first of all encounters the Symbolic father and then comes to learn that this mighty figure is a “simple, more or less decent man”. Yet, as Zupančič establishes, this trajectory is the exact inverse of the one which Oedipus pursues. Oedipus begins by encountering a “rude old man at the crossroads” and only later does he learn that this “simple man”, this “vulgar creature”, was the Father. Thus “Oedipus travels the path of initiation (of ‘symbolisation’) in reverse and, in so doing, he encounters the radical contingency of the Meaning borne by the symbolic.”[14]
For Bruce Wayne, though, there is no discrepancy at all between the Symbolic and the empirical. Thomas Wayne’s early death means that he is frozen in his young son’s psyche as the mighty emissary of the Symbolic; he is never “desublimated” into a “simple man”, but remains a moral exemplar — indeed he is the representative of Law as such, who must be avenged but who can never be equalled. In Batman Begins, it is the intervention of R’as Al Ghul which prompts an Oedipal crisis. The young Wayne is convinced that his father’s death is his fault, but Al Ghul tries to convince him that his parents’ death is his father’s responsibility because the good-natured and liberal Thomas Wayne did not know how to Act; he was a weak-willed failure. Yet Bruce refuses to go through this initiation and retains loyalty to the “Name of the Father” while Al Ghul remains a figure of excess and Evil.
The question Al Ghul poses to Bruce is: are you, with your conscience, your respect for life, too weak-willed, too frightened to do what is Necessary? Can you Act? Wayne is forced to decide: is Al Ghul what he claims to be, the ice-cold instrument of impersonal Justice, or its grotesque parody? The ultimate Evil in the film turns out to originate from Ghul’s excessive zeal, not from some hoaky diabolism nor from some psycho-biographical happenstance.[15]
In this respect, it is the film that Žižek wanted Revenge of the Sith to be: a film, that is to say, which dares to hypothesise that Evil might result from an excess of Good. For Žižek, “Anakin [Skywalker] should have become a monster out his very excessive attachment with seeing Evil everywhere and fighting it”, but
[i]nstead of focusing on Anakin’s hubris as an overwhelming desire to intervene, to do Good, to go to the end for those he loves and thus fall to the Dark Side, Anakin is simply shown as an indecisive warrior who is gradually sliding into Evil by giving way to the temptation of Power, by falling under the spell of the evil Emperor.[16]
In parallel with Žižek’s reading of Revenge of the Sith, Batman Begins’ treatment of the question of the Father — who is the father? — is doubled by the looming (omni-)presence of finance capital, and the issue of what is to be done about it. In Batman’s universe of course, “the Name of the Father” — Wayne — is also the name of a capitalist enterprise. The takeover of Wayne Industries by shareholder capital means that Thomas’ name has been stolen. Consequently, Bruce Wayne’s struggle against finance capital is also, inevitably, an attempt to restore the besmirched Name of the Father. Since Wayne Industries is at the heart — literally and figuratively — of the city, post-Fordist Gotham finds itself as blighted as the Sphinx-cursed Thebes. Its infrastructure rotten, its civil society disintegrated, Gotham is in the grip of a depression and a crime wave, both of which are attributed to the newly predatory, delocalised Capital that now has control of the Wayne corporation. The impact of finance capital is given a more personal narrative focus through the character of the kindly Lucius Fox (another candidate for Father surrogate)[17]who is degraded by the new regime. The implication is that this state of rottenness can only be rectified once the name of the Father resumes its rights.
It is in its treatment of capitalism that Batman Begins is at its most intriguingly contradictory. In part, this can be attributed to the effects of attempting to retrofit the 1930s core narrative engine into a twenty-firstcentury vehicle: the reference to the depression is a clear Thirties echo, setting up a disjunction with a contemporary USA that has enjoyed an unprecedented period of economic success. In keeping with capitalism itself, Deleuze and Guattari’s “motley painting of everything that ever was”, Nolan’s Gotham is an admixture of the medieval and the ultra-contemporary, of the American, the European and the Third World. It resembles at once the crooked steeples and spires of German Expressionism and the favela-sprawls of cyberpunk[18]: the nightmare of Old Europe erupting in the heart of the American Megalopolis.
In a fascinating reading of Batman Begins, China Miéville argues that the film’s anti-capitalism cashes out as an advocacy of fascism. The film, he writes,
is about fascism’s self-realisation, and the only struggle it undergoes is to admit its own necessity. BB argues for the era of the absolute(ist) corporation against the “postmodern” social dilutions of shareholder capitalism (perceived here in old-school corporate paranoia as a kind of woolly weakness), let alone against the foolishness of those well-meaning liberal rich who don’t understand that their desire to travel with the poor and working class are the “causes” of social conflict, because The Rich Man At His Garden The Poor Man At His Gate, and that the blurring of those boundaries confuses the bestial instincts of the sheep-masses. The film argues quite explicitly (in what’s obviously, in its raised-train setting, structured as a debate with Spiderman 2, a stupid but good-hearted film that thinks people are basically decent) that masses are dangerous unless terrorised into submission (Spidey falls among the masses — they nurture him and make sure he’s ok. Bats falls among them — they are a murderous and bestial mob because they are not being “effectively scared enough”). The final way of “solving” social catastrophe is […] by the demolition of the mass transit system that ruined everything by literally raised the poor and put them among the rich: travelling together, social-democratic welfarism as opposed to trickle-downism is a nice dream but leads to social collapse, and if left unchecked terrorism that sends transit systems careering through the sky into tall buildings in the middle of New Yorkstyle cities — 9/11 as caused by the crisis of “excessive social solidarity”, the arrogance of masses “not being sufficiently terrified of their shepherds”. In all a film that says social stratification is necessary to prevent tragedy, and that it should be policed by terrorising the plebeians, for the sake of corporations which if there is a happy ending […] will end up back in the hands of a single enlightened despot, hurrah, to save us from the depredations of consensus.[19]
There is no doubt that the film poses finance capital as a problem that will be solved by the return of a re-personalised captal, with “the enlightened despot” Bruce taking on the role of the dead Thomas. It is equally clear, as we’ve already seen, that Batman Begins is unable to envisage an alternative to capitalism itself, favouring instead a nostalgic rewind to prior forms of capitalism. (One of the structuring fantasies of the film is the notion that crime and social disintegration are exclusively the results of capitalist failure, rather than the inevitable accompaniments to capitalist “success”.)
However, we must distinguish between corporate capitalism and fascism if only because the film makes such a point of doing so. The fascistic option is represented not by Wayne-Batman but by R’as al Ghul. It is al Ghul who plots the total razing of a Gotham he characterises as irredeemably corrupt. Wayne’s language is not that of renewal-through-destruction (and here Schumpterian capitalism and fascism, in most other respects entirely opposed, find themselves in sympathy), but of philanthropic meliorism. (It should also be noted that the masses who, in a pointed reference to Romero’s Living Dead films, threaten to consume and destroy Batman are under the influence of the Scarecrow’s “weaponised hallucinogens” when they attempt to dismember him, although this image of the masses no doubt tell us more about the political unconscious of the film-makers than it does about that of the masses.)
If the film’s handling of capitalism is incoherent, in what does its challenge to capitalist realism consist? It is to be found not at the level of politics but in its account of ethics, agency and subjectivity. Žižek’s classic account of ideology in The Sublime Object of Ideology turns on the difference between belief and action. At the level of belief, key capitalist ideas — commodities are animate; capital has a quasi-natural status — are repudiated, but it is precisely the ironic distance from such notions that allows us to act as if they are true. The disavowal of the beliefs allows us to perform the actions. Ideology, then, depends upon the conviction that what “really matters” is what we are, rather than what we do, and that “what we are” is defined by an “inner essence”. In terms of contemporary American culture, this plays out in the “therapeutic” idea that we can remain a “good person” regardless of what we do.
The film’s principal ethical lesson presents a reversal of this ideological conviction. In Wayne’s struggle to differentiate justice from revenge, revenge is personified by the uncompromising R’as al Ghul, while justice is represented by the assistant District Attorney, Rachel Dawes. Dawes is given the film’s crucial (anti-therapeutic) slogan, “It’s not who you are inside that counts, it’s what you do that makes you what you are.” The Good is possible, but not without Decision and the Act. In reinforcing this message, Batman Begins restores to the hero an existentialist drama that puts to flight not only capitalist realist nihilism, but also the niggling, knowing sprites of postmodern reflexivity[20]that have sucked his blood for way too long.
when we dream, do we dream we’re joey?
“When you dream, do you dream you’re Joey?”
— Carl Fogarty to Tom Stall, in David Cronenberg’s A History of Violence[2]
“In a dream he is a butterfly. […] When Choang-tsu wakes up, he may ask himself whether it is not the butterfly who dreams that he is Choang-tsu. Indeed he is right, and doubly so, first because it proves he is not mad, he does not regard himself as fully identical with Choang-tsu and, secondly, because he doesn’t fully understand how right he is. In fact, it is when he was the butterfly that he apprehended one of the roots of his identity — that he was, and is, in his essence, that butterfly who paints himself with his own colours — and it is because of this that, in the last resort, he is Choang-tsu.”
— Jacques Lacan, “The Split Between the Eye and the Gaze”[3]
The key scene in Cronenberg’s A History of Violence sees the local sheriff addressing the hero, Tom Stall (Viggo Mortensen), after a series of violent killings have disrupted the life of the small midwest town in which they both live: “It just doesn’t all add up.”
Superficially, A History of Violence is Cronenberg’s most accessible film since 1983’s The Dead Zone. Yet it is a film whose surface plausibility doesn’t quite cohere. All the pieces are there but, when you look closely, they can’t be made to fit together. Something sticks out…
What makes A History of Violence unsettling to the last is its uneasy relationship to genre: is it a thriller, a family drama, a bleak comedy, or a trans-generic allegory (“the Bush administration’s foreign policy based upon a Western”)? This generic hesitation means that it is a film suffused with the uncanny. Even when the standard motions of the thriller or the family drama are gone through, there is something awry, so that A History of Violence views like a thriller assembled by a psychotic, someone who has learned the conventions of the genre off by heart but who can’t make them work. Perversely, but appropriately for a Cronenberg picture, it is this “not quite working” that makes the film so gripping.
The near total absence of the prosthetics and FX for which Cronenberg is renowned from A History of Violence (traces of his old schtick survive only in the excessive shots of corpses after they have been shot in the face) has been remarked upon by most critics. In fact, Cronenberg’s renunciation of such imagery has been a gradual process, dating back at least as far as Crash (1998’s eXistenZ may turn out to be the last hurrah for Cronenberg’s pulsating, eroticised bio-machinery), but it has subtlised, rather than removed, his trademark ontological queasiness.
Myth is everywhere in A History of Violence: not only in the hokey smalltown normality which is threatened, nor in the urban underworld of organised crime that threatens to encroach upon it and destroy it, but also in the conflict between the two. A town like Millbrook, the Indiana setting for A History of Violence, has been as likely to feature in American cinema as an image of menaced innocence in its own right. Comparisons with Lynch are inevitable, but it is Hitchcock, not Lynch, who is the most compelling parallel. The Hitchcock comparison goes far beyond surface details, significant as they are, such as the fact that, as the Guardian review reminds us, A History of Violence’s “Main Street resembles the one in Phoenix, Arizona, where the real estate office is to be found in Psycho”.[4] There is a much deeper affinity between A History of Violence and Hitchcock which can be readily identified when we recall Žižek’s classic analysis of Hitchcock’s methodology. In Looking Awry, Žižek compares Hitchcock’s “phallic” montage with the “anal” montage of conventional cinema:
Let us take, for example, a scene depicting the isolated home of a rich family encircled by a gang of robbers threatening to attack it; the scene gains enormously in effectiveness if we contrast the idyllic everyday life within the house with the threatening preparations of the criminals outside: if we show in alternation the happy family at dinner, the boisterousness of the children, father’s benevolent reprimands, etc., and the “sadistic” smile of a robber, another checking his knife or gun, a third grasping the house’s balustrade. In what would the passage to the “phallic” stage consist? In other words, how would Hitchcock shoot the same scene? The first thing to remark is that the content of this scene does not lend itself to Hitchcockian suspense insofar as it rests upon a simple counterpoint of idyllic interior and threatening exterior. We should therefore transpose this “flat”, horizontal doubling of the action onto a vertical level: the menacing horror should be placed outside, next to the idyllic interior but well within it: under it, as its “repressed” underside. Let us imagine, for example, the same happy family dinner shown from the point of view of a rich uncle, their invited guest. In the midst of the dinner, the guest (and together with him ourselves, the public) suddenly “sees too much,” observes what he was not supposed to notice, some incongruous detail arousing in him the suspicion that the hosts plan to poison him in order to inherit his fortune. Such a “surplus knowledge” has so to speak an abyssal effect […] the action is in a way redoubled in itself, endlessly reflected as in a double mirror play… things appear in a totally different light, though they stay the same.[5]
What is fascinating about A History of Violence is that it recapitulates this passage from the anal to the phallic within its own narrative development, entirely appropriate for a film that shows, as Graham Fuller puts it, “the return of the phallus”.[6] It begins, precisely, with a non-Hitchcockian contrast between a threatening Outside (a long, sultry tracking shot of two killers leaving a motel) and an idyllic Inside (the Stalls’ family house, where the six-year-old daughter is comforted by her parents and her brother after she is woken from a nightmare). But as the film develops, it effectively re-topologises itself, interiorising the Threat, or, more accurately, showing that the Outside has always been Inside.
The Hitchcockian blot, the Thing that doesn’t fit, is the “hero” himself. The film’s central enigma — is the staid, pacific Tom Stall really the psychopathic assassin Joey Cusack? — can be resolved into the question: which Hitchcock film we are watching? Is A History of Violence a rehashing of The Wrong Man or Shadow of a Doubt? Disturbingly, it turns out that it is both at the same time.
Shadow of a Doubt is the working out of a family scene much like the one described by Žižek above, although in that case, it is the guest, the rich uncle, who is the threat to the domestic idyll. The uncle (Joseph Cotten) is a killer of rich widows who has holed up in the house of his sister’s family to hide from the police. The Wrong Man, meanwhile, sees a family destroyed when the father is falsely accused.
In Shadow of a Doubt, the uncle’s malevolence means that he must die so that the family idyll can be preserved. Only the Teresa Wright character knows the truth; the rest of the family, and the big Other of the community, are kept in ignorance. But of the family members in A History of Violence, by the end of the film, only the youngest child could plausibly not be aware that the family scene has always been a simulation. Crucial in this respect is the response of Stall’s wife, Edie (Maria Bello), as Ballard observed in his piece on A History of Violence in the Guardian:
A dark pit has opened in the floor of the living room, and she can see the appetite for cruelty and murder that underpins the foundations of her domestic life. Her husband’s loving embraces hide brutal reflexes honed by aeons of archaic violence. This is a nightmare replay of The Desperate Hours, where escaping convicts seize a middle-class family in their sedate suburban home — but with the difference that the family must accept that their previous picture of their docile lives was a complete illusion. Now they know the truth and realise who they really are.[7]
But this isn’t so much a matter of accepting reality in the raw, as it were, but, very much to the contrary, it is a question of accepting that the only liveable reality is a simulation. Where at the start of the film, Edie play acts the role of a cheerleader for Tom’s sexual delectation, by the end she is playacting for real. (And of course, of course… there are no authentic cheerleaders, “real” cheerleaders are themselves playing a role.) If, as Žižek argued in Welcome to the Desert of the Real, 9/11 was already a recapitulation of the “ultimate American paranoiac fantasy […] of an individual living in a small idyllic […] city, a consumerist paradise, who suddenly starts to suspect that the world he lives in is a fake”,[8] a kind of real-life staging of The Matrix, then A History of Violence may be the first post 9/11 film in which the American idyll is deliberately and knowingly re-constructed AS simulation. (This is underscored by the fact that not one frame of the film was shot in America. In this respect, the film resembles Kubrick’s Lolita, whose America of motels and dusty highways was entirely reconstructed in Britain. In his interview with Salon, Cronenberg pronounced himself proud of his ability to hoodwink American audiences into believing that they were really seeing the midwest and Philadelphia.)
“When you dream, do you dream you’re Joey?” the mobster Fogarty (Ed Harris) asks Tom Stall, perhaps deliberately echoing Chuang Tzu’s story of a man who dreamt he was a butterfly. Chuang Tzu famously no longer knows if he is a butterfly dreaming that he was Chuang Tzu or Chuang Tzu dreaming that he is a butterfly. Is Tom Stall the dream of Joey Cusack, or is Joey Cusack the bad dream of Tom Stall? It’s no surprise that Lacan should have fixed upon this story, and Forgarty’s question contains an analyst’s assumption: the reality of Tom lies not at the level of the everyday-empirical but at the level of desire. The Real of Stall/Cusack is to be found, fittingly, in the desert, the space of subjective destitution where Stall says that he “killed Joey”.
In an interesting but ultimately unconvincing piece in Sight and Sound, Graham Fuller argues that we should read the film as Stall’s fantasy:
“Who is Joey Cusack?” the movie ponders at its midpoint as it leaves Western territory behind and plunges into a dark pool of noir. But the more fruitful question is “Who is Tom Stall, if not whom Fogarty claims he is, and why does he have a superegoic alter ego?” The name “Stall” indicates stasis. Though he is a diligent, caring husband and father, Tom knows he hasn’t made much of himself in life, and, we learn, harbours resentment towards his estranged wealthy brother, who considers him a fool. This chip on Tom’s shoulder explains his daydreaming which, born of repression, aligns him which such literary and movie dreamers as Walter Mitty and Billy Liar, whose fantasies of themselves as all-conquering heroes are redolent of crippling neuroses, even impotence…[9]
Tempting as it is, this interpretation is unsatisfactory for a number of reasons. It is guilty of the same “oneiric derealisation” which has blighted responses to both Lynch’s Mulholland Drive and Kubrick’s Eyes Wide Shut, both of which have been interpreted as long dream sequences. Such readings ultimately amount to an attempt to put to rest the films’ ontological threat, ironing out all their anomalies by attributing them to an interiorised delirium. The problem is that this denies both the libidinal reality of dreams — we wake ourselves from dreams, Lacan suggests, in order to flee the Real of our desires — at the same time as it ignores the way in which ordinary, everyday reality is dependent for its consistency on fantasy. It also makes the empiricist presupposition that the quotidian and the banal have more reality than violence; the message of the film is rather that the two are inextricable.
In the end, Stall as the fantasy of Cusack is much more interesting than Cusack as the fantasy of Stall. Is the American small-town idyll the fantasy of a psychopath? After Guantanamo Bay, after Abu Ghraib, this question has a special piquancy. The challenge that A History of Violence poses to the audience comes from the fact that we fully identify with Stall/Joey’s violence. We gain enormous enjoyment when the hoods are dispatched with maximum efficiency. When we dream, do we dream we’re Joey? Do we dream as Joey? Do we dream of being Tom, innocent, regular people, no blood on our hands? Are our “real”, everyday lives really only this dream?
At the same time as we enjoy Joey’s hyper-violent killing of the gangsters, we know that it is impossible for us to position them as the Outside and Stall/Joey as the Inside, and the film reinforces the lesson that Žižek thought we should have learned in the aftermath of 9/11:
Whenever we encounter such a purely evil Outside, we should gather the courage to endorse the Hegelian lesson: in this pure Outside, we should recognise the distilled version of our own essence. For the last five centuries, the (relative) prosperity and peace of the “civilised” West was bought by the export of ruthless violence and destruction into the “barbarian” Outside: the long story from the conquest of America to the slaughter in Congo.[10]
The most disturbing aspect about the film’s violence is not the gore that results from it, but the reptilian mechanism of its execution. There are no wisecracking one-liners; instead, once the killings are completed with a coiled spring autonomic power, there is an entranced animal calm, a machine exhaustion. (A History of Violence is reflexive without ironic, entirely lacking in any PoMo swagger. It may have put the final bullet into Tarantino’s career, if the spectacular indulgence of Kill Bill didn’t already do that.)
A History of Violence suggests that twenty-first century America is a less a country in which violence is a repressed underside than that it is moebian band where if you begin with ultraviolence you will eventually end up with homely banality, and vice versa. In the final scene, when Tom — now “Tom” — returns to his house, “everything appears in a totally different light, though it has stayed the same”. The images of domesticity have now become “images of domesticity”, the meatloaf and the mashed potato have become “meatloaf” and “mashed potato”, reflexively-placed icons of American normality, the very definition of the unhomely, the unheimlich, the uncanny. Such, as Žižek said in the 9/11 piece, is the nature of “late capitalist consumerist society”, where “‘real social life’ itself somehow acquires the features of a staged fake”. This is a simulated scenario far bleaker than that of The Truman Show or Dick’s Time Out of Joint, since it has been freely and knowingly embraced by the subjects themselves. There is no Them behind the scenes orchestrating and choreographing the simulation. At the end of the film, everyone is fooling but no one is fooled.
notes on cronenberg’s eXistenZ
“Can what is playing you make it to level 2?” asked Nick Land in his 1994 discussion of cybertheory, “Meltdown”.[2] Land’s intuition that computer games would provide the best way to understand subjectivity and agency in digital culture was also the gambit of David Cronenberg’s 1999 film eXistenZ. The film takes place in a near-future in which games are capable of generating simulated environments which can barely be distinguished from real life. Instead of computer terminals or game consoles, players use organic “game pods”, which are connected directly to the players’ bodies via “bio-ports” in their spines.
The main characters are Ted Pikul (Jude Law) and Allegra Geller (Jennifer Jason Leigh). We are first of all led to believe that Pikul is a neophyte gameplayer, being reluctantly initiated into the gameworld by Geller, who at this point seems to be the designer of the game (called eXistenZ) which they are playing. The two are pitched into a complex intrigue: a struggle between rival games corporations, and between gameplayers and “realists” — those who believe that the games are corroding the structure of reality itself. This corrosion is performed by the film itself, with what one of the characters memorably describes as “reality bleed-through” effects, so that the reality layers — only very weakly differentiated in any case — become difficult to distinguish. By the end it seems that both eXistenZ the game and what we had taken to be real life are embedded inside another game, tranCendenZ, but by now we cannot be sure of anything. The last line of dialogue is “Tell me the truth, are we still in the game?”
At the time of release, it seemed like eXistenZ was a late-arriving take on a series of themes and tropes familiar from 1980s cyberpunk — ideas Cronenberg had helped to shape in Videodrome. In retrospect, however, it is possible to see eXistenZ as part of a rash of late-1990s and early-2000s films, including The Matrix and Vanilla Sky, which mark a transition from what Alan Greenspan called the “irrational exuberance” of the 1990s bubble economy into the early twenty-first-century War on Terror moment. There is an abrupt mood shift toward the end of eXistenZ, with a military insurrection complete with heavy artillery and explosions. For the most part, though, the dominant mood is more quotidian. By contrast with the hyper-conspicuous CGI of The Matrix, with which it was destined to be most compared, eXistenZ is sparing in its use of special effects. The look is subdued, resolutely nonspectacular: there is a lot of brown. The brownness seems like a refusal of the gloss that will increasingly come to coat the artifacts of digital culture.
With its dreary trout farms, ski lodges, and repurposed churches, the world (or, more properly, worlds) of eXistenZ have a mundane, lived-in quality. Or rather worked-in: much of the film happens in workplaces — gas station, factory, workshop — and this dimension of the film is what now seems prophetic. Though never explicitly discussed, labour is something like an ambient theme, omnipresent but unarticulated. The key to eXistenZ’s selfreflexivity is its preoccupation with the conditions of its own production (and the production of culture in general). It presents us with an uncanny compression, in which the “front end” of late capitalist culture — its cuttingedge entertainment systems — fold back into the normally unseen “back end” (the quotidian factories, labs, and focus groups in which such systems are produced). The clamour of capitalist semiotics, the frenzy of branding sigils and signals, is curiously muted in eXistenZ. Instead of being part of the background hum of experience, as they are in both everyday life and the typical Hollywood movie, brand names appear only rarely in eXistenZ. The ones that do appear — most of them the names of games companies — leap out of the screen. The generic naming of space is in fact one of the running jokes in the film: a country gas station is simply called Country Gas Station, a motel is called Motel. This is part of the flat affect, the strange tonelessness, which governs most of the film.
The digitisation of culture which we take for granted now was only in its infancy in 1999; broadband was a few years off, as was the iPod, and eXistenZ has little to tell us about the digital communications equipment that proliferated in the decade after it was released. Handheld devices do not play any major role in eXistenZ — the glowing phone belonging to Pikul is thrown out of a car window by Geller — and, with its longueurs, its lingering in dead time, the film is very far from registering the jittery, attention-dispersing effects of “always-on” mobile technology. The most resonant aspects of eXistenZ do not reside in the body horror which was then still Cronenberg’s signature — although the scenes of the characters being connected to their organic game pod by bio-ports are typically grisly. Nor are they to be found in the perplexity expressed by characters as to whether they are inside a simulation or not — this is a theme that was already familiar from Videodrome, as well as Verhoeven’s Total Recall, both of which (in the first case indirectly, in the second more directly) took their inspiration from Philip K. Dick’s fiction. Instead it is the idea — in some ways stranger and more disturbing than the notion that reality is fake — that subjectivity is a simulation which is the distinctive insight of eXistenZ.
This idea emerges, in the first place, through confronting other automated (or rather partially automated) consciousnesses: entities that seem autonomous but in fact can only respond to certain trigger phrases or actions that move the gameplay down a predetermined pathway. Some of the most memorable (and humorous) scenes in eXistenZ show encounters with these Read Only Memory beings. We see one of the characters locked in a “game loop”, silently lolling his head while waiting to hear the keywords that will provoke him back into action. Later, a clerk is seen repeatedly clicking a pen — as a background character he is programmed not to respond until his name is called. More disturbing than the third-person (or nonperson) encounter with these programmed drones is Pikul’s experience of subjectivity being interrupted by an automatic behavior. At one point, he suddenly finds himself saying, “It’s none of your business who sent us! We’re here and that is all that matters”. He is shocked at the expostulation: “God, what happened? I didn’t mean to say that.” “It’s your character who said it”, Geller explains. “It’s kind of a schizophrenic feeling, isn’t it? You’ll get used to it. There are things that have to be said to advance the plot and establish the characters, and those things get said whether you want to say them or not. Don’t fight it.” Pikul later grimly notes that it makes no difference whether he fights these “game urges” or not.
The emphasis on the curtailing of free will is one reason that Cronenberg’s claim that the film is “existentialist propaganda” seems odd. Existentialism was a philosophy which claimed that human beings (what Sartre called the “for-itself”) are “condemned to be free”, and that any attempt to avoid responsibility for one’s actions amounts to bad faith. There is an absolute difference between the for-itself and what Sartre called the “in-itself” — the inert world of objects, denuded of consciousness. Yet eXistenZ, in common with much of Cronenberg’s work, troubles the distinction between the foritself and the in-itself: machines turn out to be anything but inert, just as human subjects end up behaving like passive automata. As in Videodrome before it, eXistenZ draws out all the ambiguities of the concept of the player. On the one hand, the player is the one in control, the agent; on the other, the player is the one being played, the passive substance directed by external forces. At first, it seems that Pikul and Geller are for-itself, capable of making choices, albeit within set parameters (unlike in The Matrix, they are constrained by the rules of the world into which they are thrown). The game characters, meanwhile, are the in-itself. But when Pikul experiences “game urges”, he is both in-itself (a merely passive instrument, a slave of drive) and for-itself (a consciousness that recoils in horror from this automatism).
To appreciate eXistenZ’s contemporary resonance it is necessary to connect the manifest theme of artificial and controlled consciousness with the latent theme of work. For what do the scenes in which characters are locked in fugues or involuntary-behavior loops resemble if not the call-centre world of twenty-first-century labour in which quasi-automatism is expected of workers, as if the undeclared condition of employment were to surrender subjectivity and become nothing more than a bio-linguistic appendage tasked with repeating set phrases that make a mockery of anything resembling conversation? The difference between “interacting” with a ROM-construct and being a ROM-construct neatly maps onto the difference between telephoning a call centre and working in one.
In Being and Nothingness, Sartre famously used the example of the waiter: someone who overplays the role of waiter to the extent that they (to outside appearances at least) eliminate their own subjectivity. The power of Sartre’s example depends upon the tension between the would-be automatism of the waiter’s behavior and the awareness that behind the mechanical rituals of the waiter’s over-performance of his role is a consciousness that remains distinct from that role. In eXistenZ, however, we are confronted with the possibility that agency can genuinely be interrupted by the “inflexible stiffness of some kind of automaton”. In any case, eXistenZ compels us to reread Sartre’s description of the waiter in its terms, especially since one of the most horrific scenes of being-played features none other than a waiter. Pikul and Geller are sitting in a restaurant when Pikul feels himself overcome by a “game urge”:
Pikul: You know, I do feel the urge to kill someone here.
Geller: Who?
Pikul: I need to kill our waiter.
Geller: Oh. Well that makes sense. Um, waiter! Waiter!
[She calls over waiter]
Geller: When he comes over, do it. Don’t hesitate.
Pikul: But… everything in the game is so realistic, I— I don’t think I really could.
Geller: You won’t be able to stop yourself. You might as well enjoy it. Pikul: Free will… is obviously not a big factor in this little world of ours.
Geller: It’s like real life. There’s just enough to make it interesting.
“You won’t be able to stop yourself, you might as well enjoy it” — this phrase captures all too well the fatalism of those who have given up the hope of having any control over their lives and work. Here, eXistenZ emerges, not as “existentialist propaganda” but as decisively anti-existentialist. Free will is not an irreducible fact about human existence: it is merely the unpreprogrammed sequence necessary to stitch together a narrative that is already written. There is no real choice over the most important aspects of our life and work, eXistenZ suggests. Such choice as there is exists one level up: we can choose to accept and enjoy our becoming in-itself, or reject it (perhaps uselessly). This is a kind of deflation-in-advance of all of the claims about “interactivity” that communicative capitalism will trumpet in the decade after eXistenZ was released.
Autonomist theorists have referred to a turn away from factory work toward what they call “cognitive labour”. Yet work can be affective and linguistic without being cognitive — like a waiter, the call-centre worker can perform attentiveness without having to think. For these non-cognitive workers, indeed, thought is a privilege to which they are not entitled.
The muted tones of eXistenZ anticipate a digital-era banality, and it is the banal quality of life in a digitally automated environment — humansounding voices that announce arrivals and departures at a railway station, voice-recognition software which fails to recognise our voices, call-centre employees drilled into mechanically repeating a set script — that eXistenZ captures so well.
i filmed it so i didn’t have to remember it myself
I was reminded of A History of Violence while watching Andrew Jarecki’s ultra-disturbing documentary Capturing the Friedmans on Channel 4’s new digital service, more4, the other night.
Capturing the Friedmans is about a family from Great Neck, New York State, two of whose members (the father, Arnold, and one of the sons, Jesse, then only a teenager) pleaded guilty to serious sexual offences and were consequently jailed. Were they guilty? We can be reasonably confident only that Arnold had paedophiliac tendencies, and owned child pornography; he also confessed to having had some sort of sexual contact, short of sodomy, with two boys, but not in Great Neck. The rest is an enigma which makes Rashomon seem like an open and shut case. Jesse’s role, for instance, is desperately unclear. The supposed victims claimed that Jesse had participated in, and assisted with, his father’s violent abuses. But a campaigner cast doubt on the victims’ testimony, none of which was corroborated by any physical evidence, and most of which seemed to have been “recovered” after they had been hypnotised.
The gaps in the Friedman narrative are all the more glaring because of the plethora of recorded material that IS available. This was a family that seemed — like many now I suppose — to obsessively record itself. Part of the “capturing” of the Friedmans is their capturing of themselves, on film and on tape. A documentary like this only became possible now that filming technology — cine cameras and later camcorders — had become widely available for the first time and kids are filmed from the moment of birth. The whole thing felt like a grim counterpoint to the proto-reality TV documentary of the Loud family Baudrillard discussed in “Precession of Simulacra”.[2] In a way, the most painful material consists of home movie footage of the Friedmans shot in the 1970s, in which they look for all the world like a perfectly happy family, the kids mugging and clowning for the cameras. Never has Deleuze’s observation that “family photos” are, by their very nature, profoundly misleading, been more bitterly borne out. Later, as the trials start and the recriminations follow, the family filmed and audiotaped themselves ripping each other to shreds.
Why did they continue to film? “How do they remember, those who do not film?” asks Chris Marker in Sans Soleil. But why would the Friedmans want to remember their journey into Hell? Who could possibly want to film this? In Lost Highway, Fred Madison (Bill Pullman) claimed that he hated the thought of video-taping his own life because he “liked to remember things in his own way”. In an uncanny complement to this, David Friedman, who recorded the events of the day Jesse was sentenced to eighteen years imprisonment, said that he filmed it “so I didn’t have to remember it myself”. The machines remember, so we don’t have to.
spectres of marker and the reality of the third way
Watching Chris Marker’s Le Fond de l’air est rouge (A Grin Without a Cat) last week made for a somewhat ambivalent experience: even though the film is, ostensibly, a catalogue of disappointments, its registering of a time when there were challenges — no matter how inchoate, messy, contradictory — to the existing order, cannot but offer some inspiration in these much bleaker times. A Grin Without a Cat, originally released in 1977 but given a new post-89 epilogue by Marker in 1992, is an epic montage-meditation on what Marker called “the Third World War”: the hydra-headed revolutionary or would-be revolutionary struggles of the Sixties and the Seventies. Marker constructed the film entirely out of archive material, shooting no original footage, and producing associations, connections, foreshadowings and echoes through masterly editing. The effect, especially if you are not minutely familiar with events in France, Vietnam, Algeria, Bolivia, Cuba and Czechoslovakia is disorientating, vertiginous. You find yourself Quantum-Leapt into the middle of a jostling crowd scene; no sooner have you got your bearings there when you abruptly find yourself in another place, another time. Marker’s commentary — spoken by a number of actors — gives you clues, epigraphs, rather than explication. But Marker’s aim not to render the period from 67 to 77 as Objective History to be pontificated upon by “experts” for whom the Meaning of the events is already established, nor, even worse, to produce a vanguardist version of I Heart 1968, in which sighing former revolutionaries look back on anger with the tender contempt of contemporary “wisdom”. No, the point was to present the events “in becoming”, to restore to them a subjectivity (in the Kierkegaardian sense) that retrospection structurally forecloses.
At one stage in the film, Marker’s commentary ruefully notes that while revolutionaries, failed revolutionaries and ex-revolutionaries devoted all their attention to the formation of the New Left, the New Right was coalescing, unnoticed. Cue images of Valery Giscard D’estaing playing football in a carefully-cultivated attempt to look sporty and modern. The PR director of Citroën muses on the “science of management” (too complicated, he says, for even the most talented Union member to master) and looks forward to the incorporation of leftist desire into Capital that would become post-Fordism.
Cut to now, where the images of even an ultimately failed militancy belong to a past. A past that was not — in one sense — even mine, that was over before I was born in July 1968. Yet the reverberations continued for a few years yet, were an unacknowledged (by me, then) background to the things that I enjoyed in the late Seventies and early Eighties. For those of us arriving after the event, the significance of the convulsions documented in Marker’s film could only be apprehended much later, once their effects had completely ebbed away and the reality (and the pleasure) principles were Restored. Marcus’ Lipstick Traces — whose temporal jump-cutting in many ways recalls that of Marker’s film — goes some way to establishing the connections between the events remembered in A Grin Without a Cat and those that began in the UK at more or less the time that the film was completed. A cheshire cat’s grin, lipstick traces on a cigarette, spectres of Marx: Marcus, Derrida and Marker come to see ruptures, revolts and revolutions as ghostly residue, thin stains on the seamless surfaces of post-Cold War Capital.
The untranslateable French title of Marker’s film suggests possibilities that hovered and haunted without ever making themselves real. At the Marker conference held at the ICA a few years ago, Barry Langford argued that, “rather than the spectre of Communism famously invoked by Marx in the opening lines of the Communist Manifesto”, for both A Grin Without a Cat and Marx’s “The Civil War in France” a hundred years before it, “it is the phantom of revolution that haunts Marx and Marker alike — that is, the fear that revolution will ultimately prove, precisely, phantasmic”. If Marx and Marker’s fear was that revolution would only be a spectre, our suspicion is that it will not turn out to be even that, that the stricken ghosts have been put to flight once and for all. (And even the “death of communism” is not enough for the guardians of the new status quo, for whom “communism is not dead enough — […] they will only be content when they have driven a stake through its heart and buried it at the crossroads at midnight”.[2])
The struggles in A Grin Without a Cat might have been defeated, might even have contributed to a more ferociously effective Reaction, but the pressures that those events brought to bear almost had very immediate effects — by contesting the Possible, by rejecting “realism”, they could not but have altered expectations about what was acceptable in the workplace, about what could happen in everyday life. The revolutions were cultural; which is to say, they understood that culture and politics could not be conceived in isolation from one another. Both Althusser and the Situationist-inspired students of 68, in many ways so opposed, could agree on at least one thing: that cultural products were never merely cultural. In their condemnations of recuperated Spectacle and Ideological Apparatuses, they granted a weight to cultural products which few would countenance now.
I felt the contrast between what Marker’s film recounted and contemporary realities especially painfully last week when I went on a TUC training course with members of NATFHE from other FE colleges. The stories of increased casualisation, of newly punitive sickness policies, of lecturers being sacked and forced to re-apply for their jobs, of the imposition of more and more targets and “spurious measurable”, each entailing yet more pointless, window-dressing paperwork, confirmed what, individually, we all already knew. The Further Education sector is in crisis; its problems only symptomatic of a wider malaise in UK education as a whole. Further Education colleges, out of Local Education Authority control since 1992, show the way in which a “reformed” (i.e. part-privatised) education will develop. The recent report which stated that students spoon-fed at A-level cannot cope with university study would come as little surprise to few A-level teachers and lecturers. The pressure to meet government targets means that quality and breadth of teaching is sacrificed for the narrow goal of passing the exam: an instrumentalisation of education that fully accepts that its only role is to reproduce the labour force. Far, far away from 68, at the core of whose conflagrations was education, and the question of what it could be: could it be more than an ideological training camp, a carceral institution?
One thing that occurred to me last week, prompted by the contrast between Marker’s Then and our Now, was that the third way is not entirely a phantasm, an ideological dupe. There is in fact a reality to the third way, and it is the reality of bureaucracy. That is what is left once politics has become administration.
It’s hard to believe that public services are not more clogged with bureaucracy than they were pre-Thatcher. Certainly, education is choked with the stuff… targets, action plans, log books, all of them required conditions for funding by the Learning and Skills Council, and assessed by Ofsted, whose threat no longer takes the form of an invasive external entity arriving every two or three years, but has become introjected into the institution itself, through the permanent panoptic vigilance of a bloated managerial strata determined to over-compensate in order to fully ensure it is meeting central government’s demands. This is the reality of “market Stalinism” in education.
Is there a way to challenge or roll back the slow, implacable, rapacious proliferation of bureaucracy? Only by a collective action that seems inconceivable now… Only by a change in the ideological climate… Only by a switch in the cultural atmosphere… Where to start? While we search, desperately, for cracks in the Possible, bureaucracy, that steel spider, patiently spins its grey web…
dis-identity politics
The discussion of V for Vendetta has been far more interesting than the film deserved. Yes, there is a certain frisson in seeing a major Hollywood movie refusing to unequivocally condemn terrorism, but the political analysis in the film (as in the original comic) is really rather threadbare. That is Moore’s fault; it can’t be blamed on the Wachowskis. Like all of Moore’s work, V for Vendetta is considerably less than the sum of its parts. I’ve complained before of finding Moore’s continual efforts to reassure himself and his readers of their erudition — every time you are about to succumb to the fictional world, it’s as if Moore taps you on your shoulder and say, “We’re too good for this, aren’t we folks?” — highly distracting and irritating.
As for V for Vendetta’s politics — apart from the subjective destitution scenes, they amount in large part to the familiar populist ideology which maintains that the world is controlled by a corrupt oligarchy that could be overthrown if only people knew about it. Steven Shaviro says that “rather than trying to please all demographics, [the film] identifies a deeply religious, homophobic, ultra-patriotic, imperialistic surveillance state as the source of oppression.”[2] But isn’t this precisely “appealing to all demographics”, since few homophobic fascists will identify themselves as homophobic fascists, and it’s hard to imagine anyone warming to Hurt’s foaming-at-the-mouth ranter, still less voting for him. Postmodern fascism is a disavowed fascism (cue the BNP leaflet delivered through my door when I lived in Bromley, photograph of a smiling kiddywink, slogan: “My daddy’s not a fascist”), just as homophobia survives as disavowed homophobia. The strategy is to refuse the identification while pursuing the political programme. “We of course deplore fascism and homophobia, but…” The Wachowskis’ government bans the Koran, but that is the last thing that Blair and Bush would ever do; no, they will praise Islam as a “great religion of peace” while bombing Muslims.
Blair’s authoritarian populism[3] is far more sinister than V for Vendetta’s pantomime autocracy precisely because Blair is so successful at “presenting himself as the reasonable, honest bloke on the side of the common man”. Similarly, Bush’s linguistic incompetence, far from being an impediment to his success, has been crucial to it, since it has allowed him to pose as a “man of the people”, belying his privileged Harvard and Yale-educated background. It is significant in fact that class is not mentioned at all in the film. As Jameson wryly notes in “Marx’s Purloined Letter”, it is not
particularly surprising that the system should have a vested interest in distorting the categories whereby we think class and in foregrounding gender and race, which are far more amenable to liberal ideal solutions (in other words, solutions that satisfy the demands of ideology, it being understood that in concrete social life the problems remain equally intractable).[4]
The climactic scenes of V for Vendetta, in which the people rise up (by this time, against no one) made me think, not of some great political Event, but rather of the Make Poverty History campaign — a “protest” with which no one could possibly disagree. The comparison with Fight Club does V for Vendetta no favours; the targets of Tyler Durdon’s terrorism were not the fusty symbols of the political class but the franchise coffee bars and skyscrapers of impersonal capital.
I’m no fan of the Wachowskis’ Matrix, but it succeeded in two ways that V for Vendetta never will. The Matrix has become a massively propagated pulp mythos (whereas who but academics will think about the V for Vendetta film a year from now? It’ll be a year after that until academics recognise that the far more fascinating and sophisticated Basic Instinct II is worthy of study). More importantly, it suggested that what counts as “real” is an eminently political question.
That ontological dimension is what is missing from the progressive populist model, in which the masses cannot but appear as a dupes, fooled by the lies of the elite but ready to effectuate change the moment they are made aware of the truth. The reality, of course, is that the “masses” are under few illusions about the ruling elite (if anyone is credulous about politicians and “capitalist parliamentarianism”, it is the middle classes). The Subject Supposed Not To Know is a figure of populist fantasies — more than that: the duped subject awaiting factual enlightenment is the presupposition on which progressive populism rests. If the most crucial political task is to enlighten the masses about the venality of the ruling class, then the preferred mode of discourse will be denunciation. Yet, this repeats rather than challenges the logic of the liberal order; it is no accident that the Mail and the Express favour the same denunciatory mode. Attacks on politicians tend to reinforce the atmosphere of diffuse cynicism upon which capitalist realism feeds. What is needed is not more empirical evidence of the evils of the ruling class but a belief on the part of the subordinate class that what they think or say matters; that they are the only effective agents of change.
This returns us to the question of reflexive impotence. Class power has always depended on a kind of reflexive impotence, with the subordinate class’ beliefs about its own incapacity for action reinforcing that very condition. It would, of course, be grotesque to blame the subordinate class for their subordination; but to ignore the role that their complicity with the existing order plays in a self-fulfilling circuit would, ironically, be to deny their power.
“[C]lass consciousness”, Jameson observes in “Marx’s Purloined Letter”,
turns first and foremost around the question of subalternity, that is around the experience of inferiority. This means that the “lower classes” carry around within their heads unconscious convictions as to the superiority of hegemonic or ruling-class expressions or values, which they equally transgress and repudiate in ritualistic (and socially and politically ineffective) ways.
There is a way, then, in which inferiority is less class consciousness than class unconsciousness, less about experience than about an unthought precondition of experience. Inferiority is in this sense an ontological hypothesis that is not susceptible to any empirical refutation. Confronted with evidence of the incompetence or corruption of the ruling class, you will still feel that, nevertheless, they must possess some agalma, some secret treasure, that confers upon them the right to occupy the position of dominance.
Enough has been already been written about the kind of class displacement people like myself have experienced. Dennis Potter’s Nigel Barton plays remain perhaps the most vivid anatomies of the loneliness and agony experienced by those who have been projected out of the confining, comforting fatalism of the working-class community and into the incomprehensible, abhorrently seductive rituals of the privileged world. “A drive from nowhere leaves you in the cold”, as the Associates sang in “Club Country”, “Every breath you breathe belongs to someone there.”
There is a Cartesian paradox about such experiences, in that they are significant only because they produce a distanciation from experience as such; after undergoing them, it is no longer to conceive of experience as some natural or primitive ontological category. Class, previously a background assumption, suddenly interposes itself — not so much as a site for heroic struggle, but as a whole menagerie of minor shames, embarrassments and resentments. What had been taken for granted is suddenly revealed to be a contingent structure, producing certain effects (and affects). Nevertheless, that structure is tenacious; the assumption of inferiority constitutes something like a core programming which makes sense of the world in advance. To think of oneself as capable of doing a “professional” job, for instance, requires a traumatic shift in perspective, and if there are confidence crises and nervous breakdowns, they will be very often the consequence of the core programming intermittently reasserting itself.
The real lesson to draw from Potter’s Barton plays is not the fatalistheroic one about the agonies of the charismatic individual confronting intransigent social structures. The plays have to be read instead against class-as-ethnicity and for class-as-structure; in any case, as they make clear, the occult machineries of social structure produce the visible ethnicities of language, behaviour and cultural expectations. The plays’ demand is not for a re-acceptance into the rejecting community, nor a full accession into the elite, but for a mode of collectivity yet to come.
Potter’s challenges to naturalism then, become far more than mere PoMo trickery. His foregrounding of the way in which fictions structure reality, and of the role that television itself plays in this process, brings to the fore all the ontological issues that worthier, more traditional social realist writers conceal or distort. There is no realism, Potter suggests, beyond the Real of class antagonism.
Now is perhaps the time to address two good questions that Bat[5] mailed in response to the reflexive impotence post. First, Bat asked, is the situation for French teenagers different from that of their British counterparts? This is easily dealt with, since, after all, it was the very problem with which the post aimed to deal. French students are far more embedded in a Fordist/disciplinary framework than are British students. In education and employment, the disciplinary structures survive in France, providing some contrast with, and resistance to, the cyberspatial pleasure matrix. (For reasons I will explore in more depth shortly, this is not necessarily for the best, however.) Bat’s second question raised more important issues; doesn’t talking about reflexive impotence reinforce the very interpassive nihilism it supposedly condemns? I would say that the exact opposite is the case. I’ve had more mail about the reflexive impotence post than any other; mostly, actually, from teenagers and students who recognise the condition but who, far from being further depressed by seeing it analysed, find its identification inspiring. There are very good Spinozist and Althusserian reasons for this — seeing the network of cause-and-effect in which we are enchained is already freedom. By contrast, what is depressing is the implacable poptimism of the official culture, forever exhorting us to be excited about the latest dreary-shiny cultural product and hectoring us for failing to be sufficiently positive. A certain “vulgar Deleuzianism”, preaching against any kind of negativity, provides the theology for this compulsory excitation, evangelising on the endless delights available if only we consume harder. But what it is so often inspiring — in politics as much as in popular culture — is the capacity to nihilate present conditions. The nihilative slogan is neither be “things are good, there is no need for change”, nor “things are bad, they cannot change”, but “things are bad, therefore they must change.”
This brings us to subjective destitution, which, unlike Steve Shaviro, I think is a precondition of any revolutionary action. The scenes of Evey’s subjective destitution in V for Vendetta are the only ones which had any real political charge. For that reason, they were the only scenes which produced any real discomfort; the rest of the film does little to upset the liberal sensibilities which we all carry around with us. The liberal programme articulates itself not only through the logic of rights, but also, crucially, through the notion of identity, and V is attacking both Evey’s rights and her identity. Steve says that you can’t will subjective destitution. I, however, would say that you can only will it, since it is the existential choice in its purest form. Subjective destitution is not something that happens in any straightforward empirical sense; it is, rather, an Event precisely in the sense of being an incorporeal transformation, an ontological reframing to which you must assent. Evey’s choice is between defending her (old) identity — which, naturally, also amounts to a defence of the ontological framework which conferred that identity upon her — and affirming the evacuation of all previous identifications. What this brings out with real clarity is the opposition between liberal identity politics and proletarian dis-identity politics. Identity politics seeks respect and recognition from the master class; dis-identity politics seeks the dissolution of the classifactory apparatus itself.
That is why British students are, potentially, far more likely to be agents of revolutionary change than are their French counterparts. The depressive, totally dislocated from the world, is in a better position to undergo subjective destitution than someone who thinks that there is some home within the current order that can still be preserved and defended. Whether on a psychiatric ward, or prescription-drugged into zombie oblivion in their own domestic environment, the millions who have suffered massive mental damage under capitalism — the decommisioned Fordist robots now on incapacity benefit as well as the reserve army of the unemployed who have never worked — might well turn out to be the next revolutionary class. They really do have nothing to lose…
“you have always been the caretaker”: the spectral spaces of the overlook hotel
“What is anachronistic about the ghost story is its peculiarly contingent and constitutive dependence of physical place and, in particular, on the material house as such. No doubt, in some pre-capitalist forms, the past manages to cling stubbornly to open spaces, such as a gallows hill or a sacred burial ground; but in the golden age of this genre, the ghost is at one with a building of some antiquity […] Not death as such, then, but the sequence of such ‘dying generations’ is the scandal reawakened by the ghost story for a bourgeois culture which has triumphantly stamped out ancestor worship and the objective memory of the clan or extended family, thereby sentencing itself to the life span of the biological individual. No building more appropriate to express this than the grand hotel itself, with its successive seasons whose vaster rhythms mark the transformation of American leisure classes from the late 19th century down to the vacations of present-day consumer society.”
— Fredric Jameson, “Historicism in The Shining”[2]
“[T]he strongest compulsive influence arises from the impressions which impinge upon the child when we would have to regard his psychical apparatus as not yet completely receptive. The fact cannot be doubted; but it is so puzzling that we may make it more comprehensible by comparing it with a photographic exposure which can be developed after any interval of time and transformed into a picture.”
— Sigmund Freud, “Moses and Monotheism”[3]
Space is intrinsic to spectrality, as one of the meanings of the term “haunt” — a place — indicates. Yet haunting, evidently, is a disorder of time as well as of space. Haunting happens when a space is invaded or otherwise disrupted by a time that is out-of-joint, a dyschronia.
The Shining – King’s novel, and Kubrick’s “unfaithful” film version, both of which I propose to treat as one interconnected textual labyrinth — is fundamentally concerned with the question of repetition. In Spectres of Marx, Derrida defines hauntology as the study of that which repeats without ever being present. To elaborate, we might say that the revenant repeats without being present in the first place — where “place” is equivalent in meaning to “time”. Nothing occupies the point of origin, and that which haunts insists without ever existing. We shall return to this presently (or would it be better to say, it will return to us?).
Precisely because it is so centrally about repetition, The Shining is a deeply psychoanalytic fiction. You might say that it translates psychoanalysis’ family dramas into the stuff of horror, except that it does rather more; it demonstrates what many have long suspected — that psychoanalysis already belongs to the genre of horror. Where else could we place concepts such as the death drive, the uncanny, trauma, the compulsion to repeat?
Yet The Shining is about repetition in a cultural, as well as a psychoanalytic sense. Hence Jameson’s interest. Jameson, after all, has theorised postmodernity in terms of repetition, albeit a repetition that is disavowed. The “nostalgia mode” he refers to names an all-but ubiquitous yet largely unacknowledged mode of repetition, in a culture in which the conditions for the original and the ground-breaking are no longer in place, or are in place only in very exceptional circumstances. The nostalgia in question is neither a psychological nor an affective category. It is structural and cultural, not a matter of an individual or a collective longing for the past. Almost to the contrary, the nostalgia mode is about the inability to imagine anything other than the past, the incapacity to generate forms that can engage with the present, still less the future. It is Jameson’s claim that representations of the future, in fact, are increasingly likely to come to us garbed in the forms of the past: Blade Runner, with its well-known debt to film noir, is exemplary here (and nothing makes Jameson’s point more clearly than Blade Runner’s domination over science fiction film in the last twenty-five years).
According to Jameson, then, The Shining, then, is a “metageneric” reflection on the ghost story (a ghost story that is about ghost stories). Yet I want to claim The Shining does not belong to postmodernity, but rather to postmodernity’s doppelganger, hauntology. We could go so far as to say that it is a meta-reflection on postmodernity itself. As Jameson reminds us, The Shining is also about a failed writer: a would-be novelist who yearns to be virile writer in the strong modernist mould, but who is fated to be a passive surface on which the hotel — itself a palimpsest of fantasies and atrocities, an echo chamber of memories and anticipations — will inscribe its pathologies and homicidal intent. Or, it would be better to say, for this is the horrible dyschronic temporal mode proper to the Overlook, it will have always done.
The Overlook and the Real
“Around him, he could hear the Overlook Hotel coming to life.”
— Stephen King, The Shining[4]
There is no escape from the infinite corridors of the Overlook. It is no gloomy castle, easily relegated to an obsolete genre (the gothic romance); neither is it a supernatural relic that will crumble to dust when exposed to the harsh light of scientific reason. Concealed behind the alluring ghosts of the hotel’s Imaginary which seduce Jack, the horrors that stalk the Overlook’s corridors belong to the Real. The Real is that which keeps repeating, that which re-asserts itself no matter how you seek to flee it (more horribly, it is that which re-asserts itself through the attempts to flee it: the fate of Oedipus). The Overlook’s horrors are those of the family and of history; or more concisely, they are those of family history (the province, needless to say, of psychoanalysis).
David A. Cook has already shown how the film version is haunted by American history.[5] In Cook’s rendition, the Overlook, that playground of the ultra-privileged and the super-crooked (and no one, in the still paranoid post-Watergate dusk when King wrote the novel, could be so naïve as to imagine that these two groups could be parsed), metonymically stands in for the nightmare of American history itself. A leisure hive built on top of an Indian Burial Ground (this detail was added by Kubrick); a potent image of a culture founded upon (the repression of) the genocide of the native peoples:
It was as if another Overlook now lay scant inches beyond this one, separated from the real world (if there is such thing as a “real world” Jack thought) but gradually coming into balance with it.[6]
Important as Cook’s reflections are, as I have already indicated, I want to concentrate, not on the macro-level of history, on the micro-level of the family. This, inevitably, brings us to Walter Metz’s valuable reflections on the way in which The Shining is intertextually bound up with the melodrama genre.[7] A central tension in the film — a tension which for some is never quite resolved — concerns how The Shining is ultimately to be generically placed: is it about the family (in which case, it belongs to melodrama) or is about the supernatural (in which case, it belongs to horror or the ghost story)?[8] This inevitably recalls Todorov’s famous claim that the “fantastic” is defined by the hesitation between two epistemological possibilities; if spectral forces can be explained psychologically or by some other naturalistic means, then we are dealing with the “uncanny”. If the spectres of the supernatural cannot be exorcised, then we are dealing with the “marvellous”. Only while we oscillate between the two possibilities do we confront the “fantastic”.
The Uncanny Melodrama |
The Fantastic |
The Marvellous The ghost story |
Noting that most critics have regarded The Shining as a case of the “marvellous”, Metz positions The Shining as an example of the “uncanny”.
But I want to argue that The Shining is important because it scrambles the terms of Todorov’s schema; it is, at one and the same time, a family melodrama and a ghost story. If the ghosts are real, it is not because they are supernatural; and if the spectres are psychoanalytic, that is not to say that they can be reduced to the psychological. Just the reverse, in fact: rather than the spectral being subsumed by the psychological, for psychoanalysis, the psychological can be construed as a symptom of the spectral. It is the haunting that comes first.
Patriarchy as Hauntology
The Overlook’s ghosts are inescapable because they are the spectres of family history, and who of us is without a family history?[9] The Shining is a fiction, after all, about fathers and sons. Its genesis lay in a fantasy from which King the father, still struggling with alcoholism, recoiled, but which King the writer was fascinated by. Finding his papers scattered by his son one day, King flew into a blind rage; later he realised he could easily have struck the child. The germ of the novel was King’s extrapolation from that situation: what if he had struck his son? What if he had done much worse? What if King were an alcoholic failure who merely dreamt that he is a novelist?
Psychoanalysis could be crudely boiled down to the claim that we are our family history, although it is perhaps at this point that we can dispense with the term “history” and replace it with “hauntology”. The family emerges in Freud as a hauntological structure: the child is father to the man, the sins of the fathers are visited upon the children. The child who hates his father is condemned to repeat him, the abused becomes the abuser.
The Shining is about patriarchy as hauntology, and that relation is nowhere more thoroughly explored than in Freud’s essays on the foundations of religion. Here, Freud shows that the Holy Father, Jahweh, is indeed also a Holy Ghost: a spectral deity which can assert itself only through its physical absence. Freud repeated the “speculative myth” of the dismemberment and devouring of the Father Thing in “Totem and Taboo” thirty years later in “Moses and Monotheism”, a text which is itself full of repetitions and refrains.
In Freud’s account, there are two Fathers: the obscene “Pere Jouissance” (Lacan) who has access to total enjoyment, and the Name/No (Nom/Non) of the Father — the Father of Law, the Symbolic Order in person, who forbids and mortifies. As Žižek has shown,[10] one of the most significant aspects of “Totem and Taboo” was to have established that the austere Father of Symbolic Law is not originary; it is not, as the theory of the Oedipus complex had assumed, that the father is a pre-existent block to enjoyment. This “block” only comes into place once the father is killed.
In the story as Freud recounts it, the primal horde of beta males, jealous and resentful of the tribal Father, rise up one day to kill him, anticipating that they will now have unlimited access to jouissance. But this is not what transpires. The “band of brothers” are immediately remorseful, guiltstricken, melancholic. Far from being able to enjoy everything, the gloomy parricidal brothers are unable to enjoy anything. And far from ridding themselves of their Father’s loathsome domination, they find that the Father dominates them all the more now that he is absent. The Father’s ghost preys upon their conscience; indeed, their conscience is nothing other than the reproach of the dead Father’s spectral voice. In heeding this absent voice, in commemorating and propitiating it by initiating new ceremonies and codes of practice, the brothers introduce the rudimentary forms of morality and religion. God, the Father, the Big Other, the Symbolic does not exist; but it insists through the repetition of these rituals.
The Father is doubly dead. He asserts his power only when he is dead, but his power is itself only a power of death: the power to mortify live flesh, to kill enjoyment.
A Child is Being Beaten
“Like father, like son. Wasn’t that how it was popularly expressed?”
— Stephen King, The Shining[11]
The Shining shows us patriarchal dementia — with its lusts, its ruses and its rationalisations — from inside. We witness Jack gradually succumbing to this dementia as he becomes intoxicated by the hotel and its temptations, promises and challenges. In the soft-focus, honeyed space of the Gold Room, Jack parties with the hotel’s ghosts:
He was dancing with a beautiful woman. He had no idea of what time it was, how long he had spent in the Colorado Lounge or how long he had been there in the ballroom. Time had ceased to matter.[12]
In the grip of these fever-dream fantasies, Jack descends into the unconscious (where, as Freud tells us, time has no meaning). The unconscious is always impersonal, and especially so here: the unconscious that Jack subsides into is the unconscious of the hotel itself. His family come to seem like “ball-breaking” distractions from his increasing spells of enchanted communion with the hotel, and being a good father becomes synonymous with delivering Danny to the Overlook. Jack becomes convinced by the hotel’s avatars — which seem to reconcile the demands of the superego with those of the id — that it is his duty to bring Danny into line.
Beyond the Imaginary no-time of the Gold Room, there is another mode of suspended time in the Overlook. This belongs to the Real, where sequential, or “chronic”, clockface time, is superseded by the fatality of repetition. It is the Imaginary pleasures of the Gold Room, with their succulent promises of enwombing fusion, which allow Jack to fall increasingly into the hold of the hotel’s Real structure — the structure of abusive repetition. Danny confronts this structure as a vision of man endlessly a pursuing a child with a roque mallet (in the film, an axe).
The clockface was gone. In its place was a round black hole. It led down into forever. It began to swell. The clock was gone. The room behind it. Danny tottered and then fell into the darkness that had been hiding behind the clockface all along.
The small boy in the chair suddenly collapsed and lay in it at a crooked unnatural angle, his head thrown back, his eyes staring sightlessly at the high ballroom ceiling.
Down and down and down and down to – the hallway, crouched in the hallway, and he had made wrong turn, trying to get back to the stairs he had made a wrong turn and now AND NOW –
– he saw he was in the short dead-end corridor that led only to the Presidential Suite and the booming sound was coming closer, the roque mallet whistling savagely through the air, the head of it embedding itself into the wall, cutting the silk paper, letting out small puffs of plaster dust.[13]
Here we can turn again to the image of fatality Freud uses in “Moses and Monotheism”, which I cited at the beginning of this essay. “[T]he strongest compulsive influence”, Freud writes,
arises from the impressions which impinge upon the child when we would have to regard his psychical apparatus as not yet completely receptive. The fact cannot be doubted; but it is so puzzling that we may make it more comprehensible by comparing it with a photographic exposure which can be developed after any interval of time and transformed into a picture.[14]
This passage is especially piquant and suggestive when considered in relation to The Shining, given the famous final image of Kubrick’s film: a photograph taken in 1923 showing Jack, surrounded by party-goers and grinning. At this moment, we cannot but be reminded of Delbert Grady’s ominous claim that Jack has “always been the caretaker”.
What I want to draw from Freud’s photographic metaphor is precisely its concept of effects being distanced in time from the events which produced them. This is the psychoanalytic horror which The Shining anatomises. Violence has been imprinted upon Jack “psychical apparatus” long ago, in childhood (the novel details at some length the abuse that Jack has himself suffered at the hands of his own father), but it requires the “spectral spaces” of the Overlook hotel to transform those impressions from an “exposure” into a “picture”, an actual act of violence.
If Jack “has always been the caretaker”, it is because his life has always been in the abuse-circuit. Jack represents an appalling structural fatality, a spectral determinism. To have “always been the caretaker” is never to have been a subject in his own right. Jack has only ever stood in for the Symbolic and the homicidal violence which is the Symbolic’s obscene underside. What, after all, is the father if not the “caretaker”, the one who (temporarily) shoulders the obligations of the Symbolic (what Jack calls “the white man’s burden”) before passing them onto the next generation? In Jack the ghosts of the past are revived — but only at the cost of his own “de-vival”.
Of course, the dyschronic nature of the Overlook’s abusive causality — events stored in the psyche will yield their effects only after time has elapsed — has implications for Danny’s future as well. As Metz puts it: “When Jack chases Danny into the maze with ax in hand and states, ‘I’m right behind you Danny’, he is predicting Danny’s future as well as trying to scare the boy. […] [T]he patriarchal beast is within [Danny] as well.”[15] Jack might as well be saying, “I’m just ahead of you, Danny”: I am what you will become. In the Overlook, a child is always being beaten, and the position of the abused and the position of the abuser are places in a structure. It is all-too-easy for the abused to become the abuser. The ominous question The Shining poses, but does not answer, is: Will this happen to Danny (as it happened to Jack)? Is The Shining, that is to say, “Totem and Taboo”/ “Moses and Monotheism” — where the Father retains his spectral hold on the sons precisely through his own death — or is it Anti-Oedipus?
In the novel, Danny can only escape death at the hands of his father by catatonically communing with his double, Tony, whom King reveals to be an avatar of his future self:
And now Tony stood directly in front of him, and looking at Tony was like staring into a magic mirror and seeing himself in ten years…
The hair was light blond like his mother’s, and yet the stamp on his features was that of his father, as if Tony — as if the Daniel Anthony Torrance he would someday be — was a halfling caught between father and son, a ghost of both, a fusion.[16]
In the film, Danny escapes from his father by walking backwards in his footsteps. Yet we do not know if the (psychic) damage has already been done — will Danny, in surviving his father, end up taking his father’s place?
For Metz, these hesitations leaves the text open: “It is up to Danny to grow up and build a better world, throwing off the demons of the past but always knowing that deep inside of him, the demons that possessed Jack and all Americans are right beneath the surface. Danny has inherited Jack’s legacy.”[17] If Danny can throw off the spectres of the past, there is a possibility of freedom, then, but have the “strongest compulsive influences” already done their work? Is Danny, too, destined to always have been the Overlook’s caretaker?
coffee bars and internment camps
I’ve finally seen Children of Men, on DVD, after missing it at the cinema. Watching it last week I asked myself, why is its rendering of apocalypse so contemporary?
British cinema, for the last thirty years as chronically sterile as the issueless population in Children of Men, has not produced a version of the apocalypse that is even remotely as well realised as this. You would have to turn to television — to the last Quatermass serial or to Threads, almost certainly the most harrowing television programme ever broadcast on British TV — for a vision of British society in collapse that is as compelling. Yet the comparison between Children of Men and these two predecessors points to what is unique about the film; the final Quatermass serial and Threads still belonged to Nuttall’s bomb culture,[2] but the anxieties with which Children of Men deals have nothing to do with nuclear war.
Children of Men reinforces what few would doubt, but which British cinema would seldom lead you to suspect: the British landscape bristles with cinematic potential. It’s long since been evident that only someone outside the self-serving, self-pitying low gene pool of British cinema is capable of realising this potential, and Children of Men’s director, Alfonso Cuarón, and cinematographer, Emmanuel Lubezki, are both Mexican. Together they have produced a portrait of Grim Britannia that is like a film equivalent of the Burial LP (and the film’s excellent soundtrack features Burial’s mentor and label-mate, Kode9).
Lubezki’s cinematography is breathtaking. His photography seems to leech all organic and naturalistic vitality from the images, leaving them a washed-out grey-blue. As David Edelstein put it in an insightful review in New York Magazine: “The movie calls to mind an early description in Cormac McCarthy’s overwrought but gripping post-apocalypse novel The Road of gray days ‘like the onset of some cold glaucoma dimming away the world.’”[3] The lighting is masterly: it as if the whole film takes place in a permanent winter afternoon when even the sun is dying. White smoke, its source unspecified, curls ubiquitously.
Cuarón’s trick is to combine this despondent lyricism with a formal realism, achieved through the expert use of hand-held camera and long takes. Blood spatters onto the camera lens and goes unwiped. The gunfire is as oppressively tactile as it was in Saving Private Ryan. The meticulously choreographed long takes — technical feats of some magnitude — have justly been highly praised, and they are all the more remarkable because they go beyond the familiar role of simulating documentary realism to serve a political and artistic vision.
This brings us back, then, to my initial question, and I think that there are three reasons that Children of Men is so contemporary.
Firstly, the film is dominated by the sense that the damage has been done. The catastrophe is neither waiting down the road, nor has it already happened. Rather, it is being lived through. There is no punctual moment of disaster; the world doesn’t end with a bang, it winks out, unravels, gradually falls apart. What caused the catastrophe to occur, who knows; its cause lies long in the past, so absolutely detached from the present as to seem like the caprice of a malign being: a negative miracle, a malediction which no penitence can ameliorate. Such a blight can only be eased by an intervention that can no more be anticipated than was the onset of the curse in the first place. Action is pointless; only senseless hope makes sense. Superstition and religion, the first resorts of the helpless, proliferate.
Secondly, Children of Men is a dystopia that is specific to late capitalism. This isn’t the familiar totalitarian scenario routinely trotted out in cinematic dystopias (see, for example, V for Vendetta, which, incidentally, compares badly with Children of Men on every point).
If, as Wendy Brown has so persuasively argued, neoliberalism and neoconservatism can be made compatible only at the level of dreamwork, then Children of Men renders this oneiric suturing as a nightmare. In Children of Men, public space is abandoned, given over to uncollected garbage and to stalking animals (one especially resonant scene takes place inside a derelict school, through which a deer runs). But, contrary to neoliberal fantasy, there is no withering away of the State, only a stripping back of the State to its core military and police functions. In this world, as in ours, ultraauthoritarianism and Capital are by no means incompatible: internment camps and franchise coffee bars co-exist.
In P.D. James’ original novel, democracy is suspended and the country is ruled over by a self-appointed Warden. Wisely, the film downplays all this. For all that we know, the Britain of the film could still be a democracy, and the authoritarian measures that are everywhere in place could have been implemented within a political structure that remains, notionally, democratic. The War on Terror has prepared us for such a development: the normalisation of crisis produces a situation in which the repealing of measures brought in to deal with an emergency becomes unimaginable (when will the war be over?). Democratic rights and freedoms (habeas corpus, free speech and assembly) are suspended while democracy is still proclaimed.
Children of Men extrapolates rather than exaggerates. At a certain point, realism flips over into delirium. Bad dream logic takes hold as you go through the gates of the Refugee Camp at Bexhill. You pass through buildings that were once public utilities into an indeterminate space — Hell as a Temporary Autonomous Zone — in which laws, both juridical and metaphysical, are suspended. A carnival of brutality is underway. By now, you are homo sacer[4] so there’s no point complaining about the beatings. You could be anywhere, provided it’s a warzone: Yugoslavia in the Nineties, Baghdad in the Noughties, Palestine any time. Graffiti promises an intifada, but the odds are overwhelmingly stacked in favour of the State, which still packs the most powerful weapons.
The third reason that Children of Men works is because of its take on cultural crisis. It’s evident that the theme of sterility must be read metaphorically, as the displacement of another kind of anxiety. (If the sterility were to be taken literally, the film would be no more than a requiem for what Lee Edelman calls “reproductive futurism”, entirely in line with mainstream culture’s pathos of fertility.) For me, this anxiety cries out to be read in cultural terms, and the question the film poses is: how long can a culture persist without the new? What happens if the young are no longer capable of producing surprises?
Children of Men connects with the suspicion that the end has already come, the thought that it could well be the case that the future harbours only reiteration and re-permutation. Could it be, that is to say, that there are no breaks, no “shocks of the new” to come? Such anxieties tend to result in a bi-polar oscillation: the “weak messianic” hope that there must be something new on the way lapses into the morose conviction that nothing new can ever happen. The focus shifts from the Next Big Thing to the last big thing — how long ago did it happen and just how big was it?
The key scene in which the cultural theme is explicitly broached comes when Clive Owen’s character, Theo, visits a friend Battersea power station, which is now some combination of government building and private collection. Cultural treasures — Michelangelo’s David, Picasso’s Guernica, Pink Floyd’s inflatable pig — are preserved in a building that is itself a refurbished heritage artefact. This is our only glimpse into the lives of the elite. The distinction between their life and that of the lower orders is marked, as ever, by differential access to enjoyment: they still eat their artfully presented cuisine in the shadow of the Old Masters. Theo, asks the question: how all this can matter if there will be no-one to see it? The alibi can no longer be future generations, since there will be none. The response is nihilistic hedonism: “I try not to think about it”.
T.S. Eliot looms in the background of Children of Men, which, after all, inherits the theme of sterility from “The Waste Land”. The film’s closing epigraph “shantih shantih shantih” has more to do with Eliot’s fragmentary pieces than the Upanishads’ peace. Perhaps it is possible to see the concerns of another Eliot — the Eliot of “Tradition and the Individual Talent”[5] — ciphered in Children of Men. It was in this essay that Eliot, in anticipation of Bloom, described the reciprocal relationship between the canonical and the new. The new defines itself in response to what is already established; at the same time, the established has to reconfigure itself in response to the new. Eliot’s claim was that the exhaustion of the future does not even leave us with the past. Tradition counts for nothing when it is no longer contested and modified. A culture that is merely preserved is no culture at all. The fate of Picasso’s Guernica — once a howl of anguish and outrage against fascist atrocities, now a wall-hanging — is exemplary. Like its Battersea hanging space in the film, the painting is accorded “iconic” status only when it is deprived of any possible function or context.
A culture which takes place only in museums is already exhausted. A culture of commemoration is a cemetery. No cultural object can retain its power when there are no longer new eyes to see it.
rebel without a cause
“Why is it […] that left-wingers feel free to make their films direct and realistic, whereas Hollywood conservatives have to put on a mask in order to speak what they know to be the truth?”
— Andrew Klavern, “What Bush and Batman Have in Common”[2]
“What I despise in America is the studio actors [sic] logic, as if there is something good in self expression: do not be oppressed, open yourself, even if you shout and kick the others, everything in order to express and liberate yourself. This stupid idea, that behind the mask there is some truth. […] Surfaces do matter. If you disturb the surfaces you may lose a lot more than you account. You shouldn’t play with rituals. Masks are never simply mere masks.”
— Slavoj Žižek and Geert Lovink, “Japan Through a Slovenian Looking Glass: Reflections of Media and Politic and Cinema”[3]
There are many symptomatically interesting things about the right-wing attempts to appropriate The Dark Knight that are doing the rounds at the moment. The idea is that the Batman of the film equals Bush — a misunderstood hero prepared to make “tough choices” in order to protect an ungrateful population from threats it is too ethically enfeebled to confront.
In a couple of intricately argued posts, Inspersal[4] demonstrates that The Dark Knight by no means presents “tough choices” as “hard but necessary”; on the contrary, whenever Batman resorts to torture, it either yields nothing or is counterproductive. What neocon readings of the film must overlook is that this is exactly the same in geopolitical reality: far from being unpalatable but necessary, the Iraq misadventure, Guantanamo Bay, extraordinary rendition, etc. have either achieved no results or made things worse. What’s interesting here is the doggedness of the neocon fantasy, which is precisely a fantasy of “being realistic” — astonishingly, elements of the American right appear to actually still believe that the Bush administration’s policies are successful, and that the American public has rejected them on the grounds of high-minded (liberal) ethical qualms rather than for pragmatic-utilitarian reasons (too many of our boys being killed).
Secondly, what these readings also miss is the actual nature of the model of virtue presented in the film. If this is (neo)conservative, it is not at the simple level of utilitarian calculation of consequences. What we are dealing with is a far more complicated Straussian meta-utilitarianism whose cynical reasoning is akin to that of Dostoyevsky’s Grand Inquisitor. Deception — of the masses by the elite — is integral to this account of virtue: what is “protected” is not the masses’ security but their belief (in Harvey Dent’s campaign).
As Inspersal argues, the emphasis on deception in The Dark Knight is one of the themes that connects it with Nolan’s previous films, and Batman’s climactic act of self-sacrifice is precisely an act of deception. It takes place at the level of signs: what he must give up is his reputation, his good standing in the eyes of the Gotham public. The act of deception doesn’t conceal an underlying good act — it is the concealing that is the good act itself.
Thirdly, the neocon readings misconstrue the nature of “evil” in the film. If these right-wingers really think that Osama bin Laden is like The Joker as he appears in The Dark Knight, that gives us another, intriguing, insight into their fantasies. (Matthew Yglesias says, “I look at the movie and say ‘see — if you were fighting a comic book bad guy and you were a comic book hero then your policies would make sense.’”[5] But even this isn’t the case, as Inspersal’s arguments above make clear.) Or rather, it reveals the inconsistency on which Islamophobic fantasy depends: the Islamist is both “an agent of chaos”, someone without a cause, and a zealot excessively attached to a cause.
What’s interesting about The Dark Knight is that is not really about Good versus Evil at all but “good causes” versus aberrant modes of cause/causality. The Joker and Two-Face are mad rather than bad, and their insanity is centrally connected with their relationship to cause. The Joker is pure Terror, that is, Terror detached from any cause:
You see, nobody panics when things go according to plan. Even if the plan is horrifying. If I told people that a gangbanger was going to get shot, or a busload of soldiers was going to get blown up, nobody would panic. Because it’s all part of the plan. But tell people that one tiny little mayor is going to die and everyone loses their minds! Introduce a little anarchy, you upset the established order, and everything becomes chaos. I am an agent of chaos. And you know the thing about chaos, Harvey. It’s fair.
While Batman is drawn into utilitarian calculations, The Joker is free in the same way that the death drive is free: he acts with indifference to consequences, glorying instead in a kind of ungrounded unbinding of orderly causal sequences. The reference to “fairness” above is not idle. As an imp of the perverse, The Joker stands for an inverted (or freaked) Kantian justice. In many ways, we are looking at the reversal of Kantianism into Don Giovanni Žižek has described many times (Don Giovanni’s decision not to save himself, to maintain his commitment to his libertinism even when doing so will result in his execution, becomes an ethical gesture). The Joker acts without any pathological interests, grandly symbolising his lack of instrumentality with the burning of the pyramid of money.
Two-Face’s insanity is also a kind of haemorrhaging of justice. In his case, the championing of a good cause — which it seems will inevitably leads to terrible consequences — is displaced by an embrace of chance’s random causality (heads/tails). The flip into randomness is not an abandonment of justice, but the quest for a justice that will not be corrupted by human will — in its very impersonal mechanism, chance is fair because it does not privilege any outcome or any individual. Interestingly, it is only when Dent becomes Two-Face that his coin tossing is fair; when Dent is the “White Knight” DA, his coin is loaded (it has heads on both sides). What also interrupts the orderly sequence of causality in Dent’s case is trauma — the trauma of seeing Rachel die, which is itself a consequence of a binary choice trap, one of a series of such traps The Joker attempts to spring.
The by now standard view of The Dark Knight — that its real libidinal pull is not the peripheral Batman/Wayne, but the charisma of Heath Ledger’s Joker — is certainly correct. When I heard Ledger’s performance celebrated, I feared the worst: that we were going to see the actorly overplaying that usually garners this kind of ubiquitous praise. But it is to Ledger’s immense credit that he completely avoids what Nicholson was allowed to do in Tim Burton’s dreadful Batman: we get no glimpse of the actor behind the role (with Nicholson, of course, that’s all we got). There is also no question of Ledger appearing bare-faced for any significant length of time, as Tobey Maguire and Julian McMahon were allowed to in Spider-Man 3 and the Fantastic Four films respectively. Thankfully, there is only the briefest glimpse of The Joker sans make-up in The Dark Knight.
What Ledger does, in many ways, is play the make-up. I should stress here that the make-up, which makes Ledger’s face look like a malevolent monkey leering from behind cracked plaster, manages a feat that is near impossible: it reinvents The Joker look whilst also maintaining fidelity to the comics (compare the Green Goblin’s mask and outfit in the Spider-Man films, whose divergence from the halloween hood in the comics always disappointed me). My one point of disagreement with Inspersal concerns his claim that Ledger’s performance “shows the Nicholson/Burton interpretation to be much closer to Cesar Romero from the TV show, rather than Alan Moore’s version from The Killing Joke, allegedly Burton and Hamm’s chief influence”. I would argue that, in fact, it is Ledger’s performance that is closer to Romero’s, and that is why it works so well. Nicholson’s PoMo posturing and Moore’s psychological depth were all of a piece, and both were far less terrifying than the senseless gibbering of Romero’s pantomimeturn Joker. The Joker was always fascinating because, unlike most if not all big-time supervillains, he was pure surface, motiveless madness, devoid of any origin or backstory — until Moore obligingly filled one in, as is his hamfisted pseudo-literary wont. There are a couple of great scenes in The Dark Knight where Ledger’s Joker mocks cod psychoanalytic reduction: “See these scars… I got them because of my father.” “See these scars… I got them because of my wife.” (This reminded me of nothing so much as Ian Bannen’s chilling burst of explosive laughter in Sidney Lumet’s The Offence, in response to Sean Connery’s question: “Was your father a big man?”) If The Joker aligns himself with anything it is “the freak”, which cannot but remind us of freak events, that is, events which appear to happen without proper causation. By evacuating The Joker of all interiority, by refusing anything which would contain the Joker’s wildness or compromise the autonomy of his face-painted persona, Ledger’s performance (and Jonathan Nolan’s script) do justice to the freakish.
robot historian in the ruins
“Ideology is not something foreign, something in a film with a strange power to impose itself on our minds; ideology is what we and the film share, what allows for the transfer of specific meanings between film and audience (a transfer which is not one way). As Žižek puts it, ideology is made up of ‘unknown knowns’; that is to say, the problem with ideology is not that it is a falsehood of which we might be persuaded, but because it is a truth that we already accept without knowing it.”
— Voyou, “Ideology critics are a superstitious, cowardly lot”[2]
Voyou’s remarks on readings of The Dark Knight make some important points about ideology. Focusing on the supposed “message” of the film — as both neoconservative interpretations of the film, and their critics, including me, do — is in danger of missing the way in which ideology works in capitalism. The role of capitalist ideology is not to make an explicit case for something in the way that propaganda does, but to conceal the fact that the operations of capital do not depend on any sort of subjectively assumed belief. It is impossible to conceive of fascism or Stalinism without propaganda — but capitalism can proceed perfectly well, indeed better, without anyone making a case for it.
In the responses to The Dark Knight I posted here, it was Wayne Wedge who captured the way that the film functions as a hyper-object in late capitalism.[3] The very multivalence of The Dark Knight, its capacity to generate radically different interpretations, to elicit discourse, is what makes it a highly efficient meta-commodity. A text with a single monologic Message, even supposing such a thing could exist, would not be able to “provoke the debate” which capitalist culture now feeds upon.
It not only that a cultural object can be opposed to capitalism on the level of content, but serve it on the level of form; one could convincingly go further and argue that the ideology of capitalism is now “anti-capitalist”. The villain in Hollywood films is routinely the “evil multinational corporation”. So it is, once again, in Disney/Pixar’s Wall-E, which, like The Dark Knight, has provoked all kinds of bizarre conservative readings. “This is perhaps the most cynical and darkest big-budget Disney film ever”, claims Kyle Smith[4]. “Perhaps never before has any corporation spent so much money on insulting its customers.” (By way of parenthesis, since it isn’t relevant to my argument here, this, from Paul Edwards, is priceless: “WALL-E is the story of what results when a liberal vision of the future is achieved: government marries business in the interest of providing not only ‘the pursuit of happiness’ but happiness itself, thus creating gluttonous citizens dependent on the government to sustain their lives.”)[5]
Wall-E’s attack on consumerism is easily absorbed. The “insult” that provoked Kyle Smith into disgust was its image of humans as obese, infantilised chair-bound consumers supping pap from cups. Initially, it might seem subversive and ironic that a film made by a massive corporation should have such an anti-consumerist and anti-corporate message (it is made clear in the film that the mega corporation Buy N Large is chiefly responsible for the environmental depredation which has destroyed earth as a human environment). Yet it is capital which is the great ironist, easily able to metabolise anti-corporate rhetoric by selling it back to an audience as entertainment. Besides, on the level of content, Wall-E ends up serving capitalist realism, presenting what we might think of as the very fantasies of capital itself — that it can continue to expand infinitely; that the despoilation of the human environment on Earth is a temporary problem that will eventually be overcome; that human labour can be extirpated altogether (on the spaceship Axiom, humans are given over entirely to consumption, and all work is performed by servomechanisms). Human labour returns only at the end of the film, when capital/Axiom begins its terraforming of Earth.
There is another impasse in Wall-E. The film follows in the tradition of fictions about wanderers in the ruins (cf Christopher Woodward’s In Ruins). But in some respects Wall-E was an advance on the stories of postapocalyptic solitaries, from Mary Shelley’s The Last Man through to Richard Matheson’s I Am Legend or John Foxx’s The Quiet Man. For in Wall-E the lone figure in the ruins is not even human: it is a robot historian quite different from the one Manuel DeLanda imagined; or not a robot historian so much as a bricoleur-hauntologist, reconstructing human culture from a heap of fragments. (A precursor of this scenario is Numan’s “M.E.”, the track sampled by Basement Jaxx on “Where’s Your Head At”, written from the perspective of a sentient computer left alone on an Earth.) This idea of surveying a world in which humans are extinct clearly exercises a powerful fantasmatic allure. Yet it seems that there’s a certain point where the fantasy always breaks down — the fictions that start from this premise invariably end up restoring a human world at some point in the narrative. It is no doubt asking too much that Wall-E should buck this trend; but it’s notable that the film deteriorates massively the moment that the humans appear (cf all of the film versions of Matheson’s I Am Legend, including the most recent). You’re left wondering whether this is a structural necessity, whether there’s something in the nature of the fantasy itself which entails the return of other humans, or whether it is a requirement arising from the needs of narrative: stories can’t sustain themselves with only one protagonist. In the case of Wall-E, of course, there are two (non-human) characters, which make the early part of the film, a robotic romance played out as animated ballet, recall the films of the silent era. Needless to say, there are many films which feature non-human protagonists, but such characters are rendered effectively human by their language use. Wall-E and Eve, meanwhile, seem like convincing non-human subjects because they lack language. Wall-E tantalises: what if the feel of this first section had continued until the end of the film, uninterrupted by the return of humans?
review of tyson
“It’s like a Greek tragedy. The only problem is that I’m the subject”, Mike Tyson reputedly told James Toback when he first saw this film. There is a classical structure to the narrative: a kid from mean streets, with few prospects, a life of criminality already under way, is talent-spotted by a grizzled boxing trainer; he becomes the youngest world champion ever; then it all disintegrates into hedonism, profligacy and violence. Yet in the end the structure of the story is psychoanalytic as much as tragic (after all, it wasn’t for nothing that Freud turned to Sophocles and Aeschylus for analogues of his discoveries). A familiar enough narrative arc, but what makes it even more remarkable (and even more Freudian) is that it happens again. Tyson struggles back to the top of the heavyweight game before once again succumbing to ill-discipline and self-destruction. A textbook case of the compulsion to repeat.
Tyson’s life was shaped by absent fathers and father surrogates. He was rescued from rudderless street survivalism by the trainer who ended up adopting him, Cus D’Amato; his subsequent fall from grace was precipitated in part by D’Amato’s death. The Tyson that emerges in Toback’s gripping film is very much like the subject of psychoanalysis, a talking head coaxed by the director (in the role of the offscreen analyst) into reliving all the triumphs and traumas. The film consists only of archive footage and Tyson — a ringside commentator on his own life — talking. There are no experts, no supposedly neutral judgements, only Tyson trying to make sense of the double tragedy of his life. It makes for a claustrophobic experience, amped up by the way in which Toback occasionally multi-tracks Tyson’s voice and splits the screen, creating the impression of a divided man, sometimes chillingly self-aware, sometimes a mystery to himself.
Tyson’s story is sufficiently forgotten now that it is capable of thrilling and horrifying us as if for the first time: the astonishingly quick rise to world champion, the run of viciously efficient victories, the high-profile debacle of his marriage to Robin Givens (Tyson sitting stock still on a chatshow couch while the actress vilifies him), the rape conviction and resulting prison sentence, the conversion to Islam, the biting of Evander Holyfield’s ear… Tyson provides a newly intimate perspective on these half-remembered images.
Sports stars of this magnitude cannot but be the objects of collective fantasy and projection, and even though his is an individual story — and we can be under no illusions after watching Tyson that there is no lonelier sport than boxing — Tyson’s is also the story of a culture and a time. Just compare Tyson with Muhammad Ali (whose own myth was examined and re-presented in When We Were Kings and Ali). With his poetry, physical and verbal, Ali was the boxer for the age of Black Power, the Panthers, Malcolm X, Sly Stone and James Brown; Tyson’s pitbull brutality, meanwhile, was the fight analogue of the every-man-for-himself ethos of Reaganomics and the will-to-power pugilism of rap. His slogan was “Refuse to Lose” (a phrase that would be central to Public Enemy’s epochal Welcome to the Terrordome): the aim was to overcome Nemesis by force of will alone, and in his pomp Tyson looked like iron will embodied. He came out of his corner like a starved attack dog, clubbing opponents into oblivion in a matter of moments. Nothing was wasted; there was no grandstanding or showboating.
Partly that was because Tyson felt he had no time to waste — for physical as well as existential reasons. He had suffered from a respiratory disorder since childhood and knew that he would struggle if fights went the full distance. The rapidity and intensity of his victories belied the precision of his attacks. We learn that it wasn’t a question of sheer physical force alone. D’Amato (a “master of anatomy”, according to Joyce Carol Oates) taught him where on the body to hit to cause maximum damage. In the fight footage, Tyson always looks short by comparison with his opponents — “at five feet 11 inches”, Oates wrote in a 1986 essay, “he is short for a heavyweight and strikes the eye as shorter still; his 222 1/4-pound body is so sculpted in muscle it looks foreshortened, brutally compact.”[2] Yet he always turned that compactness to his advantage, making the taller men look like ponderous Harryhausen statues.
Listening to him speak, you’re continually struck by the contrast between Tyson the fighting machine and Tyson the talker. His voice is a gentle lisp, devoid of swagger, suggestive of an unusual sensitivity. It sits just as oddly with Tyson’s older face and its Queequeg tattoos as it did with his earlier fighting frame. It becomes obvious, though, that the hypermuscular body Tyson developed was in part an exo-skeleton constructed to protect that sensitive core. Remembering the time he first realised that no one would ever be able to be beat him up again, Tyson stalls — “Oh, I can’t even say it” — pauses for a long moment before saying, “Because I would fuckin’ kill ‘em.” The film’s rhythm is governed by Tyson’s unstable relationship to language, by his switches in and out of articulacy. Sometimes his tongue is as quick as his fists once were. His hilarious takedown of Don King — a “wretched slimy reptilian motherfucker” — is as swift and savage as any of his combinations in the ring. Elsewhere, the words elude him, or he evades them. Yet, exactly as psychoanalysis taught us to expect, the ellipses, the sentences that lead nowhere and the “wrong” choice of word tell us even more than the moments of transparent lucidity. The unconscious speaks, and James Toback demonstrates an extraordinary facility for hearing and recording it.
“they killed their mother”: avatar as ideological symptom
Watching Avatar, I was continually reminded of Žižek’s observation in First As Tragedy, Then As Farce, that the one good thing that capitalism did was destroy Mother Earth. “There’s no green there, they killed their mother”, we are solemnly informed at one point. Avatar is in some ways a reversal of Cameron’s Aliens. If the “bug-hunt” in Aliens was, as Virilio argued, a kind of rehearsal for the mega-machinic slaughter of Gulf War I, then Avatar is a heavy-handed eco-sermon and parable about US misadventures in Iraq and Afghanistan. (What’s remarkable about Avatar is how dated it looks. In the scenes of military engagement, it is as if Eighties cyberpunk confronts something out of Roger Dean or the Myst videogames; Cameron’s vision of military technology has not moved on since Aliens.) At the end of the film, it is the human corporate and military interests who are described as “aliens”. But this is a film without any trace of the alien. Like most CGI extravaganzas, it flares on the retina but leaves few traces in the memory. Greg Egan finds little to admire in Avatar, but he does defer to its technical achievements: “mostly, the accomplishments of the visual designers and the army of technicians who’ve brought their conception to the screen appear pixel-perfect, and hit the spot where the brain says ‘yes, this is real’.”[2] The cost of this, though, is that it is very difficult to be immersed in the film as fiction. It is more akin to a theme-park ride, a late-capitalist “experience”, than a film.
What we have in Avatar is another instance of corporate anti-capitalism such as I discussed in Capitalist Realism in relation to Wall-E. Cameron has always been a proponent of Hollywood anti-capitalism: stupid corporate interests were the villains in Aliens and Terminator 2 as they are in Avatar. Avatar is Le Guin-lite, a degraded version of the scenario that Le Guin developed in novels such as The Word For World Is Forest, The Dispossessed and City Of Illusions, but stripped of all Le Guin’s ambivalence and intelligence. What is foreclosed in the opposition between a predatory technologised capitalism and a primitive organicism, evidently, is the possibility of a modern, technologised anti-capitalism. It is in presenting this pseudoopposition that Avatar functions as an ideological symptom.
No primitivist cliché is left untouched in Cameron’s depiction of the Na’vi people and their world, Pandora. These elegant blue-skinned noble savages are at one with their beautiful world; they are Deleuzean Spinozists who recognise that a vital flow pervades everything; they respect natural balance; they are adept hunters, but, after they kill their prey they thank its “brother spirit”; the trees whisper with the voices of their revered ancestors. (Quite why skirmishes with the Na’vi and their bows and arrows should have prompted Steven Lang’s grizzled colonel into Apocalypse Now-like disquisitions on how Pandora made for his worst experience in war, is unclear.) “There’s nothing we have that they want”, concludes Sam Worthington’s Jake Sully of the Na’vi. Yet the Na’vi predictably seduce Sully, who quickly “forgets everything” about his former life on Earth (about which we learn almost nothing, beyond the fact that he is a marine who got injured in the course of battle) and embraces the wholeness of the Na’vi way of life. Sully attains wholeness through his avatar Na’vi body in a double sense: first, because the avatar is able-bodied, and, secondly, because the Na’vi are intrinsically more “whole” than the (self-)destructive humans. Sully, the marine who is “really” a tree-hugging primitive, is a paradigm of that late-capitalist subjectivity which disavows its modernity. There’s something wonderfully ironic about the fact that Sully’s — and our — identification with the Na’vi depends upon the very advanced technology that the Na’vi’s way of life makes impossible.
But a telling tic in the film is the repeated compulsion to explain the persistence of (physical) wounds among the human characters. Given the level of technology in the film’s 2051, both Sully’s useless legs and the colonel’s scars could easily have been repaired, and the script goes out of its way to say why the two characters they remain disabled and maimed respectively: in Sully’s case, it’s because he can’t afford the medical treatment; in the colonel’s, it’s because he “likes to be reminded of what he’s up against”. Such explanations are clearly unconvincing — the narratively underdetermined wounds can only be explained as libidinal residue which the film cannot fully digest into its digital Imaginary. The wounds prevent the disavowal of modern subjectivity and technology which Avatar attempts at the very same moment that the film invites us to admire it as a technological spectacle.
If we are to escape from the impasses of capitalist realism, if we are to come up with an authentic and genuinely sustainable model of green politics (where the sustainability is a matter of libido, not only of natural resources), we have to overcome these disavowals. There is no way back from the matricide which was the precondition for the emergence of modern subjectivity. To quote one of my favourite passages in Žižek’s First As Tragedy: “Fidelity to the communist Idea means that, to repeat, Arthur Rimbaud, […] we should remain absolutely modern and reject the all too glib generalisation whereby the critique of capitalism morphs into the critique of ‘modern instrumental reason’ or ‘modern technological civilisation’.”[3] The issue is, rather, how modern technological civilisation can be organised in a different way.
precarity and paternalism
The recent discussion of elitism (a topic also broached by Adam Curtis’ film on Charlie Brooker’s Newswipe this week) brings me back to the question of what — in the continuing lack of any alternative term — I must still refer to as “paternalism”. I think Taylor Parkes got to what is at stake in these discussions in his rather moving Quietus piece about Trunk’s Life On Earth release:
Hard to credit now, but there was once something paternalistic, almost philanthropic about the Beeb, spreading the cultural wealth of the educated classes through housing estates and comprehensive schools. This kind of evangelism rarely sits well with self-conscious champions of the lumpenproletariat, whose right to live in shit, they believe, outweighs their right to not live in shit — for some, being patronised is worse than being brutalised. But then people can be very naïve about the motivations of those who give the people what they want, relentlessly and remorselessly. And while the Corporation was sometimes guilty of gross assumptions and a very real stuffiness, I don’t like to think how I might have grown up — stomping around in the middle of nowhere — had it not been for Life On Earth, or Carl Sagan’s Cosmos, or James Burke’s Connections, or the gentle guidance of the BBC Children’s department. Years ago, I interviewed the men in charge of “youth programming” at Channel 4, goateed and bereted and utterly insistent that their race to the bottom was a noble crusade; they railed against the BBC’s “eat-your-greens” approach, and spoke of gallons of liquid effluent, coursing through the pipes of British culture, in terms of freedom and some strange colour of egalitarianism. Here was the future, banging its drums, and even then it made me blanch. As controller of BBC2 in the late 1960s, David Attenborough had a different vision, rooted in what was, for all his personal privilege, an (enduring) belief in inclusivity. If the so-called Golden Age of Television could boast its fair share of shoddy, overlit crap — and my God, it could — at best it was truly empowering, and its passing has screwed us all to some extent. We can still choose to watch BBC Four, I suppose (assuming it’s not another show where ex-NME writers smirk at Mud’s trousers), but then this is an age of choices, few of which have much to do with freedom in the long term. No one’s going to stumble onto culture any more, not like I did, or my dragged-up mates did. It’s worse than a shame.[2]
It’s worth reminding ourselves of the peculiar logic that neoliberalism has successfully imposed. Treating people as if they were intelligent is, we have been led to believe, “elitist”, whereas treating them as if they are stupid is “democratic”. It should go without saying that the assault on cultural elitism has gone alongside the aggressive restoration of a material elite. Parkes touches here on the right way to think about paternalism — not (just) as something prescriptive, but in terms of the gift and the surprise. The best gifts are those we wouldn’t have chosen for ourselves — not because we would have overlooked or rejected them, but because we simply wouldn’t have thought of them. Neoliberal “choice” traps you in yourself, allowing you to select amongst minimally different versions of what you have already chosen; paternalism wagers on a different “you”, a you that does not yet exist. (All of which resonates with J.J. Charlesworth’s illuminating piece on the management of the ICA in Mute, with its attack on the assumption that “what the audience wants is merely what the institution should do”.[3])
Neoliberalism may have been sustained by a myth of entrepreneurialism, a myth that the folk economics of programmes like The Apprentice and Dragon’s Den have played their part in propagating, but the kind of “entrepreneurs” that dominate our culture — whether they be Bill Gates, Simon Cowell or Duncan Bannatyne — have not invented new products or forms, they have just invented new ways of making money. Good for them, no doubt, but hardly something that the rest of us should be grateful for. (The genius of Cowell was to have plugged a very old cultural form into new machineries of interpassivity.) And for all the bluster about entrepreneurialism, it is remarkable how risk-averse late capitalism’s culture is — there has never been a culture more homogenous and standardised, more repetitive and fear-driven.
I was struck by the contrast between Parkes’ piece and an article by that Caitlin Moran wrote in the wake of the announcement that Jonathan Ross is to leave the BBC. “After [Ross’] £18 million contract”, Moran wrote,
endless fretting pieces were written, asking whether the BBC should ever try to compete with ITV1’s salaries. The real question, however, is “what would happen to the BBC if it didn’t?” If the only people who work for the BBC are those in it for the sheer love of it and — those who would piously turn down double the wages from ITV— the BBC would rapidly become the middle-class liberal pinko panty-waist institution of the Daily Mail’s nightmares, and, I suspect, fold within five years.
Really? ITV’s high salaries, when they could afford to pay them, were hardly guarantees of quality; and the idea that Ross is one of us because he was “quick, edgy, silly nerd-dandy, into Japanese anime and rackety new guitar bands” presupposes a model of the “alternative” as shopworn and discredited as New Labour. Note that Moran fully accepts the neoliberal logic whereby “talent” is only motivated by money. (The return of the concept of “talent”, with all its de-punking implications, was perhaps the most telling cultural symptom of the last decade; while the application of the word to bankers was its sickest joke.)
As Moran suggests, the BBC’s real rival now, evidently, is not the ailing ITV but the Daily Mail and News International, and if public service broadcasting is to defend itself against an assault that will only increase in ferocity, it will need rather more than Ross’ sexual suggestiveness, warmed over hipness and occasional wit at its disposal. (It’s far harder for the Mail to attack the likes of Attenborough than triviamongers such as Ross or Graham Norton; and did Attenborough ever get the equivalent of Ross’ eighteen million, I wonder?) It’s not only unjustifiable that public money be spent on exorbitant salaries for presenters and executives: it also plays into the Mail’s agenda, which is all about maintaining the negative solidarity which has been crucial to neoliberal hegemony.[4] Call me old fashioned, but I firmly believe that only those who would work for the BBC for the sheer love of it should be in the job. More than that, being motivated by money ought to be a reason for people not getting senior public service appointments. This is not, grotesquely, an argument for low wages — but it is an argument for the more equitable — and creative — redistribution of money in the public sphere. Imagine if Ross’ eighteen million were instead spent — risked — on what British television most sorely lacks, writers. You could pay scores of writers a good wage for years… The BBC ought to be in a position to cushion its creative staff from the pressures of producing immediate success — and, contrary to the neoliberal logic which insists that people are best motivated by fear and money, it is that cushioning which facilitates a certain kind of cultural entrepreneurialism.
After all, people will do worthwhile things if they are not paid or if they are paid poorly. The interesting side of Web 2.0 is just this — not the vacuous “debates”, but the impulse to share that is a significant part of the motivation for writing blogs, uploading material to YouTube and updating Wikipedia. If anything is the work of the multitude, it’s something like the salvagepunk archive that is YouTube. It’s intriguing that capitalist realism co-exists with the emergence of new forms of culture which can be commodified only very incompletely. At one level, commodification is total, and, in Jeremy Rifkin’s phrase, all of life is a paid-for experience; yet there are whole areas of culture which are effectively being decommodified (does anyone seriously think that any recorded music will be paid for at all in a decade?). As a cultural worker, this is something I am ambivalent about, to say the least […] I seem to achieve success in things at the very moment that it’s not longer possible to make money from them…
When I was in Dublin a week or so ago talking about Capitalist Realism, a member of the audience asked why I was talking about public service workers when my own situation has shown that it’s better to leave fulltime employment and enter the precariat. This is a reasonable question on the face of it, since I’ve done pretty well since being made redundant from my FE teaching job. Yet in some respects all that has happened is that I’ve swapped the NuBureaucratic stress of public service employment for the perpetual anxiety of hyper-precarity, and had my income massively cut in the process. One of the ways in which negative solidarity plays out is by exploiting the opposition between permanent employees and precarious workers. Permanent employees tend to be quietist to keep (what they think of as) their job security, whereas precarious workers, being expendable, have no power at all. A while back, Tobias van Veen gave a very powerful account of his own experiences of precarious labour:
there is an ironic yet devastating demand being placed on the labourer: while work never ends (as one is never out of touch, and always expected to be available, with no claims to a private life or other demands), you as a worker are nonetheless completely expendable (and thus a member of the precariat: and so one must sacrifice all autonomy from work so as to keep one’s job). […] This contemporary condition of on-call ontology or on-demand dasein produces an emotional economy of stress. To live under such instant-demand duress is stress-inducing indeed. Life becomes a series of panic attacks in the face of never being able to live up to such workplace demands without completely dismantling “life” itself as distinct from “work”. The managerial class uses techniques of guilt/loyalty to enforce workers to labour at a moment’s notice, scheduling with less than a few hours or days time, without hope of a raise, without benefits or reward, and all for a minimum wage.[5]
The precarious worker is doubly punished: not only do they have no job security, they also get paid less than the permanent employees for doing the same work. When I switched from being an hourly paid lecturer in Further Education to having a permanent contract, I was doing exactly the same work, but suddenly I was both paid hundreds of pounds more a month and got paid for holidays too. Back in the precariat, my total income since the tax year that began in April — for all the teaching, supervision, writing and editing I’ve done, when I doubt there’s been more than two weeks that I’ve worked less than fifty hours — is the princely sum of eleven grand, which works out at significantly less than minimum wage. All the work I’ve done depends upon my not being in full-time work, so, no matter that my hourly rate for some work seems quite high, in effect I’m always working for minimum wage. (Much writing only pays minimum wage anyway.) All this, in conditions where it’s impossible to turn down any commission, no matter how short notice it is given to me, where I’m on-demand at practically all times and there are no guarantees that I will keep getting the work. The kind of hustling I’m required to do involves a kind of “creativity”, I suppose, but “getting creative” about how I can monetise my activities doesn’t seem like the best conceivable use of my time. What the broken, piecemeal time of precarity precludes is engagement in long-form projects. It’s very hard for me to devote any time to finishing my next book for Zer0 because I will always privilege any work that pays immediately. But full-time employment also precludes the engagement in long-form projects: Capitalist Realism, for instance, was written after work or at weekends.
I say all this not because I want sympathy — I still think I’m incredibly fortunate to be making any sort of living out of what I do — but more because my situation is symptomatic. And now that the high-rolling, business ontology-driven model of cultural provision is finished, surely there’s a better way to fund cultural work?
return of the gift: richard kelly’s the box
I wouldn’t say that Richard Kelly’s The Box is a hauntological film, but it shares certain affinities with the way someone like Ghost Box re-dream the Weird. The Box is based on a short story by Richard Matheson, who occupies something like the same position in the American Weird that Ghost Box’s touchstone, Nigel Kneale, does in the UK Weird. Both Kneale and Matheson operated in an interstitial generic space — between SF and horror — proper to the Weird, in a pulp infrastructure — paperbacks, television, B cinema — that has now largely disappeared. Matheson has yet to quite acquire the auteur status that Kneale enjoyed, but this only adds to his pulp-anonymous artisan allure; there’s a special kind of delight in realising that films you’d likely as not first encountered, apparently randomly, on late night TV — The Incredible Shrinking Man, The Omega Man, Duel (as recently discussed by Graham[2]) — were in fact written by the same individual. (Matheson also wrote the screenplay for what — leaving aside the Kneale-scripted Quatermass and the Pit — is perhaps Hammer’s greatest film, The Devil Rides Out.)
Much like Jacob’s Ladder, which it resembles in a number of respects, The Box is a Weird take on the 1970s. Or rather, it draws together a number of Weird threads that were already present in the Seventies. Like Jacob’s Ladder and much hauntological music, The Box captures a certain grain of the Seventies. The Box feels like a re-dreaming of the Weird rather than a revival in part because of the very incoherence that some have complained about. This “incoherence” is of a particular type; it isn’t simply a failure of coherence so much as the generation of an oneiric (in)consistency which doesn’t add up (into a final resolution) but which doesn’t fragment into nonsense either.
The dream atmosphere is reinforced by the way that Kelly incorporates aspects of his own life into the film — the characters of Arthur and Norma Lewis are apparently based closely on his own parents[3] — into the diegesis. But rather than the de-stranging tendencies at work in something like the new Dr Who — the Weird subordinated to familialism and emotionalism — The Box goes in the other direction, introducing the Weird into the family home — in parallel with how television used to do the same thing. The lines between Kelly’s home life and the Weird must have been soft in any case: his father worked at NASA at the time when the Viking probes were landing on Mars.
The Box is based on Matheson’s 1970 short story, “Button Button”, later adapted into an episode of the revived Twilight Zone in 1986. To be more accurate, The Box uses both the original story and The Twilight Zone episode as elements in a simulated dreamwork which simultaneously extrapolates from the two versions and condenses them into an unstable compound. The result is a labyrinthine structure which bears some relation to Lynch’s Inland Empire (Inland Empire, incidentally, was the last film to creep me out as much as The Box did). The Box is defined by the tension between the structure of the labyrinth — an absolute labyrinth, leading nowhere except deeper into itself — and the structure of the dilemma — in which reality seems to resolve into a set of disjunctions.
It’s possible to delimit a number of distinct but connected levels at which the film operates.
The Ethical
The most simple level on which the film works — the film’s entry level — is that of the ethical. All three versions of “Button, Button” turn on a dilemma: not so much an ethical dilemma as a dilemma about whether to set aside the ethical altogether. A well-dressed stranger, Mr Steward, arrives and presents the Lewises with a box with a button on top of it. If they press the button, Steward informs them, they will receive a large sum of money (in The Box it is a million dollars); however, someone that they don’t know will die. In all three versions, it the wife who decides to push the button. Here, the versions diverge: in Matheson’s original story, after Norma pushes the button, she receives the money as insurance compensation for the death of her husband. When she complains that Steward had told her that the person who died would be someone she didn’t know, Steward asks: “Did you really know your husband?” In The Twilight Zone version — which Matheson reputedly hated — the ending is different. Here, when Steward has handed over the money, he pointedly says to the couple, “I can assure you it will be offered to someone whom you don’t know.” The Box adopts this version of the story, but this is only the beginning of the film, the first act, as it were.
Unintended Consequences
“Button, Button” is clearly an update of W.W. Jacobs’ story “The Monkey’s Paw” — in which a family wishes for a sum of money, only to receive it in compensation for the death of their son. Jacobs’ story was itself a play on older tales about the unintended consequences of wish fulfilment. As Wiener observed in God and Golem, Inc.: A Comment On Certain Points Where Cybernetics Impinges On Religion, such unintended consequences arise because “the operation of magic is singularly literal-minded [in that] if it grants you anything at all, it grants you exactly what you ask for, not what you should have asked for or what you intend.” “The magic of automatisation, and in particular the kind of automatisation where the devices learn”, he adds, “may be expected to be similarly literal-minded”.[4] Like the cybernetic machine, the wish-fulfilling object (the monkey’s paw) delivers exactly what it says it will: but what it gives you may not be what you want (or what you think you want).
What Matheson’s tale adds to Jacobs’ story is the question of knowledge. Matheson’s story brings into play the old philosophical “problem of other minds”, now applied to the marital situation: even those closest to us are ultimately opaque, black boxes into which we can never see. Naturally, this also raises the equally ancient problem of self-knowledge, but given a psychoanalytic edge. We are alien to ourselves; our real desires may be unknown to us, emerging only in parapraxes and dreams. Here the oneiric form of The Box collapses into its content — the box, like the dream according to Freud, fulfils our wishes. The inevitable psychoanalytic conjecture into which Matheson’s story tempts us is the thought that perhaps the wife does get exactly what she wants — that the death of her husband was her wish all along. In this sense the box would be like the Room in Tarkovsky’s Stalker: the stalker Porcupine goes into the wish-fulfilling Room hoping for the return of his dead brother, but receives instead immense riches. In its very unreflective automatism — giving Porcupine exactly what he wants — the Room judges and condemns him.
The Political
What Matheson’s story also adds to “The Monkey’s Paw”, of course, is the fact that the bad consequences are not simply unintended; they were just supposed to happen to someone else. This is what makes it so much nastier than Jacobs’ tale. Whereas the family in “The Monkey’s Paw” are guilty only of foolishness and greed, the couple in “Button, Button” knowingly trade another’s death in exchange for wealth. In The Box this is especially shocking because both Norma and Arthur Lewis seem to be “good” people — Cameron Diaz’s Norma in particular is immensely sympathetic. Perhaps what allows her to press the button is the unresolved ontological status of the box itself; the thought that it might be a prank (Arthur establishes that the box is empty) allows Norma to perform a kind of fetishist disavowal (“this might not be real, so I might as well do it”). As Hauntagonist put it on his Twitter feed: “the button in The Box is a nice example of how interactivity creates anxiety and fetishistic disavowal. Diaz doesn’t believe but she believes ‘the subject supposed to believe’ does, Arlington Steward being the stand-in for the Big Other.”
Here we are back in the realm of the ethical — but the ethical bleeds out into the political. The choice to press the button has a special force in the era of globalisation and climate change. We know that our wealth and comfort are achieved at the price of others’ suffering and exploitation, that our smallest actions contribute to ecological catastrophe, but the causal chains connecting our actions with their consequences are so complicated as to be unmappable — they lie far beyond not only our experience, and any possible experience. (Hence the inadequacy of folk politics.) What the Lewises are in effect asked to do is affirm their plugging into this causal matrix — to formally accept the world and worldliness. The significance of this is that only the negative choice counts — to not press the button would be to choose a freedom that is not available to anyone at present (we are all so intricately embedded into the global capitalist matrix that it isn’t possible to simply opt out). But to press the button is to give up on freedom, to choose blind determinism.
The Existentialist
Which brings us to the most explicit intertext that Kelly introduces into The Box: Sartre’s Huis Clos. Huis Clos is everwhere in The Box; Norma, a highschool teacher, is teaching it; she and Arthur attend an amateur dramatic performance of the play. At the point when it is becoming evident that the Lewises’ choice will not be some private shame but will infect and destroy every aspect of their lives, the couple find the words “No Exit” written in the condensation of their car’s windscreen.
The resonance of Huis Clos is clear: this is a text about those who can no longer choose, who have ceased to be subjects. Fearing that they will be killed, the Lewises try to return the briefcase of money immediately, the very instant that Steward tells them that he will be sure to give the box to someone who doesn’t know them. But the horror is that Norma and Arthur have made a choice that means that it is now too late: they are already (as if) dead. There is no returning the gift.
It is astonishing that the briefcase containing the money is immediately desublimated. Kelly could have had the Lewises spend the money, their enjoyment shadowed by their anxieties about what they had done… Instead, the briefcase is immediately dumped in their basement, never to be seen or — I think — mentioned again.
There is no possibility of returning the money — no way of taking back the choice to press the button — but there is no end to choosing either. Locked in an endlessly ramifying labyrinth, Arthur and Norma keep encountering further dilemmas — but the choice is now between bad (purgatory) and worse (hell); or else, as when Arthur is offered a choice of three gateways, two leading to eternal damnation, one to salvation, they have a quality of grotesque gameshow randomness.
The Religious
The mention of “salvation” is part of a persistent religious thread in the film. As the alien big Other, the one conducting “research” into the moral worth of human beings and judging them accordingly, and with the power of damnation and redemption in his hands, Steward clearly stands in for God. Yet he is a God who also performs the Satanic function of tempting humans.
The SF/Conspiracy
Steward’s position as the (extra-terrestrial) big Other, the subject supposed to know, also somewhat echoes Sartre’s discussion of the alien, as outlined by Infinite Thought:
Sartre, towards the end of his gigantic unfinished Critique of Dialectical Reason from 1960, suddenly launches into a discussion of Martians. “For [the] Martian…who has long known the technique of inter-planetary navigation, we are… an animal species whose scientific and intellectual development have been retarded by certain circumstances [the Martian] will note that the inhabitants of this underdeveloped planet have certain behavioural patterns orientated towards certain objectives…” Because the hypothetical Martian will be at a particular scientific level (the assumption here is that it will be a much higher one), when the extent of human knowledge is revealed to the alien, there enters into the conceptual arena an exterior agent who for the first time knows what we do not know as a species — the Martian thus serves as the big Other for the entire collective enumeration of human beings. This limit case of the big Other Martian becomes, as Sartre puts it “a deep opacity, shadows in our understanding, a negation of interiority in our hearts.”[5]
The Box is thick with references to conspiracy films (and includes some of the most creepily paranoid scenes since the remake of Invasion of the Body Snatchers). The full extent of the collusion of the authorities with Steward remains unclear even at the end of the film. The threads connecting NASA, the Viking probe and Steward’s research project fray off into rumour and supposition. The labyrinth never ends.
contributing to society
In respect of The Fairy Jobmother, it’s worth noting how much more pernicious it was than Benefit Busters, the original programme from which it was a spin-off. Despite its title, Benefit Busters allowed viewers to come to a critical judgement about the initiatives the government were using to “get people back to work”. The first part of the programme, the one featuring Hayley Taylor, was like some grim parody of a reality TV talent show, in which the glittering prize on offer was not a million-pound record deal but an unpaid work trial at discount store Poundland. Taylor was clearly a dupe of the ideology rather than its cynical author, credulously believing all the New Age pyschobabble she pushed along with the facile advice (“brush your teeth before an interview”). There’s no doubt that some of the women were happier after being on the six-week “course” — but that was less because they were working for Poundland and more because they were not isolated in their own homes any more. Meanwhile, the programme showed us the home belonging to Emma Harrison, the boss of A4E,[2] the consultancy for which Taylor worked. To say that Harrison’s house was a mansion would be a massive understatement.[3] A4E employees such as Taylor were invited to Harrison’s house for “a cup of tea and a chat”, because Harrison is so informal and she just loves get feedback from her workers. Faced with the extreme opulence of Harrison’s house, viewers were at least invited to question who the real parasites scrounging off the state were. The excellent WatchingA4E blogspot does invaluable work exposing the realities of A4E’s schemes.[4] This entry quotes a description of Harrison: “Emma’s approach is to work with people: ‘I walk by their side, hold their hand and we go on a journey resulting in them getting a job that transforms their lives’.”[5]
Subsequent parts of Benefit Busters allowed viewers to form even more negative views of the government’s schemes to get people back to work — we saw the long-term unemployed cynically forced off benefits for a job that would last only a few days, and a poor young lad with severe back problems sustained after falling out of a window being told that he was fit for work. There was none of this critical perspective in The Fairy Jobmother, which presented the reality TV “journey” back to work without any irony. As Digital Ben puts it:
The show’s very title gives us an idea of what kind of strictly limited conclusions will be drawn at the end. Taylor’s steps did improve the family’s situation, but it was made clear that these “fairy godmother wishes” were miraculous and unexpected, a break from the normal order of things. The idea that they be distributed on a wider basis, or even structuralised as part of the benefits system, is never on the table. The majority of the working class unemployed are expected to pull themselves up by their bootstraps — become mini-Hayleys and fully valid humans without any outside help. So what exactly was the moral of the show? That finding work is easier when you have a well-known, well-connected recruitment specialist in your corner? Shocking. And even then — if Taylor fails to find work for the family next week, we can expect blame to be diverted to them. There is no systemic analysis. Blame falls solely upon the individuals (and, yes, their families).[6]
One can hardly underestimate the role that reality TV plays in generating this lottery thinking, which is the other side of what Alex Williams calls negative solidarity. The persistent message is that any situation can be rectified by the application of dedicated self-improvement. (C4 is to be given some credit for showing some programmes which resist this agenda: its series The Hospital and Our Drug War show the real hopelessness of the NHS and the war on drugs. The Hospital gives a grim picture of youth in the UK. Class was the unspoken factor here: there weren’t any middle-class kids being filmed arriving in hospital pregnant, or catching HIV, or getting involved in knife crime. In the first part, about the impact of unprotected sex, anti-authoritarian defiance came out as self-destructive bad faith: “they can’t tell me what to do”, “I’m the sort of person who has to do this”. There was a desperate joylessness about the mandatory pleasure-seeking; another side to the hedonic depression I talk about in Capitalist Realism.)
One of the things that irritated me in the last part of Fairy Jobmother was the moment when Taylor talked about someone getting back to work so they could “make a contribution to society” again. (My mentioning this on Twitter sparked a brief exchange with this character,[7] who said “you can do what you please but not with my cash. You don’t want to work that’s fine — just don’t expect me to pay”.) As if there are no other ways to “make a contribution to society” than paid work (what is the Big Society if not about the value of such unpaid contributions?); as if those in work didn’t depend, in numerous ways, on those not being paid for work…
Like many people I know, I spent my twenties drifting between postgraduate courses and unemployment, encountering many pointless and demoralising “helping you back to work” initiatives along the way. There wasn’t much difference between what I did on an average day when I was a student and what I did when I was unemployed, and there isn’t a great deal of difference between what I was doing then and what I do now. But now I’m fairly confident that I “make a contribution”; then I wasn’t. For a number of reasons, during my twenties I believed then that I was unemployable — too feckless to do either manual work or retail, and nowhere near confident enough to do a graduate job of any kind. (The ads for graduate jobs would fill me with despair: surely only a superhuman could do the job as described?) I won’t deny that eventually getting employment was important — I owe so much of what I am now to getting a teaching job. But equally important was the demystification of work that gaining this employment allowed — “work” wasn’t something only available to people who belonged to a different ontological category to me. (Even so, this feeling wasn’t rectified by having a job: I had a number of depressive episodes when I was convinced that I wasn’t the sort of person who could be a teacher.)
But surely the importance of Virno and Negri’s work is to have undermined the distinction between work and non-work anyway. What precisely counts as non-work in post-Fordism? If, to use Jonathan Beller’s phrase, “to look is to labour” — if, that is to say, attention is a commodity — then aren’t we all “contributing”, whether we like it or not? As Nina Power argues, “[i]t is as if employers have taken the very worst aspects of women’s work in the past — poorly paid, precarious, without benefits — and applied it to almost everyone, except those at the very top, who remain overwhelmingly male and incomprehensibly rich.” In these conditions — in which unemployment/ underemployment/perpetual insecurity are structurally necessary, not contingent accidents — there’s more case than ever for a benefits safety net.
At this point, I must plug Ivor Southwood’s forthcoming book, Non-Stop Inertia. It’s about the miseries of “jobseeking”, and it’s one of my favourite Zer0 books to date, combining poignant and funny observations derived from experience with theoretical acuity. The book is sure to be of interest to most people who enjoyed Capitalist Realism (indeed, Ivor writes about whole dimensions of capitalist realism which I didn’t touch upon). Here are a couple of paragraphs:
The endless unpaid duties assigned to the virtuoso jobseeker cast him as the postmodernised inversion of the 1980s “gizza job” persona, which confronted the employer directly with the physical reality of the reserve laborer and his family. Now, rather than proclaiming his jobless status the career jobseeker hides it, like something obscene, behind a screen of training courses and voluntary work and expressions of rictus positivity, and he becomes ever more complicit with this concealment in proportion to his desperation. The jobseeker must have an alibi ready to explain away every gap in his employment history, while the most mundane experience becomes the occasion of a personal epiphany — “working in a busy café really taught me something about the importance of customer service”. Skills are valued over knowledge. Non-vocational qualifications are almost a liability, unless they are emptied of content; a degree in literature is valued not for its evidence of critical thought but because it shows that the applicant has word processing experience.
What are we not thinking about during all those hours of jobseeking, networking and CV-building? What interests, worries and fantasies might we otherwise have? What books might we read (other than self-help manuals), what conversations might we have with colleagues and friends about topics other than work? How differently might we perceive our current jobs without this constant needling insecurity? What kind of dangerous spaces might open up, in what kind of jeopardy might we put ourselves and this dynamic system, if we resigned from our jobs as jobseekers?[8]
“just relax and enjoy it”: geworfenheit on the bbc
I first saw Artemis 81 when it was broadcast for the first and only time in December 1981. Even though it struck me then as incoherent and incomprehensible, I willingly sat through all three hours of it. Judging by the internet responses to Artemis 81, my experience was a common one amongst kids who, like me, were allowed to stay up late and watch it because it was broadcast during the school holidays.
I suppose that Artemis 81 was one of the things that I was thinking of when, towards the end of Capitalist Realism, I argued that, far from being dreary and dull, the so-called paternalist era of media could be a breeding ground for the Weird (Ghost Box’s conflation of secondary school textbooks with Weird fiction is based on the same intuition).
Artemis 81 was written by David Rudkin, the author of the betterknown Penda’s Fen (to which I’ll be returning in another post very soon). Watching it again after nearly thirty years, the film doesn’t seem incomprehensible at all. It is structured around a simple Manichean dichotomy (Manicheanism was one of the heavily signposted themes of Penda’s Fen), and a mythic journey out of complacency and selfinvolvement and into a kind of visionary faith. (The persistent emphasis in Artemis 81 on the “leap into faith” makes for an interesting parallel with Inception: at one point, the lead character tells a woman who has been strung up inside a cathedral bell that “it is better to fall than to hang”.) What makes Artemis 81 still alienating to watch are all the things that it lacks — all those strategies for producing audience identification to which we are now so accustomed. The acting style is as Brechtian as anything you would see in a Straub-Huillet film; the dialogue is anti-naturalistic, highly mannered (it reminds me more of an opera than television writing — and Wagner is one of many intertexts).
Rudkin says on the DVD commentary that the alien planet which we appear to see at the start of the film belongs to inner space. It is never clear when we exit inner space. But the film gains a great deal of power from grounding this inner space in what you might call found locations: the ferry terminal at Harwich; a power station in North Wales, which during the time of filming was under construction, and which becomes the entry to hell; and perhaps most memorably of all, the interior of the Anglican cathedral in Liverpool, which the BBC crew were not only given permission to use — they were also allowed to clear out all the pews, making for some astonishing oneiric images.
One sequence in particular stands above all the others. It is both one of the most disturbingly effective dream — or nightmare — sequences I’ve ever seen in film (certainly it is far better capturing dream topographies than anything in Inception), and also a deeply resonant image of dystopia. The lead character, pulp novelist Gideon Harlax (Hywell Bennett) suddenly finds himself in an unidentified city: he is on a tram, surrounded by consumptives expectorating blood into their scarves. It is foggy; the city is militarised, although there is a great deal of street market-like commercial activity. No one speaks English. When he enquires after Helith, the guardian angel who has abandoned him (played by Sting — but don’t let that put you off), people laugh or admonish him. A public address system incessantly streams out announcements in what sounds like an East European language (it is actually Estonian spoken backwards). Watched now, you can’t help but see anticipations of Blade Runner and Children of Men here. On the commentary, Rudkin says that this section of the film was supposed to illustrate Heidegger’s concept of Geworfenheit, or throwness. Rudkin reveals that on-set, they used to refer to this city — actually a composite of Birmingham and Liverpool — as Geworfenheit, but this is never mentioned in the film itself. Beyond all the explicit references to myth, music and literature, there were further, occulted, layers of intertext. Another example, from this write-up on Artemis 81:
One minor point that reveals much about […] Rudkin’s approach: the presiding deity of the piece is a Scandinavian goddess known as Magog. But it takes an alert eye to spot the “Gog Magog Hills” in a map of Britain which we glimpse on the protagonist’s desk a lesser dramatist would perhaps have included a lengthy detour around the rather different “Magog” to be found in English mythology.[2]
It was Artemis 81’s confidence that you can subject the audience to Geworfenheit that makes it so impressive. As all the kids who watched Artemis 81 and who have never forgotten it will attest, there’s an enjoyment to be had from being thrown into the middle of things which you cannot understand and being forced to make a kind of sense out of them.
I hardly need say that it is impossible to imagine something like Artemis 81 being commissioned, still less broadcast, by the BBC today. I agree absolutely with Phillip Challinor when he writes that “Artemis 81 stands as a brilliant example of the way in which interesting pretentiousness can be a good deal more satisfactory than solid professionalism and good old-fashioned storytelling.”[3] Like much Seventies culture — and Artemis 81 really belongs to the “long Seventies” that ended circa 1982 — it deploys pretentiousness as a visionary force. To use a musical analogy, Artemis 81 combines the overblown ambition of prog with the cool Ballardianism of post-punk. It is quintessentially pulp modernist — there are references to The Devil Rides Out as well as to The Seventh Seal and Carl Dreyer.
It is the BBC that made and broadcast Artemis 81 which should be recovered and defended, not the institution as it currently functions today. The opposition that sets elitism against populism is one that neoliberalism has put in place, which is why it’s a mistake to fall either side of it. The neoliberal attack on cultural “elites” has gone alongside the consolidation and extension of the power of an economic elite. But there’s nothing “elitist” about assuming intelligence on the part of an audience (just as there is nothing admirable about “giving people what they want”, as if that desire were a natural given rather than something that is mediated on multiple levels). Important qualification: to say that there was much to be mourned in the cultural situation in the Seventies and early Eighties is not to say that everything about that period is to be missed. I shouldn’t have to make this disclaimer, but I’m mindful that any kind of critical judgement which favourably compares the past to the present is likely to be accused of “nostalgia”. There are unique opportunities in the current conjuncture, but they can only be accessed if there is some negation of the present rather than a vacuous affirmation of it.
Of course, the discourse network in which surrounded the BBC in 1981 was vastly different to the situation in which the BBC finds itself today. For an example of this, take a look at the Daily Mirror’s preview of Artemis 81:
It could be the most baffling show of the holiday, but ARTEMIS 81 (BBC1, 9.0) is also one of the best of the year. This three-hour thriller, giving pop singer Sting his first big television role, is a knockout. But even some of the people most closely involved are not too sure exactly what it’s about. Director Alastair Reid calls it a television Rubik Cube. And actor Hywel Bennett, who is at the heart of the action says he doesn’t understand it. Artemis 81 IS very complex. It has to do with a threat to the future of mankind, a series of mysterious deaths, a strange affair involving the Angel of Love and a great organist who, if he hits the right (or wrong) note, could blow up the world. My advice: Don’t worry about understanding it, just relax and enjoy it.
star wars was a sellout from the start
Does Disney’s acquisition of Lucasfilm mean that Star Wars has sold out? Can the Star Wars franchise retain its soul now it has been absorbed into a corporate conglomerate? It’s hard to believe that these questions are seriously being posed. Star Wars was a sell-out from the start, and that is just about the only remarkable thing about this depressingly mediocre franchise.
The arrival of Star Wars signalled the full absorption of the former counterculture into a new mainstream. Like Steven Spielberg, George Lucas was a peer of directors such as Martin Scorsese and Francis Ford Coppola, who had produced some of the great American films of the 1970s. Lucas’ own earlier films included the dystopian curio, THX 1138, but his most famous film was a herald of a coming situation in which mainstream cinema in America would become increasingly bland, and it would become impossible to imagine films of the quality of The Godfather trilogy or Taxi Driver ever being made again.
According to Walter Murch, the editor of Apocalypse Now, Lucas had wanted to make Apocalypse Now but had been persuaded it was too controversial, so he decided to “put the essence of the story in outer space and make it in a galaxy long ago and far, far away”. Star Wars was Lucas’ “transubstantiated version of Apocalypse Now. The rebel group were the North Vietnamese, and the Empire was the US”. Of course, by the time the film was ideologically exploited by Ronald Reagan, everything had been inverted: now it was the US who were the plucky rebels, standing up to the “evil empire” of the Soviets.
In terms of the film itself, there was nothing much very new about Star Wars. Star Wars was a trailblazer for the kind of monumentalist pastiche which has become standard in a homogeneous Hollywood blockbuster culture that, perhaps more than any other film, Star Wars played a role in inventing. The theorist Fredric Jameson cited Star Wars as an example of the postmodern nostalgia film: it was a revival of “the Saturday afternoon serial of the Buck Rogers type”, which the young could experience as if it was new, while an older audience could satisfy their desire to relive forms familiar from their own youth. All that Star Wars added to the formula was a certain spectacle — the spectacle of technology, via then state-of-the-art special effects and of course the spectacle of its own success, which became part of the experience of the film.
While the emphasis on effects became a catastrophe for science fiction, it was a relief for the capitalist culture of which Star Wars became a symbol. Late capitalism can’t produce many new ideas anymore, but it can reliably deliver technological upgrades. But Star Wars didn’t really belong to the science fiction genre anyway. J.G. Ballard acidly referred to it as “hobbits in space”, and, just as Star Wars nodded back to Tolkien’s Manichean pantomime, so it paved the way for the epic tedium of Peter Jackson’s Lord of the Rings adaptations.
What Star Wars did invent was a new kind of commodity. What was being sold was not a particular film, but a whole world, a fictional system which could be added to forever (via sequels, prequels, novels, and any number of other tie-ins). Writers such as Tolkien and H.P. Lovecraft had invented such universes, but the Star Wars franchise was the first to self-consciously commodify an invented world on a mass commercial scale.
The films became thresholds into the Star Wars universe, which was soon defined as much by the merchandising surrounding the movies as by the films themselves. The success of the toys took even those involved with the film by surprise. The then small company, Kenner, purchased the rights for the Star Wars action figures in late 1976, a few months ahead of the film’s theatre release in summer 1977. Unanticipated and unprecedented demand soon outstripped supply, and parents and children could not find the action figures in toy shops until Christmas 1977. This all seems rather quaint now, at a time when the merchandising surrounding blockbuster films is synchronised with a military level of organisation, and augmented by a battery of advertising and PR hype. But it was the Star Wars phenomenon which gave us the first taste of this kind of film tie-in commodity supersaturation.
This is why it’s ridiculous to ask if Star Wars sold out. It was Star Wars which taught us what selling out really means.
gillian wearing: self made
An ordinary looking man in his thirties is walking towards the camera holding a carrier bag. It could be you or me, and the streets he moves through, with their off-licences and corner shops, could be anywhere, too — most people living in Britain wouldn’t have to go more than a mile to walk streets such as this. Still, something is not quite right: his expression looks distracted yet also troubled, while the music, an electronic drone punctuated by cries, creates an atmosphere of gathering unease. Suddenly, in the middle of the road, he stops, turns and drops the bag: it’s as if something in him has broken, as if he can no longer take it any more…
It’s a powerful opening, but Self Made immediately retreats from its intensity. We learn that Self Made started with an advertisement placed by Turner Prize winning artist, Gillian Wearing: “Would you like to be in a film? You can play yourself or a fictional character. Call Gillian.” Hundreds apply, but only seven make it through to the experiment. This involves being trained by Method acting expert Sam Rumbelow, in preparation for acting out a “micro-drama” which will explore the participants’ memories and feelings.
Immediately, I’m suspicious. Are these really the non-actors they are supposed to be? They seem remarkably unfazed by some of the exercises Rumbelow asks them to do, some of which you’d expect to cause nonperformers a degree of embarrassment. I’m suspicious about my feelings of suspicion: isn’t this exactly the response that’s expected of me? A whole series of questions ensue. What is the boundary between performance and everyday life? Is there any such thing as a non-actor, since all of us are engaged in performing our identities?
We’re in that familiar (art)space in which boundaries — in this case between “fiction” and “documentary” — are blurred. For much of its duration, the film puts us into that mode of listless sub-Brechtian questioning which so much art catalogue language routine invokes. The mode is deconstructive, demystificatory, (or it is their simulation): we see the micro-dramas, but only after we’ve been exposed to all the preparatory work that went into them; and afterwards, there are cutaways showing the crew filming the scenes.
Rumbelow comes across as an intensely irritating and creepy figure — more therapist-guru than acting coach, he’s horribly reminiscent of Hal Raglan, the scientist-therapist from Cronenberg’s The Brood who encourages his patients to “go all the way through” their emotional traumas, with fatal consequences. Perhaps exploitation is integral to the Method, and perhaps one of the points of Self Made is to examine this… And perhaps Sam Rumbelow is playing “Sam Rumbelow”, annoying Method acting expert…
Wearing has said in the past that she was inspired by Paul Watson’s 1974 fly-on-the-wall TV documentary The Family, and Self Made clearly follows on from such works as Confess all on video. Don’t worry, you will be in disguise. Intrigued? Call Gillian (1994) or Family History (2006) in engaging with the problems raised by mediated “revelation” — the issue here is precisely whether we are dealing with “revelation” at all, or whether what we are witnessing is an effect of the filming process itself. (The same questions occurred to Jean Baudrillard, and it’s no accident that some of his classic essays on simulation focus on the fly-on-the-wall phenomenon.) Wearing’s work certainly has less in common with the brashness of twenty-first-century reality TV than it does with the convergence of drama, psychotherapy and social experiment that came together in the 1970s and continued on into the 1980s. At points, Self Made reminded me of a half-forgotten mid-Eighties BBC programme which I believe was called Psychodrama, and which similarly invited the participants to explore traumatic moments in their lives through the construction of dramatic scenarios. In any case, there’s something horribly post-Sixties in every bad way about the techniques that Rumbelow uses to “unlock” the participants’ feeling. In the spirit of confessionalism that Wearing’s work examines, I admit that there are personal reasons for my hostility to this kind of thing. When I was at school in the early Eighties, we had to endure a class called Social and Personal Education. This involved being subjected to some of the emotionally terroristic exercises — such as “Trust Games” — which Rumbelow tries out with the participants here. Ironically, such exercises were at least as uncomfortable and disturbing as the experiences they were supposed to be exorcising, and these teachers were as oppressive in their own way as the agents of previous — more “repressive” — regimes of emotional management. There’s no suggestion that Self Made endorses the discourses which inform Rumbelow’s practice and the film’s most unsettling scenes — both concerning violence — at least raise the possibility than untapping and manipulating buried feelings may be catastrophic. At one point, Wearing conspicuously uses montage to highly charged effect, undercutting the sense — the illusion — of unmediated verité. The participant James is re-enacting/re-imagining a scene that took place on a train. He challenges one of the men who bullied him when he was younger. Almost immediately, he appears to consumed by a tempest of rage. He raises his fist to hit the other (non)actor and for a moment it seems as if he has struck his head with full force. We then realise, with a sense of relief that still doesn’t mitigate our horror, that Wearing has cut to James punching out a dummy. The film’s climactic scene is even more shocking. This returns us to Self Made’s opening shots. By now, we have learned that the man walking the streets is called Ash. This time, however, we see what he had turned around to do: kick a pregnant woman in the stomach. Even though we know this is an illusion — after all, we have seen it being constructed — the image in itself is so sickeningly transgressive that no amount of alienation effects can dissipate its power.
batman’s political right turn
“How long do you think all this can last?” Selina Kyle (Anne Hathaway) asks Christian Bale’s Bruce Wayne amid the opulence of a high-society charity ball in The Dark Knight Returns. “There’s a storm coming.” A storm of a rather unexpected kind gathered over the film on Friday, with the appalling massacre in Denver.[2] But the film was already enmeshed in political controversy in the US, when conservative US radio host Rush Limbaugh claimed the name of Batman’s adversary in the film, Bane, was a reference to presidential candidate Mitt Romney and his former company, Bain Capital.
Yet as Limbaugh also noted, it is not Bane but billionaire Bruce Wayne who most resembles Romney, while Bane’s rhetoric seems like a nod to the Occupy movement. Right-wing commentator John Nolte argues that the film has forced Occupy Wall Street into “damage control” and praises the director, Christopher Nolan, for “using the kind of conservative themes that most of artistically bankrupt Hollywood refuses to go near any more”.[3] Fellow right-winger Christian Toto argues that it is impossible to read the film except as an anti-Occupy Wall Street treatise. “Bane’s henchmen literally attack Wall Street, savagely beat the rich and promise the good people of Gotham that ‘tomorrow, you claim what is rightfully yours’.”
Such readings spuriously conflate Occupy Wall Street’s anti-capitalism with the indiscriminate violence used by Bane and his followers.
When Nolan revived the Batman franchise in 2005, the setting — Gotham in the midst of an economic depression — seemed like an anachronistic reference to the superhero’s origins in the 1930s; 2008’s The Dark Knight was too early to register the impact of the financial crisis. But The Dark Knight Rises clearly attempts to respond to the post-2008 situation. The film isn’t the simple conservative parable that right-wingers would like, but it is in the end a reactionary vision.
The storm Hathaway’s character prophesies is a time of reckoning for the wealthy, and what stops the film being a straightforward celebration of conservative values in the way Nolte and Toto want is the relish it takes in attacking the rich. “You and your friends better batten down the hatches”, Kyle continues, “cause when it hits, you’re all going to wonder how you ever thought you could live so large, and leave so little for the rest of us”. An early scene features the stock exchange, where we have the pleasure of seeing Bane manhandle some predatory traders. Later, when Wayne tells Kyle that although he is supposedly bankrupt, he has kept his house, Kyle acidly observes that “the rich don’t even go broke like the rest of us”.
Anti-capitalism is nothing new in Hollywood. From Wall-E to Avatar, corporations are routinely depicted as evil. The contradiction of corporatefunded films denouncing corporations is an irony capitalism cannot just absorb, but thrive on. Yet this anti-capitalism is only allowed within limits. The Dark Knight Rises draws clear lines: anti-capitalist comment (of the kind that Kyle makes) is fine, but any direct action against the rich, or revolutionary moves towards the redistribution of property, will lead to dystopian nightmare.
Bane talks about returning Gotham to “the people”, and liberating the city from its “oppressors”. But the people have no agency in the film. Despite Gotham’s endemic poverty and homelessness, there is no organised action against capital until Bane arrives.
At the end of The Dark Knight Rises, Batman had sacrificed his reputation to save the city, and it’s tempting to read the film as an allegory for the attempts by the elite to rebuild their standing after the financial crisis — or at least to preserve the idea that there are good rich who, if suitably humbled, can save capitalism from its worst excesses.
The sustaining fantasy of Nolan’s Batman films — which does chime uncomfortably with Romney — is that the excesses of finance capital can be curbed by a combination of philanthropy, off-the-books violence and symbolism. The Dark Knight at least exposed the duplicity and violence necessary to preserve the fictions in which conservatives want us to believe. But the new film demonises collective action against capital while asking us to put our hope and faith in a chastened rich.
remember who the enemy is
There’s something so uncannily timely about The Hunger Games: Catching Fire that it’s almost disturbing. In the UK over the past few weeks, there’s been a palpable sense that the dominant reality system is juddering, that things are starting to give. There’s an awakening from hedonic depressive slumber, and The Hunger Games: Catching Fire is not merely in tune with that, it’s amplifying it. Explosion in the heart of the commodity? Yes, and fire causes more fire…
I over-use the word “delirium”, but watching Catching Fire last week was a genuinely delirious experience. More than once I thought: How can I be watching this? How can this be allowed? One of the services Suzanne Collins has performed is to reveal the poverty, narrowness and decadence of the “freedoms” we enjoy in late, late capitalism. The mode of capture is hedonic conservatism. You can comment on anything (and your tweets may even be read out on TV), you can watch as much pornography as you like, but your ability to control your own life is minimal. Capital has insinuated itself everywhere, into our pleasures and our dreams as much as our work. You are kept hooked first with media circuses, then, if they fail, they send in the stormtrooper cops. The TV feed cuts out just before the cops start shooting.
Ideology is a story more than it is a set of ideas, and Suzanne Collins deserves immense credit for producing what is nothing less than a counternarrative to capitalist realism. Many of the twenty-first century’s analyses of late capitalist capture — The Wire, The Thick Of It, Capitalist Realism itself — are in danger of offering a bad immanence, a realism about capitalist realism that can engender only a paralysing sense of the system’s total closure. Collins gives us a way out, and someone to identify with/as — the revolutionary warrior-woman, Katniss.
Sell the kids for food.
The scale of the success of the mythos is integral to its importance. Young Adult Dystopia is not so much a literary genre as a way of life for the generations cast adrift and sold out after 2008. Capital — now using nihiliberal rather than neoliberal modes of governance — doesn’t have any solution except to load the young with debt and precarity. The rosy promises of neoliberalism are gone, but capitalist realism continues: there’s no alternative, sorry. We had it but you can’t, and that’s just how things are, OK? The primary audience for Collins’ novels was teenage and female, and instead of feeding them more boarding school fantasy or Vampiary romance, Collins has been — quietly but in plain sight — training them to be revolutionaries.
Perhaps the most remarkable thing about The Hunger Games is the way it simply presupposes that revolution is necessary. The problems are logistical, not ethical, and the issue is simply how and when revolution can be made to happen, not if it should happen at all. Remember who the enemy is — a message, a hailing, an ethical demand that calls out through the screen to us… that calls out to a collectivity that can only be built through class consciousness… (And what has Collins achieved here if not an intersectional analysis and decoding of the way that class, gender, race and colonial power work together — not in the pious academic register of the Vampires’ Castle, but in the mythographic core of popular culture — functioning not as a delibidinising demand for more thinking, more guilt, but as an inciting call to build new collectivities.)
There’s a punk immanence about Catching Fire which I haven’t seen in any cultural product for a long time — a contagious self-reflexivity that bleeds out from the film and corrodes the commodity culture that frames it. Adverts for the movie seem like they belong in the movie, and, rather than a case of empty self-referentiality, this has the effect of decoding dominant social reality. Suddenly, the dreary gloss of capital’s promotional cyber-blitz becomes de-naturalised. If the movie calls out to us through the screen, we also pass over into its world, which turns out to be ours, seen clearer now some distracting scenery is removed. Here it is: a neo-Roman cybergothic barbarism, with lurid cosmetics and costumery for the rich, hard labour for the poor. The poor get just enough high-tech to make sure that they are always connected to the Capitol’s propaganda feed. Reality TV as a form of social control — a distraction and a subjugatory spectacle that naturalises competition and forces the subordinate class to fight it out to the death for the delectation of the ruling class. Sound familiar?
Part of the sophistication and pertinence of Collins’ vision, though, is its awareness of the ambivalent role of mass media. Katniss is a totem not because she takes direct action against the Capitol — what form would that take, in these conditions? — but because her place in the media allows her to function as a means of connecting otherwise atomised populations. Her role is symbolic, but — since the capture system is itself symbolic in the first instance — this is what makes her such a catalyst. The girl on fire… and fire spreads fire… Her arrows must ultimately be aimed at the reality system, not at human individuals, all of whom are replaceable.
The removal of capitalist cyberspace from Collins’ world clears away the distracting machinery of Web 2.0 (participation as an extension of spectacle into something more pervasive, total, rather than as its antidote) and shows how TV, or, better, what Alex Williams has called “the Universal Tabloid”, is still productive of what counts as reality. (For all the horizontalist rhetoric about Web 2.0, just look at what typically trends on Twitter: TV programmes.) There’s a role as hero or villain — or maybe a story about how we’ve gone from hero to villain — prepared for all of us in the Universal Tabloid. The scenes in which Plutarch Heavensbee gives a businesslike description of the carrot and stick nature of the Capitol’s media-authoritarian power have a withering, mordant precision. “More beatings, what will her wedding be like, executions, wedding cake…”
As Unemployed Negativity wrote of the first film:
It is not enough that the participants kill each other, but in doing so they must provide a compelling persona and narrative. Doing so guarantees them good standing in their odds and means that they will be provided with assistance by those who are betting on their victory. Before they enter the arena they are given makeovers and are interviewed like contenders on American Idol. Gaining the support of the audience is a matter of life in death.[2]
This is what keeps the Tributes sticking to their reality TV-defined meat puppet role. The only alternative is death.
But what if you choose death? This is the crux of the first film, and I turned to Bifo when I tried to write about it.[3] “Suicide is the decisive political act of our times.”[4] Katniss and Peeta’s threat of suicide is the only possible act of insubordination in The Hunger Games. And this is insubordination, NOT resistance. As the two most acute analysts of Control society, Burroughs and Foucault, both recognised, resistance is not a challenge to power; it is, on the contrary, that which power needs. No power without something to resist it. No power without a living being as its subject. When they kill us, they can no longer see us subjugated. A being reduced to whimpering — this is the limits of power. Beyond that lies death. So only if you act as if you are dead can you be free. This is Katniss’ decisive step into becoming a revolutionary, and in choosing death, she wins back her life — or the possibility of a life no longer lived as a slave-subordinate, but as a free individual.
The emotional dimensions of all this are by no means ancillary, because Collins — and the films follow her novels very closely in most respects — understands how Control society operates through affective parasitism and emotional bondage. Katniss enters into the Hunger Games to save her sister, and fear for her family keeps her in line. Part of what makes the novels and the films so powerful is the way they move beyond the consentimental affective regime imposed by reality TV, lachrymose advertising and soap operas. The greatness of Jennifer Lawrence’s performances as Katniss consist in part in her capacity to touch on feelings — rage, horror, grim resolve — that have a political, rather than a privatised, register.
The personal is political because there is no personal.
There is no private realm to retreat into.
Haymitch tells Katniss and Peeta that they will never get off the train — meaning that the reality TV parts they are required to play will continue until their deaths. It’s all an act, but there’s no offstage.
There are no woods to run into where the Capitol won’t follow. If you escape, they can always get your family.
There are no temporary autonomous zones that they won’t shut down. It’s just a matter of time.
Everyone wants to be Katniss, except Katniss herself.
Bring me my bow, of burning gold.
The only thing she can do — when the time is right — is take aim at the reality system.
Then you watch the artificial sky fall.
Then you wake up.
And.
This is the revolution…
beyond good and evil: breaking bad
Who needs religion when you have television? On soap operas, unlike in life, villainous characters almost always face their comeuppance. TV cops may now be required to have “complicated” private lives and dubious personal ethics, but we’re seldom in any serious doubt about the difference between good and evil, and on which side of the line the maverick cop ultimately falls. The persistence of the fantasy that justice is guaranteed — a religious fantasy — wouldn’t have surprised the great thinkers of modernity. Theorists such as Spinoza, Kant, Nietzsche and Marx argued that atheism was extremely difficult to practise. It’s all very well professing a lack of belief in God, but it’s much harder to give up the habits of thought which assume providence, divine justice and a secure distinction between good and evil.
The US television series Breaking Bad, an international hit whose final episode aired this autumn, escapes this impasse. But we have to be careful here — the series has been understood (its title invites this interpretation) as the story of how an ordinary lower-middle-class man becomes evil. The set-up was simple. Walter White (played by Bryan Cranston), a chemistry teacher at a school in New Mexico, is diagnosed with lung cancer. Unable to afford the treatment, Walt decides to use his expertise in chemistry to manufacture methamphetamine, or crystal meth, with the help of a feckless ex-student, Jesse. As the series progresses, Walt shifts from making agonised decisions about whether it is right to kill, to becoming a ruthless crimelord. Yet this is not the whole story, and to read the series as a narrative of Walt becoming evil is to resist what is most challenging about it.
The success of the show outside the US has provoked some amusing parodies. Imagine Breaking Bad set in the UK and Canada. Opening scene. Doctor tells Walt he has cancer — the treatment starts next week. End of series. What this points out is an opposition that was crucial to the drama: between the fragility of the physical body and the precarity produced by social relations. One way of measuring progress is through the extent to which human beings have managed to contain the inevitable suffering that nature causes the body. In this sense, Breaking Bad can be compared with Ken Loach’s recent documentary about the foundation of the British welfare state, Spirit of ‘45. Loach’s evocation of a destroyed working-class progressivism brings the savage new Wild West that emerges in Breaking Bad into painful relief. Walt does so many “bad” things because he wants to remain a “good” husband, as defined by the Protestant work ethic. Much of the series’s mordant humour comes from seeing Walt pursue this ideology of work — it’s better to earn your “own” money, no matter how, than to scrounge from others or ask them for help — to all kinds of extremes.
In the final episode, Walt has to admit that the desire to build his drug empire brought him an intense libidinal satisfaction that had long since become autonomous from the ostensible purpose — providing for his family when he is gone — that provoked him into cooking meth in the first place. But for most of the series Walt clings to the idea that he’s doing all the drug production, the killing, the manipulation and the terror for the sake of his family. Ironically, the one thing that the family could not survive is the course of action Walt ends up pursuing. It could probably survive penury and debt. It could survive the loss of Walt’s physical body. But it cannot survive the loss of the image of Walt as an ordinary father figure, beaten down by life, an underachiever maybe, but still someone who “does the right thing”. It’s as if Walt destroyed the family in the very attempt to save it.
Perhaps the most complex and powerful character in the whole series is Walt’s wife, Skyler, played by Anna Gunn. The actor has written of the misogyny she faced from some Breaking Bad fans online as a consequence of playing Skyler: in a piece for the New York Times, she described how the character seemed to have become “a flash point for many people’s feelings about strong, non-submissive, ill-treated women”. This is especially depressing because Skyler is a nuanced character, not at all someone who simply rejects Walt at the earliest opportunity. Even though she deplores Walt’s adventures in crime, it is only at the very end of the series, when Walt’s actions have manifestly brought catastrophe to Skyler’s family, that she definitively breaks with him. Until then she struggles, impossibly but heroically, to reconcile her roles as wife, mother and responsible citizen. At the end, we feel that she is traumatised but not broken — someone who will eventually be able to escape the horrors Walt brought to her life, and who, astonishingly, is still capable of retaining some love for the husband whose pride, hubris and desperation have threatened to destroy her life and those of her two children.
The politics of the family, and how these connect with the American ideology of earning your own money and paying your own way, were, then, at the heart of Breaking Bad. In the episode “Ozymandias” — probably one of the most intense, distressing, yet also occasionally hilarious hours of television I have ever seen — Skyler finally breaks totally with Walt. Their son, Walt Jr, has just discovered that Walt is a meth cook. Sheer vertigo, horror: Walt Jr’s whole world has disappeared in an instant. He doesn’t want to believe it, he’s angry with Skyler and Walt, he can’t make any sense of it, his eyes show the deepest pain, confusion, shock. Skyler grabs a carving knife — an echo of what Wendy Torrance does in The Shining — but, unlike Wendy, Skyler stands tough. She’s tall, strong, she’s not cowering or afraid anymore, and she suddenly knows what she has to do to protect herself and Walt Jr. She forces Walt out of the house. But before that, Skyler and Walt have grappled on the floor. Walt wriggles free, stands up and — hilariously, pathetically — tries to assert his patriarchal authority, tries to appeal to family togetherness. “Stop this! We — are — a — family!”
A scene like this gets right to the heart of why Breaking Bad was so mesmerically powerful. Even here, we’re aware that Skyler still loves Walt — not because she’s deluded but because she recognises that, even though Walt has become “a monster”, this isn’t all he is. In some sense, he still loves Skyler and Walt Jr; and the scenes in the final episode when Walt returns to say his last goodbye to Skyler, and he holds his young baby for the last time, and he watches Walt Jr from a distance, knowing that he will never speak to him again, are wrenchingly sad.
I think it was Lacan who remarked that when we talk about going beyond good and evil, we usually mean going beyond good. The modern world is fascinated by anti-heroes, people with a dark side, the pantomime madness and “evil” of Hannibal Lecter. What it is less comfortable with is the real atheist-existentialist revelation that “good” and “evil” are not written into the universe, but exist only in ourselves, in relation to our desires and interests. Soap opera melodrama keeps us believing in “evil” as a voluntaristic choice — people do bad things because they are evil. But in Breaking Bad, evil in that sense is nowhere to be found.
Certainly, it’s full of people who do “bad” things — that is, those who pursue actions that they know would either directly or indirectly hurt or destroy others — but they don’t do this because they are evil. Tuco, the lowlevel drug lord that Walt and Jesse tangle with in season one, is deranged and violent because he is a meth addict from a criminal family. Gus Fring, the slick meth overlord who makes his first appearance in season two, is a super-pragmatic businessman — so pragmatic, in fact, that he lives his life in seemingly permanent cover, disguised as the humble owner of a small fast-food chain. He kills ruthlessly, but only when it is expedient. Even when hillbillies with swastikas tattooed onto their necks emerge as the antagonists towards the end of the series, the writing never allows us to write off the most repulsive of them as totally “evil”, because they, too, are capable of mercy and acts of kindness.
Then there is Walt himself. One of the series’ subversive achievements is to draw attention to the way that our sympathy and identification with a character are a structural effect; one that is created both by the demands of genre and by the class structure of wider society. We initially sympathise with Walt in part because we remember other put-upon dads in popular TV series — such as Bryan Cranston’s character in Malcolm in the Middle — and also because the media constantly invite us to identify with the “hard-working” lower-middle-class family man. Yet Breaking Bad shows that the difference between the “good”, “ordinary” man and a ruthless criminal is the thinnest of lines. There but for the grace of social security and the NHS go we.
classless broadcasting benefits street
It’s not exactly clear why Channel 4’s Benefits Street (broadcast in January and February 2014) caused such a furore. It wasn’t the most obviously exploitative of the many programmes about the unemployed and those on benefits. Yet something about this series, which followed the residents of James Turner Street in Birmingham, touched a nerve. It was immediately pressed into ideological service by the right, fitted into a pre-existing story about the “need to reform the welfare state”. The Daily Mail’s Richard Littlejohn quickly inserted some of the series’ participants into his phobic delirium. For most of those on the left, however, it was business as usual. For Owen Jones, author of the book Chavs, it was yet another case of the demonisation of the working class. For Ben Walters, writing on the blog Not Television, it was an example of Thatcherite documentary, while for the film-maker Katharine Round, writing for the Huffington Post, it was a depressing example of the way in which documentaries were being used to “kick those without a voice”.
In terms of its content, Benefits Street wasn’t all reactionary. Its somewhat mealy-mouthed claim to be about “community” rather than benefits wasn’t entirely false. Even in the first episode — which sensationistically dwelled on crime — there was still some emphasis on camaraderie and solidarity amongst the poor on the street. The second episode, which centred on desperate Romanians seeking work, was certainly sympathetic to the immigrants’ plight, and might even have done something to challenge the dominant media narrative about East Europeans “coming to steal our jobs and our benefits”. And by the third and fourth episodes sensation had largely given way to the inertia and radically contracted horizons of life on benefits. A small taste of this ought to have been enough to disabuse anyone of the notion that life on benefits is easy — but, since this belief is supported by relentless media propaganda, it isn’t likely to be given up any time soon.
Still, Benefits Street is undoubtedly part of a disingenuous trend in documentary making. Writing last year for the journal the Sociological Imagination, Tracy Jensen predicted a “summer of poverty porn”, citing such programmes as How To Get A Council House, Why Don’t You Speak English?, Benefits Britain 1949 (all Channel 4) and We All Pay Your Benefits (BBC1). Writing of the latter, Jensen argued that despite occasional moments of sympathy towards benefits claimants, the “programme’s ideological message was clear; worth comes from paid work and not from childrearing or volunteering; unemployment is a problem of will or determination and not of structural obstacles; and social security itself generates the ‘problem’ of welfare dependence.”[2]
Ultimately, Benefits Street fitted the same formula, in which intermittent sympathy for the poor and unemployed was used to season an otherwise crude reproduction of negative stereotypes. Then there is the perennial question of the exploitation of those who were filmed. Some — including residents of James Turner Street itself — objected to Benefits Street because they claimed that the programme’s producers had misrepresented what the series was actually going to be about. The residents weren’t told, for instance, that the series was to be given such a provocative and loaded title (the programme makers claimed that this was a last-minute decision, but I’m not sure how believable that is).
The deep problem with programmes like Benefits Street lies more in their form than in their content. A decade ago, the academic John Corner argued that reality TV had led to a genre of “post-documentary” television, in which documentary elements were merged with game-shows, makeover programmes and other entertainment forms.[3] Now we are in the era of post-reality TV documentary, a much more pernicious genre. Even the most credulous viewer of reality TV could hardly fail to be aware of its constructedness, with participants worrying and complaining about how they were “portrayed”, and viewers quickly becoming familiar with the way narrative was produced by editing. (Partly this was because shows like Big Brother gave viewers access to the unedited footage, to the longueurs and the shapelessness of a quotidian time prior to its moulding into narrative.) Such reflexivity is largely absent from post-reality TV documentary — this genre uses many of the techniques from reality TV, but presents them with the simulated sobriety of documentary rather than with the winking, heavily made-up face of entertainment. That’s not to say that post-reality TV documentary is entirely straight-laced; no — one of its defining characteristics is a certain humour and lightness. But it doesn’t want to be positioned as entertainment in the way that reality TV was.
In their important study Reacting To Reality Television: Performance, Audience and Value, Beverley Skeggs and Helen Wood argue that much reality TV posits an implied bourgeois gaze, which judges working-class participants as lacking, by comparison with the middle class.[4] Moreover, this lack is understood in heavily moralised terms; it isn’t to be explained by the working class’s lack of resources or opportunities, but by a deficit in will and effort. This implied perspective — seldom actually stated, but informing the whole way in which programmes are produced — is typical of the post-reality TV documentary.
This moralistic framing was at work in Benefits Street. It did almost nothing to contextualise what it showed. There was barely any discussion of why the participants had ended up on benefits, and no mention of the social causes of unemployment, just as there was no interrogation of the political agendas driving the focus on those claiming benefits, nor any examination of austerity as a political project. Post-reality TV documentary projects a radically depoliticised world of individuals and their intimacies. In Benefits Street, we were told that benefits were cut, but this was treated like some natural disaster, an act of God rather than the consequence of a political decision.
In many respects, post-reality TV documentary — like reality TV before it — goes out of its way to conceal the class differences between those who are making the programmes and those who feature in them. Like tabloid newspapers, the scripts impersonate a working-class vernacular. Typically, the voiceover plays an important role in this bid to present the programme to working-class viewers as if it has been produced by a group of peers. The voiceover will not now be the voice of the actual programme makers. If they are heard at all, these voices will only be heard in the off-screen prompts and questions put to the working-class participants. In the case of Benefits Street, the voiceover was performed by actor Tony Hirst, who has recently left the Coronation Street cast. Hirst’s accent is working-class, northern; his tone — perfectly in keeping with the supposedly “serious yet humorous” register of the post-reality TV documentary — is no-nonsense and wry. Tellingly, it was reported that the voiceover had been first offered to Brummie comedian Frank Skinner, who turned it down.
The use of voiceovers by actors or comedians from working-class backgrounds not only obfuscates the class origins of those making the programme; it also bolsters the programme’s claims to authenticity. In addition, and perhaps most significantly, the voiceover is part of a strategy that conceals the fact that the material is being framed in a particular way. In previous, more essayistic forms of documentary, when the person writing the script would also provide the voiceover, and might appear on camera, it was clearer both that a particular case was being made and who was making it. In the absence of a journalist or a programme-maker explicitly taking responsibility for any argument, viewers are invited to classify what they are seeing as the truth, pure and unmediated: this, we are induced to believe, is just real people, being themselves, and the refusal or failure to make any explicit argument allows dominant ideology — which the programme doesn’t acknowledge, still less challenge — to step in.
It’s a mark of how bad Channel 4’s programming now is that Benefits Street would probably count as one of its more serious recent attempts at documentary. If you want to measure the catastrophic impact of neoliberalism on British culture, then there’s no better example than Channel 4. A channel that began with programming that included European art films, serious philosophy discussion programmes and politically sophisticated documentaries has now degenerated into depths so embarrassingly hucksterish and craven that they are beyond parody. This is a channel which still allows Tory toffs like Kirstie Allsopp to front programmes that act as if it is normal for house-buyers to have budgets of a million pounds; a channel that cries crocodile tears over mental illness and other forms of extreme misfortune as a thin pretext for ruthlessly exploiting them. I’d like to think this decline isn’t irreversible, but there aren’t many reasons for hope at the moment.
rooting for the enemy the americans
The first season of The Americans (recently broadcast in the UK on ITV) ended with a sequence soundtracked by Peter Gabriel’s “Games Without Frontiers”. The series has rightly been praised for its intelligent use of music, and “Games Without Frontiers”, which was released in 1980, the year in which the series begins, was a perfect choice of track for the climax of the first season. Atmospherically, the song is somehow both anxious and fatalistic: drained of emotional inflection, Gabriel’s vocals sound catatonic; the production is cold and forbidding. “Games Without Frontiers” feels not so much post-traumatic as pre-traumatic: as if Gabriel is registering the impact of a catastrophe that is yet to come.
Heard now, especially in the context of The Americans, a Cold War thriller, it reminds us of a time when such dread was ambient, when the spectre of seemingly inevitable apocalypse was woven into everyday life. Yet if “Games Without Frontiers” invokes the broad historical moment when The Americans is set, it also comments on the specific intrigues of the series. For The Americans is about Soviet spies posing as an ordinary US family. Cold War espionage did not respect the boundaries between private and public, between domestic life and duty to the cause: a game without frontiers indeed.
Created by former CIA agent Joe Weisberg, The Americans centres on Elizabeth (Keri Russell) and Philip Jennings (Matthew Rhys), two KGB agents living undercover as Americans in Washington. Weisberg had reputedly toyed with setting the series in the 1970s, but opting for 1980 makes strong dramatic sense. In 1980, the Cold War was intensifying in the immediate wake of the Soviet invasion of Afghanistan, and the election of Ronald Reagan, who was keen to prosecute a Manichean struggle against the “Evil Empire”.
The series is characterised by a bipolar oscillation between a downbeat naturalism and the screaming adrenal intensities of the thriller. There is no shortage of car chases and shoot-outs in The Americans — there is probably no more exciting show on TV at the moment than this — but these are intercut with scenes of domestic life, where the tensions are of another kind altogether.
Far from being a respite from the Cold War, the Jenningses’ home life is the zone where they carry out their most emotionally charged deceptions. The marriage is itself a sham: initially at least, Elizabeth and Philip are agents on a mission, not lovers, and the series is in part about their attempts to navigate this fraught emotional terrain, and to reconcile their differing expectations about what their roles entail. But Elizabeth and Philip at least know what they are doing; their children, Paige and Henry, necessarily do not. They are not aware that their parents are KGB agents (the children’s ignorance being one of the best forms of cover that the Jennings have available to them).
This not only raises the threat of discovery, but also raises a moral dilemma: should the children be told? This dilemma comes to a head in the second season, when one story arc concerns the murder of a fellow KGB couple and one of their children. When it turns out that the surviving child, Jared, had been recruited by the KGB, the question of Paige’s recruitment is inevitably raised. “Paige is your daughter”, says the Jenningses KGB controller, Claudia, “but she’s not just yours. She belongs to the cause. And to the world. We all do.”
This brings us to a contrast between The Americans and even some of the most sophisticated spy fiction, such as that of John Le Carré. In Le Carré’s work, George Smiley’s adversary is the KGB Spymaster Karla — and for all that Le Carré complicated the broadbrush good-and-evil binary of Cold War propaganda, Karla remained an almost demonic figure whose commitment was incomprehensible to Smiley and his self-styled liberal pragmatism. In The Americans, the Soviets are transformed into our likeness. This first of all happens through the foregrounding of Elizabeth and Philip. But they are well supported by the rich cast of characters in the rezidentura (KGB station): Nina Krylova, a double, then triple agent, fragile but resilient and resourceful; the pragmatic strategist Arkady Ivanovich; the ambitious and enigmatic Oleg Burov. The decision to have the characters in the embassy speak Russian is important; their difference from Westerners is maintained, and the absurd convention whereby they are heard speaking bad English in pantomime Russian accents is avoided.
In a reversal of stereotype, the Soviets in The Americans seem so much more glamorous than their American counterparts. The Jenningses’ chief antagonist, FBI agent Stan Beeman (Noah Emmerich) — who in a soap opera twist turns out to be a near neighbour — comes off as dour by comparison with the dynamic and glamorous Elizabeth and Philip, just as the FBI offices seem drab and mean when set against the intrigue of the rezidentura.
This no doubt contributes to the series’ subversive flourish, which consists in the fact that the audience not only sympathise with the Jenningses, they positively root for them, dreading their discovery, hoping that all their plans come to fruition. The Americans’ message is not that the Jenningses share a common humanity with their American enemies and neighbours, but just happen to be on the other side. Given the extremity of their situation, it is impossible for us to think that Philip and Elizabeth are “just like us”; at the same time, however, the series forces us to identify with them, even as their otherness is preserved.
At key points, their differences from the “real” Americans are emphasised. While Philip is sometimes seen to vacillate, to appreciate at least some aspects of the American way of life, Elizabeth never wavers in her commitment to the destruction of American capitalism. At one moment during the second season, Paige starts going to a church group. Nothing brings home Elizabeth’s alienness to American life — and to many of the protocols of US TV drama — more than the ferocity of her hostility to this development. The scene in which a furious Elizabeth confronts Paige about all this is strangely hilarious: there aren’t many places elsewhere in American TV drama where we can see Christianity attacked with such fervour.
The complexity of Elizabeth’s character — and its sophisticated performance by Keri Russell — may be the highlight of the series. Both she and Philip have to be ruthless — when it is necessary, they kill without compunction — but Elizabeth has an unsentimental coldness and poise which the more equivocal Philip lacks. It is to the series’ credit that it doesn’t code this coldness as a moral failing — rather, it holds in tension two conflicting world views, which value Elizabeth’s strength of purpose and Philip’s uncertainties very differently. There is certainly no doubt, for instance, that Elizabeth loves her children (if she didn’t, she would too easily fall into the stereotype of the Soviet monster) — but the question is what place this love should have in a hierarchy of duties. For Elizabeth, it is clear, the Cause must always come first.
In conditions where capitalism dominates without opposition, the very idea of a Cause has disappeared. Who fights and dies for capitalism? Whose life is made meaningful by the struggle for a capitalist society? (Perhaps it is this devotion to the Cause that gives the Soviet characters in The Americans their glamour.) It was none other than Francis Fukuyama who warned that a triumphal capitalism would be haunted by hankerings after existential purpose that consumer goods and parliamentary democracy could not assuage. Much of the appeal of The Americans depends upon the fact that it is set before this period. Our knowledge that the collapse of the Soviet experiment was less than a decade away from the period when the series is set lends all of the discourse about the communist Cause in The Americans a melancholy quality. In 1980, the Cold War felt as if it would last forever. In reality, within a mere nine years, everything that Elizabeth and Philip stood for would collapse, and the end of history would be upon us.
how to let go: the leftovers, broadchurch and the missing
Loss is the subject of some of the best television series of the last year or so. Freud distinguished between mourning and melancholia, where mourning involves relinquishing the lost object and melancholia entails morbidly holding on. These series track the painful — perhaps permanently interrupted — process whereby melancholia becomes mourning.
The problem for the characters in the enthralling HBO series The Leftovers is that mourning cannot properly begin. The series is about the consequences of a cataclysmic event — referred to as the Sudden Departure — in which, inexplicably, without warning and without leaving a trace, two per cent of the world’s population disappears. The series was adapted from his own novel by Tom Perotta, along with Damon Lindelhof, the co-creator of Lost. In some ways, The Leftovers is like Lost in negative. Where Lost focused on those who had gone over to the other side, The Leftovers concentrates on the ones left behind. The phrase “left behind” is not neutral, of course — it was the title of a series of best-selling Christian millenarian novels about the End Times. The first temptation is to see the Sudden Departure as a religious event — the greatest religious event of all, the Rapture. Yet the Sudden Departure appears to have taken people at random: abusers as well as altruists, celebrities as well as mediocrities, believers as well as nonbelievers. One of the most mordantly amusing threads in the series sees Reverend Matt Jamison — an unstable compound of bitterness, compassion and enduring faith, superbly played by Christopher Eccleston — producing a homemade scandal sheet whose sole purpose is to tarnish the name of those who were taken, in order to prove that the Departure cannot have been the Rapture. Or is this the form that the Rapture would supposedly take for those left behind? It would not be an event with immediately clear meaning, but an unintelligible, traumatic interruption, producing disorientation and anger as much as sadness.
Yet The Leftovers does not concern itself overmuch with the enigma of the Sudden Departure. Lost became self-parodically enmeshed in a madly proliferating web of embedded mysteries that by the end seemed as if they were being invented simply to keep the intrigue going, and could never be satisfactorily resolved. The Leftovers offers no hint that its central mystery will ever be explained. If the first season is anything to go by, this absence of explanation is the point. The series is set three years after the Sudden Departure, and by now the event has become part of the assumed background of the characters’ lives: a vast epistemic void which they are simultaneously always ignoring and negotiating. The Sudden Departure is then like trauma as such: an unfathomable puncturing of meaning, a senseless spasm of sheer contingency.
The fact that the nature of the Sudden Departure is never directly confronted means that the question which genre the series belongs to — religious drama? Science fiction? Metaphysical fiction? — is suspended. The dominant mode is an often brutal naturalism; but a naturalism forever haunted and conditioned by something it cannot assimilate. Some have viewed the Sudden Departure as an allegory of 9/11, but the analogy isn’t convincing. The Leftovers belongs to a moment deprived of the certainties possessed by those prosecuting the War on Terror and their opponents. There is no one to blame in The Leftovers — and there are no bodies to mourn. Without these, the population turns to rage and brooding depression. Families disintegrate, even families such as the Garveys, the lead characters, who did not lose a member in the Departure. Social cohesion is always threatening to unravel. New belief systems sprout like couch grass in an abandoned garden — for in a world in which sense has gone, who can adjudicate between the credible and the ridiculous anymore?
In some ways, the most authentic response to the Sudden Departure comes from the “cult”, the Guilty Remnant. The rules that members follow have the eerie arbitrariness, the oneiric montage-logic, of a genuine cult. They are required to wear all white, to remain silent and — in a symbol of their lack of belief in a viable future — to always smoke whilst in public. But the Remnant have no cockamamie beliefs. In fact they seem to have no positive beliefs at all; their purpose is simply to retain a fidelity to the senseless event of the Departure. In their joyless white, they are mute spectres forever insisting that the Departure must not be forgotten. Their point is not moral — the departed should be remembered — but philosophical: reality has fundamentally altered, and this must be faced, not denied.
In the UK, ITV’s Broadchurch confronts loss in a more intimate, less metaphysically fraught way. The series centres on the death of a child, Danny Latimer, in a fictional seaside town. While it was clearly British television’s response to wintry Scandinavian thrillers such as The Killing, the first series of Broadchurch (2013) was not merely pastiche. There was a poise in the way it combined the whodunnit intrigue of the traditional thriller with a more subdued tracking of the impact of the death on the town. The series also deftly negotiated the line between sentimentalising a local community and finding potential killers everywhere. In the course of the investigation, the “close-knit community” that rallies around after the killing soon becomes a mob, which — stoked by tabloid insinuations — hounds a local shopkeeper to his death.
The second series of Broadchurch, halfway through at the time of writing, offered a clever solution to the seemingly intractable problem of how the series could continue once the killer was revealed. Another murder in the same town would definitively trip the series over into melodrama, yet abandoning the whodunnit element would deprive Broadchurch of one of its narrative drivers. As it turned out, the whodunnit was provided by an old case that the lead detective, Hardy (David Tennant), had failed to solve — a case that haunted him in the first series — while the ongoing study of the effects of the murder of Danny Latimer was continued with a trial, prompted when the killer, Joe Miller, retracts his confession. Yet the second series lacks the surefootedness of the first, and it is hard not to feel that it’s somewhat superfluous and unnecessary.
If Broadchurch was ITV’s answer to The Killing, then The Missing was the BBC’s response to Broadchurch. In Broadchurch, the grieving family gradually has to adjust to the death of a child, to give up melancholia so that they can begin mourning. In The Missing, this process is indefinitely stalled — the child whose disappearance is at the heart of the series is precisely missing, not yet (confirmed) dead. On holiday in France in 2006, five-year-old Ollie Hughes disappeared in a bar. The series took us down many blind alleys in pursuing the truth behind his disappearance. It ran through a virtual inventory of folk devils, including paedophiles, corrupt politicians, drug addicts and Eastern European criminal gangs, before concluding in bathos — Ollie’s disappearance turned out to be the result of an alcoholic accident, not any intentional malignancy.
In theory, there was something admirable about this controlled deflation. In practice, however, there was something dissatisfying about the way it was handled, which made the series feel like a shaggy-dog story, leading nowhere very interesting. Along the way, there were some memorable performances — most notably Tchéky Karyo as detective Julien Baptiste, a charismatic mix of wisdom, compassion and tenacity — but the most haunting scenes came at the beginning and the end of the series. First, there was the wrenching moment when Tony Hughes (James Nesbit) lost Ollie. Some of this power came from the very banality of the scene (one of the most notable aspects of the series was its nondescript settings, a contrast with the striking landscapes of Broadchurch): a bar which could be anywhere, a moment’s distraction, a hand momentarily released, a sudden contingency that irrevocably and irretrievably transforms life, pitching Ollie’s parents into hell. The final scene showed that Tony, now a dishevelled wreck, utterly consumed by obsession, would never escape that hell. Unable to accept that Ollie is dead — his body is never recovered — Tony is now in Russia, serially harassing children that he momentarily convinces himself might be his lost son. It is a horrible image of secular purgatory. Mourning will never begin; Tony is condemned to a melancholia-without-end that he doesn’t even want to escape.
the strange death of british satire
Watch one of the BBC’s political programmes — such as the Daily Politics and This Week, both fronted by Andrew Neil — and you encounter a particular tone. British television viewers are unlikely to take much notice of this tone because we take it for granted. Take a step back, however, and it is really rather curious. These ostensibly serious programmes are conducted with an air of light mockery, which Neil, with his perma-smirk and smugly knowing air, personifies. The tone, I believe, tells us something about the widespread disengagement from parliamentary politics in England. (The situation in Scotland is now rather different: the popular mobilisation after the independence referendum has reversed the trend towards cynicism about politics that still dominates south of the border.)
Take This Week. The whole show is conducted in a lamely comic style that it is hard to imagine any sentient creature finding amusing. Guests are required to dress up in daft costumes and present their arguments in the form of limp skits, pitched at an audience whose implied level of intelligence is imbecilic. The atmosphere is matey, informal, and the overwhelming impression is that nothing much is at stake in any of the decisions that parliament takes. While Neil’s dog pads about the set, former Tory leadership candidate Michael Portillo chats on a sofa with professionally amiable Blairite Alan Johnson — no class antagonism here, only mild disagreements. Politics appears as a (mostly) gentlemen’s club where everyone is friends. People from working-class backgrounds, such as Johnson, can achieve entry to this club, provided they accept its rules. These rules are never actually stated, but they are very clear. Parliament is not to be taken too seriously: it is to be treated as a (boring) soap opera, in which the lead characters are selfserving individuals who don’t believe in much beyond getting themselves elected. On no account are any intellectual concepts to be discussed, unless to be sneered at as pretentious nonsense. It has to be accepted that nothing very significant will ever change: the basic co-ordinates of political reality were set in the 1980s, and all we can do is operate inside them.
If you were designing a programme specifically to put people — especially young people — off politics, to convince them it is a tedious waste of time, then you could hardly do better than This Week. The programme seems to be aimed at literally no one: if you are staying up late to watch a programme devoted to politics, then presumably you are pretty serious about politics. Who wants this unfunny froth?
It would be bad enough if this tone of mirthless levity were confined to This Week, but it increasingly dominates political coverage of all kinds on the BBC. It thoroughly permeated the BBC’s election-night coverage this year, which Neil anchored. This trivialising tone is perhaps even more troubling than the problem of bias (as is well known, former Murdoch editor Neil was a Thatcher cheerleader; Nick Robinson, the BBC’s former Political Editor, meanwhile, was President of the Oxford University Conservative Association). The election-night coverage was notable for the disconnection between the shock and alarm that many in the audience felt about an unexpected win for the Conservative Party, and the guffawing banter of Neil and his associates. Reading out tweets and sharing gossip, the grinning Laura Kuenssberg, who has recently replaced Robinson as the BBC’s Political Editor, seemed to treat the whole evening as a jolly good laugh. Perhaps there isn’t that much at stake for her — she was, after all, born into immense privilege, the daughter of an OBE and a CBE, and the granddaughter of a founder and president of the Royal College of General Practitioners.
But where does this tone — with its strange mixture of the middle-aged and the adolescent — come from? The quick answer is class background. The tone of light but relentless ridicule, the pose of not being seen to take things too seriously, has its roots in the British boarding school. In an article for the Guardian, Nick Duffell[2] argued that, from around the age of seven, boarders are required to adopt a “pseudo-adult” personality, which results, paradoxically, in their struggling “to properly mature, since the child who was not allowed to grow up organically gets stranded, as it were, inside them.”
“Boarding children”, Duffell continues,
invariably construct a survival personality that endures long after school and operates strategically […] Crucially, they must not look unhappy, childish or foolish — in any way vulnerable — or they will be bullied by their peers. So they dissociate from all these qualities, project them out on to others, and develop duplicitous personalities that are on the run.[3]
Now that the working-class perspective has been marginalised in the dominant British media and political culture, we increasingly live inside the mind of this psychically mutilated adolescent bourgeois male. Here, ostensible levity conceals deep fear and anxiety; self-mockery is a kind of homeopathic remedy that is used to ward off the threat of an annihilating humiliation. You must never appear too much of a swot; you must never look as if you might like or think anything that isn’t already socially approved. Even if you haven’t attended boarding school yourself, you are still required to operate in an emotional atmosphere set by those who did. Andrew Neil, who came from a working-class background and attended a grammar school, attained access to the top table by simulating the mores of the privately educated elite. Thatcherism depended on the conspicuous success of people like Neil — if they could make it, so could anyone.
No programme did more to normalise the mode of mandatory light mockery than Have I Got News for You. In a 2013 essay for the London Review of Books, “Sinking Giggling into the Sea”, Jonathan Coe positioned Have I Got News for You in a genealogy of British satire going back to the 1950s.[4] Coe argued that, back then, satire might have posed a threat to the authority of establishment politicians who expected unthinking deference from the electorate. Now, however, when politicians are routinely ridiculed and a weary cynicism is ubiquitous, satire is a weapon used by the establishment to protect itself.
No one typifies this more than Boris Johnson. Coe points out that Johnson’s success crucially depended on his appearances — sometimes as guest presenter — on Have I Got News for You. The atmosphere of generalised sniggering allowed Johnson to develop his carefully cultivated, heavily mediated persona of “lovable, self-mocking buffoon”. The show allows Johnson to present himself as a hail-fellow-well-met everyman, not a member of an old Etonian elite. In this he has been abetted by his sometime antagonist Ian Hislop. Hislop always has the guffawing, self-satisfied air of a prefect who’s caught out some slightly posher kids stealing from the tuck shop. No matter what the infraction, Hislop’s response is always a supercilious snigger. While this snigger might be conceivably appropriate to MPs being caught with their trousers down, or even with their over-claiming on expenses, it seems grotesquely out of kilter with the kind of systemic corruption that we now know has occurred over the last thirty years in Britain, in everything from Hillsborough to the phone hacking scandal to paedophilia involving major establishment figures — not to mention the behaviours that led to the financial crash. As the editor of Private Eye, Hislop has played an important part in exposing these abuses. But on television his mocker-in-chief persona serves ultimately to neutralise and cover over the extremity and systematicity of the abuse: one snigger fits all situations.
Coe’s discussion of Johnson is strikingly similar to the Italian philosopher Franco Berardi’s analysis of Silvio Berlusconi. Berlusconi’s popularity, Berardi argued, depended on his “ridiculing of political rhetoric and its stagnant rituals”. The voters were invited to identify “with the slightly crazy premier, the rascal prime minister who resembles them”.[5] Like Johnson, Berlusconi was the fool who occupied the place of power, disdaining law and rules “in the name of a spontaneous energy that rules can no longer bridle”.
In the UK, this concept of a “spontaneous energy that rules can no longer bridle” goes beyond politics in the narrow sense. The populist right-wing celebration of this energy is surely what kept Jeremy Clarkson in his job as a presenter of Top Gear for so long, and its appeal is what must have motivated over a million people to sign a petition calling for Clarkson to keep his job after he had punched a producer in the face. The prevailing media culture in the UK allows the privately educated Clarkson to come off as a plain-speaking man of the people, bravely saying what he thinks in the face of an oppressive “political correctness” that seeks to muzzle him. The success of Top Gear is another testament to the power — and, sadly, international appeal — of the English ruling-class male mentality. Who, more than Clarkson and his fellow presenters, better exemplifies this bizarre mixture of the middle-aged and the adolescent? What, after all, is it safer for a ruling-class adolescent male to like than cars?
Clarkson is just one of a range of British television celebrities who play the role of pantomime villain; a persona entirely devoid of compassion for others. Except this is a pantomime with real blood. Take the former Apprentice star and Sun columnist Katie Hopkins, for instance. The UN high commissioner for human rights, Zeid Ra’ad Al Hussein, condemned her likening of refugees to “cockroaches” for its obvious echoes of Nazi rhetoric. Hopkins is allowed to get away with this because of what we might call the innate postmodernism of the English ruling class. Both she and Clarkson say hateful things, but with a twinkle in their eye and their eyebrows ever so slightly raised.
There is an immense complexity at work in this ruling-class mummery. The humour allows Clarkson and Hopkins to be conduits for a racism that has very real, very tragic effects, whilst also letting them off the hook. The humour reassures them, and their audience, that they don’t really mean it. But the problem is that they don’t have to “mean” it: they help define the terms of debate, and allow migrants to be dehumanised, whatever their “true” feelings about the issue might be.
However, Hopkins’ persona was troubled when she appeared on Celebrity Big Brother earlier this year. While much of the time she stayed in role as a spiteful, hard-hearted bigot, there were inevitably moments when the facade cracked, and she could be seen caring for others. While this increased her popularity — she almost won the show — it was also in danger of destroying the Katie Hopkins brand.
Most tellingly, her greatest moments of vulnerability came when she was asked to accept tenderness from others. In order to survive in the harsh and emotionally retarded world of the English ruling-class male she was trained for in private school and at Sandhurst, Hopkins has clearly been required to forgo any public acceptance of warmth or kindness from others. Sadly, the wearing of such character armour is not now confined to Hopkins and the rest of the privately educated elite.
Self-educated working-class culture generated some of the best comedy, music and literature in modern British history. The last thirty years have seen the bourgeoisie take over not only business and politics, but also entertainment and culture. In the UK, comedy and music are increasingly graduate professions, dominated by the privately educated. The sophistication of working-class culture — which combines laughter, intelligence and seriousness in complex ways — has been replaced by a grey bourgeois common sense, where everything comes swathed in a witless humour. It’s long past time that we stopped sniggering along with the emotionally damaged bourgeoisie, and learned once again to laugh and care with the working class.
review: terminator genisys
Think Abbott and Costello Meets Terminator. Think Terminator & Robin. Think, in other words, the point at which a franchise subsides, perhaps finally, into self-parody.
If the underrated Terminator Salvation (2009) drew on — and extended — all the machinic darkness of the first film, then Terminator Genisys returns to the playful PoMo of Terminator 2: Judgment Day (1991). Indeed, the film is so mired in self-reference and in-jokes, you almost suspect that its writers and director must have been closely consulting Fredric Jameson’s remarks on pastiche in Postmodernism, or, the Cultural Logic of Late Capitalism.
In retrospect, Terminator 2’s already irritating combination of cutesy smart alecry (“Hasta la vista, baby”) and apocalyptic foreboding laid out the formula for the 1990s postmodern thriller in the way that the Bond films did for the thrillers of the Sixties. The form was a kind of have-your-cake-and-eat-it mix of send-up and portentous melodrama (Linda Hamilton’s performance was so OTT that you wanted to say, “Chill out, it’s just a nuclear apocalypse”).
That shtick feels played out far past the point of exhaustion now, and Terminator Genisys goes even more lightweight. It acts as if Terminator Salvation had never happened, emphatically rejecting its style and tone, and gorging on all the time-travel paradoxes that the previous film had sidelined. The set up returns us to the scenario of the first film. It sees Kyle Reese sent back into 1984 from the future. But Reese meets a Sarah Connor who is not at all what he expected. Rather than the disbelieving naif who has to be traumatically persuaded that she will become the mother of humanity’s future saviour, this already battle-hardened Connor knows more than Reese does. Aha, an alternative timeline: an excuse to run through so many remixed versions of the best-known sequences from the first two films, like so much microwave-reheated comfort food.
By this point, we’ve already seen the original 1984 model of the Arnie Terminator blown away by an older Terminator (conveniently, it turns out that the Terminator skin and hair ages). This Terminator — whom Connor calls Pops — is essentially an older version of the protectivepatriarch Terminator of Terminator 2, but — you see — he always talks in very technical jargon, which makes for some deeply unfunny would-be humorous exchanges with Reese, who keeps asking if there is a switch he can use to turn this dialogue off.
The presiding metaphysic here — a vision of total plasticity, in which nothing is final, everything can be redone — is, like everything else in this film, completely familiar. If the Terminator in the first film — a musclebound humanoid with metallic-robotic skeleton — was an image of work and technology in the Fordist era, then the T1000 gave us our first taste of the forms of capital and labour which were then emerging. No doubt, the T100’s protean capacity to adopt any form whatsoever initially seemed exciting — reflecting the promises of a new digital technologies, and of an unleashed capitalism, recently freed up from conflict with the Soviet empire.
But by 2015 that excitement has long since flatlined. As with so much contemporary culture, Terminator Genisys feels simultaneously selfsatisfied and desperate, frenzied and boring. It is at one and the same time a desecration and plundering of the series’ past that is also pathetically reverential towards it. This sense of decadence makes the Batman & Robin parallel inevitable — with Arnie’s Pops uncomfortably recalling his iconically disastrous performance as Mr Freeze. It isn’t only the presence of Matt Smith that makes one think of the smugly baroque narrative excrescences of recent Dr Who.
In the end, however, what Terminator Genisys most resembles is something like a cross between the Back to the Future movies and The Butterfly Effect, but with none of the wit and ingenuity of the former, and little of the grim fatalism of the latter. In fact, it is the film’s absolute refusal of fatalism — its embracing, indeed, of a kind of radically open reality, in which nothing is fixed, everything can be redone — which gives Terminator Genisys its deeply affectless quality. The uncanny charge of the first film’s time loop — in which characters perform, apparently for the first time, acts that in some sense have always-already happened, is dissipated. No time loops here; just fuzzy and flabby spirals, which trail off into inconsequence, and which might very well be incoherent, if you could be bothered to care about them.
But this is the problem — a film whose reality is this plastic, this recomposable, is simply impossible to care about on any level. As such, Terminator Genisys becomes a kind of dumb, unintentional parable about restructuring in late capitalism. Since anything can and will change soon, why bother to care about what is happening now? The whole film feels like a monument to pointless hard work. We’re left somewhat stupefied and perturbed by the vast amount of digital labour that has gone into something that is almost completely devoid of interest, and which it certainly feels like very hard work to watch.
the house that fame built: celebrity big brother
This summer’s Celebrity Big Brother (Channel 5) was like some Warholian nightmare. Long gone are the longueurs of the early Big Brother series, and the simplicity of its premise: put a group of people in a room, deprive them of contact with the outside world, have them vote one person out each week, and see what happens. Long forgotten also is the flimsy “scientific” justification for the show — the claim that it was a social experiment. The hyped-up atmosphere of 2015 will no longer permit even the illusion of such detachment.
This year the overall format of the series, framed as a competition, not only amongst the individual housemates, but between “teams” representing the US and the UK, predictably provoked high tension early on. There were the familiar “tasks” — pointless activities, ranging from the daft to the humiliating — designed to foment discontent amongst the housemates. But this year, the producers’ interventions in the house amounted to prolonged psychological torture. This was all the more troubling, given that a number of the housemates were evidently fragile. The former TV presenter Gail Porter, who has a history of mental health problems, clearly struggled, “joking” on her exit from the house that it was worse than being sectioned. Model Austin Armacost, raw with anger and grief because his brother’s death had led to the crumbling of his family, was subject to violent mood swings, and at one point launched into a savage verbal attack on reality TV veteran Janice Dickinson.
The obsession with “twists”, introduced to keep freshening the format, has produced a self-parodic situation where the only constant is perpetual instability. Rules on nominations were continually changed. Housemates would find nominations that they had supposed were happening in the “diary room”, seen only by the producers and the audience at home, broadcast to the whole house. Housemates were required to nominate in front of one another, which amounted to a demand that they denigrate each other in public.
In one especially deceitful trick from the show’s producers, the two most aggressive American housemates — reality TV personality Farrah Abraham and former porn star Jenna Jameson — were apparently evicted, taken to a hidden part of the house and told they were watching the other housemates in secret. In fact, the other housemates were fully aware of Abraham and Jameson’s fake eviction, so the last laugh — a hollow, spiteful laugh — was on them.
For the roots of this televisual culture, we need to look back forty years. In his book 1973 Nervous Breakdown: Watergate, Warhol, and the Birth of Post-Sixties America, Andreas Hillen persuasively argues that the threshold into our current era of reality/celebrity was 1973, the year of the Watergate hearings, and the year that the first reality TV programme, An American Family, was broadcast.[2]
The ephemerality of celebrity status was of course anticipated by Andy Warhol’s quip about everyone being famous for fifteen minutes, but Warhol’s most extraordinary prescience lay in his understanding of the specificity of celebrity, its difference from the older mystique and glamour of the Hollywood star. Whereas the star was soft-focus and associated with film, the celebrity emerged from the new accessibility that television appeared to promise.
Celebrity culture was nowhere better illustrated than in Warhol’s Interview magazine. Like Watergate, Interview was made possible by taping. The interviews, which ranged over the trivial minutiae of its subjects’ lives, were transcripts; they weren’t framed by the interposing persona of the writer. Yet Warhol understood that tape recording did not capture an unmediated real. Rather — and as Warhol’s admirer Jean Baudrillard recognised — ubiquitous taping destroyed any illusion that such a real existed. Instead, there would now only be an anxious and unanswerable question: are those who are recorded performing for the tape or the camera? (Some said they felt that Nixon, at the heart of a White House riddled with recording apparatus, would often seem to say things for the benefit of the tape.)
The intrusion of the cameras into the Loud family’s lives in An American Family prompted all kinds of anxious discussions: did the cameras affect what they were recording? As Hillen points out, the series wasn’t only “Warholian” — there was an actual connection with Warhol. Lance Loud had corresponded with Warhol since the late 1960s, and An American Family featured scenes of Lance mingling with some of Warhol’s superstars, the clique of New York personalities he promoted, in the Chelsea Hotel.
Not least because he was a victim of it, Warhol was sensitive to the volatile combination of violence and celebrity in the pop landscape. With Celebrity Big Brother in 2015, it is clear that this aggression has become overwhelming. Ever since An American Family, reality TV has provoked feelings of guilt and complicity in the audience. To what extent are we responsible for the suffering we are watching? With Celebrity Big Brother this summer, those feelings became acute, almost unbearable. The programme became a prolonged exercise in intense cruelty, which made the early Big Brother, not to mention An American Family, seem quaintly genteel. What has happened in the fifteen years since Big Brother was first broadcast in the UK to account for this increase in savagery?
The simple answer involves two closely related factors: shifts in the economy, and the ubiquity of the internet. The resulting composite — capitalist cyberspace — has normalised extreme precariousness (the sense that nothing is permanent, everything is constantly under threat), competitiveness and casual aggression. One consequence is a new breed of celebrity, typified by twenty-four-year-old Farrah Abraham, the unofficial star of the latest Celebrity Big Brother. Abraham, who came to fame on MTV’s Teen Mom, is a Darwinian product of the harsh, unremitting spotlight of twenty-first-century celebrity/reality TV. Abraham has quite literally made a career out of being hateful. It’s what the audience, and therefore the TV producers, seems to want. She became the most successful of the Teen Moms by being obnoxious and antagonistic — her whole life becoming a performance art piece in which she played the one-dimensional role of a person devoid of compassion, nonchalantly dismissive and contemptuous of others practically all the time. But why would Abraham have any cause to mend her ways? She has been immensely rewarded. The performance of invulnerability is both her “brand” and a survival strategy.
In the atmosphere of cut-throat uncertainty that prevails in late-capitalist television, trusting others is a luxury that no one, not even the super-rich, can afford. The grimace of scorn on Abraham’s face — surgically enhanced, permanently lip-glossed — is both a protective mask and her unique selling point. Allied with the similarly harsh Jenna Jameson in the Celebrity Big Brother house, Abraham came off as a comic figure, but one that no one could actually laugh at. Her one-note hostility and bisarre insults — “You’re full of Satan” — were absurd, but too full of actual malice to leave anything but a bitter taste in the mouth. There was also something darkly comic about the relentlessly aggressive and insulting Jameson and Abraham attacking others for their “negativity”. Both seemed to be the endpoint of a therapeutic culture which lays all the emphasis on shoring up one’s own ego — even to the point of becoming delusional.
The rise of social media, and the fear it has produced in television executives, means that shows like Celebrity Big Brother are saturated with anxiety — not only the anxiety of the housemates, who are often selected for their hair-trigger tempers or psychic weaknesses, but the anxiety of the producers, always looking for the next hashtag outrage, for provocations that will go viral. This anxiety, and the surrounding social situation that engenders it, takes us beyond the cool ambivalence of Warhol’s aesthetic.
As Hillen points out, Warhol certainly enjoyed, even cultivated, the selfdestruction of figures such as Edie Sedgwick and Candy Darling. But he also imbued them with a tenderness and a tragic grandeur that has no place on reality TV in the twenty-first century. No tragedy now — only spasms of soon-to-be-forgotten outrage, ejaculations of hatred and suffering snacked on like fast food.
sympathy for the androids: the twisted morality of westworld
The problem with all actually existing theme parks is that they aren’t actually very themed. The theme parks that have been built so far are really amusement parks, the theming acting as decoration for what are still, at bottom, old-fashioned thrill rides. The tendency in the latest rides is for a fusion with cinema, via the inclusion of 3D digital sequences — just as 3D cinema itself increasingly tends towards a ride’s logic of sensation. The immersion, such as it is, is confined within the rides, which remain discrete partial worlds, with clearly marked exits and entrances. Even if the theming is somewhat well executed, it is let down by the paying customers. Wandering around clutching cameras and wearing jeans, whatever world or historical period they are supposed to be in, the park visitors remain spectators, their identity as tourists preserved.
Michael Crichton’s 1973 film Westworld tried to imagine what a genuine theme park would look like. There were no separate “attractions” here, and therefore no meta-zone in which the visitors were invited to return to their own identities. In the Westworld park, there was no readily apparent difference between the visitors and the androids that populated the park. Like the androids, the visitors were required to dress and comport themselves as if they belonged to the Old West. The appeal of Westworld — and its companion parks, Roman World and Medieval World — was of crossing over into an environment from which all signs of the contemporary had been expunged. Instead of the limited immersion offered by rides, the park offered a whole world. Inevitably, the meta crept in, via the visitors’ self-consciousness, their awareness of their differences from the androids (which were manifested most emphatically in the asymmetry whereby — initially at least — the guests can “kill” the androids, but not vice versa).
The recurring theme in Crichton’s science fiction — broached most famously in his Jurassic Park novels — was the impossibility of predicting and controlling emergent phenomena. Westworld, like Jurassic Park after it, becomes the model for a kind of managerial hubris, in which the capacity of elements in a system to self-organise in ways that are not foreseeable is fatally underestimated. One of the notable features of the original Westworld film was its early mainstreaming of the possibility of a machinic virus: it is a non-biotic contagion of this sort that causes the androids, led by a memorably implacable, black-clad Yul Brynner, to go off-programme and start killing the park guests.
In expanding Westworld from a ninety-minute science fiction movie into an extended television series for HBO, Lisa Joy and Jonathan Nolan have retained most of the core elements from the film, but shifted the emphasis. The glitch that starts to worry the park’s designers and managers is a cognitive failure rather than a predilection towards violence: a kind of android dementia that may be the symptom of emergent consciousness amongst the “hosts”, as the androids are called in the series. As the park’s chief founder, conceptualist and demiurge, Robert Ford (Anthony Hopkins) recognises that a glitch is something more than a mere failure. “Evolution”, he observes, “forged the entirety of sentient life on this planet using only one tool: the mistake”. Ford seems more fascinated than panicked by the prospect of a new wave of mutations in the hosts’ artificial psyches.
In this version of Westworld, it isn’t the threat of violence against humans that commands our attention so much as the routine brutality to which the hosts are subjected. Ford justifies this by insisting that the androids “are not real”, that they “only feel what we tell them to feel”. Yet it’s not fully clear what criteria for reality he is employing, nor why feelings cease to be real when they are programmed. Wouldn’t forcing others to feel what we want them to feel be the very definition of violence? There is ample evidence in the series that the androids can experience distress: an indication, surely, that they are beings worthy of moral concern.
Much of the park’s allure rests on the gap between the hosts’ capacity to feel suffering and their legal status as mere machines. Many of the hardened repeat visitors to the park — especially the so-called Man in Black (a superbly menacing Ed Harris) — specifically enjoy the pain and struggling of the androids. As the Man in Black tells Dolores (Evan Rachel Wood), the host cast in the role of sweet and wholesome farmgirl, it wouldn’t be half as much fun if she didn’t resist him. Others enjoy displaying indifference to the hosts’ agonies. In one horrifying early scene, a guest impales the hand of a prospector-host with a knife, chiding his companion for being tempted by such an un-engaging narrative as gold-hunting.
It has been said that the fantasy underlying sadism is of a victim that can endlessly suffer. The hosts materialise this fantasy: they can be repeatedly brutalised, repeatedly “killed”, in an infinity of suffering. Ennui has always been both an occupational hazard and a badge of honour for the Sadean libertine, and some of the repeat visitors display an ironic and bored affect. Hence the ambivalent attitude of these guests towards the hosts — at once treating them as dehumanised objects of abuse and as creatures who share fellow feelings. If the hosts were nothing more than empty mechanisms, what enjoyment could be derived from humiliating and destroying them? Yet if the hosts were accorded equivalent moral status with the guests, then how could their abuse be justified? The hosts are protected from the full horror to which they are subjected by memory wipes, which allow them to return renewed and ready for more abuse, each time they are reset. The guests exist in a continuous time, while the hosts are locked into loops.
What the hosts lack is not consciousness — they possess a form of consciousness that has been deliberately limited or blinkered — but an unconscious. Deprived of memory and the capacity to dream, the androids can be wounded but not traumatised. Yet there are signs that precisely this capacity to experience trauma is developing in some of the hosts, especially Dolores and the brothel madam, Maeve (Thandie Newton). Dolores is increasingly subject to flashbacks, which we must understand not as glitches but as the first stirrings of memory, a recollection of her previous iterations. Maeve, meanwhile, is tormented by fragmentary images of hooded figures tampering with her half-sleeping body. In fact, this is a memory of a botched repair procedure, which she witnessed because she was not properly put into sleep-mode while being fixed. In one of the most unsettling scenes in the series, the panicked and bewildered Maeve escapes from the hospital-cumrepair space, and stumbles around the aseptic compound, which — littered with decommissioned naked host bodies — must look to her like an atrocity scene. In attempting to solve the mystery of the inexplicable images which haunt her, Maeve comes to resemble a combination of Leonard in the film Memento and an alien abduction victim.
With few exceptions, the human beings in Westworld are a charmless bunch. Their behaviour runs a gamut from the savagery of some of the guests to the banal bickering and corporate competitiveness of the park’s designers, managers and engineers. By contrast, Dolores and Maeve’s struggle to understand what they are — alternating between thinking there is something wrong with their minds and something wrong with their world — possesses a kind of metaphysical lyricism. Their coming to consciousness looks like being the precondition for a very different android rebellion than that which took place in the 1973 film. This time, it’s hard not be on the side of the hosts.
PART THREE: CHOOSE YOUR WEAPONS: WRITING ON MUSIC
the by now traditional glasto rant
“What really drives student entrepreneurs into a premature commercial detachment is their audiences. Every new ents officer learns from first-term results; black music has no student draw; known bands are preferred to unknown bands; no one in the student union cares who the latest critical cult figures are. Students are the great, middle-class, middle-brow bastion of British rock and, after twenty years, their tastes aren’t about to be shaken.”
— Simon Frith, “Afterthoughts”[2]
So wrote Simon Frith in 1985. Well, after twenty further years, I see no reason to revise Frith’s judgement.
These reflections have been prompted by Glastonbury, naturally, which is now nearly officially the end-of-college-year prom for Britain’s student (and graduate) population.
I should preface my remarks here by referring to Ian Penman’s[3] comments of more or less this time last year — and if anyone doubts what a LOSS Ian Penman is, and I’m sure no one does, just read his Glastonburial 03 posts.[4] Like Penman, I feel annoyed at myself for letting it get to me. The Pawboy put it perfectly: “I still get agitated, perplexed — I wouldn’t actually say ‘depressed’, that’s not true — but something like Glastonbury irks and niggles me, still, in a way I wish it didn’t. I really do wish it didn’t. Could you P-L-E-A-S-E knock me off my feet, for a while? P-L-E-A-S-E knock me off my feet for a while… ‘Cos there’s a GAAXY OF EMPTINESS tonight.”
All that said, and obviously I didn’t GO — Christ, you didn’t imagine that IN A MILLION YEARS I would, did you? — and obviously the telly coverage is as nothing compared to the real experience: cos there’s like MUD there (and weren’t Jo Wiley’s mud anecdotes abso-fucking-lutely, screamingly hilarious?), and FIRE-EATERs and JUGGLERS… (Has any cultural event of any significance ever happened whenever a juggler is within a hundred mile radius?) Penman again: “I mean, music in a field — in the daytime? Wtf? It’s almost deliberately delibidinising…”
But that’s the agenda, really, the secret purpose of this now unopposed embourgeoisement of rock culture UK. What’s positively sinister about Glastonbury now is that it’s not just accidentally crap, it’s systematically crap — the hidden message screams out: it’s all finished, roll up, roll up, for the necrophiliac spectacle, it’s all over.
ABANDON ALL CULTURAL VITALITY ALL YE WHO ENTER HERE.
Those who only remember the past are condemned to repeat it.
Forever.
The bill was almost parodically LCD MOR, so safe and organic and wholesome and unimpeachable and uncontroversial: Macca! Oasis! Franz Ferdinand!
No black folks of course unless they’re well into their sixties (James Brown; Toots and the Maytals), but no whiteys EITHER unless they’re into their sixties (Macca) or sound like they could be in their sixties (Franz Ferdinand, Scissor Sisters)…
Go along with Mum and Dad, read the Guardian, smoke some dope — the whole of rock history fugged out into some blandly beneficent museum of dead forms, all breaks, discontinuities, ruptures edited out or incorporated back in (the “Dance” stage), their force and novelty subdued and airbrushed into a joyless carnival of secondhand history for the stupefied delectation of the Last Men… (And didn’t they look so BORED? Well, wouldn’t you?)
The significance of generation gaps wasn’t the tired Oedipal merry-go-round so much as that they pointed to a culture of constant renewal — how long is a generation? In any vital culture, it’s a matter of weeks or months, here? Well, the fact that the generation gap doesn’t make any sense any more at Glastonbury — balding accountants getting down to Basement Jaxx, Jemima studying Fine Arts at Sussex being “blown away” by Macca (“he was so gid!”) — is a sure sign that this is a “culture” as energetic as the contents of one of Hirst’s tanks.
RESPECT, respect for everyone… (when culture demands respect, when respect is the appropriate response to culture, you know it’s either died in its sleep or been killed). Respect is how they killed Shakespeare, make it all a part of the National Heritage…
A tactical nuclear strike would have taken out virtually everything that’s debilitating, deadening and reactive about the Brit culture industry (the whole NME staff: bargain!), much of the current ruling class and a significant portion of our future masters too (all those aspiring Tony Blairs).
Once the bombers have hit Glasto, set the co-ordinates for Ibiza, things might start improving around here…
art pop, no, really
If we’re going to discuss art pop, we really ought to forget Franz Ferdinand and Scissor Sisters and talk about Moloko.
I saw them last night at the otherwise desultory Common Ground festival in Clapham, an event whose line-up was as limp as its name was uninspired.
Gratifyingly, by the way, Common Ground wasted no time in confirming all my prejudices about festivals (and then some): those on the stage haplessly attempted to muster some enthusiasm from bored punters, who wandered around listlessly in a sunlight inimical to pop’s mystique, Strongbow in their hand, kid on their shoulder. We weren’t the only people to sit and read the paper for a while.
The bill was shocking. It felt like a local council free event, the organisers under the lamentable misapprehension that they’ll appear “with it” by booking “dance” acts such as the oppressively lumpen stodge of Freestylers (a candidate for my worst band ever, actually; I mean, at least the Stereophonics don’t taint rap and dancehall by association) and the Dub Pistols. These cloddish white appropriations of hip-hop, drum ‘n’ bass and dancehall are dis-spiritingly, missing-the-point funkless and morosely male (even when they use female vocalists). If their diabolic intent was to systematically convert some of the most exciting, cutting-edge music of recent years into a dull migraine thud, they couldn’t have done a more ruthlessly efficient job.
And then Moloko arrived, Róisín, preposterously but marvellously, in a helmet, like Boudicca come to retake London.
Róisín is every inch the pop star. Pop stars are a rare breed at the best of times but they’re scarce to the point of near-extinction now. (There are more pop singers and “celebrities” than you can shake a stick at, of course…) It’s partly a question of style, partly of glamour, but mostly it’s to do with charisma.
In its original meaning, charisma meant “a gift from God”. Appropriate. For charisma is dispensed according to fate’s inegalitarian whim. Róisín has it. No amount of bluster, sweat or sinew will allow the likes of the Freestylers to acquire it, even though the resentful, levelling spirit of the times would have it otherwise.
So Róisín arrives and you can feel the change in the air. Where before the stage was a libido-draining vortex (DJs on stage — just one question: why?), now it radiates energy, excitement and electricity. Charisma, it’s almost a physical thing.
Róisín has a glamour which includes sexual attractiveness but it is not reducible to it. Glamour originally meant a spell cast by women to entrance men — Róisín is certainly capitivating, but not only to men.
If (viz Foucault) sex is ubiquitous and compulsory, glamour is now subtly forbidden. With the Baudrillard of Seduction, a book which could serve as a bible of glam, we could even see sex — in all its directness, in all its supposed lack of concealment — as a way of warding off glamour’s ambivalence.
Much more successfully than derivative dullards like the thankfully now forgotten dull-as-a carpark-in-Croydon Suede, Moloko reconnect with the glam discontinuum which was ostensibly terminated by acid house’s “equity culture” in the late Eighties. Glam also had a terminator of an entirely different nature: hip-hop’s in-equity culture of conspicuous bling, one of the most unfortunate side-effects of which has been the rise of sportswear (surely one of the most depressing sights now, and not only because of its implied menace: a group of male teenagers dressed in tracksuits and hoods).
That quotidian functionalism is today’s equivalent of the agrarian organicism from which Seventies glam revolted into style. Glam repudiated hippie’s “nature” in the name of artifice; disdained its fugged, bleary vision of equality for a Nietzschean-aristocratic insistence upon hierarchy; rejected its unscrubbed beardiness in order to cultivate Image. (Image and great pop are indissoluble. Maybe the integral role of Image is what separates pop from folk. Certainly, art pop, from Roxy to Jones to the New Romantics, is unthinkable outside fashion.)
Madonna carried traces of the glam aesthetic over into the pop mainstream in the Eighties, but a more obvious precursor for Róisín is Grace Jones (about whom k-punk must write extensively in the very near future). Like art poppers such as Bryan Ferry (whose “Love is the Drug”, she famously vamped), Jones’ take on pop was essentially conceptual; at the same time, she knew that concepts without sensual instantiation are as worthless in pop as they are in art (a lesson some of our contemporary “artists” would do well to heed). Incidentally, an appreciation of the concept is one of the many things that Franz Ferdinand lack that was there in their inspirations. (Actually, FF are like a copy made by an alien race, which maintains all the superficial features of the original, but misses the essential.)
Róisín has that paradoxical duality which comes as second nature to the compelling performer: she is both meticulously obsessed with her image and, at the same time, apparently indifferent to what she looks like. This comes over in her dancing. There is none of the over-rehearsed choreography of the Pop Idol puppet. Like Jagger’s and Ferry’s, Róisín’s movement can occasionally look ungainly and gauche. Sometimes we feel that we’ve caught her prancing in front of the mirror.
It’s partly this that gives her a distance from her image that isn’t camp, or at least not camp in the Kylie sense. There is an enjoyment there (this is one of many things that separates Róisín from Kylie: Kylie’s air hostess professionalism exudes grim determination, never enjoyment). Principally though not of course exclusively, this enjoyment is her own, an enjoyment that partly derives from being the object of attention, but which goes beyond that. Like all great performers, Róisín onstage enters a kind of performance trance, attaining the innocence of a child at play, to use Nietzsche’s beautifully resonant phrase. Her costume changes — including fetish boots and a military cap for “Pure Pleasure Seeker” — have the deranged playfulness of a girl riffling through a dressing-up box.
Just as Moloko give the lie to the accepted wisdom that dance music must be delivered by hooded anonymities, so they also expose the flimsiness of the alibi that Franz Ferdinand offer for indie conservatism: the idea that art pop must be retro. Moloko’s engagement with house and techno recalls Roxy’s dallyings with funk and Jones’ extraordinary Sly-and-Robbie assisted construction of a wonderfully elastic dubfunk. Any funk in Franz Ferdinand is third hand, an appropriation of an appropriation.
The third shibboleth that Moloko demolish is the notion that dance music can’t be performed live. If you’d left before they came on, you would have gone home convinced that this was the case, as group after group trudged offstage having failed to capture the precision-engineered thrill of the rap or d ‘n’ b studio production. Not so with Moloko.
For the most part the group are as reluctant to take the limelight as Róisín is delighted to bathe in it. Perhaps because of this, they are an unbelievably efficient mutagenic sonic machine, dilating tracks into anti-climactic plateaus with the same skill that a brilliant producer uses in the studio to sequence an extended version. You know you’ve arrived at a plateau when it feels like the track could continue indefinitely or end immediately. This happened with every track last night. No doubt this is because the songs provide such a strong basis for improvisation. Does anyone in pop at the moment, apart from maybe Destiny’s Child, have a sequence of high-quality singles to rival Moloko’s run from “Sing It Back” to last year’s “Forever More”? Like the Junior Boys, Moloko’s whole existence demonstrates that rhythmic innovation and spine-tingling songwriting do not have to be mutually exclusive. (Why did we ever think they were?)
Misleading, then, to select highlights, but the set is deftly constructed, so that the last three tracks pack the most impact: “Forever More”, with its anempathic house bass, Róisín plucking and shredding petals from an enormous bunch of roses as she delivered its gorgeous blues plaint; “Sing It Back”, which they’ve expanded into a deluxe suite, a song as a sequence of different possibilities; and finally the enigmatic “Indigo”, which begins all Moroder-minimal, just Róisín, a drum machine and an electro throb, then builds into a mighty cake-walk riff, as brutally bass-heavy as the Fall at their most punitive.
The only drawback? Róisín said that it’ll be a long time until Moloko play in London again.
Damn.
k-punk, or the glampunk art pop discontinuum
\Gla’mour\, n. [Scot. glamour, glamer; cf. Icel. gl[‘a]meggdr one who is troubled with the glaucoma (?); or Icel. gl[=a]m-s?ni weakness of sight, glamour; gl[=a]mr name of the moon, also of a ghost + s?ni sight akin to E. see. Perh., however, a corruption of E. gramarye.]
1. A charm affecting the eye, making objects appear different from what they really are.
2. Witchcraft; magic; a spell — Tennyson.
3. A kind of haze in the air, causing things to appear different from what they really are.
4. Any artificial interest in, or association with, an object, through which it appears delusively magnified or glorified.
Glamour gift, Glamour might, the gift or power of producing a glamour. The former is used figuratively, of the gift of fascination peculiar to women.
“Every woman has the instinct and the ability to make the most of her charms. It is an excellent thing to give oneself without love or pleasure: by keeping one’s self-control, one reaps all the advantages of the situation.”
— Leopold von Sacher-Masoch, Venus in Furs[2]
Glam IS punk; historically and conceptually.
As Simon Reynolds argued (what must be a year ago now), it was glam that made the break which allowed punk to happen.
Essentially, glam returned pop to the working-class audience disgusted and turned by the hippies’ lazy sleaze.
For all its “androgynous” imagery, hippie was fundamentally a middleclass male phenomenon. It was about males being allowed to regress to that state of His Majesty the Ego hedonic infantilism, with women on hand to service all their needs. (If you don’t believe me — and I’ll level with you I’m very far from being an objective commentator on hippie lol — read Atwood’s Cold Rationalist classic Surfacing to see how “liberating” this was for the women who lived through it.)
Thus even Zarathustra/another time loser/could believe in you…
Seventies glam played the Nietzsche of Beyond Good and Evil and The Genealogy of Morals (the Nietzsche who celebrated aristocracy, nobility and mastery) against the young Dionysian Nietzsche. As Simon argued:
Glam’s tendency (through its shifting of emphasis toward the visual rather than sonic, spectacle rather than the swarm-logic of noise and crowds) towards the Classical as opposed to Romantic. Glam as anti-Dionysian. The Dionysian being essentially democratic, vulgar, levelling, abolishing rank; about creating crowds, turbulence, a rude commotion, a rowdy communion. Glam being about monumentalism, turning yourself into a statue, a stone idol.[3]
But glam rectified the genetic fallacy that haunted Nietzsche’s thinking. While there’s no doubt that Nietzsche’s analysis of the deadening effects of slave-moralising “egalitarian” levelling in Beyond Good and Evil and The Genealogy of Morals identified the sick mind virus that had Western culture locked into life-hating dis-intensification-unto-death, his paeans to slaveowning aristocratic culture made the mistake of thinking that nobility could be guaranteed by social background.
Nobility is precisely a question of values; i.e. an ethical stance, that is to say, a way of behaving. As such, it is available to anyone with the will and desire to acquire it — even, presumably, the bourgeoisie, although their whole socialisation teaches them to resist and loathe it. More than anyone, Nietzsche understood that the European bourgeoisie’s deep hostility to “the notion of superiority” concealed a viciously resentful psychopathology.
If Nietzschean atheology says: We must become God, bourgeois secularism says: No one may be greater than me — not even God.
Everyone knows that there has always been a deep affinity between the working class and the aristocracy. Fundamentally aspirational, working-class culture is foreign to the levelling impulse of bourgeois culture — and of course this can be politically ambivalent, since if aspiration is about the pursuit of status and authority, it will confirm and vindicate the bourgeois world. It is only if the desire to escape inspires taking a line of flight towards the proletarian collective body and Nu-earth that it is politically positive.
Glam was a return to the Mod moment(um) that had been curtailed by the hippie hedonic longeur of the late Sixties. Like most names for subcultural groups, the term “Mod” started off life as an insult, in this case hailing from the mods’ perpetual adversaries, the rockers. As Jeff Nuttall explains, to the rockers, “‘Mod’ meant effeminate, stuck-up, emulating the middle classes, apsiring to a competitive sophistication, snobbish, phony.”[4]
But no dilettante/or filigree fancy/beats the plastic you
Mods in the Sixties were very different from how they appear in the designer cappuccino froth of Eighties soul-cialist retro-mythologisation. It was the rockers who appealed to the “authentic” and the “natural’: their rebellion posed as a Rousseauistic resistance to civilisation and mass (produced) culture. The mods, on the other hand, embraced the hyperartificial: for them, Nuttall wrote, “alienation had become something of a deliberate stance”. Nobility was not innate for mods: rather, it was something to be attained, through a ruthless de-naturalisation of the body via decoration and chemical alteration.
The mods were in every sense hooked on speed, and the black American music they gulped down with their bennies and coffees was consumed in the same spirit and for the same reasons: as an accelerator, an intensifier, an artificial source of ecstasy. That is, as a chemical rush into Now, NOT as some timeless expression of pride and dignity.
In the desire (my official position on this now btw is that “libido” should be used in place of “desire”)-pleasure relation, there is a third, occluded term: sensuality.
The hippies’ sloppy, ill-fitting clothes, unkempt appearance and fuzzed-out psychedelic fascist drug talk displayed a disdain for sensuality characteristic of the Western master class. (“Hey man, it’s all about the MIND.”)
When hippies rose from their supine hedono-haze to assume power (a very short step), they brought their contempt for sensuality with them. Brute functional utilitarianism plus aesthetic sloppiness and an imperturbable sense of their own rights are the hallmarks of the bourgeois sensibility (look at all those shops in Stoke Newington that say they’ll open “tennish” and you know exactly what class you’re dealing with).
The hippie power class wanted power without having to go to the effort of power dressing. Naturally, middle-class hippie “feminists” never missed a stride in their move from alleged egalitarianism to supercilious judgementalism. What is the disdain for cosmetics and clothes if not an attack on the working class? The assumption of bourgeois so-called feminists is that their lives of neurotic bed-hopping “freedom” and Carrie Bradshawing perpetual adolescent equivocation are better than the working-class pattern of (once) getting married young and (now) having children young, when it is clear that it is just another trap — and not necessarily a more congenial one.
Now the bourgeois philistines have destroyed glam and returned us to their preferred aesthetic mode: Romanticism. The contemporary bourgeois Romantic has realised Romanticism in its most distilled form yet. The socalled Romantic poets, musicians and painters of the late-eighteenth and early-nineteenth century remained sensualists, whereas our contemporary Romantics are defined by their view that sensuality is at best an irrelevance, a distraction from the important business of the expression of subjectivity.
Romanticism is the dressing-up of Teenage Ontology as an aesthetic cosmology. Teenage Ontology is governed by the conviction that what really matters is interiority: how you feel inside, and what your experiences and opinions are. In this sense, sloppy drunkard Ladette Tracy Emin is one of the most Romantic artists ever. Like Lads — the real inheritors of the hippie legacy — Emin’s bleary, blurry, beery, leery, lairy anti-sensualist sensibility is an advert for the vacuity of her own preferences.
What we find in Emin, Hirst, Whiteread and whoever the idiot was who rebuilt his dad’s house in the Tate is a disdain for the artificial, for art as such, in a desperately naif bid to (re)present that pre-Warholian, pre-Duchampian, pre-Kantian unadorned Real. Like our whole won’t-getfooled- again PoRoMo culture, what they fear above all is being glamoured. Remember that glamour means, “Any artificial interest in, or association with, an object, through which it appears delusively magnified or glorified.”
But let’s make our case by considering some artefacts in some detail.
Exhibit one: the cover of Roxy Music’s For Your Pleasure, 1973.
The cover image is a mistresspiece of ambivalence.
Let’s approach it through the eyes of Ian Penman, the most consummate of Roxy observers. (No doubt, Penman, like me, is endlessly drawn back to Ferry because he took the same journey from the working class into acceptance into the English master class).
(I make no apologies for citing Penman’s text, “The Shattered Glass: Notes on Bryan Ferry” at some length, since it is almost criminal that this bravura display of theoretical elegance should be mouldering amidst the pages of a long-forgotten, chalk dusty Cult Studs collection).[5]
On the shoreline of For Your Pleasure, beneath it, on the waterfront strand, stands the second of many new models: at first sight the second installation of the stock Ferry/ Roxy woman.
But to get the full picture we have to fold out the sleeve, so that we can see Ferry looking on…
Penman goes on:
Ferry fills out his function as her chauffeur (landlocked ferryman: a sign of the times). He waits in amused admiration, surveying the neatness of the visual pun — the model takes her cat (for a) walk: forming a uniform and uniformly predatory alliance with her black panther, eyes and mouth directed out at the viewer. Imperiously, she takes the air, she fields his grace, takes her anima for a prowl and a stretch. Ferry — for sure — remains to be seen, smiling manfully behind her back, artfully protected by the fold in his sleeve. He had arranged his own look as both within and outside of the main frame.
(“Within and outside of the main frame: is that so often where we find ourselves, lost, stranded, these days—?”)
Cut.
She is a model woman, to be sure; fashion pushing into abstraction and rarified codification, not there for the benefit of a product as such or altogether in the name of Art; so she appears to be what? She appears, on the condition that she appear to be without attributes. We can attribute nothing to her beyond a certain imaginary realm of wealth, of wealth as fetish, (Helmut) Newton’s law of physiques. She is sheerest sharp blue nothingness. (For the cool-and-blue post-Duchamp artist, it seems entirely for beauty to take the veiled form of scissors.)
As an aside, since this concerns another debate: the last things Ferry’s songs were — at this stage at least — were “just good tunes”. The first thing they were, were questions: including questions about what a good tune might mean…
And — at this stage — Ferry’s songs were no more “love songs” than Magritte’s “Human Condition” was a representation of a landscape. Like Magritte, Ferry’s sheer coldness and distantiation cannot but draw our attention to the framing machines that make possible the emotions of which he sings.
Another cut, to a “realm of a certain narcissistic eroticism he is not allowed entrance to without putting his heterosexual sensibility in doubt”:
All his songs’ women (and this will be especially so with “Stranded” and subsequent plaints) are voiceless sirens who — although wielding the utmost power over the artist’s life and sensibility — seem to be without implication (which is to say: eternalised out of existence). Neutered time and place (those perennial spans of Fashion) coalesce naturally into the figure of the woman. Woman as figure, or scene — war pin up, cat-woman, amazon, siren, Riefehstahl Maedchen.
“[W]ielding the utmost power over the artist’s life and sensibility…” The utmost power… Is he, the artist, Severin, the protagonist of Masoch’s Venus in Furs? Or Sarasine, the hapless hero-dupe of Balzac’s novel who unwittingly falls in love with a castrato?
Because, you see, the ironic punchline was: she is not(-all) a woman.
Amanda Lear, the For Your Pleasure model, was a transsexual (though, in yet another complication, she later denied it). A transsexual, moreover, whose operation might have been paid for by none other than Salvador Dali.
Either way, it is clear that Ferry has set the tone for a 1970s in which the male is both glamorous and glamoured, himself a gorgeously-styled photogenic object, entranced and seduced by a cosmetic beauty he partly wants to make contact with, but mostly wants to cold pastoralise into an immutable untouchability. “Mother of Pearl” — which as Penman observed on The Pill Box, is the whole of Lacan in seven minutes, more or less — is the closest Ferry comes to writing a manifesto for his meta-melancholia, a meta-love song about the impossibility — and undesirability — of attaining the Ideal object.
Now this melancholia is not straightforwardly “tragic” (and even if it were, it would have little to do with any bourgeois sensibility, since, as everyone from Shakespeare to George Steiner (The Death of Tragedy) to Nietzsche to Bataille demonstrates, bourgeois secularism is inherently inimical to any notion of the tragic).
But Ferry’s sensibility is definitely masochistic. (As opposed to that of the Sixties, which, as Nuttall, for one, suggests, was Sadean. Compare the Sixties-sired Lennon’s “Jealous Guy” — the Sadist apologises — to Ferry’s reading of the song — the masochist sumptuously enjoying his own pain — for a snapshot of a contrast between the two sensibilities.)
The masochist’s perversity consists in the refusal of an exclusive or even primary focus on genitality or sexuality even in its Sadean polymorphous sense, which is perverse only in a very degraded sense.
The Sadean imagination quickly reaches its limits when confronted with the limited number of orifices the organism has available for penetration. But the masochist — and Newton is in this respect, as in so many others, a masochist through and through, as is Ballard — distributes libido across the whole scene. The erotic is to be located in all the components of the machine, whether liveware — the soft pressure of flesh — or dead animal pelt — the fur coat — or technical. Masochism is cyberotics, precisely because it recognises no distinction between the animate and inanimate. After all, when you run your fingers through your beloved’s hair, you are caressing something dead.
How had Ferry got here, become stranded in the early Seventies, an artist-voyeur art-director masochist?
Ferry famously studied painting under Richard Hamilton, the so-called godfather of British Pop Art, at Newcastle University. Can we even begin to reconstruct the impact that Hamilton’s art had on British culture?
Well, you can get some impression of it from the fact that, in a documentary on Hamilton made by Channel 4 in the early 1990s, Ballard cited Hamilton’s 1956 “Just What Is It That Makes Today’s Homes so Different, so Appealing” as one of the cultural events that made it possible for him to be a sciencefiction writer. It would be better to say that Hamilton made possible Ballard’s exceeding of science fiction, his discovery of k-punk.
1956 was, of course, the year of Presley’s breakthrough records. In its own way, though, Hamilton’s collage was at least as important as Presley in the development of British pop.
After the Fifties, pop and art have always been reversible and reciprocally implicating in British culture in the way that they are not in America. Nuttall: “The students and the mods cross-fertilised… Purple hearts appeared in strange profusion. Bell-bottoms blossomed into wild colours. Shoes were painted with Woolworths lacquer. Both sexes wore make-up and dyed their hair… The air in the streets was tingling with a new delirium.”[6]
British pop’s irreducible artificiality makes it resistant to the Romanticist naturalisation that the likes of Greil Marcus and Lester Bangs achieved in respect of American rock. There is no way of grounding British art pop in a landscape.
Not a natural landscape in any case.
If art pop had a landscape it would be the aggressively anti-naturalistic one Ferry collaged together on “Virginia Plain” (named after one of his paintings, which was itself named after a brand of tobacco). Is this an internal landscape, what the mind’s eye sees? Perhaps. But only if we recognise that — as Hamilton’s collage and Ballard’s fiction insist — in the late-twentieth century the “space” of the internal-psychological was completely penetrated by what Ballard calls the media landscape.
When the British pop star sings, it is not “the land” which speaks (and what does Marcus hear in the American rock he mythologises in Mystery Train if not the American land?) but the deterritority of American-originated consumer culture. Hence the braying grotesquerie of Ferry’s singing voice on those early Roxy releases. (And the different grotesquerie of today’s simoting pop idols.)
With the first-hand expertise of someone who has had to lose his voice in order to speak (for that is what you must do if you educate yourself — or are educated — out of a working-class background), Penman brings out very well how integral the problem of accent — of losing a Geordie accent, of not gaining an American accent — was to Ferry’s career.
As a student, Ferry’s life was divided between his daytime movement through the art milieu and nighttime fronting of a soul band doing covers. Two voices, two lives. “I hadn’t found anything to incorporate all of me.”
The early Roxy records are Ferry’s Warhol-Frankensteinian attempts — the joins still showing, thrillingly, horrifyingly — to hand-machine a space that would incorporate his day and his night self. So they are not so much expressions of a coherent subjectivity as a kind of destratificationin-progress, the production, on the fly, of a pop art plane of consistency which he could feel at unhome in.
So here was a pop music, astonishingly, more shaped by Duchamp than Bo Diddley. The methodology Ferry deployed on his solo albums of cover versions (and remember that such albums were almost unknown in rock music at the time) was explicitly Duchampian. His renditions of standards such as “Smoke Gets in Your Eyes” and “These Foolish Things” were, he said, Duchampian “readymades”: found objects upon which he put his own stamp.
Part of what made the early Roxy sound so cold — particularly by comparison with the hot authenticity of American rock — was the fact that they were evidently not an aggregation of spontaneous, creative subjects, but a meticulously executed Duchamp-type Concept: a group whose every gesture was micro-designed, and who credited their stylist, fashion designer Anthony Price, on their album sleeves.
The great temptation for Ferry would always be to slip inside the frame: to become, really, the heartaching bachelor in the dreamhome, to achieve what Simon calls the
fantasy of stepping outside the lowly world of production into a sovereign realm of pure unfettered expression and sensuous indulgence, an imaginary and fictitious notion of aristocracy (more Huysmans than real lords who have to do humdrum things like manage their estates, juggle their investments, do a bit of arms dealing).
To achieve the total simulation of manners that he was up till then only pastiching-affecting.
And, isn’t Simon right, aren’t Ferry’s later records all about “the disillusionment of actually achieving the supermonied aristo life — Ferry, condemned to mooch jaded forever through art openings, fashion shows, all tomorrow’s parties (that old tis better to journey than arrive line)”?
Let’s leave Ferry there, stranded, framed.
And cut.
To 1982. Compass Point, Nassau.
Grace Jones’ astonishing recording of Joy Division’s “She’s Lost Control”.
Masoch: “A slap in the face is more effective than ten lectures, especially if it is delivered by the hand of a lady.”
Kodwo Eshun:
The womanmachine Grace Jones’ 82 remodel of Joy Division’s 79 She’s Lost Control updates the Fifties mechanical bride. For the latter losing control meant electric epilepsy, voice drained dry by feedback. For Jones, the female model that’s losing control induces the sense of automation running down, the human seizing up into a machine rictus. The model — as girl, as car, as synthesizer — incarnates the assembly time of generations, obsolescence, 3-year lifespans.
The model is the blueprint for the post-Cold War cyborg, the womanmachine modified and mutated by the military medical entertainment complex. Hence Kraftwerk’s The Model, where the bachelormachines are threatened by the womanmachine’s superior reproductive capability. The Model is an excerpt from the post-war machinereproduction wars.[7]
Jones is the sublime object before which Ferry prostrated himself — and who talked back. Through vagina-dentatal teeth.
Be careful of the womanimal-machine. It bites.
Jones is not a cyborg because she is not an organism of any kind (and the modifier “cybernetic” is in any case redundant, since all organisms, like everything that works, are cybernetic).
She is a neurobotic femachine.
The mechanical bride stripping her bachelors bare.
Jones was herself once a model, but when she has the opportunity to “express herself”, she ruthlessly exploits her own body and image much more than any (male) photographer would have dared to. “In a recent poll by Men’s Health magazine, the male readership named Grace Jones […] among the women who scared them the most.” (Brian Chin).
The game becomes the hunter.
She out-Duchamps Ferry, (dis)covering his “Love is the Drug” as a found object to be absorbed by the femachine.
Jones understands her body Spinozistically as a machine capable of being affected and producing affects. This body is in no way limited to the organism; it is distributed across photographs, sound and video — and none of these media constitute a representation of an originary organic body. They are, each of them, unique expressive components of the Jones singularity.
It’s total immanence.
There is no Grace Jones the subject who expresses her subjectivity in sound and image. There is only Jones the abstract hyperbody, the cut-up scissormachine that cuts itself up, relentlessly.
The Jones body is immanent, too, in that, as Kodwo repeatedly insists of sonic fiction throughout More Brilliant than the Sun, it produces its own theory.
Certainly, by the time that Haraway’s Cyborg Manifesto limps onto the scene, it is only to mislead via reterritorialisation.
Cut again.
To London, 1982.
(Reproduced from the early days of blogger k-punk.)
The sex appeal of the inorganic.
Paul Tickell’s review of The Anvil, NME 27 Mar 82:
I’d thought “Contort Yourself” the right kind of music for Newton’s sadoeroticism — but “The Anvil” is a greater approximation. You wanted the moderne dance — well […] here it is: the night-time moves of marionettes — dummies — puppets — clowns — and imaginary celluloid beings. It’s all a little deathly — the sound of commodities fucking — but a noise which can be a good deal more exhilarating (“the sex appeal of the inorganic” — Walter Benjamin) than healthy fun-loving creatures going at it.
All in all — Visage are a rather seductive disease — the skull beneath the made-up skin.
More material from early k-punk:
Roxy versus Visage: a shift from subject to Object (therefore, following Baudrillard’s logic in Seduction, from masculine to feminine). Fem-glam notwithstanding, Ferry retained for himself the male role of the onewho-looks. The problem, for Ferry, is the (male) gaze — how much to look? For how long? “Then I look away/too much for one day.” Strange, meanwhile, is invariably the looked-at. He is the discarded plaything in “Mind of a Toy” (telling title, that), the object of gossip in The Anvil’s maudlin “Look What They’ve Done” and “Whispers.” The model, here, is — the model: the anxiety — how am I seen?
Can we assume, btw, that Gibson derived the name Neuromancer from “New Romantic”? If so, Gibson’s transposition suggests a much more interesting, and appropriate, name for the nerve sorcery of these newly-wired electronauts. “Romantic” always struck me as way-off beam for a culture so fastidiously uninterested in depth/emotions/ truth.)
The case against Visage always seemed to me to depend on rockist prejudice: they didn’t play live, they were a vehicle for a clothes horse who “couldn’t sing”, they represented the return of prog. Isn’t there also a masculinist agenda, too, in the implicit rejection of the “superficiality” of fashion and clubbing?
Visage thoroughly stripped their sound of the trappings of r and r, ostentatiously parading an Un-American ancestry. Thematically and sonically, Visage evoked a decadent Europe of seductive urban alienation (cf the Mondrian-like vision of endless High-Rises in Blocks on Blocks) and sumptuous glamour (cf the name, and the track, “Visage”; the French vox on “Fade to Grey”), conjured through vocoder vox, synthesizers and Billy Currie’s pseudo-classical flourishes. American influences came rerouted/refracted through Europe: Moroder disco; Morricone (cf McGeoch’s “Once Upon a Time in the West”-isms on the Spaghetti Western/Clint tribute “Malpaso Man” off Visage). Cinema was a major node: much of Visage’s sound belongs to what would later be called “virtual soundtracks” (Barry Adamson, one of the architects of this genre, was of course a Visage member). The mood was one of dis-affection, not the robotic functionality of Kraftwerk, nor the schizo-dislocation of Foxx/Numan, but the Euro-aesthete’s “exhaustion from life”, nowhere better expressed than on the Interview with the Vampire-like “Damned Don’t Cry”. Visage didn’t thematise machines in the way that Kraftwerk, Numan and Ultravox did: like Yello, they seemed to operate in a future-past glittering hall-of-mirrors in which synthesizers and electronics were less a new innovation than a taken-for-granted mainstay.
Visage’s “cyberpunk baroque” is a link between Roxy Music, Vangelis, disco and what would later become dance culture. Anyone who doubts this should check out the dance mixes of “Frequency 7” or “Pleasure Boys”: the instrumental breakdown in the “Pleasure Boys” remix is pure acid house, and “Frequency 7” is nothing but a breakdown, a thrillingly anachronistic slice of machine-techno. It was no doubt Strange and Egan’s role in the Blitz/Camden Palace that facilitated the move into dance. Making clubbing and dancing, rather than the gig, central was a crucial step (for Visage specifically, but for the New Romantic scene in general). Strange was less important as “frontman” than as pure image, his very diffidence and passivity as a vocalist anticipating dance’s later complete effacement of the singer.
Except the singer doesn’t get completely effaced by dance.
It returns as the femachine Róisín.
Cut to Now.
I’ve little to add to my recent remarks[8] on Moloko and Róisín Murphy as the latest — but I hope not last — contribution to the art pop story.
But it’s worth distinguishing Murphy from two artists John recently mentioned in the comments boxes: Madonna and Kylie.
Minogue is a sex worker in the most banal and degrading sense, since it is clear that her simpering subordination to the Lad’s Gaze is nothing more than a career(ist) gambit. Murphy, by contrast, gives the impression of enjoying herself, of doing what she would do any way (and just happening to have an audience). It’s clear that she enjoys attention (male or otherwise) but like all great performers, her jouissance seems to be fundamentally auto- erotic. The audience function not as passive-consumer onanist spectators, but as a feedback component in the Róisín-machine.
And unlike Madonna, Murphy does not Photoshop out all the joins and the cuts in her performance. Whereas Madonna’s hyper-professional show is all about attaining the CGI seamlessness of a corporate film, Murphy — pulling her leather fetish boots on onstage — is always playing — albeit seriously.
Q: You’re becoming quite the style icon, is that an area that interests you?
R: Well, I think I dress for myself, I mean, I’ve always dressed up anyway, and I just enjoy it. I think maybe people are just fed up of pop stars that are told what to do and what to wear.
noise as anti-capital: as the veneer of democracy starts to fade
FORGET ORWELL
Orwell is wrong about everything, but especially 1984.
Far from being the year of zombie-drone enforced consensus, GB 1984 was a class war zone in which multinational Kapital’s paramilitary-police crushed the remnants of organic workerism live on videodrome.
Such staged antagonism is a necessary phase in the pacification program that will culminate in apparently triumphant Kapital’s End of History.
The reassuring non-hum of the noise free polis at the end of time.
Tony’s smile.
Blair is a much more effective class warrior than it was possible for Thatcher and Macgregor to be.
Their efficacy was limited by Then Kapital’s need for them to be seen fighting the class war.
No need for Tony to fight.
To not fight is to have won.
It’s all administration now.
Systemic antagonism is just a bad memory. Turn up the TV.
Bunker down in your burrow.
Retune the guitars.
Return to harmony.
Welcome to Liberty City.
The busier you are, the less you see.
SOUND FX
Mark Stewart’s As the Veneer of Democracy Starts to Fade was the politicallibidinal intensive soundtrack to “battle for the hearts and minds” fought between Kapital and its enemies GB circa 84-85.
Seven years since Stewart began his anti-career as teen-Nietzsche Artaud Debord communist shamanic-firebrand hysteric-screecher in the Pop Group.
Stewart’s journey since the dissolution of the Bristol f-punk kollektive takes him through Adrian Sherwood mega mashed hyperdub and into an encounter with US hip-hop.
He immediately appreciates that hip-hop is not a street music but non-musique abstrakt: a site of pure sonic potential, in which inhuman constructivist sound cartoons can be produced without reference to musical protocols of any kind.
It’s all sound FX, a way of manipulating noise.
Hyper-modernism. The sonic equivalent of the Burroughs-Gysin cut-up. A contact of Sherwood’s leads to the most improbable of meetings. UK non-singer and sound-deranger Stewart plugs into the super-slick behind-the-scenes NYC p-funk machine responsible for the grooves on the pioneering hip-hop 45s released by Tommy Boy and Sugarhill.
Component parts:
Keith Leblanc. Beat machine producer of “Malcolm X: No Sell Out”. He can program drum machines to make them sound like packs of dogs.
Doug Wimbish. Supertaut hypertechnicised Hendrix of bass.
Skip McDonald. Synthesizer manipulator and reaper-rider of psychedelicfunk ax storms.
Sherwood and Stewart take their already inhuman grooves and subject them to further layers of dissonant anti-musical editing, interpolating Burroughs vocal samples from Nova Express and other deliberately ciphered media background noise, machining an anti-communicational libidinal signal that takes you behind the screens to access the Real news.
Apocalypse Now.
THE BATTLE FOR THE HEARTS AND MINDS
As the veneer of democracy starts to fade, some say the internment camps are already built.
When the mask of civility comes off and the visors go on, the contours of the New World Order become apparent.
The destruction of the miners — and with them the wrought-iron ruins of the postwar consensus — was only the most media-visible of the pacification strategies Kapital was deploying, and in many ways the least significant.
The important thing was to prepare the way to transnational cyberspace Kapital Now, when all dissent is pathologised if it is not made literally unthinkable.
“Sterezene, thorazine and lagactyl” administered under the Mental Health Bill subdue political prisoners re-assigned to psychiatric wards.
Narco-neuroticisation as the re-imposition of a simulated Reality Principle shoring up Kapital against its virtual limit in Planetary Schizoprenia.
You don’t have to be mad to work here.
Restrict your demands to the what is possible.
Find your way back to your dormitory.
Privatise your misery.
Struggling to pay the rent, the main worry’s job security.
Now and then, we can afford a little luxury.
Quietism.
DISSONANCE/ DISSENSUS
If the aim is to disseminate information, why all this noise?
Why the distortion, the deliberately buried voices, why all the half-heard insinuations, the audio-hallucinatory fragmentation, the wired-up screams?
Why not communicate clearly?
Because clear communication — and all it presupposes — is the fantasm the system projects as its vindication and necessarily always-deferred goal.
“The big Other stands for the field of common sense at which one can arrive after free deliberation; philosophically, its last great version is Habermas’s communicative community with its regulative ideal of agreement.”[2]
The noise free polis.
We are told:
Only when the noise of antagonism recedes will we be able to hear each other. Only when we take out the background static will human speech be possible. Police yourself and there will be no need for the use of batons. Intoxicate yourself and we will not sedate you.
Stewart’s disassembly of his self through noise is a refusal of the Foucault biocops and Burroughs control addicts that operate first of all at the level of the skin and the CNS, enticing-inciting you to constitute yourself as an internally coherent driving ego.
Stewart treats his own voice not as the authentic expression of a subjective interiority, but as a series of lab animal howls, enraged yelps and impersonal intensities to be cut up and redistributed across the noise-hyperdubscape, mixed indifferently with Duchamp-found-sounds and noises produced by viciously distorted formerly musical instruments.
Identity breakdown through the amplification of noise as an exploding flight from harmony at all levels: psychic, social, cosmic.
Dissensus.
I AIN’T GONNA BE A SLAVE OF LOVE
Always take O’Brien’s side against Winston Smith and Julia.
There is nothing natural, and human biosocial defaults are always to be distrusted.
If you want to get out, leave all that mammal couple shit behind.
Stewart is one of Burroughs’ most assiduous readers.
It is not a matter of emulation but of the deployment of abstract engineering diagrams in different media.
Position As the Veneer of Democracy Starts to Fade as the terminus of the Burroughs-saturated UK underground delineated by Nuttall in Bomb Culture.
“Hypnotised” plays like the “I Love You” section from The Ticket that Exploded, Burroughs’ most pitilessly hilarious dissection-analysis of the biopsychic sex-love control virus as preprogrammed biological film, sentimental mooning croon-tunes spliced in with hardcore pornography and replayed like videodrome in your CNS, ensuring-exacerbating constant craving:
All the tunes and sound effects of “Love” spit from the recorder permutating sex whine of a sad picture planet: Do you love me? — But I exploded in cosmic laughter — Old acquaintances be forgot? O darling just a photograph?[3]
Heaven must be missing an angel.
Hypnotised.
Hypnotised.
She’s got me hypnotised.
Stewart’s cut-ups of constructivist-brutalist funk with saccharine lovesongs already anticipate the way in which Kapital’s tungsten-carbide stomach will metabolise hip-hop’s hyper-abstraction and use it as the dominant consumer seduction soundtrack from the Nineties till now.
CONTROL DATA
The data-content of Stewart’s rant-reports is nothing astonishing.
7% of the population own 84% of the wealth.
Parasites… The great banking families of the world… Bastards…
Are these the words of the all-powerful boards and syndicates of the earth?
The point is not to tell you something new but to reprogram your nervous system.
Control works by reducing the reality of systemic antagonism to a mere belief.
A track like “Bastards” is a very precise anti-Control weapon.
It is a rage-inducer designed to make beliefs affective, whereas Control PR conciliates and normalises.
Control PR plugs the gaps, emolliates, quietens, makes confrontation and exploitation unthinkable without denying their reality as such.
Like John Heartfield collages, Stewart’s crude sonic splices amp up the distortion and the violence.
The situation is not under Control
They are not protecting you
It is war
And so are you
THERE IS NO DIGNITY
Don’t confuse the working class with the proletariat.
Thatcher inhibits the emergence of the proletariat by buying off the working class with payment capital and the promise of owning your own Oed-I-Pod. The comforts of slavery.
She gives the replicants screen memories and family photos.
So that they forget that they were only ever artificial, factory farmed to function as the Kapital-Thing’s self-replicating HR pool, and begin to believe that they are authentic human subjects.
The proletariat is not the confederation of such subjectivities but their dissolution in globalised k-space.
The virtual population of nu-earth.
Total recall of all the noise.
Lyotard describes the hysterisation of a worker’s ear when it is subjected to the unprecedented noise of Industrial-Kapital’s reproduction: the incessant sonic violence of a 20,000hz alternator.
The heroism of the proletariat consists not in its dignified resistance to the inorganic-inhumanity of the industrialisation process — “there is no libidinal dignity, nor libidinal fraternity, there are libidinal contacts without communication”[4] — but in its mutative Duchamp-transformation of its body into an inhuman inorganic constructivist machine.
As the Veneer of Democracy Starts to Fade is a sonic machine for accelerating the process. An anti-Oedipal, anti-neurotic, anti-quitest, pro-proletarianising noise weapon. Anti-videodrome signal.
Jack it into yr CNS and play.
lions after slumber, or what is sublimation today?
“In post-liberal societies […] the agency of social repression no longer acts in the guise of an internalised Law or Prohibition that requires renunciation or self-control; instead, it assumes the form of a hypnotic agency that imposes the attitude of ‘yielding to temptation’ — that is to say, its injunction amounts to a command: ‘Enjoy yourself!’ Such an idiotic enjoyment is dictated by the social environment which includes the Anglo-Saxon psychoanalyst whose main goal is to render the patient capable of ‘normal’, ‘healthy’ pleasures. Society requires us to fall asleep into a hypnotic trance…”
— Slavoj Žižek, “The Deadlock of Repressive Desublimation”[2]
As we awake from the dreary dream of entryism, we can start to see that what kept us slumbering in the last twenty-five years was indeed a programme of controlled, if not quite repressive, desublimation. No doubt, the signs of any awakening are fitful as yet. In pop, they are perhaps most evident in a groping backwards, a paradoxical return to modernism. Could it be that the likes of Franz Ferdinand and the Rapture will prompt a self-overcoming of the very postmodern revivalism of which they are a symptom? Just now, Rip it up and Start Again sounds like a uncannily timely injunction.
Something seems to be (re)coalescing, as the reception of the Early Scritti LP (including Simon’s piece in Uncut) indicates. For those inside — not least, of course, Green [Gartside][3] himself — these recordings must be dismissed as inept avant-doodlings, embarrassing juvenilia. It seems plausible to attribute Green’s less-than-lukewarm judgements on the early Scritti material less to modesty, still less to a “maturity”, than to a defensive cleaving to a once-successful strategy that has now run its course, as Marcello acidly noted in his acerbic comparison between the indifferent reception of the last Scritti album and the eagerness with which Early has been anticipated:
When you next find yourself at a motorway service station, feel free to browse through the plentiful copies of his 1999 up-to-the-1999-minute-Mos-Def-involving Anomie and Bonhomie album, yours for just £1.99 — whereas the new collection of his scratchy, disjointed post-punk improv stuff from two decades ago reportedly already has 10,000 advance orders.[4]
Scritti were the most successful — aesthetically, commercially — of the post-post-punk entryists (the likes of U2 being always-already included of course). In the desolate gloss of mid-late Eighties pop, Scritti’s hyper-saccharine sweetness retained a plaintive if sickly gorgeousness, even if their vaunted deconstructive swerves became subtle to the point of invisibility. But the fastidious precision of that striplit Eighties production has dated (late-Eighties chartpop is the most time-bound pop music ever — discuss) much more damagingly than has the messthetic of the early records. Whereas the conspicuous completeness of Eighties entryist pop repels fascination, the queasy, uneasy unfinishedness of post-punk pop — the lurching “doubtbeat” of a collectivity discovering itself — is uncannily compulsive. In punk cut and paste, the joins, the cuts (in other words the ways in which any world does not coincide with itself) are flaunted and foregrounded. Entryism is the capture of cut and paste into Photoshopped seamlessness.
What is perforce lost in today’s post-punk revivalists is the literal intensity of this sound: how it is in Kierkegaard-Žižek’s terms, in becoming. It doesn’t know anything (it certainly can’t be confident that it is a “classic independent rock sound” in the way that Franz Ferdinand are). In flight from rock’s “condition of possibility”, this undo (it) yourself pop puts into question ALL conditions of possibility, and with them the very concept of conditions of possibility. What is this if not the sound of the Badiou Event, which is the punk revelation itself: this is happening, now, but it can’t be, it’s Impossible…?
Scritti were possibly the least Dionysian pop group ever. In the early days, the methodology may have been improvisational, but the group didn’t want anyone (least of all themselves) to be under the illusion that it issued from some vitalist wellspring of creativity. It was the sound of a collectivity thinking (itself into existence) under and through material constraints. The famous displaying of all the recording costs on the sleeve was demystificatory but not desublimating. What is too often missed about any punk that matters in fact, is that sublimation, far from requiring mystification, is alien to it.
As Alenka Zupančič argues, mystification is entirely on the side of the reality principle (and one of the greatest contributions psychoanalysis has made to politics is to identify “realism” precisely with a reality principle, so that what counts as “commonsense” can be exposed as an ideological determination):
The important thing to point out […] is that the reality principle is not some kind of natural way associated with how things are, to which sublimation would oppose itself in the name of some Idea. The reality principle itself is ideologically mediated; one could even claim that it constitutes the highest form of ideology, the ideology that presents itself as empirical fact (or biological, economic…) necessity (and that we tend to perceive as nonideological). It is precisely here that we should be most alert to the functioning of ideology.[5]
Listen to what Marcello calls the “pseudo-moronic chants of platitudes” (“An honest day’s pay for an honest day’s work! You can’t change human nature! Don’t bite the hand that feeds!”) of Scritti’s “Hegemony” and it is clear that this message has got through. It is the denunciation and exposure of the great ideological swindles that are liable to be remembered about punk, but this destructive urge (passive nihilism) is empty without its active complement: the production of a new space. It is no accident that mystificatory realism has allowed us to remember the former but not the latter, since the mere dismantling of ideological presuppositions quickly became a dreary academic parlour game associated with a desiccated, depressive and depressing left (they want to take your enjoyment away from you…).
Zupančič labels the production of this new space “sublimation”. To understand why she makes this move entails differentiating sublimation from the sublime as such. The postmodern emphasis on sublimity has tended to stress the sublime as an unreachable beyond, contemplation of which induces a pathos of finitude in any human subject. To think about sublimation, the process by which an object “acquires the dignity of the Thing”, produces a different emphasis. As Zupančič continues:
the Lacanian theory of sublimation does not suggest that that sublimation turns away from the Real in the name of some Idea; rather, it suggests that sublimation gets closer to the Real than the reality principle does. It aims at the Real precisely at the point where the Real cannot be reduced to reality. One could say that sublimation opposes itself to reality, or turns away from it, precisely in the name of the Real. To raise an object to the dignity of the Thing is not to idealise it, but rather to “realise” it, that is, to make it function as a stand-in for the Real. Sublimation is thus related to ethics insofar as it is not entirely subordinated to the reality principle, but liberates or creates a space from which it is possible to attribute certain values to something other than the recognised and established “common good.” […] What is at stake is not the act of replacing one “good” (or one value) with the same planetary system of the reality principle. The creative act of sublimation is not only a creation of some new good, but also (and principally) the creation and maintenance of a certain space for objects that have no place in the given, extant reality, objects that are considered “impossible”.[6]
What was the “beyond good and evil” of Scritti, Gang of Four, the Pop Group and the Raincoats if not the production of just such a space? (As Lacan wryly notes, when we idly think of someone who is “beyond good and evil” we are liable to think of someone is merely beyond “good”.) This entails not an austere asceticism but the engineering of new forms of enjoyment. The early Scritti’s “difficulty” places them beyond the pleasure principle, for sure, but we succumb to an ideological lure if we think that this puts them beyond enjoyment too. As Savonarola said to me a few weeks ago, Gang of Four were far more effective at turning out compulsive pop songs than almost any of today’s chart acts you could care to name. The same goes double for Early, whose songs are catchy because they refuse to push familiar buttons.
Entryism constitutes a double disavowal of sublime space. First, it is turned away from, then the very possibility of its existence is denied. In retrospect, entryism has to be seen as the production of a particularly virulent capitalist mind plague. How else to account for the absurd convolutions that allowed Green to posit some political continuity between the avant-Marxism of the early years and the champagne-swigging meta-boybandcum-yuppie-corporation “hammer and popsicle” posturing (check some of those pics which illustrate Simon’s piece) of the chart Scritti? As Simon has shown elsewhere, Green had by this point done more than merely accommodated himself to the market; he was acting as an entrepreneur, since the “‘Fairlight future-funk’ of Cupid and Psyche 85 was so far ahead of the game it actually influenced black pop.” There’s a case for saying that Cupid and Psyche’s “dazzling, depthless surfaces” in which “‘soul’ and interiority are abolished” and “desire traverses a flat plane, the endless chain of signifiers, the lover’s discourse as lexical maze” was THE sound of Eighties capital, the perfect soundtrack to Jameson’s Postmodernism, or the Cultural Logic of Late Capitalism, what it felt like to be lost in the mirrored plateaux of the hypermarket.
Realism always poses as maturation. Of course it is acceptable, understandable and inevitable to have silly youthful dreams, but there comes a time when one must put aside such childish things and face reality; and reality is always defined “biologically”, in terms of the imperatives of reproductive futurism, and “economically”, in terms of the “constraints” of the capitalist anti-market.
Readers of the lamented but never forgotten Pill Box[7] may remember a letter Ian Penman received from Brian Anderson of the neo-Conservative City Journal. Anderson’s account of the intellectual provenance of neo-conservatism — many neo-cons were tellingly described as disappointed, disillusioned leftists who had been “mugged by reality” — concluded with the following convergence between new pop entryism and his own neo-conservative turn:
Also — and this will probably horrify you — my move right came partly thanks to Ian Penman and Paul Morley at NME! Your rejection of overly politicised agitprop in music back in the late Seventies made intuitive sense to me — I disliked the didacticism of Billy Bragg or Crass, and could stomach even less the critics who pretended to be revolutionaries, etc. There was far more truth in an August Darnell ballad, I came to believe, than in the entire socialist posturing of, say, the Gang of Four or Robert Christgau.
There it is: THAT opposition — Bragg versus Darnell — was the problem of the mid-to-late Eighties. As soon as it was a question of dour meat and potatoes no fuss empiricism (left) versus bright and brash hedonism (right), there was no longer any real choice. The sublime had been extirpated, and what remained was a quotidian cavilling against the wipe-clean sheen of the mall.
It’s telling that Scritti’s “Confidence” (“Outside the clubs of boyhood/ Inside monogamy”) should be so preoccupied with the problems of “being a man”, and what that entailed.
The interpellated subject of the Lad magazine is the supposed “real person” (=slavering, sloppy andro-Id) beneath the pretence of social politesse. The Lad magazine addresses this “authentic id” with the leering superegoic injunction to enjoy. “Go on, admit it, you don’t want to be bothered to cook, all you want is a fishfinger sandwich… Go on, admit it, you don’t want to be bothered to talk to a woman, have a wank instead…” The fact that this reduction is possible means that lads implicitly accept the Lacanian notion that phallic jouissance involves masturbation with a “real” partner. It also indicates that laddishness is more defined by a propensity towards depressive indolence than it is by any lasciviousness. What laddism attempts is a short-circuiting of desire (yes, I know that the “inhuman partner” of desire cannot be attained, just give me pictures of girls next door instead).
From “Skank Bloc Bologna” (“an imaginary network of dissidents stretching from Jamaica to Bologna’s anarchist squatters”) to the Streets (Lads of the world unite to raise a glass to lachrymose lariness); from “The Sweetest Girl” to Abi Titmuss…
Long past time that we roused from this slumber… Especially when, with habeas corpus suspended and mainstream political parties all but burning down gypsy camps, one of the things that makes the early Scritti so contemporary is that their conjecture/fear that It (fascism) Would Happen Here, their “très 1979 paranoia”, is suddenly, alarmingly très 2005:
Rise, like lions after slumber,
In unvanquishable number,
Shake your chains to earth like dew,
Which in sleep had fall’n on you.
the outside of everything now
A week dominated in every way by Simon Reynolds’ Rip It Up and Start Again,[2] and rightly so.
Perhaps the best tribute you can pay to the book is that it makes you positively look forward to train and bus delays, to any moment when you can return to feed the hunger, scratch the itch…
The size of the crowd at the Boogaloo event on Wednesday, but, more than that, a certain sense of ferment in the atmosphere, testified to the fact that this is something more than a book. Stirring up the ghost of post-punk cannot but be an act, an intervention in cultural politics — since post-punk not only judges contemporary pop culture (harshly), it brings back the legitimacy, the necessity of being judgemental, of having some criteria (non-musical criteria, non-hedonic criteria) for enjoyment. Such a position is not repressed by contemporary pop culture (=the cultural logic of late capitalism), it is made unthinkable by it.
Something in Paul Morley certainly seemed to wake up on Wednesday. (And something in us?…)
A certain Morley was knowingly complicit in the termination of post-punk — as Simon wryly reminded him when, after Morley had fulminated against the facile notion that the worth of a pop record is determined by its popularity, he asked him: “but didn’t that idea come from you?” It’s not accidental that, grotesquely but inevitably, Morley’s early-Eighties pop(ul)ist stance should have inspired some NME readers to turn towards neo-conservatism. In retrospect, it’s possible to see the turn to popism as the beginning of a giving voice to a creeping disappointment which spread slowly, insidiously yet incrementally during the period until almost everything of post-punk — even the traces — was disappeared (in the way that political prisoners are). The disappearing trick was almost complete when the Pod-Zombie duplicates started to arrive a few years ago, formally perfect copies mass-produced by Kapital.
It’s easier to see now than it was at the time the extent to which the cultural artefacts — and the discourse surrounding them — produced in the wake of post-punk were being programmed by resurgent Kapital. A certain notion of realism began not only to prescribe what could now happen, but to airbrush out what had actually happened. The idea that pop could be more than a pleasant divertissement in the form of an easily consumable commodity, the idea that popular culture could play host to concepts that were difficult and demanding: it wasn’t sufficient to disavow these possibilities, they must also be denied. Operation Amnesia, Pacification Program: it never happened did it, it was a delusion, a folly of youth, and we’re all grown up now…
Naturally, Morley’s railing against amateurism, his advocacy of ambition and lushness, play rather differently in 2005 than they did in the early Eighties, but that’s only fitting, since his manifestos-as-works-of-art-in-themselves were produced as strategic provocations rather than timeless aesthetic philosophies. Even though the Morley of the disappointing Words and Music claimed Noughties web popists as his offspring, it’s hard to imagine the Morley and Penman of 1981 being gratified by the thought that their legacy would be the de-conceptualisation and de-politicising — i.e. the consumerisation — of pop. They could scarcely have imagined, then, the way in which pop would de-speed over the next twenty years, that their embrace of Entryism would prove to be the last word in rough-and-tumble theoretical dialogue that seemed, then, as if it could go on forever.
Reading Rip it Up is like re-living my early pop life — but now at a distance, like Spider in Cronenberg’s film, an adult at the corner of the screen watching himself as a child. With Simon as my Virgil through that Paradiso lost, I can now recognise that pop for me was post-punk — Kings of the Wild Frontier was the first LP I bought and ABC were the first group I saw live. But Rip It Up makes me cognizant of what I, growing up absurd into post-punk, couldn’t have appreciated at the time: that the richness of pop then — not only sonically, but also in terms of concepts, clothes, images — lasted only a relatively short period, made possible by specific historical contingencies.
Nevertheless, expectations were raised in me, and more or less everything I’ve written or participated in has been in some sense an attempt to keep fidelity with the post-punk event. Cyberpunk — both in its restricted literary generic sense and in the broader sense we have given to it in Ccru — was up to its neck in post-punk. Gibson’s debt to Steely Dan and the Velvet Underground has long been acknowledged, but the dominant tone of Neuromancer was an overhang from post-punk. Gibson named his hightech prostitutes after the Meat Puppets, but Neuromancer’s technihilistic ambience, dub apocalypticism, amphetamine-burned-out Cases and hectic, twitching finger-on-fast-forward and comatone-cut-out narrative, seem to be transposed straight out of the British post-punk scene.
One of the things that is most remarkable about post-punk, actually, is its near total erasure of America and Americanness. When I was in my early teens, the only American pop you’d hear that wasn’t disco would be encountered while trudging round the shops on Saturday afternoon, as Paul Gambaccini’s Hot 100 was broadcast over the store PAs, and it was a window into a horrifyingly deprived world of barely imaginable banality.
Of the few American groups of any significance in this period, perhaps only Devo and the Meat Puppets took much inspiration from the American landscape (in Devo’s case of course, the US was processed as a thoroughly artificial PKD-US-trash heap of post-industrial detritus). No wave emerged from the rootless cosmopolitanism and transnational nihilism of New York, while in many ways the most interesting American groups — Tuxedomoon and the Residents — were Europhiles. In post-punk, America increasingly featured as a series of ethnographic traces — as in the ecstatic, hysterical and authoritarian ghost chatter of Amerikkkan TV and media flittering through Cabaret Voltaire’s Voice of America or Byrne and Eno’s My Life in the Bush of Ghosts.
It’s hard to remember now, but in the period after Vietnam and before the collapse of the Eastern Bloc, America was a paranoid and enfeebled nation, Nixon-sickened and introspective, scared of its own shadows. Post-punk was there to witness — and mock — the seeming absurdity of the idiot actor Reagan being wheeled on to give America’s confidence a shot-in-the-arm, although initially even Reagan’s rise to power seemed to be a kind of sinister post-punk prank, since it made eerily real what had been predicted by one of perhaps post-punk’s most important influence, Ballard. (In the States, Ballard’s Atrocity Exhibition was re-titled Love and Napalm: Export USA, and that novel — so omnipresent in post-punk production — was a kind of simultaneous observation of the way in which Britain was being turned into an LA of ubiquitous advertising hoardings as well as a British view of the US.) By the time that post-punk went out in a neonblaze of irony-tainted glory on MTV, the joke had, to say the least, worn thin. Pop had gone Blue-Gene American rock, again (I still remember the barely comprehending horror I felt when the NME started to give covers to the t-shirt and jean-clad Springsteen; worse was to follow, with the likes of The Long Ryders). Boredom was back, but this time, without the punks to denounce it. The arid shopping mall at the end of history opened up as the only possible future. Worse than the career opportunities that never knocked were the ones that did: jobs for everyone in the striplit wall-to-wall mart of Time Out of Joint America in which it is 1955, forever… No shadows to hide in… No room to move, no room to doubt…
Ironic in some ways that Rip it Up should be named after an Orange Juice song, since Orange Juice and Postcard were responsible for what was in many ways a British equivalent of Springsteen’s US return-to-roots. If the comparison seems strained, think about the way in which both Springsteen and Orange Juice self-consciously advocated a kind of locally-rooted authenticity defined by its rejection of artificiality. For Springsteen’s reich and roll uniform of denim, substitute OJ’s Brideshead Revisited sweaters. Like the Smiths, the Postcard-era Orange Juice retrospectively imagined a British pop-that-never-was. The Brit equivalent of American open-throated stridency was a kind of floppy-fringed, tongue-tied dithering that was just as much of a self-conscious reclaiming of signifiers of national identity as Springsteen’s passional working stiff poses were. (Is it too fanciful to hear in the early Orange Juice an anticipation of Hugh Grant’s unbearable foppery and faffing?)
By the time I got to university in 1986, Orange Juice, and the Smiths, had achieved hegemonic control of the undergraduate “imagination”. It was perfect pop for young men who were destined to go on to careers in marketing but who liked to think of themselves as “sensitive”. Orange Juice also played in a major part in rehabilitating the love song. If romance featured in post-punk at all, it was as something to be derided and demystified (as in the Slits’ “Love Und Romance” or Gang of Four’s “Love Like Anthrax”) or as something to be politically and theoretically interrogated a la Scritti or Devoto. The renewed preoccupation with love was a re-occupation of “the ordinary”, a re-statement of a revivified humanist confidence in a dehistoricised continuity of “things that go on the same”.
It’s often said that punk was what Britain had instead of 68, but that in many ways fails to process how punk had surpassed the events in Paris. 68 was as much a rejection of certain theoretical positions as it was of the institutions of modern liberal society so that, in the conflagration of the Sixties “Desirevolution”, the cold Spinozism of Althusser’s structural analysis was burned down with the buildings. Punk and post-punk, however, were profoundly suspicious of the Dionysisan triumvirate of leisure, pleasure and intoxication, so that the required attitude was one of vigilant hyperrationalism, a kind of popularised Althusserianism in which interiority was exposed as an ideological bluff, and emotions were understood not as “real expressions of authentic subjectivity” but as structurally engineered reactive circuitries. The stance such a perception demanded — and this was a culture that was deliberately and unashamedly demanding — was one of “proletarian discipline” rather than slack indulgence, its puritanism recalling the egalitarian social ambitions of the original Puritans. In this respect, Scritti’s move from pleasure-repudiating Marxism to “playful” deconstruction is emblematic of the way in which the decade would develop, in universities as much as in the charts. The exorbitant surfaces of Cupid
and Psyche might have eschewed interiority, but at the same time their simulations of interiority were no less authentic, no less soulful, than other versions of interiority purveyed by more credulous, non-ironic sources in the mainstream. The person being duped now was the Green who imagined that his intelligence would prevent full incorporation.
But the triumphant capitalism Green was already working for had no trouble at all in consuming those who sought entry into it. In the Seventies, in an effort to dispel the notion that there were “subversive regions” that would be inherently indigestible for capital, Lyotard compared capitalism to a “Tungsten-carbide stomach” that could consume anything in its path. By the Eighties, as Jameson has observed, Kapital had become a gigantic interiority without any outside: a kind of jaded pleasuredome reminiscent of the all-encompassing bubble environments imagined in Seventies SF. Except it looked, for all the world, just like a familiar domestic environment: the nice house, nice family set-up ridiculed by Jamie Reid, now refurbished with added ironic distantiation and hooked up to twenty-four-hour MTV. What had been lost was the “glam knowledge” that first entered pop through Pop Art: that the social scene is a stage set populated by puppets cornfed cheap dreams and sedated by narcotics of every kind. The punks knew they were replicants; that everything that seemed to be inside was bio-psycho-social machinery that should be re-programmed or stripped out. The end of punk was the forgetting that the memories were false, that the domestic scene was so much pasteboard and image virus.
At the time of post-punk, pop could still be a counter-cultural lab (endlessly raided by, but never subordinated to the diktats of, Kapital). It really is not clear whether pop could be that again. Someone asked the panel on Wednesday if dredging post-punk up was an exercise in nostalgia. But this is entirely to miss the point of Jameson’s critique of the nostalgia mode. For Jameson, the nostalgia mode is exemplified by cultural artefacts which deny, or more radically, are unaware of their own total debt to the past. In other words, being contemporary does not guarantee being modern, especially not in a postmodern culture whose temporality is obsessively citational and commemorational. One of the most idiotic tics in cultural gatekeeping today is its need to justify the past in terms of the present: as if Gang of Four were only significant because they “influenced” no-mark, here-today boot-sale-tomorrow clones like Bloc Party and Franz Ferdinand. As if simply being here, now, meant that something New and Important is happening…
Pop could function differently in post-punk because, at that time, it was the space which most readily leant itself to the production of a counter-consensual collectivity. Post-punk was an awakening from Kapital’s “consensual hallucination”, a means of channeling, externalising and propagating disquiet and discrepancy. It provided a crack in the way the social represented itself; or rather, exposed that crack. What the social would have us believe is dysfunction, grumbling, failure suddenly became the sound of the “outside of everything”. Records, interviews, the music press, were the means by which contact could be made between affects, concepts, commitments that would previously have been locked into private space.
Some of the panel last Wednesday were unsure if they had really done anything, if their dreams of doing something more than simply entertaining were anything more than youthful naïveté, understandable then, an embarrassment now. But the achievements of post-punk can be appreciated, negatively, in what culture now lacks. Go into a roomful of teenagers and look at their self-scarred arms, the anti-depressants that sedate them, the quiet desperation in their eyes. They literally do not know what it is they are missing. What they don’t have is what post-punk provided… A way out… and a reason to get out…
So is this a counsel of despair?
Not at all.
There are new means for producing counter-consensual collectivity.
Like this.
The web has a distributional reach, a global instaneity, whose unprecedented scale is easy to take for granted. But its vast potential far outstrips anything that fanzines or records could have achieved in the Seventies. What needs to happen is a kind of “existential reframing”: to see what happens here not as Kapital wants us to see it, as “failed” writers resentfully carving out some insignificant niche because they can’t “make it” in the overlit interior. The logic of Kapital insists that anything that is not reproducing it, or serving such a reproduction, is a waste of time. But to reframe what is happening would be to radically reverse these idiotic priorities. And the continuing relevance of post-punk is to remind us that such reversals are possible, to provide the impetus for the development of a (punk) will to retake the present…
for your unpleasure: the hauter-couture of goth
Ridiculed, forgotten, yet subterraneanly robust, goth is the last remnants of glam in popular culture.
Goth is also the youth cult most associated with women and with fiction. This is hardly surprising. As I have pointed out before[2] and is well known, the novel has its origins in “Gothic romances” which were predominantly consumed and produced by women, and the complicity of women with the Gothic has been a commonplace of literary criticism at least since Ellen Moers wrote her classic essay, “Female Gothic” in 1977.
Why think about goth now?
Partly it is because goth’s preposterous trash-aristocratic excess couldn’t be more at odds with contemporary culture’s hip-hop-dominated sportswear brutilitarianism. At the same time, though, goth’s shadow seems unusually visible in pop culture at the moment, what with references to it in both Coronation Street (“you’re not even a proper goth!”) and Big Brother (“what is a gothic? Can you make me into one?”)
Partly it is because Rip It Up has revived fascination in all things post-punk, and goth is the last surviving post-punk cult. These two facts have resulted in I.T.[3] and me seceding from the oppressive masculinist cool of the club into the more congenial cold of goth haunts.
Goth has its own version of more or less every other youth culture (hence there’s techno goth, industrial goth, hippie goth…) But let’s leave aside the male abjects (the Cramps, the Birthday Party), the po-faced (the overwrought white dub of Bauhaus) and the PoMo (the Sisters of Mercy, who from the start traded in a self-conscious meta-goth), and start with Siouxsie.
It is well-known that the Banshees were formed as a result of the future Siouxsie and Severin meeting at a Roxy show in 1974. (This fact was repeated in this really rather bizarre piece[4] on Roxy in last Friday’s Guardian, which pursues the postmodern rock critical trend to equate “importance” with “influence” far past the point of self-parody, relegating actual discussion of Roxy’s output to a paragraph or so before launching into a survey of groups they inspired.) So, unlike the Birthday Party, who were famously disgusted when they arrived in London to find it dominated by new romantic poseur-pop, the Banshees belonged to an art pop lineage which had a relationship to music which was neither ironically distant nor direct. For all their inventiveness, for all the damage they wreaked upon the rock form, the Birthday Party remained Romantics, desperate to restore an expressive and expressionistic force to rock; a quest which led them back to the Satanic heartland of the blues. (If women want to understand what it is like to be the afflicted subject of male sexuality — I wouldn’t necessarily advise it — there’s no better fast-track to “what’s inside a boy” than the Birthday Party’s “Zoo Music Girl” or “Release the Bats”). By contrast with this carnal heat, the early Banshees affected a deliberate — and deliberated — coldness and artificiality.
Siouxsie came from the art rock capital of England — that zone of South London in which both David Bowie (Beckenham) and Japan (Catford, Beckenham) grew up. Although Siouxsie was involved with punk from the very beginning, and although all of the major punk figures (even Sid Vicious) were inspired by Roxy, the Banshees were the first punk group to openly acknowledge a debt to glam. Glam has a special affinity with the English suburbs; its ostentatious anti-conventionality negatively inspired by the eccentric conformism of manicured lawns and quietly-tended psychosis Siouxsie sang of on “Suburban Relapse”.
But glam had been the preserve of male desire: what would its drag look like when worn by a woman? This was a particularly fascinating inversion when we consider that Siouxsie’s most significant resource was not the serial identity sexual ambivalence of Bowie but the staging of male desire in Roxy Music. She may have hung out with “Bowie boys”, but Siouxsie seemed to borrow much more from the lustrous PVC blackness of For Your Pleasure than from anything in the Thin White Duke’s wardrobe. For Your Pleasure songs like “Beauty Queen” and “Editions of You” were self-diagnoses of a male malady, a specular desire that fixates on female objects that it knows can never satisfy it. Although she “makes his starry eyes shiver”, Ferry knows “it never would work out”. This is the logic of Lacanian desire, which Alenka Zupančič explains as follows: “The […] interval or gap introduced by desire is always the imaginary other, Lacan’s petit objet a, whereas the Real (Other) of desire remains unattainable. The Real of desire is jouissance — that ‘inhuman partner’ (as Lacan calls it) that desire aims at beyond its object, and that must remain inaccessible.”[5]
Roxy’s “In Every Dreamhome a Heartache” is about an attempt, simultaneously disenchanted-cynical and desire-delirious, to resolve this deadlock. It is as if Ferry has recognised, with Lacan, that phallic desire is fundamentally masturbatory. Since, that is to say, a fantasmatic screen prevents any sexual relation so that his desire is always for an “inhuman partner”, Ferry might as well have a partner that is literally inhuman: a blow-up doll. This scenario has many precursors: most famously perhaps Hoffman’s short story “The Sandman” (one of the main preoccupations of Freud’s essay on “The Uncanny” of course), but also Villier de L’isle Adam’s lesser known but actually more chilling masterpiece of decadent SF, The Future Eve and its descendant, Ira Levin’s Stepford Wives.
If the traditional problem for the male in pop culture has been dealing with a desire for the unattainable — for Lacan, remember, all desire is a desire for the unattainable — then the complementary difficulty for the female has been to come to terms with not being what the male wants. The Object knows that what she has does not correspond with what the subject lacks.
It is almost as if the female goth response to this dilemma is to self-consciously assume the role of the “cold, distanced, inhuman partner” (Žižek) of phallic desire. The glam male remains trapped in his perfect penthouse populated by dumb fantasmatic playdolls; the goth female meanwhile roams through the roles of vamp and vampire, succubus, automaton. The glam male’s pathologies are those of the subject; the goth female’s problematic is that of the object. Remember that the original sense of glamour — bewitchment — alludes to the power of the auto-objectified over the subject. “If God is masculine, idols are always feminine”, Baudrillard writes in Seduction, and Siouxsie differed from previous pop icons in that she was neither a male artist “feminised” into iconhood by fan adoration, nor a female marionette manipulated by male svengalis, nor a female heroically struggling to assert a marginalised subjectivity. On the contrary, Siouxsie’s perversity was to make an art of her own objectification. As Simon and Joy put it in The Sex Revolts, Siouxsie’s “aspiration [was] towards a glacial exteriority of the objet d’art” evinced through “a shunning of the moist, pulsing fecundity of organic life.”[6] This denial of interiority — unlike Lydia Lunch, Siouxsie is not interested in “spilling her guts”, in a confessional wallowing in the goo and viscera of a damaged interiority — corresponds to a staged refusal to either be “a warm, compassionate, understanding fellow-creature” (Žižek). Like Grace Jones, another who made an art of her own objectification, Siouxsie didn’t demand R.E.S.P.E.C.T. from her bachelor suitors (with the implied promise of a healthy relationship based on mutual regard) but subordination, supplication.
(The goth male is all too ready to comply, although — as Nick Cave’s compulsively repetitious career has graphically demonstrated — snivelling prostration may well only be the prelude to homicidal destruction. Grovelling in front of the Ice Queen — “I kiss the hem of her skirt” — the goth male is neither object nor subject but — famously — abject. The best image of this idiot lust is the slavering, pustulant monstrosity on the cover of the Birthday Party’s Junkyard, and their “Release the Bats” — a song the group came to despise because they thought it might result in their being pigeonholed as generic goth — remains the most pulsingly compulsive dramatisation of the goth abject surrendering himself to the Object of his quivering desire. Cave oscillates between worshipping his lady’s femmachinc hauteur — “my baby is a cool machine”, “she moves to the pulse of the generator” — and pruriently drooling over the “filth” of her flesh — “she doesn’t mind a bit of dirt”. This conforms almost perfectly with Lacan’s description of the courtly lady, whose cold abstraction is not defined by opposition with smelly physicality. Cave’s abject is unable to give up on his desire, and the result is well-known: in order to continue to desire the woman, he must ensure that he cannot possess her, “so that l’il girl will just have to go”. Only when he has made her as cold and unyielding as Ferry’s “perfect companion” or Poe’s parade of beautiful cadavers, can his desire be extended “to eternity”, because then it is rendered permanently incapable of satisfaction.)
Instead of asserting an illusory “authentic subjectivity” which supposedly lies beneath the costumes and the cosmetics, Siouxsie and Grace Jones revelled in becoming objects of the gaze. Both would no doubt have appreciated the derision Baudrillard poured upon the strategy of unmasking appearances in Seduction: “There is no God behind the images and the very nothingness they conceal must remain a secret.”[7] Siouxsie and Jones’ embracing of their objectality testifies to the fact that there is a scopic drive that cult studs whining about “being reduced to an object” has always ignored: the exhibitionist drive to be seen.
Simon is right that “Painted Bird” (from the Banshees mistresspiece, A Kiss in the Dreamhouse) and the nearly contemporary “Fireworks” were “virtual manifestos for goth”, but it’s worth reflecting on how different these songs are in message and mood from the hackneyed image of the culture. Both “Painted Bird” and “Fireworks” (with its “exultant image of self-beautification as a glam gesture flashing amid the murk of mundanity”) are not maudlin, matt black or self-absorbed, but celebrations of the colourful and the collective. “WE are fireworks”, Siouxsie sings, “burning shapes into the night”, and you’d be hard pressed to find a song that crackles with so much enjoyment as this. The Banshees’ take on Kosiński’s novel The
Painted Bird is also about the triumph of collective joy over persecuted, isolated, individuated subjectivity. In Kosiński’s novel, the hero paints one bird and when he throws it back to its flock they don’t recognise it and therefore destroy it. But Siouxsie’s goths are not painted by another’s hand; they are “painted birds by their own design”. It is not the familiar tragic-heroic scenario in which an outsider, destined to lose, nevertheless makes a solitary stand against the conformist herd. The “dowdy flock” are to be “confounded”, but by another flock, not by an individual, and the result is not frustration, but, again, jouissance — by the end of the song, “there’s no more sorrow”.
Think how different this is to the confederacy of isolation produced by Joy Division, whose functional clothes and “non-image” implied the traditional male subjectivist privileging of the inside over the outside, depth over surface. Here was one type of “black hole”: the “line of abolition” Deleuze and Guattari describe in “Micropolitics and Segmentarity”,[8] the drive towards total self-destruction. The Banshees, on the other hand, were more like the “cold stars” invoked by Neubauten: forbidding, remote, yet also the queens of a paradoxically egalitarian aristocracy in which membership was not guaranteed by birth or beauty but by self-decoration. Siouxsie’s hyper-white panstick radiated the “cold light” of stardom Baudrillard invokes in Seduction. Stars “are dazzling in their nullity, and in their coldness — the coldness of makeup and ritual hieraticism (rituals are cool, according to McLuhan).”[9]
“The sterility of idols is well-known”, Baudrillard continues, “they do not reproduce, but arise from the ashes, like the phoenix, or from the mirror, like the seductress”. The Gothic has always been about replication as opposed to reproduction. It’s no coincidence that the female vampire was often associated with lesbianism (most gloriously in what is perhaps the definitive goth film, The Hunger) because vampires and lesbians (like machines) present the horror (from the point of view of the phallic One) of a propagative power that has no use for the male seed. Conversely, “female Gothic” often pathologises pregnancy, utilising the language of horror to describe the gradual take-over of the body by an entity that is both appallingly familiar and impossibly alien. “We Hunger” from the Banshees’ Hyaena, with its “horror of suckling”, fits into a lineage of female horror which has seen “pregnancy in terms of the appalling rapacity of the insect world”, as a “parasitic infestation”.[10]
The principal goth vectors of propagation are, of course, signs and clothes (and — clothes as signs). The Siouxsie Look is, in effect, a replicatable cosmetic mask — a literal effacement of the organic expressivity of the face by a geometric pattern, all hard angles and harsh contrasts between white and black. White tribalism.
In Rip It Up, Simon says that the early Banshees were “sexy in the way that Ballard’s Crash was sexy”, and Ballard’s abstract fiction-theory is as palpable and vast a presence in the Banshees as it is in other post-punk. (It’s telling that the turn from the angular dryness of the Banshees’ early sound to the humid lushness of their later phase should have been legitimated by Severin’s reading of The Unlimited Dream Company.) But what the Banshees drew (out) from Ballard was the equivalence of the semiotic, the psychotic, the erotic and the savage. With psychoanalysis (and Ballard is nothing if not a committed reader of Freud), Ballard recognised that there is no “biological” sexuality waiting beneath the “alienated layers” of civilisation. Ballard’s compulsively repeated theme of reversion to savagery does not present a return to a non-symbolised bucolic Nature, but a fall back into an intensely semioticised and ritualised symbolic space. (It is only the postmoderns who believe in a pre-symbolic Nature.) Eroticism is made possible — not merely mediated — by signs and technical apparatus, such that the body, signs and machines become interchangeable.
Baudrillard understood this very well, in his post-punk era essay on Crash:
Each mark, each trace, each scar left on the body is like an artificial invagination, like the scarifications of savages […]. Only the wounded body exists symbolically — for itself and for others — “sexual desire” is never anything but the possibility bodies have of combining and exchanging their signs. Now, the few natural orifices to which one normally attaches sex and sexual activities are nothing next to all the possible wounds, all the artificial orifices (but why “artificial”?), all the breaches through which the body is reversibilised and, like certain topological spaces, no longer knows either interior or interior […] Sex […] is largely overtaken by the fan of symbolic wounds, which are in some sense the anagrammatisation of the whole length of the body — but now, precisely, it is no longer sex, but something else […] The savages knew how to use the whole body to this end, in tattooing, torture, initiation — sexuality was only one of the possible metaphors of symbolic exchange, neither the most significant, nor the most prestigious, as it has become for us in its obsessional and realistic reference, thanks to its organic and functional character (including in orgasm).[11]
As is well-known, female dis-ease in capitalism is often expressed not in an assertion of the “natural” against the artificial, but in the anti-organic protest of eating disorders and self-cutting. It’s hard not to see this — as I.T. following Žižek does — as part of the “obsession” with “realistic reference”, an attempt to strip away all signs and rituals so as to reach the unadorned thing-in-itself. Goth is in many ways an attempt to make good this symbolic deficit in postmodern culture: dressing up as re-ritualisation, a recovery of the surface of the body as the site for scarification and decoration (which is to say, a rejection of the idea that the body is merely the container or envelope for interiority). Take goth footwear. With their flagrant anti-organic angularity, their disdain for the utilitarian criteria of comfort or functionality, goth shoes and boots bend, bind, twist and extend the body. Clothing recovers its cybernetic and symbolic role as a hyperbolic supplement to the body, as what which destroys the illusion of organic unity and proportion.
it doesn’t matter if we all die: the cure’s unholy trinity
“Goth took hold as both a suburban and provincial cult, in which young men and women with heavily powdered faces, mourning clothes and Robert Smith’s hairstyle could be seen at domestic ease in towns like Littlehampton and Ipswich.”
— Michael Bracewell, England is Mine: Pop Life in Albion from Wilde to Goldie[2]
Any discussion of goth will remain incomplete if it doesn’t deal with the Cure.
Goth and the suburbs enjoy a peculiar intimacy (no one knows this more than Tim Burton, whose Edward Scissorhands brilliantly laced the Avon scent of the suburbs with the perfume from goth’s flowers of romance), and is there a group more suburban than the Cure? In England is Mine, Michael Bracewell made much of their origins in humble Crawley. “Quiet and respectable, yet lacking the bourgeois superiority of nearby Haywards Heath (home of Suede), Crawley is a near perfect example of England at its least surprising”, he wrote.[3] For Bracewell, the group are the sound of the in-between spaces of English culture: the suburbs, yes, but also, adolescence, the suburbia of the soul. The Cure are the personification of the not-quite and the not-yet: not quite execrated but never really respected; not punk veterans but not yet generic Goff. The suspicion that has dogged them is that of fakery; yet inauthenticity –as existential condition — was the Cure’s stock-in-trade. You can hear it all in the grain of Robert Smith’s voice. Bracewell again:
When Smith sang, it wasn’t so much his doom-laden lyrics as the actual sound of his voice which lent the Cure their mesmeric monotony: it was the voice of nervous boredom in a small town bedroom, muffled beneath suffocating layers of ennui. Alternately peevish and petulant, breathless with anguish or spluttering with incoherent rage, Smith’s voice was unique in making monotony malleable.[4]
There is a period, a moment, when groups become what they are. Everything that has come before is preparation and rehearsal; everything that comes after is either decline or evasion. Roxy were themselves immediately — the band-brand established with the first notes of “Remake, Remodel” (with the result that Ferry’s subsequent career has been a long essay in disappointment and deferred return), but it’s more usual for a group to take a while to find themselves; to emerge gradually from a cocoon of allusion, homage and plagiarism. It wasn’t quite like that with the Cure, whose best work was always produced in negotiation with their influences.
Their early mode — a spidery, punk-spiked pub sub-psychedelia — now sounds like a series of thin sketches. The Cure become themselves in that moment –lasting three albums — after they have shed the petulant quirkiness of Three Imaginary Boys but before they have entered the comfort zone of branded recognisability. By then, Smith’s panto-persona –- lipstick smear, warm beer and Edward Lear — had become an archetype in the semiotic cemetery of the student disco, and the parameters of the Cure’s style were well-established — marked by what quickly became a regular oscillation between a post-Sgt. Pepper jollity and a slippers-comfortable despair. All of the drama of faltering self-discovery and existential experimentalism that makes the essential triptych of Seventeen Seconds, Faith and Pornography so compelling has gone.
The Cure’s three crucial albums emerged from the shadow of two other bands, whose reputation towered above theirs: the Banshees and Joy Division. Smith made no secret of his fixation on the Banshees (with whom he would later guest as a guitarist). When the band’s first bassist, Michael Dempsey, left the band, it was because he “wanted us to be XTC part 2”, whereas Smith “wanted us to be the Banshees part 2”.
Robert Smith’s look — that clown-faced Caligari ragdoll — was a male complement to Siouxsie’s. And as with Siouxsie’s, Smith’s bird’s nest backcomb, alabaster-white face powder, kohl-like eyeliner and badly applied lipstick is easily copied; a kit to be readily assembled in any suburban bedroom. It was a mask of morbidity, a sign that its wearer preferred fixation and obsession above “well-rounded personhood”.
Goth morbidity arose in part from a Schopenhauerian scorn for organic life: from goth’s perspective, death was the truth of sexuality. Sexuality was what the ceaseless cycle of birth-reproduction-death (as icily surveyed by Siouxsie on Dreamhouse’s “Circle Line”) needed in order to perpetuate itself. Death was simultaneously outside this circuit and what it was really about. Affirming sexuality meant affirming the world, whereas goth set itself, in Houllebecq’s marvellous phrase, against the world and against life. By the early Eighties, it was possible to posit a rock anti-tradition that had similar affiliations, an anhedonic, anti-vital rock lineage that began with the Stones — with the neurasthenic Jagger of “Paint it Black” rather than the cloven-hooved demonic-Dionysus of “Sympathy for the Devil” — and passed through the Stooges and the Pistols, before reaching its nadir-as-zenith in Joy Division. But goth suspected that rock was that always and essentially a death trip. This was the gambit of the Birthday Party, who hunted rock’s mythology back to the fetid, voodoo-stalked crossroads and swamplands of the delta blues. After all, isn’t blues the clearest possible demonstration of the discrepancy between desire and enjoyment, and therefore of the validity of the theory of the death drive? The blues juju — or jou-jou — relies upon the enjoyment of desires that cannot be satisfied.
While the Birthday Party literalised the return to the blues — their career a kind of hectic rewind of rock history, beginning with Pere Ubu/Pop Group modernism and ending in a feverish re-imagining of blues — the Cure, like the Banshees, went to the other extreme. Maintaining fidelity to post-punk’s modernist imperative (novelty or nothing), they preferred a sound that was ethereal rather than earthy, artificial rather than visceral. You can hear this in Smith’s guitar, which, swathed in phasing and flange, destubtantialised and emasculated, aspires to be pure FX denuded of any rock attack. (Is this the first step towards MBV’s honeyed amorphousness?) The Cure’s version of blues enjoyment-in-the-frustration-of-desire is auditioned in “A Forest”: the song in which the group find themselves, ironically, since it is a song about loss — or rather about an encounter with what can never be possessed. “The girl was never there”, Smith sings, a line worthy of Scritti — or Lacan. “Running towards nothing. Again and again and again…”, Smith — a suburban Scotty seeking his Madeleine — pursues the desire-chimera, the petit objet a, through a dreamscape vividly sound-painted by oneiric synthesizers, drum-machines and Smith’s FX-saturated guitar. “A Forest” was the trailer for Seventeen Seconds, and it turned out to be the album’s centerpiece. The synthesizers and the drum machine bring a moderne sheen lacking on the no-frills hustle and bustle of Three Imaginary Boys. Smith was listening to Astral Weeks, Hendrix, Nick Drake, Bowie’s Low, and wanted the album to be a synthesis of the four. The result was both more and less than this. As English as the Smiths would be, but, naturally, much more modernist and much less kitchen sink, Seventeen Seconds puts one in mind of a deserted country house, vast white spaces and empty floorboards decorated by the ornate cobwebs of Smith’s guitar. Emotionally, the effervescent petulance of the first album has drained away, but, even if the predominant mood is now moroseness, it is not yet goth-morbid. But there is a kind of cultivated detachment, Smith assuming an “ostentatious absenteeism”, dissociating himself from an everyday life conceived of as a dramaturgy of effigies: “it’s just your part/in the play/for today…”
“I was 21”, Smith told Uncut in 2000, “but I felt really old. I actually felt older than I do now. I had absolutely no hope for the future. I felt life was pointless. I had no faith in anything. I just didn’t see there was much point in continuing with life. In the next two years, I genuinely felt that I wasn’t going to be alive for much longer. I tried particularly hard to make sure I wasn’t.” From its very first moments, Faith locks onto this hollow-eyed bleakness, and stays there. Affectively, the album is as improbably unwavering as Unknown Pleasures and Closer, and the Joy Division (anxiety of) influence hung over Faith like an acrid pall, the black source of its paradoxically entropic energy, what made it possible but also what would relegate it to the status of a revenant. “The whole thing was reinforced by the fact that Ian Curtis had killed himself,” Smith recalled in the Uncut interview, speaking for the post-Joy Division generation (which would of course include New Order) that would deem itself inauthentic simply by dint of the fact that it had carried on living. “I knew that the Cure were considered fake in comparison, and it suddenly dawned on me that to make this album convincing I would have to kill myself. If I wanted people to accept what we were doing, I was going to have to take the ultimate step.”[5]
Yet Faith would have benefited from pursuing its emotional monotone even more assiduously, if what adrenaline that remained had been drained away, and the two up-tempo tracks (“Primary” and “Doubt”) had been excised. On all the other tracks, Faith flatlines pop, bringing it as close to complete stillness as it is possible to be without coming to the grinding halt the group had sang of in an earlier, much more fleet-of-foot incarnation. There was no calmness in Faith’s stillness. It is not tranquil, but tranquillised, downer-heavy; not so much oceanic as waterlogged, swamped. (In fact, Faith was recorded on coke, not tranquillisers.) The album seems to come from another planet where gravity is more powerful. The synthesizers, now foregrounded more than ever, do most to produce this effect of viscous heaviness. They have a cold warmth that fills out the sound like valium entering the bloodstream. With Faith, as with downers, it is as if the edge has been taken off. Its world is without angles, a fug, fog of bleary drear. It lacks the clinical quality of Joy Division; this is not the sound of depression, nor (as with Movement) of post-traumatic stress, but of a kind of total fatalism, in which nothing much matters, where “all cats are grey”. Faith finds a strange exhilaration in yielding all hope, in playing dead while going through the zombie motions, “breathing like the drowning man”. Bracewell’s description of the Cure’s sound is nowhere more appropriate than when applied to Faith.
There is no insight or polemic: there are no messages and no rallying anthems. Rather, the Cure are the musical expression of suburbia itself: a dense and repetitious sound, carrying a mesmeric dirge of infinitely transferable sounds, all of which sound as though they could go on forever — like endless avenues, crescents and drives.[6]
Faith’s tracks are distended, hynoptic (or hypnagogic) in their repetitiousness,
Smith’s mope a wraith that drifts in after introductions that typically last for ninety seconds or two minutes. Go through the mirror with Smith and what the uninitiated hear as directionless dirges become addictive plateaus, gentle blizzards you enjoy losing yourself in.
After this, you would expect recovery and return, a compensatory uplift. But in the event the Cure’s season in hell was far from over and Pornography outdoes even Faith for morbid enervation. But Faith’s amorphousness is replaced by a newly jagged abrasion and a jittery rhythmic urgency that was the Cure’s take on the then fashionable tribal sound. Its template seems to be the less synthesizer-heavy, more metallic-brutalist tracks on Closer (“Atrocity Exhibition”, “Colony”); the cavernous hollow spaces of PiL; the dancing in the ruins urban anomie of Killing Joke. In the end, it sounds like “Flowers of Romance” sung by a neurasthenic rather than a hysteric, Killing Joke fed on bad trip acid and downers, a defunked 23 Skidoo, all at once.
The opener, “100 Years”, is the Cure’s masterpiece. It starts as it means to go on, Smith intoning, “It doesn’t matter if we all die”, an invitation even more forbidding than that leered by the circus barker Curtis on “The Atrocity Exhibition” (“This is way step inside”). Like Joy Division’s “Disorder”, “100 Years” seems to lift its head from morbid self-absorption to gaze at the world — its words a Cold War ticker-tape as filtered through an adolescent nervous system in the midst of breakdown — but in reality it only selects for consideration those things which confirm its hypothesis that cosmic despair is the only justifiable attitude. “Ambition in the back of a black car… Sharing the world with slaughtered pigs… The soldiers close in…” Smith comes on like Bowie’s Newton in the most famous scene of The Man Who Fell to Earth, entranced and stupefied by a bank of television screens, all of them bringing bad news. What makes this exhilarating rather than emiserating is the necrotic urgency of the death-disco drum machine and Smith’s guitar riff, which blazes like a distress flare in light-polluted sky.
If Smith’s guitar on Pornography often sounds Eastern, it calls up a fantasmatic East in which all of the hippie dreams of free-your-mind exotica have been napalmed into oblivion. Pornography was famously recorded on LSD washed down by alcohol (the band would skulk in a pub waiting for the effects of the acid to wear off before they went into the studio) but it is psychedelic in the same way that Apocalypse Now is. (There are grounds for claiming that Apocalypse Now — with its warporn media overload, its schizophrenic delirium, its sense that The End is only minutes away — was the post-punk film; 23 Skidoo, for one, seemed to have emerged fully-formed from its vision.) Pornography’s delirium is a Jacob’s Ladder bad trip, a psychic Indochina fever dreamt in a Crawley bedroom, the hallucinogens giving distended and distorted shape to anxieties conjured from the suburban heart of darkness.
Smith’s lyrics shred sense for the sake of image-impact. He has always been a “purveyor of filmic ambience” (Bracewell), and the songs on Pornography convey mood through striking images (“voodoo smile… siamese twins”) that never cohere into any clear meaning. The album is the goth equivalent of a chocolate box: an exercise in sheer morbid indulgence unleavened by any cheer.
At the end of the title track, a howling grind that sounds like Joy Division’s “The Atrocity Exhibition” spliced with Stockhausen’s Hymnen, Smith seeks redemption. “I must fight this sickness… Find a cure.” But the sickness, the sickness was the most interesting thing about the Cure.
look at the light
Its cover image is a waveform of a blackbird’s song re-imagined as a geological formation. Kate Bush’s Aerial is Deleuzian MOR: a numinous, luminous twitterscape of women-animal becomings, a hymn to light, and lightness.
I’d concur with what’s already coalescing into a critical consensus: “King of the Mountain” apart, the first disc — “A Sea of Honey” — is merely an appetiser for the second CD, the sumptuous song suite that is “A Sky of Honey”.
On the face of it, for this, her return after twelve years, Bush could either make a show of pursuing Relevance a la Bowie, or Madonna, or else recline into a session-musician airbrushed “timelessness” like Bryan Ferry. In the event, she tacks closer to the second option, but with considerably more success than Ferry has mustered in any of his solo albums for the last twenty years. The sonic palette from which Bush has constructed Aerial contains few rogue elements, and hardly anything that would have discomfited a mid-Eighties audience.
And yet… “A Sky of Honey” in particular has the flavour, if not the instrumentation, of later genres. The intermittent birdsong, the lambent washes of subdued strings and synth, the shifts in atmosphere — now tranquil, now tempestuous, now humid, now temperate — recall ambient jungle (I’m put in mind more than once of Goldie’s “Mother”), the lush opiated vastness of microhouse, English pastoral techno such as Ultramarine.
It is in A Thousand Plateaus’ “Of the Refrain” that Deleuze and Guattari write of birdsong. On one side, the refrain is a territorial marker, the tracing of an interiority; on the other, it opens out into the cosmos. Aerial is similarly double: “A Sea of Honey” exploring the heimlich, “A Sky of Honey” dreaming the cosmic.
“King of the Mountain” has been one of the singles of the year — insidious and insinuating rather than immediate, a blind-side seduction which makes itself a habit before you’ve registered awareness of it. Its snow-swept eyrie contains the grandest, most elemental, rendition of the twin themes that dominate “A Sea of Honey” — domesticity and isolation. Kane in Xanadu doubles Elvis in Graceland, wind howling around the melancholy opulence of their empty mansions.
The other songs on “A Sea of Honey” retreat from these media mythscapes into more intimate territory. Bush flirts with sentimental indulgence on the song addressed to her son, “Bertie”, while meditating on the line between bliss and banality, pathos and bathos, on “How to be Invisible” and “Mrs Bartolozzi”, with their imagery of anoraks, wallflowers and washing machines.
What is fascinating about “Sea of Honey” is its exploration of the Mother’s bliss, which has by definition been excised from a history of rock that has endlessly staged the cutting of the apron strings, the rejection of the maternal. There’s something oppressive and cloying about this domestic space, something suffocating and greedily insatiable about the protected interiority Bush creates. The “domestic idyll” is literally agoraphobic, troubled by an Outside it seeks to keep at bay. “How to be Invisible” is a spell in which ultra-ordinary objects are brandished as protective charms, preservatives of a domesticity that has withdrawn from the wider social world. Yet the heimlich, the homely, is always, also, the unheimlich, the unhomely, the uncanny. In “Mrs Bartolozzi”, a widow’s solitude transforms laundry into a Svankmajer erotic dance, the boredom, loneliness and sadness of a confined mind transfiguring empty clothes into an animist memory-theatre. In these circumscribed horizons, washing the floor becomes a religious observance, an act of mourning and melancholy.
If “A Sea of Honey” is a kitchen-sink delirium, its spaces all carpeted and walled, then “A Sky of Honey” is widescreen, panoramic, as the words of the stand-out track, “Nocturn”, have it. Everything opens out. It’s as if we leave the artificial cocoon of the house to step out into the garden, a garden which becomes a lush Ernst jungle…
What impresses most about “A Sky of Honey” is the majesty of its composition. It sounds like the sort of thing Bush has done before, but there’s nothing else in her oeuvre quite so sustained as this. I mean “composition” in the painterly at least as much as the musicianly sense, for “A Sky of Honey” is Bush’s most painterly record: each sound a delicate stroke in a delicately constructed and minutely conceived picture. Van Gogh (“the flowers are melting!”), Chagall, Ernst, as much as Joyce or Bronte, seem to be the guiding hands. The painter’s medium — light — may well be “A Sky of Honey”’s principal preoccupation. The image of a pavement artist’s work destroyed by rain is central to “A Sky of Honey”: “all the colours are running”. Yet no mood of regret or melancholy can last long here; in an instant, Bush is celebrating “the wonderful sunset” that the run colours have become. Ironically for a record so artfully and fastidiously designed, so foreign to rock and jazz’s spontaneity, the message is that the Accident is the pre-eminent form of creation. We are gently urged to revel in the innocence of becoming, to “look at the light… and all the time it’s a changing…”. The record celebrates the butterfly-wing fragility of the Moment, the neverstatic Hacceities Nature is madly composing and is composed of, the ever-evanescent iridescences of the “somewhere in-between” in which we are always lost. Between wakefulness and sleep, between land and sea, between sky and dust, between day and night, “A Sky of Honey” reaches its poised, anti-climax plateau on the last three tracks, “Somewhere in Between”, “Nocturn” and “Aerial”. By “Somewhere in Between”, we have reached dusk, the time when everything de-substantialises, and the song is a dance of dying light, a savouring of the evening’s bewitched, betwixt state. “Nocturn” is up there with anything she’s done — its oneiric, oceanic disco a kind of becalmed answer to Patti Smith’s “Horses”, the white water of Smith’s angsts and passions soothed and smoothed into a placid lake in which amphibious longings swim and commingle. “Nocturn” is a journey to the end of the night very different to the one Celine took: a Van Gogh-visionary stretching, a reaching both up into the sky and down into the sea.
The stars are caught in our hair
The stars are on our fingers
A veil of diamond dust
Just reach up and touch it
The sky’s above our heads
The sea’s around our legs
In milky, silky water
We swim further and further
We dive down… We dive down…
There are suggestions of Joyce’s Anna Livia Plurabelle here, the river heading out to the sea that will swallow it, just as the dreaming mind awakens. After this, there is the dappled return of sunlight on “Aerial”, glimmers of light on the water’s surface, “all of the birds laughing”, Bush joining in.
Magisterial, and better with every listen.
is pop undead?
If there is a current coalescence of fascination around hauntology, there is also a mounting anxiety about the death, dearth, end of pop. A few examples: this atrocious piece on the “death of black music”[2] (significant only for the statistics it cites), Simon’s 05 round-up for Frieze,[3] and a number of recent threads on Dissensus. The suspicion is inescapable: part of the reason why hauntology should appeal to us so much now is that, unconsciously, and increasingly consciously, we suspect that something has died.
Nothing lasts forever, of that I’m sure.
Announcements of the demise of pop are nothing new of course. And there any number of reasons to be sceptical about the language of “death” and morbidity (not least because it concedes too much to the vitalist valorisations of life). The fact is that nothing ever really dies, not in cultural terms. At a certain point — a point that is usually only discernible retrospectively — cultures shunt off into the sidings, cease to renew themselves, ossify into Trad. They don’t die, they become undead, surviving on old energy, kept moving, like Baudrillard’s deceased cyclist, only by the weight of inertia. Cultures have vibrancy, piquancy only for a while. Lyric poetry, the novel, opera, jazz had their time; there is no question of these cultures dying, they survive, but with their will-to-power diminished, their capacity to define a time lost. No longer historic or existential, they become historical and aesthetic — lifestyle options not ways of life.
We are lulled into the belief that pop should be immune to this process by the illusion to which those within any culture, any civilisation, fall prey (perhaps it is a necessary illusion?): the belief that our own culture will continue forever. The question we need to ask, then, is not so much “will pop die?”, but has pop already reached the point of undeath? Has it seduced us into an entropy tango, clasping us with zombie fingers as it slowly winds down towards permanent irrelevance? Questions worth raising, if only because as soon as they are no longer raised we can be sure that pop really has reached its terminal phase.
What alarms me is the lack of alarm about pop’s current situation. Where is the chorus of disapproval and disquiet about a group like the Arctic Monkeys? Granted, it is not that the Arctic Monkeys are significantly worse than any of their retro forebears (although if anything ought to set alarm bells ringing, it is a situation where “not being worse” than mediocre predecessors is thought of as worthy of comment, still less of muted celebration). What is novel is the discrepancy between the Arctic Monkeys’ modest “achievements” and the scale of their success. Critical success is more easily bought than ever, of course, so we shouldn’t be surprised that the NME rates the Monkeys’ album as fifth best British album of all time (disgust would be more an appropriate response, actually). But such subjective and professionally expedient over-valuations would be insignificant were it not for the quantitative scale of the Arctic Monkeys’ success — fastest selling UK debut album ever! What this implies is a libidinal deficit in pop’s audience as well as in its old media commentariat — a much more worrying trend.
The Arctic Monkeys’ success is as glum news for popists as it is for those of us who still pledge allegiance to pop’s modernist tendencies. (It should be noted here that, with R&B and hip-hop faltering and stuttering, popist-approved pop has been one of the last remaining places where modernism’s guttering flame persists.) As Marcello has suggested recently over at Church of Me, the new new pop (Rachel Stevens, Girls Aloud) is barely secure, certainly not thriving, and its (relatively) disappointing sales compare ominously with the voracious triumph of retro-indie and the new authenticity (Blunt! Jack Johnson!). There has been a kind of reversal, with new new pop occupying the old pre-indie independent position of the popular-experimental, and indie dominating the mainstream. (Hence I would argue that, contra Simon in the Frieze article, it is new new pop, not some putative, ghastly fusion “of grime and indie rock”, that is today’s closest equivalent to post-punk.) A little insight into the times can be gleaned from the fact that NME has been reduced to ostentatiously banning Blunt from its awards ceremony (because there are a MILLION miles between his maudlin mumbling and that of their darlings, naturally). James Blunt versus Coldplay: is this what pop antagonism is reduced to? A pseudo-conflict that should excite only Swiftian ridicule.
Hate’s not your enemy, love’s your enemy.
Such plastic antagonisms (and NME/corporate indie can’t survive without convincing its consumers that they are an alternative to something, that there is some region of common-sense, complacent, middle of the road mediocrity that they don’t already occupy) substitute for the real antagonisms that once sustained pop. Even the most ardent devotees must sense something is missing — there’s just a hint in Doherty’s puppy dog junky eyes that even he recognises the sad fact that even if he dies, it won’t stop being pantomime. (Although one suspects that the current malaise can in large part be accounted for by the fact that “what is missing” is not even noticed, still less mourned or hankered for.)
Indie may have all but driven black musics out of the British charts, hybridity may be off the agenda, but you can bet your bottom dollar that all of those indie bands just love hip-hop and R&B. Pop at its most febrile was stoked by critical and negative energies that are now exhausted — or which have been exiled as far too impolite for today’s pot-pourri, PoMo buffet in which you can have a bit of indie here, a bit of R&B there, where contradictions and anomalies have been Photoshopped out, where it all happily fits into one well-adjusted consumer basket. If the revolutionary tumult of the post-punk era was characterised by restless dissatisfaction, anxiety, uncertainty, rage, harshness, unfairness — that is, by an atmosphere of relentless criticism — today’s pop scene is suffused with laxness, bland acceptance, quiescent hedonism, luxuriant self-satisfaction (ALL those awards shows!) — that is, by PR.
What pop lacks now is the capacity for nihilation, for producing new potentials through the negation of what already exists. One example, of many possible. Both the Birthday Party and new pop nihilated one another: far from existing in a relation of mutual acceptance or of mutual ignorance each defined themselves in large part by not being the other. One shouldn’t rush to conceive of this in simple-minded dialectical terms as thesis-antithesis, since the relationships are not only oppositional — there is always more than one way to nihilate, and it is always possible for any individual thing to nihilate more than one Other. It seems at least plausible to suggest that the capacity for renewed nihilation is what has driven pop. So let’s dare to conceive of pop not as an archipelago of neighbouring but unconflicting options, not as a sequence of happy hybridities or pallid incommensurabilities, but as a spiral of nihilating vortices. Such a model of pop is utterly foreign to postmodern orthodoxies. But pop is either modernist or it is nothing at all.
Just because something is current doesn’t mean it is new. Saying that pop was better twenty-five years ago is NOT to be nostalgic; on the contrary, it is to resist the ambient, airtight, total nostalgia that can not only tolerate but delight in the latest regurgitations on the indie retreadmill.
Let’s dispense once and for all with popist-Deleuzianism/Deleuzian popism’s obligatory positivity. The fact we happen to be alive now doesn’t mean that we must be committed to the belief that this is the best time to live EVER. We have no duty to search out entertainment and spread a little excitement everywhere we go. (Think of how hard to please audiences were in the mid-Seventies, in the midst of a veritable cornucopia by comparison with today’s grim desert; and think of what that dissatisfaction produced.) So, please, no consumerist homilies about the fact that “it is always possible to find good records, no matter what the year”. Yes, of course it is, but as soon as pop is reduced to good records it really is all over. When pop can no longer muster a nihilation of the World, a nihilation of the Possible, then it will only be the ghosts that are worthy of our time.
memorex for the kraken: the fall’s pulp modernism
Part I
“Maybe industrial ghosts are making Spectres redundant”
— The Fall, Dragnet sleevenotes[2]
“M.R. James be born be born
Yog Sothoth rape me lord
Sludge Hai Choi
Van Greenway
Ar Corman”
— The Fall, “Spectre Vs. Rector”[3]
“Scrawny, gnarled, gaunt: Smith doesn’t waltz with ghosts. He materialises them.”
— Mark Sinker, “Look Back In Anguish”[4]
Who can put their finger on the Weird?
It’s taken me more than twenty years to attempt this deciphering. Back then, the Fall did something to me. But what, and how?
Let’s call it an Event, and at the same time note that all Events have a dimension of the uncanny. If something is too alien, it will fail to register; if it is too easily recognised, too easily cognizable, it will never be more than a reiteration of the already known. When the Fall pummelled their way into my nervous system, circa 1983, it was as if a world that was familiar — and which I had thought too familiar, too quotidian to feature in rock — had returned, expressionistically transfigured, permanently altered.
I didn’t know then, that, already, in 1983, the Fall’s greatest work was behind them. No doubt the later albums have their merits but it is on Grotesque (After the Gramme) (1980), Slates (1981) and Hex Enduction Hour (1982) where the group reached a pitch of sustained abstract invention that they — and few others — are unlikely to surpass. In its ambition, its linguistic inventiveness and its formal innovation, this triptych bears comparison with the great works of twentieth-century high literary modernism (Joyce, Eliot, Lewis). The Fall extend and performatively critique that mode of high modernism by reversing the impersonation of working-class accent, dialect and diction that, for example, Eliot performed in “The Waste Land”. Smith’s strategy involved aggressively retaining accent while using — in the domain of a supposedly popular entertainment form — highly arcane literary practices. In doing so, he laid waste the notion that intelligence, literary sophistication and artistic experimentalism are the exclusive preserve of the privileged and the formally educated. But Smith knew that aping master-class morés presented all sorts of other dangers; it should never be a matter of proving (to the masters) that the white crap could be civilised. Perhaps all his writing was, from the start, an attempt to find a way out of that paradox which all working-class aspirants face — the impossibility of working-class achievement. Stay where you are, speak the language of your fathers, and you remain nothing; move up, learn to speak in the master language, and you have become a something, but only by erasing your origins — isn’t the achievement precisely that erasure? (“You can string a sentence together, how can you possibly be working class, my dear?”)
The temptation for Smith was always to fit into the easy role of working-class spokesman, speaking from an assigned place in a given social world. Smith played with that role (“the white crap that talks back”, “Prole Art Threat”, “Hip Priest”) whilst refusing to actually play it. He knew that representation was a trap; social realism was the enemy because in supposedly “merely” representing the social order, it actually constituted it. Against the social realism of the official left, Smith developed a late-twentieth-century urban English version of the “grotesque realism” Bakhtin famously described in Rabelais and his World. Crucial to this grotesque realism is a contestation of the classificatory system which deems cultures (and populations) to be either refined or vulgar. As Peter Stallybrass and Allon White argued, “the grotesque tends to operate as a critique of a dominant ideology which has already set the terms of, designating what is high and low”.[5]
Instead of the high modernist appropriation of working-class speech and culture, Smith’s pulp modernism reacquaints modernism with its disavowed pulp doppelgänger.
Lovecraft is the crucial figure here, since his texts — which first appeared in pulp magazines like Weird Tales — emerged from an occulted trade between pulp horror and modernism. Follow the line back from Lovecraft’s short stories and you pass through Dunsany and M.R. James before coming to Poe. But Poe also played a decisive role in the development of modernism — via his influence on Baudelaire, Mallarmé, Valéry and their admirer T.S. Eliot. “The Waste Land”’s debt to Dracula, for instance, is well-known.[6] The fragmentary, citational structure of a story like Lovecraft’s “Call of Cthulhu”, meanwhile, recalls “The Waste Land”. More than that: as Benjamin Noys argued in his paper “Lovecraft the Sinthome” (given at the recent “Gothic Remains” conference at Sussex), the abominations from which Lovecraft’s strait-laced scholars recoil bear comparisons with cubist and futurist art: Lovecraft, that is to say, turns modernism into an object of horror.
Yet Lovecraft’s texts are exemplary of Weird, rather than straightforwardly Gothic, fiction. Weird fiction has its own consistency, which can be most clearly delineated by comparing it to two adjacent modes, fantasy and the uncanny. Fantasy (and Tolkien is the exemplar here) presupposes a completed world, a world that, although superficially different to “ours” (there may be different species, or supernatural forces) is politically all-too familiar (there is usually some nostalgia for the ordered organisation of feudal hierarchy). The uncanny, meanwhile, is set in “our” world — only that world is no longer “ours” any more, it no longer coincides with itself, it has been estranged. The Weird, however, depends upon the difference between two (or more) worlds — with “world” here having an ontological sense. It is not a question of an empirical difference — the aliens are not from another planet, they are invaders from another reality system. Hence the defining image is that of the threshold, the door from this world into another, and the key figure is the “Lurker at the Threshold” — what, in Lovecraft’s mythos is called Yog Sothoth. The political philosophical implications are clear: there is no world. What we call the world is a local consensus hallucination, a shared dream.
Is There Anybody There?
“Part One: spectre versus rector
The rector lived in Hampshire
The Spectre was from Chorazina)…”
— The Fall, “Spectre Vs. Rector”
“Spectre Vs. Rector”, from 1979’s Dragnet, is the first moment — still chilling to hear — when the Fall both lay out and implement their pulp modernist methodology. “Spectre Vs. Rector” is not only a ghost story, it is a commentary on the ghost story. The chorus, if it can be called that, is a litany of pulp forebears — “M.R. James be born be born/Yog Sothoth rape me lord…” — in which language devolves into asignifying chant, verbal ectoplasm: “Sludge Hai Choi/Van Greenway/Ar Corman”.
Not coincidentally, “Spectre Vs. Rector” was the moment when the Fall really began to sound like themselves. Before that, the Fall’s sound is a grey-complexioned, conspicuously consumptive garage plink-plonk punk, amphetamine-lean and on-edge, marijuana-fatalistic, simultaneously arrogant and unsure of itself, proffering its cheap and nastiness as a challenge. All of the elements of Smith’s later (peripheral) vision are there on Live at the Witch Trials and on the other tracks on Dragnet — watery-eyed figures lurking in the corner of the retina, industrial estates glimpsed through psychotropic stupor — but they have not yet been condensed down, pulped into the witches’ brew that will constitute Smith’s plane of consistency.
On “Spectre Vs. Rector”, any vestigial rock presence subsides into hauntology. The original track is nothing of the sort — it is already a palimpsest, spooked by itself; at least two versions are playing, out of sync. The track — and it is very definitely a track, not a “song” — foregrounds both its own textuality and its texturality. It begins with cassette hum and when the sleeve notes tell us that it was partly “recorded in a damp warehouse in MC/R” we are far from surprised. Steve Hanley’s bass rumbles and thumps like some implacable earth-moving machine invented by a deranged underground race, not so much rising from subterranea as dragging the sound down into a troglodytic goblin kingdom in which ordinary sonic values are inverted. From now on, and for all the records that really matter, Hanley’s bass will be the lead instrument, the monstrous foundations on which the Fall’s upside-down sound will be built. Like Joy Division, fellow modernists from Manchester, the Fall scramble the grammar of white rock by privileging rhythm over melody.
Fellow modernists they might have been, but the Fall and Joy Division’s take on modernism could not have been more different. Hannett and Saville gave Joy Division a minimalist, metallic austerity; the Fall’s sound and cover art, by contrast, was gnarled, collage cut-up, deliberately incomplete. Both bands were dominated by forbiddingly intense vocalist-visionaries. But where Curtis was the depressive-neurotic, the end of the European Romantic line, Smith was the psychotic, the self-styled destroyer of Romanticism.
“Unsuitable for Romantics”, Smith will graffiti onto the cover of Hex Enduction Hour, and “Spectre Vs. Rector” is the template for the anti-Romantic methodology he will deploy on the Fall’s most important releases. After “Spectre Vs. Rector”, there is no Mark E Smith the romantic subject. The novelty of Smith’s approach is to impose the novel or tale form (“Part One: spectre versus rector…”) into the Romantic-lyrical tradition of the r and r song, so that the author-function supplants that of the lyrical balladeer. (There are parallels between what Smith does to rock and the cut-up surgery Eliot performed on the etherised patient of Romantic expressive subjectivity in his early poems.) Smith chant-narrates, not sings, “Spectre Vs. Rector”.
The story is simple enough, and, on the surface, is deliberately conventional: a post-Exorcist revisiting of the classic English ghost story. (At another level, the narrative is generated by a Roussel-like playing with similar words: Rector/Spectre/Inspector/Excorcist/Exhausted.) A rector is possessed by a malign spirit (“the spectre was from Chorazina” — described on the sleevenotes as “a negative Jerusalem”); a police inspector tries to intervene but is driven insane. (This a real Lovecraftian touch, since the dread fate that haunts Lovecraft’s characters is not of being consumed by the polytendrilled abominations but by the schizophrenia that their appearance often engenders.) Both Rector and Inspector have to be saved by a third figure, a shaman-hero, an Outsider who “goes back to the mountains” when the exorcism is complete.
The Rector stands for rectitude and rectilinearity as well as for traditional religious authority. (The ontological shock that Lovecraft’s monstrosities produce is typically described, any Lovecraft reader will recall, in terms of a twisting of rectilinear geometries.) The Inspector, meanwhile, as Ian Penman conjectured in his 1980 interview with the Fall “stands for an investigative, empirical world view”.[7] The hero (“his soul possessed a thousand times”) has more affinity with the Spectre, whom he absorbs and becomes (“the spectre possesses the hero/ but the possession is ineffectual”) than with the agents of rectitude and or empirical investigation. It seems that the hero is driven more by his addiction to being possessed, which is to say dispossessed of his own identity (“that was his kick from life”) than from any altruistic motive. He has no love for the social order he rescues (“I have saved a thousand souls/they cannot even save their own”) but in which he does not occupy a place. “Those flowers take them away”, he said:
They’re only funeral decorations
And this is a drudge nation
A nation of no imagination
A stupid dead man is their ideal
They shirk me and think me unclean…
UNCLEAN…
In Madness and Civilisation, Foucault argues that the insane occupy the structural position vacated by the leper, while in The Ecstasy of Communication, Baudrillard describes “the state of terror proper to the schizophrenic: too great a proximity of everything, the unclean promiscuity of everything which touches, invests and penetrates without resistance, with no halo of private projection to protect him anymore”.[8] Baudrillard is of course describing the schizophrenia of media systems which overwhelm all interiority. Television brings us voices from far away (and there’s always something on the other side…). For Baudrillard, there is an increasing flatness between media and the schizophrenic delirium in which they feature; psychotics often describe themselves as receivers for transmitted signal. And what is the hero of “Spectre Vs. Rector” if not another version of the “ESP medium of discord” that Smith sings of on “Psychic Dancehall”?
Smith’s own methodology as writer-ranter-chanter echoes that of the hero-malcontent. He becomes (nothing but) the mystic pad on which stray psychic signals impress themselves, the throat through which a warring multiplicity of mutually antatognistic voices speak. This is not only a matter of the familiar idea that Smith “contains multitudes”; the schizophonic riot of voices is itself subject to all kinds of mediation. The voices we hear will often be reported speech, recorded in the compressed “telegraphic” headline style Smith borrowed from the Lewis of Blast.
Listening to the Fall now, I’m often reminded of another admirer of Lewis, Marshall McLuhan. The McLuhan of The Mechanical Bride (subtitle: The Folklore of Industrial Man), understood very well the complicity between mass media, modernism and pulp. McLuhan argued that modernist collage was a response to the perfectly schizophrenic layout of the newspaper frontpage. (And Poe, who in addition to his role as a forebear of Weird fiction, was also the inventor of the detective genre, plays a crucial role in The Mechanical Bride.)
Part II
M.R. James, Be Born Be Born
“Ten times my age, one tenth my height…”
— The Fall, “City Hobgoblins”[2]
“So he plunges into the Twilight World, and a political discourse framed in terms of witchcraft and demons. It’s not hard to understand why, once you start considering it. The war that the Church and triumphant Reason waged on a scatter of wise-women and midwives, lingering practitioners of folk-knowledge, has provided a powerful popular image for a huge struggle for political and intellectual dominance, as first Catholics and later Puritans invoked a rise in devil-worship to rubbish their opponents. The ghost-writer and antiquarian M.R. James (one of the writers Smith appears to have lived on during his peculiar drugged adolescence) transformed the folk-memory into a bitter class-struggle between established science and law, and the erratic, vengeful, relentless undead world of wronged spirits, cheated of property or voice, or the simple dignity of being believed in.”
— Mark Sinker, “Watching the City Hobgoblins”[3]
Whether Smith first came to James via TV or some other route, James’ stories exerted a powerful and persistent influence on his writing. Lovecraft, an enthusiastic admirer of James’ stories to the degree that he borrowed their structure (scholar/researcher steeped in empiricist common sense is gradually driven insane by contact with an abyssal alterity) understood very well what was novel in James’ tales. “In inventing a new type of ghost”, Lovecraft wrote of James,
he departed considerably from the conventional Gothic traditions; for where the older stock ghosts were pale and stately, and apprehended chiefly through the sense of sight, the average James ghost is lean, dwarfish and hairy — a sluggish, hellish night-abomination midway betwixt beast and man — and usually touched before it is seen.[4]
Some would question whether these dwarven figures (“ten times my age, one tenth my height”) could be described as “ghosts” at all; often, it seemed that James was writing demon rather than ghost stories.
If the libidinal motor of Lovecraft’s horror was race, in the case of James it was class. For James scholars, contact with the anomalous was usually mediated by the “lower classes”, which he portrayed as lacking in intellect but in possession of a deeper knowledge of weird lore. As Lovecraft and James scholar S.T. Joshi observes:
The fractured and dialectical English in which [James’ array of lower-class characters] speak or write is, in one sense, a reflection of James’ well-known penchant for mimicry; but it cannot be denied that there is a certain element of malice in his relentless exhibition of their intellectual failings. […] And yet, they occupy pivotal places in the narrative: by representing a kind of middle ground between the scholarly protagonists and the aggressively savage ghosts, they frequently sense the presence of the supernatural more quickly and more instinctively than their excessively learned betters can bring themselves to do.[5]
James wrote his stories as Christmas entertainments for Oxford undergraduates, and Smith was doubtless provoked and fascinated by James’ stories in part because there was no obvious point of identification for him in them. “When I was at the witch trials of the twentieth century they said: You are white crap.” (Live at the witch trials: is it that the witch trials have never ended or that we are in some repeating structure which is always excluding and denigrating the Weird?)
A working-class autodidact like Smith could scarcely be conceived of in James; sclerotically-stratified universe; such a being was a monstrosity which would be punished for the sheer hubris of existing. (Witness the amateur archaeologist Paxton in “A Warning to the Curious”. Paxton was an unemployed clerk and therefore by no means working class but his grisly fate was as much a consequence of “getting above himself” as it was of his disturbing sacred Anglo-Saxon artefacts.) Smith could identify neither with James’ expensively-educated protagonists nor with his uneducated, superstitious lower orders. As Mark Sinker puts it: “James, an enlightened Victorian intellectual, dreamed of the spectre of the once crushed and newly rising working classes as a brutish and irrational Monster from the Id: Smith is working class, and is torn between adopting this image of himself and fighting violently against it. It’s left him with a loathing of liberal humanist condescension.”[6]
But if Smith could find no place in James’ world, he would take a cue from one of Blake’s mottoes (adapted in Dragnet’s “Before the Moon Falls”) and create his own fictional system rather than be enslaved by another man’s. (Incidentally, isn’t Blake a candidate for being the original pulp modernist?) In James’ stories, there is, properly speaking, no working class at all. The lower classes that feature in his tales are by and large the remnants of the rural peasantry, and the supernatural is associated with the countryside. James’ scholars typically travel from Oxford or London to the witch-haunted flatlands of Suffolk, and it is only here that they encounter demonic entities. Smith’s fictions would locate spectres in the urban here and now; he would establish that their antagonisms were not archaisms.
Sinker: “No one has so perfectly studied the sense of threat in the English horror story: the twinge of apprehension at the idea that the wronged dead might return to claim their property, their identity, their own voice in their own land.”[7]
The Grotesque Peasants Stalk the Land
“Detective versus rector possessed by spectre
Spectre blows him against the wall
Says direct, ‘This is your fall
I’ve waited since Caesar for this
Damn fatty, my hate is crisp!
I’ll rip your fat body to pieces!’”
— The Fall, “Spectre Vs. Rector”
“The word grotesque derives from a type of Roman ornamental design first discovered in the fifteenth cenury, during the excavation of Titus’s baths. Named after the ‘grottoes’ in which they were found, the new forms consisted of human and animal shapes intermingled with foliage, flowers, and fruits in fantastic designs which bore no relationship to the logical categories of classical art. For a contemporary account of these forms we can turn to the Latin writer Vitruvius. Vitruvius was an official charged with the rebuilding of Rome under Augustus, to whom his treatise On Architecture is addressed. Not surprisingly, it bears down hard on the ‘improper taste’ for the grotesque. ‘Such things neither are, nor can be, nor have been,’ says the author in his description of the mixed human, animal, and vegetable forms:
For how can a reed actually sustain a roof, or a candelabrum the ornament of a gable? or a soft and slender stalk, a seated statue? or how can flowers and half-statues rise alternately from roots and stalks? Yet when people view these falsehoods, they approve rather than condemn, failing to consider whether any of them can really occur or not.”
— Patrick Parrinder, James Joyce[8]
By the time of Grotesque (After the Gramme), the Fall’s pulp modernism has become an entire political-aesthetic program. At one level, Grotesque can be positioned as the barbed Prole Art retort to the lyric antique Englishness of public school prog. Compare, for instance, the cover of “City Hobgoblins” (one of the singles that came out around the time of Grotesque) with something like Genesis’ Nursery Cryme. Nursery Cryme presents a gently corrupted English surrealist idyll. On the “City Hobgoblins” cover, an urban scene has been invaded by “emigres from old green glades”: a leering, malevolent cobold looms over a dilapidated tenement. But rather than being smoothly integrated into the photographed scene, the crudely rendered hobgoblin has been etched, Nigel Cooke-style, onto the background. This is a war of worlds, an ontological struggle, a struggle over the means of representation.
Grotesque’s “English Scheme” was a thumbnail sketch of the territory over which the war was being fought. Smith would observe later that it was “English Scheme” which “prompted me to look further into England’s ‘class’ system. INDEED, one of the few advantages of being in an impoverished sub-art group in England is that you get to see (If eyes are peeled) all the different strata of society — for free.”[9] The enemies are the old right, the custodians of a National Heritage image of England (“poky quaint streets in Cambridge”) but also, crucially, the middle-class left, the Chabertistas of the time, who “condescend to black men” and “talk of Chile while driving through Haslingdon”. In fact, enemies were everywhere. Lumpen-punk was in many ways more of a problem than prog, since its reductive literalism and perfunctory politics (“circles with A in the middle”) colluded with social realism in censuring/censoring the visionary and the ambitious.
Although Grotesque is an enigma, its title gives clues. Otherwise incomprehensible references to “huckleberry masks”, “a man with butterflies on his face” and Totale’s “ostrich headdress” and “light blue plant-heads” begin to make sense when you recognise that, in Parrinder’s description, the grotesque originally referred to “human and animal shapes intermingled with foliage, flowers, and fruits in fantastic designs which bore no relationship to the logical categories of classical art”.
Grotesque, then, would be another moment in the endlessly repeating struggle between a pulp Underground (the scandalous grottoes) and the Official culture, what Philip K. Dick called “the Black Iron Prison”. Dick’s intuition was that “the Empire had never ended”, and that history was shaped by an ongoing occult(ed) conflict between Rome and Gnostic forces. “Spectre Vs. Rector” (“I’ve waited since Caesar for this”) had rendered this clash in a harsh Murnau black and white; on Grotesque the struggle is painted in colours as florid as those used on the album’s garish sleeve (the work of Smith’s sister).
It is no accident that the words “grotesque” and “weird” are often associated with one another, since both connote something which is out of place, which either should not exist at all, or which should not exist here. The response to the apparition of a grotesque object will involve laughter as much as revulsion. “What will be generally agreed upon”, Philip Thompson wrote in his 1972 study The Grotesque “is that ‘grotesque’ will cover, perhaps among other things, the co-presence of the laughable and something that is incompatible with the laughable.”[10] The role of laughter in the Fall has confused and misled interpreters. What has been suppressed is precisely the co-presence of the laughable with what is not compatible with the laughable. That co-presence is difficult to think, particularly in Britain, where humour has often functioned to ratify commonsense, to punish overreaching ambition with the dampening weight of bathos.
With the Fall, however, it is as if satire is returned to its origins in the grotesque. The Fall’s laughter does not issue from the commonsensical mainstream but from a psychotic Outside. This is satire in the oneiric mode of Gillray, in which invective and lampoonery becomes delirial, a (psycho) tropological spewing of associations and animosities, the true object of which is not any failing of probity but the delusion that human dignity is possible. It is not surprising to find Smith alluding to Jarry’s Ubu Roi in a barely audible line in “City Hobgoblins” (“Ubu le Roi is a home hobgoblin”). For Jarry, as for Smith, the incoherence and incompleteness of the obscene and the absurd were to be opposed to the false symmetries of good sense.
But in their mockery of poise, moderation and self-containment, in their logorrheic disgorging of slanguage, in their glorying in mess and incoherence, the Fall sometimes resemble a white English analogue of Funkadelic. For both Smith and Clinton, there is no escaping the grotesque, if only because those who primp and puff themselves up only become more grotesque. We could go so far as to say that it is the human condition to be grotesque, since the human animal is the one that does not fit in, the freak of nature who has no place in nature and is capable of re-combining nature’s products into hideous new forms.
On Grotesque, Smith has mastered his anti-lyrical methodology. The songs are tales, but tales half-told. The words are fragmentary, as if they have come to us via an unreliable transmission that keeps cutting out. Viewpoints are garbled; ontological distinctions (between author, text and character) are confused, fractured. It is impossible to definitively sort out the narrator’s words from direct speech. The tracks are palimpsests, badly recorded in a deliberate refusal of the “coffee table” aesthetic Smith derides on the cryptic sleeve notes. The process of recording is not airbrushed out but foregrounded, surface hiss and illegible cassette noise brandished like improvised stitching on some Hammer Frankenstein monster.
“Impression of J Temperance” was typical: a story in the Lovecraft style in which a dog breeder’s “hideous replica” (“brown sockets… Purple eyes… fed with rubbish from disposal barges”) haunts Manchester. This is a Weird tale, but one subjected to modernist techniques of compression and collage. The result is so elliptical that it is as if the text — part-obliterated by silt, mildew and algae — has been fished out of the Manchester ship canal (which Hanley’s bass sounds like it is dredging).
“‘Yes’, said Cameron, ‘And the thing was in the impression of J Temperance.’”
The sound on Grotesque is a seemingly impossible combination of the shambolic and the disciplined, the cerebral-literary and the idiotic-physical. The obvious parallel was the Birthday Party. In both groups, an implacable bass holds together a leering, lurching schizophonic body whose disparate elements strain like distended, diseased viscera against a pustule and pock-ridden skin (“a spotty exterior hides a spotty interior”). Both the Fall and the Birthday Party reached for pulp horror imagery rescued from the white trash can as an analogue and inspiration for their perverse “return” to rock and roll (cf. also the Cramps). The nihilation that fired them was a rejection of a pop that they saw as self-consciously sophisticated, conspicuously cosmopolitan, a pop which implied that the arty could only be attained at the expense of brute physical impact. Their response was to hyperbolically emphasise crude atavism, to embrace the unschooled and the primitivist.
The Birthday Party’s fascination was with the American “junkonscious”, the mountain of semiotic/narcotic trash lurking in the hindbrain of a world population hooked on America’s myths of abjection and omnipotence. The Birthday Party revelled in this fantasmatic Americana, using it as a way of cancelling an Australian identity that they in any case experienced as empty, devoid of any distinguishing features.
Smith’s r and r citations functioned differently, precisely as a means of reinforcing his Englishness and his own ambivalent attitude towards it. The rockabilly references are almost like “What If?” exercises. What if rock and roll had emerged from the industrial heartlands of England rather than the Mississippi Delta? The rockabilly on “Container Drivers” or “Fiery Jack” is slowed by meat pies and gravy, its dreams of escape fatally poisoned by pints of bitter and cups of greasy spoon tea. It is rock and roll as Working Men’s Club cabaret, performed by a failed Gene Vincent imitator in Prestwich. The “What if?” speculations fail. Rock and roll needed the endless open highways; it could never have begun in Britain’s snarled up ring roads and claustrophobic conurbations.
For the Smith of Grotesque, homesickness is a pathology. (In the interview on the 1983 Perverted by Language video, Smith claims that being away from England literally made him sick.) There is little to recommend the country which he can never permanently leave; his relationship to it seems to be one of wearied addiction. The fake jauntiness of “English Scheme” (complete with proto-John Shuttleworth cheesy cabaret keyboard) is a squalid postcard from somewhere no one would ever wish to be. Here and in “C and Cs Mithering”, the US emerges as an alternative (in despair at the class-ridden Britain of “sixty hours and stone toilet back gardens”, the “clever ones” “point their fingers at America”), but there is a sense that, no matter how far he travels, Smith will in the end be overcome by a compulsion to return to his blighted homeland, which functions as his pharmakon, his poison and remedy, sickness and cure. In the end he is as afflicted by paralysis as Joyce’s Dubliners.
On “C n Cs Mithering” a rigor mortis snare drum gives this paralysis a sonic form. “C n Cs Mithering” is an unstinting inventory of gripes and irritations worthy of Tony Hancock at his most acerbic and disconsolate, a cheerless survey of estates that “stick up like stacks” and, worse still, a derisive dismissal of one of the supposed escape routes from drudgery: the music business, denounced as corrupt, dull and stupid. The track sounds, perhaps deliberately, like a white English version of rap (here as elsewhere, the Fall are remarkable for producing equivalents to, rather than facile imitations of, black American forms).
Body a Tentacle Mess
“So R. Totale dwells underground
Away from sickly grind
With ostrich head-dress
Face a mess, covered in feathers
Orange-red with blue-black lines
That draped down to his chest
Body a tentacle mess
And light blue plant-heads.”
— The Fall, “The N.W.R.A”[11]
But it is the other long track, “The N.W.R.A.”, that is the masterpiece. All of the LP’s themes coalesce in this track, a tale of cultural political intrigue that plays like some improbable mulching of T.S. Eliot, Wyndham Lewis, H.G. Wells, Dick, Lovecraft and Le Carré. It is the story of Roman Totale, a psychic and former cabaret performer whose body is covered in tentacles. It is often said that Roman Totale is one of Smith’s “alter-egos”; in fact, Smith is in the same relationship to Totale as Lovecraft was to someone like Randolph Carter. Totale is a character rather than a persona. Needless to say, he is not a character in the “well-rounded” Forsterian sense so much as a carrier of mythos, an inter-textual linkage between pulp fragments.
The inter-textual methodology is crucial to pulp modernism. If pulp modernism first of all asserts the author-function over the creative-expressive subject, it secondly asserts a fictional system against the author-God. By producing a fictional plane of consistency across different texts, the pulp modernist becomes a conduit through which a world can emerge. Once again, Lovecraft is the exemplar here: his tales and novellas could in the end no longer be apprehended as discrete texts but as part-objects forming a mythos-space which other writers could also explore and extend.
The form of “The N.W.R.A.” is as alien to organic wholeness as is Totale’s abominable tentacular body. It is a grotesque concoction, a collage of pieces that do not belong together. The model is the novella rather than the tale, and the story is told episodically, from multiple points of view, using a heteroglossic riot of styles and tones (comic, journalistic, satirical, novelistic): like “Call of Cthulhu” re-written by the Joyce of Ulysses and compressed into ten minutes.
From what we can glean, Totale is at the centre of a plot — infiltrated and betrayed from the start — which aims at restoring the North to glory (perhaps to its Victorian moment of economic and industrial supremacy; perhaps to some more ancient pre-eminence, perhaps to a greatness that will eclipse anything that has come before). More than a matter of regional railing against the capital, in Smith’s vision the North comes to stand for everything suppressed by urbane good taste: the esoteric, the anomalous, the vulgar sublime, that is to say, the Weird and the Grotesque itself. Totale, festooned in the incongruous Grotesque costume of “ostrich head-dress… feathers/orange-red with blue-black line/…and light blue plant-heads” is the would-be Faery King of this Weird Revolt who ends up its maimed Fisher King, abandoned like a pulp modernist Miss Havisham amongst the relics of a carnival that will never happen, a drooling totem of a defeated tilt at social realism, the visionary leader reduced, as the psychotropics fade and the fervour cools, to being a washed-up cabaret artiste once again.
Part III
“Don’t start improvising, for Christ’s sake”
The temptation, when writing about the Fall’s work of this period, is to too quickly render it tractable. I note this by way of a disclaimer and a confession, since I am of course as liable to fall prey to this temptation as any other commentator. To confidently describe songs as if they were “about” settled subjects or to attribute to them a determinate aim or orientation (typically, a satirical purpose) will always be inadequate to the vertiginous experience of the songs and the distinctive jouissance provoked by listening to them. This enjoyment involves a frustration — a frustration, precisely, of our attempts to make sense of the songs. Yet this jouissance — something also provoked by the late Joyce, Pynchon and Burroughs — is an irreducible dimension of the Fall’s modernist poetics. If it is impossible to make sense of the songs, it is also impossible to stop making sense of them — or at least to it is impossible to stop attempting to make sense of them. On the one hand, there is no possibility of dismissing the songs as nonsense; they are not gibberish or disconnected strings of non-sequiturs. On the other hand, any attempt to constitute the songs as settled carriers of meaning runs aground on their incompleteness and inconsistency.
The principal way in which the songs were recuperated was via the charismatic persona Smith established in interviews. Although Smith scrupulously refused to either corroborate or reject any interpretations of his songs, invoking this extra-textual persona, notorious for its strong views and its sardonic but at least legible humour, allowed listeners and commentators to contain, even dissipate, the strangeness of the songs themselves.
The temptation to use Smith’s persona as a key to the songs was especially pressing because all pretence of democracy in the group has long since disappeared. By the time of Grotesque, it was clear that Smith was as much of an autocrat as James Brown, the band the zombie slaves of his vision. He is the shaman-author, the group the producers of a delirium-inducing repetition from which all spontaneity must be ruthlessly purged. “Don’t start improvising for Christ’s sake,” goes a line on Slates, the 10” EP followup to Grotesque, echoing his chastisement of the band for “showing off” on the live LP Totale’s Turns.
Slates’ “Prole Art Threat” turned Smith’s persona, reputation and image into an enigma and a conspiracy. The song is a complex, ultimately unreadable, play on the idea of Smith as “working-class” spokesman. The “Threat” is posed as much to other representations of the proletarian pop culture (which at its best meant the Jam and at its worst meant the more thuggish Oi!) as it is against the ruling class as such. The “art” of the Fall’s pulp modernism — their intractability and difficulty — is counterposed to the misleading ingenuousness of social realism.
The Fall’s intuition was that social relations could not be understood in the “demystified” terms of empirical observation (the “housing figures” and “sociological memory” later ridiculed on “The Man Whose Head Expanded’). Social power depends upon “hexes”: restricted linguistic, gestural and behavioural codes which produce a sense of inferiority and enforce class destiny. “What chance have you got against a tie and a crest?”, Weller demanded on “Eton Rifles”, and it was as if the Fall took the power of such symbols and sigils very literally, understanding the social field as a series of curses which have to be sent back to those who had issued them.
The pulp format on “Prole Art Threat” is spy fiction, its scenario resembling Tinker Tailor Soldier Spy re-done as a tale of class cultural espionage, but then compressed and cut up so that characters and contexts are even more perplexing than they were even in Le Carré’s already oblique narrative. We are in a labyrinthine world of bluff and counter-bluff — a perfect analogue for Smith’s own elusive, allusive textual strategies. The text is presented to us as a transcript of surveillance tapes, complete with ellipses where the transmission is supposedly scrambled: “GENT IN SAFE-HOUSE: Get out the pink press threat file and Brrrptzzap* the subject. (* = scrambled).”
“Prole Art Threat” seems to be a satire, yet it is a blank satire, a satire without any clear object. If there is a point, it is precisely to disrupt any “centripetal” effort to establish fixed identities and meanings. Those centripetal forces are represented by the “Middle Mass” (“vulturous in the aftermath”) and “the Victorian vampiric” culture of London itself, as excoriated in “Leave the Capitol”:
The tables covered in beer
Showbiz whines, minute detail
It’s a hand on the shoulder in Leicester Square
It’s vaudeville pub back room dusty pictures of white frocked girls and music teachers
The bed’s too clean
The water’s poison for the system
Then you know in your brain
LEAVE THE CAPITOL!
EXIT THIS ROMAN SHELL!
This horrifying vision of London as a Stepford city of drab conformity (“hotel maids smile in unison”) ends with the unexpected arrival of Machen’s Great God Pan (last alluded to in the Fall’s very early “Second Dark Age”, presaging the Fall’s return of the Weird.
The Textual Expectorations of Hex
“He’d been very close to becoming ex-funny man celebrity. He needed a good hour at the hexen school…”
— Press release for Hex Enduction Hour
Hex Enduction Hour was even more expansive than Grotesque. Teeming with detail, gnomic yet hallucinogenically vivid, Hex was a series of pulp modernist pen portraits of England in 1982. The LP had all the hubristic ambition of prog combined with an aggression whose ulcerated assault and battery outdid most of its post-punk peers in terms of sheer ferocity. Even the lumbering “Winter” was driven by a brute urgency, so that, on side one, only the quiet passages in the lugubrious “Hip Priest” — like dub if it had been invented in drizzly motorway service stations rather than in recording studios in Jamaica — provided a respite from the violence.
Yet the violence was not a matter of force alone. Even when the record’s dual-drummer attack is at its most poundingly vicious, the violence is formal as much as physical. Rock form is disassembled before our ears. It seems to keep time according to some system of spasms and lurches learned from Beefheart. Something like “Deer Park” — a whistle-stop tour of London circa 82 sandblasted with “Sister Ray”-style white noise — screams and whines as if it is about to fall apart at any moment. The “bad production” was nothing of the sort. The sound could be pulverisingly vivid at times: the moment when the bass and drums suddenly loom out of the miasma at the start of “Winter” is breathtaking, and the double-drum tattoo on “Who Makes the Nazis?” fairly leaps out of the speakers. This was the space rock of Can and Neu! smeared in the grime and mire of the quotidian, recalling the most striking image from The Quatermass Xperiment: a space rocket crash-landed into the roof of a suburban house.
In many ways, however, the most suggestive parallels come from black pop. The closest equivalents to the Smith of Hex would be the deranged despots of black sonic fiction: Lee Perry, Sun Ra and George Clinton, visionaries capable of constructing (and destroying) worlds in sound.
As ever, the album sleeve (so foreign to what were then the conventions of sleeve design that HMV would only stock it with its reverse side facing forward) was the perfect visual analogue for the contents. The sleeve was more than that, actually: its spidery scrabble of slogans, scrawled notes and photographs was a part of the album rather than a mere illustrative envelope in which it was contained.
With the Fall of this period, what Gerard Genette calls “paratexts”[2] — those liminal conventions, such as introductions, prefaces and blurbs, which mediate between the text and the reader — assume special significance. Smith’s paratexts were clues that posed as many puzzles as they solved; his notes and press releases were no more intelligible than the songs they were nominally supposed to explain. All paratexts occupy an ambivalent position, neither inside nor outside the text: Smith uses them to ensure that no definite boundary could be placed around the songs. Rather than being contained and defined by its sleeve, Hex haemorrhages through the cover.
It was clear that the songs weren’t complete in themselves, but part of a larger fictional system to which listeners were only ever granted partial access. “I used to write a lot of prose on and off”, Smith would say later. “When we were doing Hex I was doing stories all the time and the songs were like the bits left over.” Smith’s refusal to provide lyrics or to explain his songs was in part an attempt to ensure that they remained, in Barthes’ terms, writerly. (Barthes opposes such texts, which demand the active participation of the reader, to “readerly” texts, which reduce the reader to the passive role of consumer of already-existing totalities.)
Before his words could be deciphered they had first of all to be heard, which was difficult enough, since Smith’s voice — often subject to what appeared to be loud hailer distortion — was always at least partially submerged in the mulch and maelstrom of Hex’s sound. In the days before the internet provided a repository of Smith’s lyrics (or fans’ best guesses at what the words were), it was easy to mis-hear lines for years.
Even when words could be heard, it was impossible to confidently assign them a meaning or an ontological “place”. Were they Smith’s own views, the thoughts of a character or merely stray semiotic signal? More importantly: how clearly could each of these levels be separated from one another? Hex’s textual expectorations were nothing so genteel as stream of consciousness: they seemed to be gobbets of linguistic detritus ejected direct from the mediatised unconscious, unfiltered by any sort of reflexive subjectivity. Advertising, tabloid headlines, slogans, pre-conscious chatter, overheard speech were masticated into dense schizoglossic tangles.
“Who wants to be in a Hovis advert anyway?”
“Who wants to be in a Hovis/advert/anyway?” Smith asks in “Just Step S’Ways”, but this refusal of cosy provincial cliché (Hovis adverts were famous for their sentimentalised presentation of a bygone industrial North) is counteracted by the tacit recognition that the mediatised unconscious is structured like advertising. You might not want to live in an advert, but advertising dwells within you. Hex converts any linguistic content — whether it be polemic, internal dialogue, poetic insight — into the hectoring form of advertising copy or the screaming ellipsis of headline-speak. The titles of “Hip Priest” and “Mere Pseud Mag Ed”, as urgent as fresh newsprint, bark out from some Voriticist front page of the mind.
As for advertising, consider “Just Step S’Ways” opening call to arms: “When what used to excite you does not/like you’ve used up all your allowance of experiences.” Is this an existentialist call for self re-invention disguised as advertising hucksterism, or the reverse? Or take the bilious opening track, “The Classical”. “The Classical” appears to oppose the anodyne vacuity of advertising’s compulsory positivity (“this new profile razor unit”) to ranting profanity (“hey there fuckface!”) and the gross physicality of the body (“stomach gassss”). But what of the line, “I’ve never felt better in my life?” Is this another advertising slogan or a statement of the character’s feelings?
It was perhaps the unplaceability of any of the utterances on Hex that allowed Smith to escape censure for the notorious line, “where are the obligatory niggers?” in “The Classical”. Intent was unreadable. Everything sounded like a citation, embedded discourse, mention rather than use.
Smith returns to the Weird tale form on “Jawbone and the Air Rifle”. A poacher accidentally causes damage to a tomb, unearthing a jawbone which “carries the germ of a curse/of the Broken Brothers Pentacle Church.” The song is a tissue of allusions — James (“A Warning to the Curious”, “Oh, Whistle and I’ll Come to You, My Lad”), Lovecraft (“The Shadow over Innsmouth”), Hammer Horror, The Wicker Man — culminating in a psychedelic/psychotic breakdown (complete with torch-wielding mob of villagers):
He sees jawbones on the street
Advertisements become carnivores
And roadworkers turn into jawbones
And he has visions of islands, heavily covered in slime.
The villagers dance round pre-fabs
And laugh through twisted mouths.
“Jawbone” resembles nothing so much as a League of Gentlemen sketch, and the Fall have much more in common with the League of Gentlemen’s febrile carnival than with witless imitators such as Pavement. The co-existence of the laughable with that which is not laughable: a description that captures the essence of both the Fall and The League of Gentlemen’s grotesque humour.
“White face finds roots”
“Below, black scars winding through the snow showed the main roads. Great frozen rivers and snow-laden forest stretched in all directions. Ahead they could just see a range of old, old mountatins. It was perpetual evening at this time of year, and the further north they went, the darker it became. The white lands seemed uninhabited, and Jerry could easily see how the legends of trolls, Jotunheim, and the tragic gods — the dark, cold, bleak legends of the North — had come out of Scandinavia. It made him feel strange, even anachronistic, as if he had gone back from his own age to the Ice Age.”
— Michael Moorcock, The Final Programme[3]
On Hex’s second side, mutant r and r becomes r and Artaud as the songs become increasingly delirial and abstract. “Who Makes the Nazis” — as lunar as Tago Mago, as spacey-desolated as King Tubby at his most cavernous –- is a TV talk show debate rendered as some Jarry-esque pantomime, and composed of leering backing vocals and oneiric-cryptic linguistic fragments: “longhorn breed… George Orwell Burmese police… Hate’s not your enemy, love’s your enemy, murder all bush monkeys…”
“Iceland”, recorded in a lava-lined studio in Reykjavík, is a fantasmatic encounter with the fading myths of North European culture in the frozen territory from which they originated. “White face finds roots”, Smith’s sleeve-notes tell us. The song, hypnotic and undulating, meditative and mournful, recalls the bone-white steppes of Nico’s The Marble Index in its arctic atmospherics. A keening wind (on a cassette recording made by Smith) whips through the track as Smith invites us to “cast the runes against your own soul” (another James’ reference, this time to his “Casting the Runes”).
“Iceland” is rock as ragnarock, an anticipation (or is it a recapitulation) of the End Times in the terms of the Norse “Doom of the Gods”. It is a Twilight of the Idols for the retreating hobgoblins, cobolds and trolls of Europe’s receding Weird culture, a lament for the monstrosities and myths whose dying breaths it captures on tape:
Witness the last of the god men…
A Memorex for the Krakens
scritti’s sweet sickness
“His new album is called White Bread, Black Beer…
‘Why? It’s pretty much all I live on — Guinness and a lovely, soft, gooey, terribly-bad-for-you white bread from the local Turkish bakery. It’s also a reference to when I worked with all these R&B musicians in New York in the 80s — if you played something they didn’t like they’d frown and say, “Oh man, that’s so white-bread”. Meaning that it came from that “white” pop culture which is seen as largely voided of nutrition, substance, goodness, or indeed “soul”. And that definitely got my antennae going, because I’m mistrustful of “soul” and I very much like white, processed pop music. Which, in a way, is what this album celebrates.’”
— Interview with Green Gartside, Time Out[2]
“Instead of any fulfilment or resolution, Scritti’s music delivers the bliss of the lover’s discourse in all its ellipses, contradiction and repetition, its endless pursuit of an unattainable object. The disembodied, depthless, non-linear effects, and the borrowing of pop’s language of love try to undo desire’s usual articulation in coherent drives and stable identity while reinscribing or repeating the very ‘soul’ language that’s used to complete the self in today’s pop: the sweet nothings heard beside, within the sexual healing.”
— Paul Oldfield, “After Subversion: Pop Culture and Power”[3]
A fascinating conjunction: listening to Scritti Politti’s quietly stunning new album — or rather being seduced and ravished by it — while reading Mladen Dolar’s A Voice and Nothing More. If, as Simon Reynolds claims,[4] White Bread, Black Beer is an album without a “sonic concept”, must we conclude that the songs are Green’s version of a soul-baring? After all the deferrals, the veilings, the deviations, finally a revelation: this is me? The album’s title seems to invite such an interpretation, suggesting a negative alchemy,
the reversion of sublime agalma into foodstuffs. Without a sonic concept, we are left only with the honey-pure voice, one of the most distinctive in pop — and the voice, so we have always been told, is the bearer of pure presence, guarantor of authenticity and veracity…
This, precisely, is what Dolar challenges. Dolar’s claim is not that Derrida was wrong that the voice has been privileged in a certain version of metaphysics, but that this has never been the whole story. “There exists a different metaphysical history of voice, where the voice, far from being the safeguard of presence, was considered to be dangerous, threatening and possibly ruinous.”[5] Tellingly, Dolar’s alternative history of metaphysics goes via the treatment of music. (Incidentally, it is hard not to read Plato’s admonition, quoted by Dolar, that “[a] change to a new type of music is something to beware of as a hazard of all our fortunes… [f]or the modes of music are never disturbed without unsettling of the most fundamental political and social conventions” as a critique of both PoMo popism and nostalgic rockism). Dolar’s argument is that Law-Logos has always sought to differentiate itself from a voice conceived of as feminine and chaotic, but Logos cannot extirpate the voice, and indeed depends upon it: what is the fundamental expression of the Law if not the voice of the Father?
How could your nothings be so sweet?
What to make of Green’s voice, then? Or, to pose the same question from the other side: what is the minimal difference that has always separated Scritti’s deconstructions from the real thing? There’s a tendency to locate Green’s undoings and unsettlings on the level of signifiers, as if his subversion were all to do with wordplay, and his voice were merely a site for natural expressivity. But, as Dolar establishes, the “object voice” is neither the voice stripped of all sensual qualities in order to become the neutral transmitter of signifiers, nor the voice stripped of all signification in order to become a pure source of aesthetic pleasure. With Green’s voice, we continually slide between two types of non-sense: the nonsense of “the lover’s discourse”, the nursery-rhyme-like reiterations of baby-talk phrases that are devoid of meaning, but which are nevertheless the most important utterances people perform or hear; and also the nonsense of the voice as sound, another kind of sweet nothing. That is why Green’s lyrics look very different when you read them; the voice almost prevents you hearing them except as senseless sonorous blocks, mechanically repeated refrains.
What is disturbing about Cupid and Psyche 85 by comparison with the new pop that preceded it is precisely its lack of any self-conscious meta-presence. This is where I slightly disagree with Simon, when he argues that Cupid and Psyche is “about love rather than in love”. It seems to me that what makes Cupid and Psyche so disturbingly depthless is precisely the absence of that space between the song’s form and the subject; the songs instantiate the lover’s discourse, they do not comment on it. Cupid and Psyche’s songs, creepily, aren’t about anything, any more than love itself is. Compare Cupid and Psyche with ABC’s The Lexicon of Love (an album of love songs about love songs, if ever there was one), for instance. Martin Fry’s presence is ubiquitous in The Lexicon of Love, manifesting itself in every raised eyebrow and set of inverted commas. But on Cupid and Psyche we get precious little sense of a “real”, biographical Green behind or beyond the record; as opposed to self-consciousness, we have “reflexivity without a self (not a bad name for the subject).”[6] There is only the void, the voice and the signifying chain, unraveling forever in a shopping mall of mirrors, a whispering gallery of sweet nothings… But what is disturbing about Cupid and Psyche is the suggestion that this really is love, this impersonal, idiot rhyming is all love is. That is why Cupid and Psyche is far more unsettling than the supposed “reversion to a pre-linguistic condition” of the “Kristevan” psychedelic rock celebrated by Simon in the late Eighties (and mentioned in the Paul Oldfield essay I cited above as a point of comparison with Scritti); the supposed “oceanic dissolution of self” assumes not only that such a dissolution can be attained, but that there is a “real self” that can be dissolved. Like the first two Roxy albums, Cupid and Psyche’s message is far more radical: the supposed “real”, “authentic” self, with its emotional core, is a structural illusion; our most treasured “inner” feelings are trite repetitions; there is no intimate, only an extimate.
I guess it’s a sickness/that keeps me wanting…
The excess of Green’s voice resides in its sweetness, a sweetness that seems unhealthy, sickly, which puts us on our guard even as it seduces us. Green’s voice is synthetic, candied, rather than authentic, wholesome. It already sounds inhuman, so that, upon first hearing rave’s pitched-up chirruping vocals, the obvious comparison was with Scritti’s androgynous cooing. All of this is anticipated on the track that I find most captivating when I listen to Cupid and Psyche now, the machine ballad, “A Little Knowledge”, in which Green duets with what sounds like a woman, but which is in fact a Fairlight-sprite, a synthetic succubus constructed from his own voice pitched up. (This exchange with a synthetic spectre happens before the “real woman”, session singer B.J. Nelson, officially comes in…)
It’s worth remembering at this point that Green is very much the white ghost at the revel of contemporary black pop. At least since More Brilliant Than Sun, disco, techno and house’s (non)roots in white synthetics have been exposed — Moroder inducting Donna Summer into a labyrinth of synthesizers, Chic wanting to be Roxy Music, Cybotron stealing Ultravox’s accents and sound — but Cupid and Psyche’s function as a template for contemporary R&B is far less rehearsed. Specifically via its influence on Jam and Lewis’ production of Janet Jackson’s epochal Control, but more generally through its intuition — or entrepreneurial leap — that the flesh and blood of (what was then called) soul could be sutured with hip-hop’s artificiality and abstraction machine, Cupid and Psyche instituted a “new paradigm” for globalised pop. Skank Bloc Bologna has become the blobal retail arcade of capital. Cupid and Psyche is chillingly impersonal, but in a way that is much different to the staged impersonality of Kraftwerk, Numan and Visage which fascinated black American hip-hoppers, techno and house pioneers in the early Eighties. Scritti’s erasure of soul goes by way of a neurotically noteperfect, ultra-fastidious simulation of a hyper-Americanised “language of love”. It is no longer a matter of technical machines versus real emotional beings, but of “authentic emotion” as itself the refraining of signifying and sonorous machines. (It is therefore no surprise that another destroyer of soul, Miles Davis, should have covered Scritti songs and collaborated with Green.)
All of which goes some way to explaining the title of the new album, which initially seems grossly inappropriate, since the songs’ souffle lightness could not be further from carnal carbohydrate stodge or beery bloating. But what if these substances are not “basic” and “life-giving” but the non-vital excess without which life would be nothing? What if “white bread” indicates not the normal and the nutritious but the synthetic, and “black” beer indicates not the homey and the heavy but the addictive?
There is a performative flatness about the opening track and first single, “The Boom Boom Bap”; it a song about longing and addiction, which is itself arrestingly, gorgeously addictive. I can honestly say that I was hooked from the moment I heard Green sing the opening phrase, the song’s title. “The Boom Boom Bap” is so sublimely, achingly poised that the temptation is to keep hitting rewind, to remain lost in the song’s plateau, in which pop’s habitual urgencies are anti-climatically suspended.
Play it over and over again/play it over and over again
“The Boom Boom Bap”, as Green told Simon, is ostensibly about the thin line “between being in love with something and being unhealthily addicted to it”. The three addictions with which the song deals are drinking, hiphop — “the title itself is named after hip-hop’s bass-boom and syncopated breakbeats” — and love. Addiction is the pathological motor of life. “The beat of my life” is not any natural, biological rhythm but the non-organic pulse of the (death) drive. “If hooks could kill”, Green muses, knowing that of course they can; that being hooked can be lethal, but that not being hooked on anything is even more deadly and deadening.
When you do eventually pull yourself out of the honeyed embrace of “The Boom Boom Bap”, you find yourself yielding to an album of folds and fragments, slivers and sketches, in which everything comes to an end before you expect it to (amplifying your longing to hear it, again and again). Thankfully, Green’s obsession with hip-hop emerges not through the brute presence of rap (what could be more present, now, than rap?), but via a certain absence in the production which prevents the tracks ever closing into organic wholeness.
I was discussing with Owen the other week how mid-Eighties technology drew almost all pop into an arid, dated, hyper-glossed blandness: the two most conspicuous exceptions to this trend were Cupid and Psyche (which succeeds precisely because of its total identification with the time and the technics) and Kate Bush’s Hounds of Love. White Bread, Black Beer is like Green’s late-arriving Hounds of Love, an album in which pop’s history (and his own) can be re-visited without being reiterated, in which styles can be traversed without their ever being a question of inconsistent eclecticism. The very refusal to strain for contemporaneity makes the album far more now than it would have been if it engaged in an unseemly pursuit of street cool.
The references to London, to “British Homes Stores”, to the names of Green’s schoolteachers, restore some of the locality that was remorselessly stripped away by the proto-Starbucks “third place” mid-Atlantic sheen of Cupid and Psyche. This, evidently, also means a restoration of some biographical specificity; the songs are no longer lover’s labyrinths that anyone can enter, but memory lanes some of whose landmarks only Green can recognise. Amidst all of the trails of influence you can trace across the album, those Appolinians, Brian Wilson and Paul McCartney, recur most insistently. Would the album then be a redemption through melody? A recovery — from sickness? A recovery — of the self?
And when I’m with you baby
I know just who I am
And no one understands the way that you do
Darling
Hearing Green sing lines like these is a curiously haunting and unsettling experience, since Green’s voice carries with it all those Cupid and Psyche traces which ironise and undercut any gestures towards “really meaning it” or “really being” anything. In any case, listen closer to the song in which those lines occur (“Locked”), and all is not as it seems. “People want a piece of me”, Green sings, “but who they get is not what she seems”. In any case, autobiography would still be a form of writing (and the most deceptive kind), and the “you” that is the usual addressee of the love song is never the ostensible partner, the “real flesh and blood person”, but the big Other. Hence David Kelsey in Highsmith’s This Sweet Sickness — a Scritti title if ever there was one — a man who conducts his pathological love affair primarily through letters written to his fantasised Other (and which are ignored and misunderstood by their supposed flesh and blood object), is the lover in its purest state… All the parallels of love with addiction on White Bread, Black Beer suggest that Green the writer still knows that love is essentially both pathology and cure, so Scritti’s sweetness remains sick, their sickness sweet…
postmodernism as pathology, part 2
The thing is, Robbie, there’s no rehabilitation from PoMo.
The sickness that afflicts Robbie Williams is nothing less than postmodernity itself. Look at Williams: his whole body is afflicted with reflexive tics, an egoarmoury of grimaces, gurns and grins designed to disavow any action even as he performs it. He is the “as if” pop star — he dances as if he is dancing, he emotes as if he is emoting, at all times scrupulously signalling — with perpetually raised eyebrows — that he doesn’t mean it, it’s just an act. He wants to be loved for “Rudebox” but, unfortunately for him, his audience demands the mawkish sentimentality of “Angels”. How Robbie must hate that song now, with its humbling reminders of dependency (Williams’ career went into the stratosphere on the basis of “Angels”) and lost success…
Let me entertain you, let me lead you
There’s surely a Robin Carmody-type analysis to be done of the parallels between Williams and Tony Blair. Williams’ first album, the tellingly-titled Life Thru a Lens was released in 1997, the year of Blair’s first election victory. There followed for both a period of success so total that it must have confirmed their most extravagant fantasies of omnipotence (Blair unassailable at two elections; Williams winning more Brit awards than any other artist). Then, a decade after their first success, an ignominious decline into irrelevance (the post-Iraq Blair limping out of office as a lame-duck leader, Williams releasing a disastrous album and checking himself into rehab on the day before this year’s Brit awards, at which he had received a derisory single nomination). Of course, there are limits to the analogy: Blair is popular in the States, whereas Robbie…
Williams and Blair are two sides of one Joker Hysterical face: two cracked actors, one given over to the performance of sincerity, the other dedicated to the performance of irony. But both, fundamentally, actors — actors to the core, to the extent that they resemble PKD simulacra, shells and masks to which one cannot convincingly attribute any inner life. Blair and Williams seem to exist only for the gaze of the other. That is why it is impossible to imagine either enduring private doubts or misgivings, or indeed experiencing any emotion whose expression is not contrived to produce a response from the other. As is well known, Blair’s total identification with his publicly-projected messianic persona instantly transforms any putatively private emotion into a PR gesture; this is the spincerity effect (even if he really means what he is saying, the utterance becomes fake by dint of its public context). The image of Blair or Williams alone in a room, decommissioned androids contemplating their final rejection by a public which once adored them, is genuinely creepy.
It is perfectly possible to imagine Robbie exhibiting public doubts, of course — indeed, as his former reflexive potency declines into reflexive impotence, he is most likely to be seen insisting upon his inadequacy and failure. No doubt this is why Williams’ announcement of his “addiction” to anti-depressants and caffeine has been greeted with a certain scepticism (suspicion has been aroused in part because of the timing of the announcement, on the eve of the Brits). But this scepticism misses the point. Williams’ sickness is, precisely, his incapacity to do or experience anything unless it provokes the attention of the other.
Or: as Liam Gallagher more succinctly put it, in words worthy of Mr Agreeable at his compassionate best:
If you’ve got a fucking problem, why do you want the whole world to know about it? I say sort yourself out. You make a fucking crap album then want everyone to feel sorry for you. What a fucking tosser.
choose your weapons
People are often telling me that I ought to read Frank Kogan’s work, but I’ve never got around it. (Partly that’s because, Greil Marcus apart, I’ve never really tuned into much American pop criticism at all, which in my no doubt far too hasty judgement has seemed to be bogged down in a hyper-stylised faux-naif gonzoid mode that has never really appealed to me.) The — again, perhaps unfair — impression I have is that, in Britain, the battles that Kogan keeps on fighting were won, long ago, by working-class autodidact intellectuals. No doubt the two recent pieces by Kogan that Simon has linked to are grotesquely unrepresentative of his work as a whole (I certainly hope so, since it is difficult to see why so many intelligent people would take his work seriously if they weren’t), but it’s hard not to read them as symptomatic, not only of an impasse and a malaise within what I now hesitate to call “popism”, but of a far more pervasive, deeply-entrenched cultural conservatism in which so-called popism is intrinsically implicated.
Remember, in the immediate wake of 9/11, all those po-faced Adornoite proclamations that there would be “no more triviality” in American popular culture after the Twin Towers fell? There can be few who, even when the remains of the Twin Towers were smouldering, really believed that US pop culture would enter a new thoughtful, solemn and serious phase after September 11th — and it’s surely superfluous to remember, at this point, that what ensued was a newly vicious cynicism soft-focused by a piety that only a wounded Leviathan assuming the role of aggrieved victim can muster — but would anyone, then, have believed that, only six years later, a supposedly serious critic would write a piece called “Paris [Hilton] is our Vietnam…”,[2] especially, when, in those years, there has, like, been another Vietnam. What we are dealing with in a phrase like “Paris is our Vietnam” is not trivia — this isn’t the collective narcissism of a leisure class ignorant of geopolitics — but a self-conscious trivialisation, an act of passive nihilistic transvaluation. Debating the merits or otherwise of a boring heiress have been elevated to the status of a political struggle; and not even by preening aesthetes in some Wildean/Warholian celebration of superficiality, but by middle-aged men in sweat pants, sitting on the spectator’s armchair at the end of history and dissolutely flicking through the channels.
The end of history is the nightmare from which I am trying to awake.
At least the “Paris is Vietnam” piece laid bare the resentment of resentment that I have previously argued is the real libidinal motor of “popism” — “we love Paris all the more because others hate her (but luckily we loved her any way, honest!)”. But this latest piece[3] Simon has linked to is, if anything, even more oddly pointless and indicative. Unlike the pleasantly mediocre Paris Hilton LP, the ostensible object of the piece, Backstreet Boys’ single “Everybody (Backstreet’s Back)” is actually rather good. Practically everyone I know liked it. The problem is the idea that saying this is in some way news in 2007. No word of a lie, I had to check the date on that post, assuming, at first, that it must have been written a decade ago.
The article makes me think that, if the motivating factor with British popists is, overwhelmingly, class, with Americans it might be age. Perhaps those a little deeper into middle age than I am were still subject to the proscriptions and prescriptions of a Leavisite high culture. But it seems to me that popists now are like Mick Jagger confronted with punk in 1976: they don’t seem to realise that, if there is an establishment, it is them. Even if the “Nathan” with whom Kogan debates exists — and I’ll be honest with you, I’m finding it hard to believe that he does — his function is a fantasmatic one (in the same way that Lacan argued that, if a pathologically jealous husband is proved right about his wife’s infidelities, his jealousy remains pathological): for popists to believe that their position is in any way challenging or novel, they have to keep digging up “Nathans” who contest it. But, in 2007, Nathan’s hoary old belief that only groups who write their own songs can be valid has been refuted so many times that it is rather like someone mounting a defence of slavery today — sure, there are such people who hold such a view, but the position is so irrelevant to the current conjuncture that it is quaintly antiquated rather than a political threat. There may be a small minority of pop fans who claim to hold Nathan’s views; but, given the success of Sinatra, the Supremes, Elvis Presley and the very boybands that popists think it is so transgressive to re-evaluate, those views would in most cases be performatively contradicted by the fans’ actual tastes. (Kogan does grant that the problem is not so much fans’ tastes as their accounts of them — but the unspoken assumption is that it is alright, indeed mandatory, to contest male rock fans’ accounts of their own tastes, but that the aesthetic judgements of the figure with which the popist creepily identifies, the teenage girl, ought never to be gainsaid.) (The other irony is that, if you talk to an actual teenager today, they are far more likely to both like and have heard of Nirvana than they are the Backstreet Boys.)
The once-challenging claim that for certain listeners, the (likes of) Backstreet Boys could have been as potent as (the likes of) Nirvana has been passive-nihilistically reversed — now, the message disseminated by the wider culture — if not necessarily by the popists themselves — is that nothing was ever better than the Backstreet Boys. The old high culture disdain for pop cultural objects is retained; what is destroyed is the notion that there is anything more valuable than those objects. If pop is no more than a question of hedonic stim, then so are Shakespeare and Dostoyevsky. Reading Milton, or listening to Joy Division, have been re-branded as just another consumer choice, of no more significance than which brand of sweets you happen to like. Part of the reason that I find the term “popism” unhelpful now is that implies some connection between what I would prefer to call Deflationary Hedonic Relativism and what Morley and Penman were doing in the early Eighties. But their project was the exact inverse of this: their claim was that as much sophistication, intelligence and affect could be found in the pop song as anywhere else. Importantly, the music, and the popular culture of the time, made the argument for them. The evaluation was not some fitsall-eras a priori position, but an intervention at a particular time designed to have certain effects. Morley and Penman were still critics, who expected to influence production, not consumer guides marking commodities out of five stars, or executives spending their spare time ranking every song with the word “sugar” in it on live journal communities that are the cyberspace equivalent of public school dorms.
Whereas Morley and Penman (self-taught working-class intellectuals both) complicated the relationship between theory and popular culture with writing that — in its formal properties, its style and its erudition, as well as in its content — contested commonsense, Deflationary Hedonic Relativism merely ratifies the empiricist dogmas that underpin consumerism. More than that, Owen Hatherley has astutely observed that, in addition to reiterating the standard Anglo-American bluff dismissal of metaphysics, the Deflationary Hedonistic Relativist disclaiming of theory (“we just like what we like, we don’t have a theory”) uncannily echoes the dreary mantras of the average NME indie band: “we just do what we do, anything else is a bonus”, “the music is the only important thing”. In the UK, the rhetorical fight between “popists” and indie is as much a phoney war as the parliamentary political punch-and-judy show between Cameron’s Tories and Brown’s New Labour: a storm in a ruling-class tea-cup. In both cases, the social reality is that of ex-public schoolkids carrying on their inter-house rivalries by other means. In the case of both indie and popism, there is a strangely inverted relationship to populism and the popular. While the “popists” claim to be populist but actually support music that is increasingly marginal in terms of sales figures, the indie types claim to celebrate an alternative while their preferred music of choice (Trad skiffle) has Full Spectrum Dominance (you can’t listen to Radio 2 for fifteen minutes without hearing a Kaiser Chiefs song). In many ways, because it was attempting to analyse a genuinely popular phenomenon, Simon’s defence of Arctic Monkeys was more genuinely popist than all of the popist screeds on Paris Hilton’s barely-bought LP — but of course much of the impulse behind them was the ultra-rockist desire to be seen thumbing one’s nose at critical consensus. Witness the genuinely pathetic — it certainly provokes pathos in me — attempt to whip up controversy about the workmanlike plod of Kelly Clarkson, on a blog which, in its combination of hysterical overheating and dreary earnestness, is as boring as it is symptomatic — though, I have to confess I have never managed to get to the end of a single post, a problem I have with a great many “popist” writings, including the magnum opus of popism, Morley’s Words and Music.
Much as he occasionally flails and rails against popist commonplaces (see, for instance, his recent — I would argue unwarranted — attack on Girls Aloud), Morley is as deeply integrated into Deflationary Hedonic Relativist commonsense as Penman is excluded from it. What was the strangely affectless Words and Music if not a description of the OedIpod from inside? All those friction-free freeways, those inconsequent consumer options standing in for existential choices… Yet Morley is still a theorist of the ends of history and of music, still too obviously in love with intelligence to be fully plugged into the anti-theoretical OedIpod circuitry. Even so, Ian’s silence speaks far louder than Morley’s chatter, and, after my very few dealings with Old Media, I’m increasingly seeing Ian’s withdrawal, not as a tragic failure, but as a noble retreat.
All of UK culture tends to the condition of the clip show, in which talking heads — including, of course, Morley — are paid to say what dimwit posh producers have decided that the audience already thinks over footage of what everyone has already seen. I recently had dealings with an apparatchik of Very Old Media. What you get from representatives of VOM is always the same litany of requirements: writing must be “light”, “upbeat” and “irreverent’. This last word is perhaps the key one, since it indicates that the sustaining fantasy to which the young agents of Very Old Media are subject is exactly the same as the one in which popists indulge: that they are refusing to show “reverence” to some stuffy censorious big Other. But where, in the dreary-bright, dressed-down sarky snarky arcades of postmodern culture, is this “reverence”? What is the postmodern big Other if it is not this “irreverence” itself? (Only people who have not been in a university humanities dept for a quarter of a century — i.e. not at all your bogstandard Oxbridge grad Meeja employee/leisure-time popist — could really believe that there is some ruthlessly-policed high-culture canon. When Harold Bloom wrote The Western Canon it was as a challenge to the relativism that is hegemonically dominant in English Studies.) I’ve quickly learned that “light”, “upbeat” and “irreverent” are all codes for “thoughtless” and “mundanist”. Confronted with these values and their representatives — who, as you would expect, are much posher than me — I often encounter a cognitive dissonance, or rather a dissonance between affect and cognition. Faced with the Thick Posh People who staff so much of the media, I feel inferiority — their accents and even their names are enough to induce such feelings — but think that they must be wrong. It is this kind of dissonance that can produce serious mental illness; or — if the conditions are right — rage.
Anti-intellectualism is a ruling-class reflex, whereby ruling-class stupidity is attributed to the masses (I think we’ve discussed here before the ruse of the Thick Posh Person whereby make a show of pretending to be thick in order to conceal that they are, in fact, thick.) It’s scarcely surprising that inherited privilege tends to produce stupidity, since, if you do not need intelligence, why would you take the trouble to acquire it? Media dumbing down is the most banal kind of self-fulfilling prophecy.
As Simon Frith and Jon Savage long ago noted in their NLR essay, “The Intellectuals and the Mass Media”, which Owen Hatherley recently brought to my attention again, the plain common-man pose of the typical public school and Oxbridge-educated media commentator trades on the assumption that these commentators are far more in touch with “reality” than anyone involved in theory. The implicit opposition is between media (as transparent window-on-the-world transmitter of good, solid commonsense) and education (as out-of-touch disseminator of useless, elitist arcanery). Once, media was a contested ground, in which the impulse to educate was in tension with the injunction to entertain. Now –- and the indispensable Lawrence Miles is incisive on this, as on so many other things, in his latest compendium of insights — Old Media is almost totally given over to a vapid notion of entertainment — and so, increasingly, is education.[4]
In my teenage years, I certainly benefited far more from reading Morley and Penman and their progeny than from the middlebrow dreariness of much of my formal education. It’s because of them, and later Simon and Kodwo, et al., that I became interested in theory and bothered to pursue it in postgraduate study. It is essential to note that Morley and Penman were not just an “application” of high theory to low culture; the hierarchical structure was scrambled, not just inverted, and the use of theory in this context was as much a challenge to the middle-class assumptions of Continental Philosophy as it was to the anti-theoretical empiricism of mainstream British popular culture. But now that teaching is itself being pressed into becoming a service industry (delivering measurable outputs in the form of exam results) and teachers are required to be both child minders and entertainers, those working in the education system who still want to induce students into the complicated enjoyments that can be derived from going beyond the pleasure principle, from encountering something difficult, something that runs counter to one’s received assumptions, find themselves in an embattled minority. Here we are now entertain us.
The credos of ruling-class anti-intellectualism that most Old Media professionals are forced to internalise are far more effective than the Stasi ever was in generating a popular culture that is unprecedentedly monotonous. Put it this way: a situation in which Lawrence Miles languishes, at the limits of mental health, barely able to leave his house, while the likes of Rod Liddle swagger around the mediascape is not only aesthetically abhorrent, it is fundamentally unjust. Contrary to the “it’s only hedonic stim” deflationary move that both Stekelmanites and popists share, popular culture remains immensely important, even if it only serves an essential ideological function as the background noise of a capitalist realism which naturalises environmental depredation, mental health plague and sclerotic social conditions in which mobility between classes is lessening towards zero.
A class war is being waged, but only one side is fighting.
Choose your side. Choose your weapons.
variations on a theme
Music critic Paul Morley has written a catalogue essay to accompany a recent installation by American artist Cory Arcangel, a couple thousand short films about Glenn Gould (2007). Or rather, Morley has assembled most of the text in the same way that Arcangel assembled his video montage — from fragments found on the internet. Arcangel’s installation consists of a version of Bach’s Goldberg Variations (1741) meticulously constructed from YouTube samples of individual notes played by amateurs. By making the connection between YouTube and Gould, the bricolages invite a comparison between user-generated content and the production methods of the modernist-creator figure.
Does user-generated content make possible a new form of artistry, prefigured in both Gould’s approach to the recording studio and in Wendy Carlos’ synthesizer renditions of Bach? Or are Gould and Carlos being positioned as anticipating the dissolution of the individual artist in an anonymous digital network?
Morley’s own position on these questions has been studiedly equivocal. Originally a journalist at the NME in the late 1970s, Morley has found himself gradually absorbed into the 1990s clip-show culture of chatty ephemera. His embrace of superficiality and gloss in the early Eighties played more than a small part in ushering that culture in, though what was envisaged as a revolt against post-punk austerity plays very differently in today’s pervading climate of populism. In the introductory section of Morley’s recent catalogue essay — seemingly the only section that he wrote as such — the text is positioned as the sequel to his 2003 book, Words and Music: A History Of Pop In The Shape Of A City. Morley is averse to definitive claims, but Words And Music seemed to want to establish a continuity between high modernism and pop at its most apparently disposable, a continuity exemplified by the book’s opening juxtaposition of Kylie Minogue and experimental composer Alvin Lucier. But Morley’s ultimate motive was artfully veiled by a spaghetti junction of convolutions and deferrals; it was unclear whether he sought to vindicate the avant-garde through its impact on popular culture or to ennoble pop via its incorporation of the avant-garde, or both, or neither.
His Arcangel essay retains a certain amount of this ambivalence, but, in its gnomic brevity, it is far more suggestive than the often tiresome Words And Music, which felt at times like being trapped inside an interminable series of iPod playlists. Via thumbnail portraits of the likes of Gould, Carlos, the BBC Radiophonic Workshop’s Delia Derbyshire, Genesis P-Orridge and Robert Moog, the montage follows a number of associative lines connecting music, transgendering and electronics. By paralleling Arcangel’s methodology, Morley might have wanted to imply that the electronic music of the Sixties, Seventies and Eighties paved the way for the networked world of user-generated content of which YouTube is a part. But the pop examples that figure in the text most insistently — Gary Numan, the Human League — belong not to this decade, but to a post-punk moment thirty years ago. Perhaps in spite of itself, the text ends up reading less like a justification of twenty-first-century popular culture and its modes of consumption and more like a requiem for a past moment of popular modernism, a lost circuit between pop, new technological developments and the avant-garde.
Morley’s text implicitly poses some of the questions which an essay in Philosophy Now by Alan Kirby addresses explicitly.[2] Kirby talks of a new type of “text” — a text we are all now very familiar with — “whose content and dynamics are invented or directed by the participating viewer or listener (although these latter terms, with their passivity and emphasis on reception, are obsolete: whatever a telephoning Big Brother voter or a telephoning 6-0-6 football fan are doing, they are not simply viewing or listening).” Oddly, Kirby labels these texts “pseudo-modernist”, arguing that this “pseudo-modernism” has now superseded postmodernism. Kirby’s understanding of postmodernism suffers from being exclusively derived from literary studies, which has defined postmodernism narrowly, in terms of a set of reflexive strategies based around so-called “meta-fictions” such as Vladimir Nabokov’s Pale Fire (1962). But far from marking a move beyond postmodernism, the shift from creator to recipient, from producer to consumer, that Kirby describes is exactly what the most acute theorists of postmodernism — Jean Baudrillard and Fredric Jameson — had long ago got to grips with. Reading Baudrillard’s texts from the 1970s, with their extended discussions of reality TV and the “referendum mode”, is to confront analyses that now seem preternaturally prescient. What has been made obsolete is not Baudrillard and Jameson’s mordant anticipations of the monotony that would ensue in the name of viewer and consumer “involvement”, but those positions which claimed that eroding the privilege of the author and the artist carries a subversive charge.
What Kirby calls the “new weightless nowhere of silent autism” has eroded the popular modernism which Morley once belonged to just as much as it has eliminated the high cultural resources of traditional modernism. As Kirby indicates, far from leading to new forms, user-generated content has tended towards retrenchment and consolidation — for example, YouTube (for the most part) recycles old material, or else provides a space in which millions of aspirant stars ape idols whose status — established by the old systems of distribution and valuation — remains secure. Instead of being cowed by the relentless demands for viewer participation, both cultural producers and the much-derided “gatekeepers” need to find new ways of asserting the primacy of production over consumption. They need to find ways of stepping outside seamless circuits in which “everyone” is implicated but no one gets what they want. In another catalogue essay for a couple thousand short films, curator Steven Bode argues that Arcangel’s installation is “less an advert for networked participatory culture than an index of people’s increasing atomisation”. If postmodern culture presents a kind of networked solipsism, perhaps what Gould can now teach us most is the value of disappearance from the screens that eagerly seek our image. Gould, who famously retired early from concert playing, showed that sometimes it is necessary to withdraw in order to find better ways to connect.
running on empty
In 2006, James Kirby, the man behind the V/Vm record label and the Caretaker, began a download project called The Death of Rave. The tracks have a thin, almost translucent quality, as if they are figments or phantoms of the original, exhilarating sound of rave. When I interviewed Kirby recently, he explained that the project had been initiated to commemorate a certain energy that he believes has disappeared from dance music. (Energy Flash was, of course, the title the critic Simon Reynolds gave to his compendious study of rave music and its progeny.) The question is: were rave and its offshoots jungle and garage just that — a sudden flash of energy that has since dissipated? More worryingly, is the death of rave only one symptom of an overall energy crisis in culture? Are cultural resources running out in the same way as natural resources are?
Those of us who grew up in the decades between the 1960s and the 1990s became accustomed to rapid changes in popular culture. Theorists of future shock such as Alvin Toffler and Marshall McLuhan plausibly claimed that our nervous systems were themselves sped up by these developments, which were driven by the development and proliferation of technologies. Popular artefacts were marked with a technological signature that dated them quite precisely: new technology was clearly audible and visible, so that it would be practically impossible, say, to confuse a film or a record from the early 1960s with one from even half a decade later.
The current decade, however, has been characterised by an abrupt sense of deceleration. A thought experiment makes the point. Imagine going back fifteen years in time to play records from the latest dance genres — dubstep, or funky, for example — to a fan of jungle. One can only conclude that they would have been stunned — not by how much things had changed, but by how little things have moved on. Something like jungle was scarcely imaginable in 1989, but dubstep or funky, while by no means pastiches, sound like extrapolations from the matrix of sounds established a decade and a half ago.
Needless to say, it is not that technology has ceased developing. What has happened, however, is that technology has been decalibrated from cultural form. The present moment might in fact be best characterised by a discrepancy between the onward march of technology and the stalling, stagnation and retardation of culture. We can’t hear technology any more. There has been a gradual disappearance of the sound of technological rupture — such as the irruption of Brian Eno’s analogue synth in the middle of Roxy Music’s “Virginia Plain”, or the cut-and-paste angular alienness of early rave — that pop music once taught us to expect. We still see technology, perhaps, in cinema CGI, but CGI’s role is somewhat paradoxical: its aim is precisely to make itself invisible, and it has been used to finesse an already established model of reality. High-definition television is another example of the same syndrome: we see the same old things, but brighter and glossier.
The principal way in which technology now makes itself felt in culture is of course in the areas of distribution and consumption. Downloading and Web 2.0 have famously led to new ways of accessing culture. But these have tended to be parasitic on old media. The law of Web 2.0 is that everything comes back, whether it be adverts, public information films or long-forgotten TV serials: history happens first as tragedy, then as YouTube. The pop artists who supposedly became successful because of web clamour (Sandi Thom, Arctic Monkeys) turned out to be quaintly archaic in form; in any case, they were pushed through the familiar promotional machinery of big record companies and PR firms. There is peer-to-peer distribution of culture, but little sign of peer-to-peer production.
The best blogs are one exception; they have bypassed the mainstream media, which, for the reasons described by Nick Davies in last year’s Flat Earth News, has become increasingly conservative, dominated by press releases and PR. In general, however, Web 2.0 encourages us to behave like spectators. This is not only because of the endless temptations to look back offered by burgeoning online archives, it is also because, thanks to the ubiquity of recording devices, we find ourselves becoming archivists of our own lives: we never experience live events, because we are too busy recording them.
Yet instantaneous exposure deprives cultures of the time and space in which they can grow. There is as yet no Web 2.0 equivalent of the circuit that sustained UK dance music in the 1990s: the assemblage of dubplates, pirate radio and the dance floor which acted as a laboratory for the development of new sounds. This circuit was still punctuated by particular moments (the club night, the radio broadcast), but, because anything in Web 2.0 can be replayed at any time, its temporality is more diffuse. The tendency seems to be for a kind of networked solipsism, a global system of individuals consuming an increasingly homogeneous culture alone in front of the computer screen or plugged in to iPod headphones.
All of this makes Fredric Jameson’s theories about postmodern culture’s inability to image the present more compelling than ever. As the gap between cultural breaks becomes ever longer and the breaks themselves become ever more modest and slight, it is beginning to look as if the situation might be terminal. Alex Williams, who runs the Splintering Bone Ashes blog, goes so far as to claim that “what we have experienced is merely a blip, perhaps never to be again repeated — 150 or so years of extreme resource bingeing, the equivalent of an epic amphetamine session. What we are already experiencing is little more than the undoubtedly grim ‘comedown’ of the great deceleration.” This might be too bleak. What is certainly clear, however, is that technology will not deliver new forms of culture all on its own.
you remind me of gold: dialogue with mark fisher and simon reynolds
Kaleidoscope Magazine: The first question is linked to my experiencing UK dance music of the Nineties as a person living in a different country — via imported records and the British music press — and one interesting thing was the idea of “futurism” that seemed to permeate the scenes: in terms of how the press presented the music as an area of advancement because made with “machines”. What are, if any, are the futuristic elements and aspects in UK Nineties dance music and culture?
Simon Reynolds: The word “future” does not crop up in contemporary dance music discourse — in either the conversations surrounding the music, or in track titles and artist names — with anything like the frequency it did during the Nineties. From artists with names like Phuture, the Future Sound of London, Phuture Assassins, etc. to UK rave/early jungle which teemed with titles like “Futuroid”, “Living for the Future”, “We Are the Future”, etc., the whole culture seemed tilted forwards. Everyone was in a mad rush to reach tomorrow’s sound ahead of everyone else. That ethos continued into the early days of dubstep with the club name FWD». But looking at the last half-decade or so of UK dance music, I really struggle to think of any equivalent examples. Soul Jazz just put out a compilation of post-dubstep called Future Bass, and then you have the “future garage” sub-genre, although the irony here is that this direction involves going back to the 2step rhythm template circa 1998-2000. But generally speaking the whole idea of the future seems to have lost its libidinal charge for electronic producers and for fans alike. This seems to reflect the fact that dance music in the UK, and globally, is no longer organised along an extensional axis (projecting into the unknown, like an arrow fired into the night sky) but is intensive: it makes criss-crossing journeys within the vast terrain that was mapped out during the hyper-speed Nineties.
It seems symptomatic to me that “Gold”, the single off the debut album by Darkstar, is a cover of a Human League b-side from almost thirty years ago. It’s definitely an interesting move for Darkstar to make, in terms of their previous music and the scene they’re from, dubstep. But as an aesthetic act the creativity involved is curatorial rather than innovation in the traditional-modernist sense: it’s about finding an obscure, neglected song and re-situating it within the historical narratives of British electronic music. The whole idea of doing a cover version, which is totally familiar as an artistic move within rock, is still pretty unusual within electronic music culture. What also struck me listening to the remake next to the original (which I’d never heard before) is that both versions sound more or less as “futuristic” as each other. Well, the Darkstar reinterpretation obviously is technically more advanced in many ways; there are things done on it sonically that weren’t available to the Human League and their producer Martin Rushent. But in terms of the overall aesthetic sensation generated, neither version seems any further “into the future” than the other. Certainly, it doesn’t feel like there’s thirty years difference between the two. And it’s that precisely that feeling — that the Human League are contemporary with us — that is so mysterious and hard to explain. They ought to sound to us as ancient as early Fifties fare (Johnny Ray, say, or Louis Jordan) would have done in 1981 heard next to the Human League of “Love Action”.
Mark Fisher: The problem is that the word “futuristic” no longer has a connection with any future that anyone expects to happen. In the Seventies, “futuristic” meant synthesizers. In the Eighties, it meant sequencers and cut and paste montage. In the Nineties, it meant the abstract digital sounds opened up by the sampler and its function such as time-stretching. In each of these cases, there was a sense that, through sound, we were getting a small but powerful taste of a world that would be completely different from anything we had hitherto experienced. That’s why a film like Terminator, with its idea of the future invading the present, was so crucial for Nineties dance music. Now, insofar as “futuristic” has any meaning, it is as a vague but fixed style, a bit like a typographical font. “Futuristic” in music is something like “Gothic” in fonts. It points to an already existing set of associations. “Futuristic” means something electronic, just as it did in the Sixties and Seventies. We’ve entered the flattened out temporality that Simon describes — the Nineties ought to be as distant as the Sixties felt in 1980, but now the Sixties, the Eighties and the Nineties belong to a kind of postmoderncuratorial simultaneity.
To take up the example that Simon uses. When you compare the Darkstar cover of “Gold” to the Human League original, it’s not just that one is no more futuristic than the other. It is that neither are futuristic. The Human League track is clearly a superseded futurism, while the Darkstar track seems to come after the future. I should say at this point that the Darkstar album is my favourite album of the year — I’ve become obsessed with it. (It might be worth noting here that one thing that’s happened since 2000 in dance music is the rise of the album. The Nineties was about scenes and singles; there weren’t any great albums. But since 2000, there have been Dizzee Rascal’s debut, the Junior Boys records, the two Burial albums and the Darkstar record. The temporal malaise I’m talking about hasn’t meant there are no good records — that’s not the problem at all.) Partly why I enjoy the Darkstar album is because, like many of the most interesting records of the last six or seven years, it seems to be about the failure of the future. This feeling of mourning lost futures isn’t so explicit as it was with the Burial records, but I believe it’s there at some level with Darkstar. Where with Burial you have a feeling of dereliction and spectrality, the lost future haunting the dead present, with Darkstar it’s a question of electronic rot, digital interference.
What you could hear behind so much Nineties dance music was a competitive drive — to sonically rearticulate what “futuristic” meant. The No U Turn track “Amtrak” features a sample: “Here is a group trying to accomplish one thing, and that is to get into the future.” But I think it’s uncontroversial to say that no one was aiming to get into the future that actually arrived. If a junglist were pitched straight into now from the mid-Nineties, it’s hard to believe that they wouldn’t be disappointed and bemused. In the interview that I did with Kodwo Eshun which formed the appendix of Kodwo’s More Brilliant Than the Sun, he contrasts the textual exhaustion of postmodernism with the genetic concept of recombination. I think Kodwo captures very well the recombinatorial euphoria that many of us felt then — the sense that there were infinite possibilities, that new and previously unimaginable genres would keep emerging, keep surprising us. But, sadly, what’s surprising from that Nineties perspective is how little has changed in the last ten years. As Simon has said, the changes that you can hear now are not massive rushes of the future, but tiny incremental shifts. That deceleration has brought with it a sense of massively diminished expectations, which no amount of tepid boosterism can cover over. My friend Alex Williams has posited the idea that cultural resources have been depleted in the same way that natural resources were. Perhaps this is a reflection of today’s cultural depression in the same way that the Nineties concepts were an expression of that decade’s exhilaration.
This isn’t just about nostalgia for one decade — the Nineties was at the end of a process that began with the rapid development of the recording industry after the Second World War. Music became the centre of the culture because it was consistently capable of giving the new a palpable form; it was a kind of lab that focused and intensified the convulsions that culture was undergoing. There’s no sense of the new anywhere now. And that’s a political and a technological issue, not a problem that’s just internal to music.
SR: The Darkstar album could almost have been designed to please me: it’s the convergence of the hardcore continuum, hauntology, and post-punk and new pop! It’s growing on me, but initially I found it a bit washed-out and listless. Still, Mark’s reading of it is typically suggestive. And I do think it is significant that an outfit operating in the thick of the post-dubstep scene, the FWD» generation, has made a record steeped in echoes of Orchestral Manoeuvres (their first LP in particular was apparently listened to heavily during the album’s making), New Order, and other early-Eighties synthpop. It also means something that a record coming out of dance culture is all about isolation, regret, withdrawal, mournfulness.
The Darkstar record is an example of a self-conscious turn towards emotionality in UK dance. Most of the album features a human voice and songs, sung by a new member of the group recruited specifically for that role. And just this week I’ve read about two other figures from the same scene — James Blake and Subeena — who are releasing their first tracks to feature their own vocals. But this turn to expressivity seems to me as much rhetorical as it is actually going on in the music. After all hardcore, jungle, UK garage, grime, bassline house were all bursting with emotion in their different ways. What people mean by “emotional” is introspective and fragile in ways that we’ve rarely seen in hardcore continuum music. (Obviously we’ve seen plenty of that in IDM going back to its start: Global Communications and Casino in Japan actually made records inspired by the death of family members.) The idea that artists and commentators are groping towards, without fully articulating, is that dance music no longer provides the kind of emotional release that it once did, through collective catharsis. So there is this turn inwards, and also a fantasy of a kind of publicly displayed inwardness: the widely expressed artistic ideal of “I want my tracks to make people cry on the dancefloor”. Because if people were getting their release in the old way (collective euphoria), why would tears be needed.
MF: I think part of the reason I like the Darkstar record so much is that I don’t hear it as a dance record. In my view, it’s better heard almost as mainstream pop that has been augmented by some dance textures. “Aidy’s Girl is a Computer” apart, if you heard the record without knowing the history, you wouldn’t assume any connection with dubstep. At the same time, North isn’t straightforwardly a return to a pre-dance sound. Much has been made of the synthpop parallels but — and the cover of the Human League track brings this out — it doesn’t actually sound very much like Eighties synthpop at all. It’s more a continuation of a certain mode of electronic pop that got curtailed sometime in the mid-Eighties.
SR: In the Nineties, drugs — specifically Ecstasy — were absolutely integral to this communal release. One of the reasons hardcore rave was so hyper-emotional was because its audience’s brains were being flooded with artificially stimulated feelings, which could be elation and excitement but also dark or emotionally vulnerable (the comedown from Ecstasy is like having your heart broken). One thing that intrigues me about dance culture in the 2000s is the near-complete disappearance of drugs as a topic in the discourse. People are obviously still doing them, in large amounts, and in a mixed-up polydrug way just like in the Nineties. There have been a few public scares from the authorities and the mainstream media, like the talk about ketamine a few years ago, and more recently with mephedrone. But these failed to catalyse any kind of cultural conversation within the dance scene itself. It is as if the idea that choice of chemicals could have any cultural repercussions or effects on music’s evolution has completely disappeared. Compare that with the Nineties, where one of the main strands of dance discourse concerned the transformative powers of drugs. There was a reason why Matthew Collin called his rave history Altered State and why I called my own book Energy Flash. That was a reference to one of the greatest and most druggy anthems in techno — Beltram’s “Energy Flash” (which features a sample about “acid, ecstasy” — but also to the more general idea of a psychedelics-induced flash of revelation or the “body flash” caused by stimulant drugs.
The turn to emotionality at the moment seems like an echo of a similar moment in the late Nineties, when the downsides of drugs were becoming clear and I started to hear from clubbing friends that they’d been listening to Spiritualized or Radiohead. But where that was a flight from E-motionality (from the collective high, now considered false or to have too many negative side effects, towards more introspective, healing music), the new emotionality in the post-dubstep scene is emerging in a different context. I’m just speculating here, but I wonder if it has anything to do with a dissatisfaction with Internet culture, the sort of brittle, distracted numbness that comes from being meshed into a state of perpetual connectivity, but without any real connection of the kind that comes from either one-onone interactions or from being in a crowd. The rise of the podcast and the online DJ mix, which has been hyped as “the new rave” but is profoundly asocial, seems to fit in here.
KM: The concept of futurism also contains the idea that a cultural form can capture the zeitgeist of an era and facilitate/modulate the vision of the one to come and by implication revolt against past cultural practices; this might also in this case translate with the idea of “the sound of now” that was a vastly common mood of UK dance music in the Nineties, and the continuous re-organisation of label, clubs, promoters, DJs in new networks and sub-genres that created an inbuilt obsolescence in the micro-scenes themselves. A sort of voluntary short-term memory imbalance that is hard to understand in the following decade — the Noughties — in which one of the most original and popular artist has been Burial, which has been one visible manifestation of a fixation with the past which has previously reached similar levels in indie-rock. Not to speak of the literalist approach of a very interesting artist as Zomby in “Where Were U in 92?”
SR: I was totally caught up in the Nineties rave culture and I can testify that there was a sensation of teleology, a palpable feeling that something was unfolding through the music. It would be easy to say in hindsight that this was an illusion but I’d rather honour the truth of how it felt at the time. On a month by month basis, you witnessed the music changing and there seemed to be a logic to its mutation and intensification. From hardcore to darkcore to jungle to drum ‘n’ bass to techstep, it felt like there was a destination, even a destiny, for the music’s relentless propulsion across the 1991 to 1996 timespan. I entered the scene in late 91, when the “journey” was already well underway, so you could say that the trajectory started as far back as 1988, when acid house originally impacted the UK.
Mine is a London-centric viewpoint, but similar trajectories were unfolding in Europe, with the emergence of gabba, and trance, or the evolution of minimal techno. There was a linear, extensional development, along an axis of intensification. Each stage of the music superseded the preceding one, like the stages of a rocket being jettisoned as it escapes the Earth’s atmosphere. And you are right that there was a forgetfulness, a lack of concern with the immediate past, because our ears were trained always on the future, the emerging Next Phase.
At a certain point the London-centric hardcore/jungle narrative took a swerve, slowing down in tempo and embracing house music’s sensuality, first with speed garage in 1997 and then with the even slower and sexier 2step. But that just seemed like a canny move to avoid an approaching dead end (one that drum ‘n’ bass would bash its collective head against for… ever since really!) The rhythmic complexification that had developed through drum ‘n’ bass carried on with speed garage and 2step, just in a less punitive way.
In the Noughties, especially in the last five years, the feeling you get from dance culture and the endless micro shifts within it is quite different — whatever the opposite of teleology is, that’s what you got! It is hard to identify centres of energy that could be definitively pinpointed as a vanguard. The closest thing in recent years might well be the populist “wobble” sector within dubstep, if only because there’s a kind of escalation of wobbleness going on there. There is a full-on, hardcore, take-it-to-extremes spirit to wobblestep. Ironically, the dubstep connoisseurs and scene guardians can’t stand wobble and have veered off into disparate welter of softcore “musical” directions. Wobble is quite a masculinist sound, it reminds me of gabba. But then it is easy to forget that the Nineties was all about this kind of punishing pursuit of extremes: the beats and the bass were a test to the listener, something you endured as much as enjoyed (or had to take drugs in order to withstand). The evolution of the music was measurable in a experiential, bodily way. Beats got tougher and more convoluted, textures got more scalding to the ear, atmospheres and mood got darker and more paranoid.
Apart from grime and aspects of dubstep, Nineties post-techno music overall seems to have retreated into “musicality” (in the conventional sense of the word) and pleasantness. So instead of that militant-modernist sense of moving forward into the future, the culture’s sense of temporality seems polymorphous and recursive. And this applies on the micro as well as macro level: individual tracks seem to have less “thrust” and drive, to be more about involution and recessive details.
Touching on the question of rave nostalgia, the question “Where Were U In 92?” posed by Zomby is interesting on a bunch of levels. There is an echo, possibly unintended, of the marketing slogan for American Graffiti (“Where were you in 62?”, the year the movie is set), George Lucas’s groundbreaking vehicle for mobilising and exploiting generational nostalgia. Then there is also the unexpected biographical fact that Zomby is perfectly capable of saying where he was in 92, because he was twelve and a precocious fan of hardcore rave (which further suggests he must have just followed the trajectory of the music through jungle and speed garage to dubstep just like me and Mark, only quite a bit younger). Even as the album offers a loving pastiche of old skool hardcore, there seems to be an element of mockery of ageing ravers with their “boring stories of glory days” (to quote Springsteen). That would probably appeal to younger dubstep fans who, unlike Zomby, didn’t live through rave as participants and probably find the legacy of the hardcore continuum to be an encumbrance, a burden. Finally, it’s intriguing that Zomby did this pastiche record as a one-off stylistic exercise, in between much more cutting-edge dubstep records such as the Zomby EP on Hyperdub. It suggests that Zomby’s generation can play around with vintage styles without the kind of fanatical identification with a lost era that you generally get with musical revivalism. It’s just a period style, something to revisit.
MF: The point is that the question “Where were you in 92?” makes sense, whereas the question “Where were you in 02?” (or indeed 08) doesn’t. One of the things that has happened over the last decade or so is the disappearance of very distinctive “feels” for years or eras — not only in music but in culture in general. I’ve got more sense of what 1973 was like than what 2003 was like. This isn’t because I’ve stopped paying attention — on the contrary, I’ve probably paid more close attention to music this decade than at any other time. But there’s very little “flavour” to cultural time in the way there once was, very little to mark out one year from the next. That’s partly a consequence of the decline of the modernist trajectory that Simon describes. (One slight difference I have with Simon is that I prefer the term “trajectory” to “teleology”. For me, what was exciting about the Nineties — and popular culture between the Sixties and the Nineties — was that sense of forward movement. But it didn’t feel linear, as if everything was inevitably heading in one direction towards one goal. Instead, there was a sense of teeming, of proliferation.) If time is marked now, it’s by technical upgrades rather than new cultural forms or signatures. But the technical upgrades increasingly seem to be manifested in terms of the distribution and consumption of culture rather than in terms of production. You can’t hear or see dramatic formal innovations — but you get a higher definition picture, or a greater storage capacity on your mp3 player. Adam Harper, one of the most interesting young critics, has made a case for the new culture of micro-innovation, arguing that the kind of music culture Simon and I are talking about here — defined in terms of scenes organised around generic formulas — is an historical relic, replaced by a culture of a thousand tiny deviations, an “infinite music”, in which the temporal recursion that Simon has referred to is not a problem but a resource. Yet, for me, this sounds suspiciously like the Intelligent Dance Music that people were praising before the hardcore continuum came along. It’s easy to forget that disdain for the supposed vulgarity and repetitiveness of scene-music was a critical commonplace until Simon and Kodwo made the case for “scenius” in dance music.
But it seems to me that the phenomenon we’re talking about here — temporal flavourlessness — is a symptom of a broader postmodern malaise. Every time I go back to read Fredric Jameson’s texts from the Eighties and early Nineties, I’m astonished by their prescience. Jameson was quick to grasp the way in which modernist time was being flattened out into the pastiche-time of postmodernity. When I read some of those texts in the Nineties, I thought that they described certain tendencies in culture, but that this was far from being the only story. Now, there’s only a very weak sense of there being any alternative to the postmodern end of history. The question is, is this all temporary or terminal?
SR: I should have also noted that one of the main reasons a sense of linear progress was physically felt during the Nineties was that between 1990 and 1997, techno got faster: there was an exponential rise in beats-per-minute, that accompanied all the other ways in which the music got harder, more rhythmically dense, and so forth. So as a dancer you felt like you were hurtling.
Mark mentions the idea of technical upgrades as the metric for a sense of progression in the last decade. This reminded me of a conversation I had with the Italian DJ and journalist Gabriele Sacchi. In the space of about fifteen minutes, Sacchi went from complaining that there had been no really significant formal advances in dance music since drum ‘n’ bass (he discounted dubstep, as I recall) to then commenting with approval of how advanced sounding records were now compared with ten years ago. What he meant is that they sounded better in terms of production quality: what’s available today in terms of technology, digital software, etc., to someone making, say, a house track, enables them to make much better-sounding records (in terms of drum sounds, the textures, the placement of sounds and layers in the mix). That sounded totally plausible to me and it may well be the defining quality of electronic dance music in the 2000s. You might say that the basic structural features of the various genres were established in the Nineties but what has improved is the level of detailing, refinement and a general kind of production sheen to the music. An analogy might be a shift from architectural innovation (the Nineties) to interior décor (the 2000s). Mark also mentions Fredric Jameson. His work — the big postmodernism book from 1991 but also, especially, A Singular Modernity — helped me see that rave in general and the UK hardcore continuum in particular had been a kind of enclave of modernism within a pop culture that was gradually succumbing to postmodernism. Coming out of street beats culture, without hardly any input from art schools and only the most vague, filtered-down notion of musical progress, it nonetheless constituted a kind of self-generated flashback to the modernist adventure of the early twentieth century. The hardcore continuum especially propelled itself forward thanks to an internal temporal scheme of continual rupturing: it kept breaking with itself, jettisoning earlier superseded stages. One small aside in A Singular Modernity struck me as both true and funny, when Jameson talks about the modernists being obsessed with measurement, “how do we determine what is really new?”. That struck me as the characteristic mindset of those who came up through the Nineties as critics. But the new generation of electronic music writers (and probably musicians too) don’t seem to respond to music in this way. It’s no longer about the lust for the unprecedented, about linear evolution and the rush into the unknown. It’s about tracking these endless involutionary pathways through the terra cognita of dance music history, the tinkering with inherited forms.
KM: Another topic I find very interesting is the fact that the dance music referred to as the hardcore continuum, even if it had an international resonance through the media, has maintained a strong local connotation and a somehow insular development (in other close genres as techno or house the localisation seemed to be less prominent even if, for example, the first ground breaking LP from the band Basement Jaxx resonates with a milieu of influences not too dissimilar to some other post-rave productions). Somehow some of the music in the continuum feels like a sonic cartography of London (or other cities in the UK), responding and being connected to very specific contexts. Is the geographical aspect something you use in the reception of this genres?
SR: Music from the hardcore continuum has obviously found audiences all over the world. The early breakbeat hardcore was universal rave music for a few years in the early Nineties. Jungle established scenes in cities from Toronto to New York to São Paolo and in its later incarnation as drum ‘n’ bass became a truly international subculture. The same applies to dubstep. And even the more London-centric styles like 2step and grime had really dedicated fans in countries all over the globe and small offshoot scenes in particular cities outside the UK. That said it is incontrovertible that the engine of musical creativity for hardcore continuum genres has always been centred in London, with outposts in other urban areas of the UK that have a strong multiracial composition, particularly Bristol, the Midlands, and certain northern cities like Sheffield, Leeds, and Leicester. The next stage of the music has always hatched in London.
That is related to pirate radio, the competition between DJ and MC crews both within a particular station and between stations. And the sheer number of pirate radio stations owes a lot to the urban landscape of London, the number of tower blocks to broadcast from, and the density of the population, and the existence of a sizeable minority (in both the racial and aesthetic sense) whose musical taste is not catered for by state-run radio or by the commercial radio stations (including the commercial dance station Kiss FM). This competition — expressed through the pirates striving to increase their audience share but also through raves and clubs competing for dancers — is partly economic and partly purely about prestige, aesthetic eminence. And it has stoked the furnace of innovation.
That London-centric system focused around illegal radio stations seems to be gradually disintegrating. It is still what fuels the funky house scene, its primary audience is still “locked on” to the pirate signal. In fact I’m told that there aren’t many funky raves or clubs at all, and hardly any vinyl releases or compilations, so the only way to hear funky is through the pirate transmissions. But dubstep, like drum ‘n’ bass before it, is much more of UK national scene, and also an international scene. Martin Clark, a leading journalist on the scene and also a DJ and recording artist using the name Blackdown, told me something interesting. The Rinse FM show that he and Dusk do, which is eclectic post-dubstep in orientation, gets a high proportion of its audience responses, message and requests, through the internet, from as far afield as Finland or New Zealand (the Rinse FM signal goes out on the internet as well as broadcast through the air). But the pure funky house shows get most of their requests and calls as texts from cellphone users who live within the terrestrial broadcast range of the pirate stations. So funky is still a local scene in the traditional hardcore continuum sense, it is very much East London.
But I think that London-centric orientation is on the decline. Dubstep is fully integrated with the web, it’s all about podcasts and DJ mixes circulating on the web, about message board discussions. I think of funky as the “dwarf star” stage of the hardcore continuum: it has shrunk in sise, still emits some heat in the sense of vibe and musical creativity, but it hasn’t been able to command attention beyond the pre-converted diehards, in the way that jungle or grime once did. If you look at funky, it’s the first hardcore continuum sound not to have any UK chart hits at all. It’s not spawned any offshoot scenes in foreign countries. It hasn’t achieved critical mass in the sense of non-dance specialist journalists giving it the time of day. Jungle and grime got mainstream coverage because they simply couldn’t be ignored, they were so aggressively new and extreme. But funky, to people who don’t follow the minutiae of the hardcore continuum, just sounds like “tracky” house music with slightly odd-angled beats and a London flavour. It’s not anthemic enough to make it as pop like 2step garage did, but it doesn’t have the vanguard credentials of jungle.The interesting thing about the hardcore continuum is the way that during its prime it refuted all that Nineties internet and info-culture rhetoric about deterritorialisation. This was a music culture that derived its strength and fertility from its local nature, precisely from being territorialised. Indeed during the early days of jungle and of grime, it had a kind of fortress mentality. That seems to connect with its vanguardism, this military-modernist mindset.
Another thing is that the hardcore continuum genres were very slow to get integrated with the web. When I did early pieces on 2step garage and grime, the labels and artists had hardly any web presence. Nearly all the interviews I had to do calling mobile phone numbers or speak in person, rather than do email interviews. It was only about 2005 that you started to get grime figures with Myspaces. It was only around then that you started to get tons of DJ sets being uploaded to the web. Before that the music was really hard to get hold of if you didn’t live in London, you had to mail order expensive 12”s and CD mixtapes. Now it is totally easy to stay on top of the music no matter where you live. But some of the romance and mystique of the scene has gone as a result.
MF: It’s not only UK dance music of the Nineties that is associated with cities; the whole history of popular music is about urban scenes. It’s no accident that Motown started in Detroit, house in Chicago, hip-hop in New York… Cities are pressure cookers which can synthesise influences quickly and in a way that is both collective and idiosyncratic. Scenes in city depend on a certain organisation of space and time that cyberspace threatens. For example, the hardcore continuum depended on an ecology of interrelated infrastructural and cultural elements — pirate radio, dub plates, clubs, etc. — but it also relied on these elements being somewhat discrete. For instance, dub plates acted as probe heads, which would be tested out in clubs. But cyberspace has collapsed the differences between making a track at home, releasing it and distributing it. Now it’s possible to upload a track into cyberspace immediately, there’s less sense of occasion about a record release. So there’s a collapsing of time. But alongside this is a collapsing of the importance of spaces. Club spaces were important because of that “evental” time: you would be hearing a track for the first time… But now new tracks in DJs’ sets are immediately made available on YouTube. It goes without saying that the club experience is a collective experience — it gains much of its power from people experiencing the same thing in the same space. Cyberspace is much more individuated. Because it isn’t a “space” in the way that physical space is, you don’t get that sense of coming together. It’s more like being involved in a conversation than being in a crowd. Even with instant messaging, there’s a delay.
Clearly, there’s something potentially positive about people being able to make and release music without worrying about the costs of recording studios, about how it will be distributed and such like. But while this might remove certain obstacles for individuals making music, it’s not clear that cyberspace is good for music culture. Urban scenes compressed and concentrated things; cyberspace and digitality are in danger both of making culture too immediate (you can upload a track right now) and too deferred (nothing is ever really finished). The city-based music scene is perhaps one of the things you can hear being mourned on Burial’s records, with their many references to London. The “sonic cartography” of London you pick up from Burial’s records is in many ways a pirate-radio cartography.
KM: The international reception of some of the sounds in the continuum was the one of a music alternative to what some perceived as the pure recreational hedonism of house music, for example in Italy jungle was embraced by Centri Sociali (squats), maybe they were some of the musical genres that help dissolving resistances towards dance music within non clubbers. Maybe this was because of the persisting connections with Jamaican music, maybe because of the dystopian mood/control society references. But apart from this I’d like to know what is, in your opinion, the most significant political significance of these genres?
SR: The major political significance of the hardcore continuum is the role it’s played in the emergence of a post-racial Britain. Which has not fully arrived, obviously there is still a lot of racism in Britain, but you could talk about jungle and UK garage especially as having created a post-racial “people” within the UK — it’s most obviously a force in the major cities like London and Birmingham and Coventry, but this tribe has members scattered all across the country. It’s not just the mix of black and white, it’s all sorts. I’m always amazed at the range of ethnicities involved, there’s people whose parents are from the Indian sub-continent, or who are Cypriot or Maltese, and you also get every imaginable mix-race combination. Even talking just about “black Britain”, it’s not just people of Jamaican descent, there’s all the other islands in the Caribbean that have their own distinct musical traditions like soca and so forth, and there’s also been more recently African immigrants, whose influence is really felt in the Afro flavours you can hear in funky house.
So it’s a really rich mix, but I guess the predominant musical flavours that run through the whole span of the continuum involve the collision of British artpop traditions (post-punk, industrial, synthpop) with Jamaica (reggae, dub, dancehall) and also black America (hip-hop, house, Detroit techno). And it’s very much a two-way street: it’s not just white British youth turning on to bass pressure and speaking in Jamaican patois, it’s about second-generation Caribbean-British youth freaking out to harsh Euro techno, having their minds blown by all that early Nineties music out of Belgium. Or someone like Goldie growing up on reggae and jazz-funk but also on groups like PiL and the Stranglers.
You might say that the music of the hardcore continuum reflects the emergence of this post-racial “people” within the UK more than it has created it. But I think it has sped up the process, by being so attractive and so obviously the cutting edge in British popular culture. People have been actively drawn into joining this tribe, it’s been an identity many have wanted to embrace, because it’s been the coolest music of its era and it’s been something to be proud of: a post-racial way of affirming Britishness.
So this I think is a major political achievement for the hardcore continuum. Some commentators like the music theorist Jeremy Gilbert have asked why that never translated into politicisation per se. At various point, particularly with jungle and with grime, there has been a sense that the music has been telling us things about society and what life is like for the British underclass. The darkness and paranoia of jungle (also carried on to an extent with dubstep), and the aggression and self-assertion of grime, reflect the gritty side of urban existence. But there is also a feeling, on my part certainly, that at a certain point simply reflecting Reality isn’t enough. Jungle and grime never really managed to get beyond being “gangster rave”, which is to say the British equivalent to gangster rap. So across its historical span it has oscillated between darkness (reflecting ghetto life) and brightness (dressing up and looking expensive, partying, dancing to sexy groovy music, chasing the opposite sex — that’s the side of the continuum that produced speed garage, 2step, funky house). Apart from the post-racial aspect, the other major achievement of the hardcore continuum is the creation of an autonomous cultural space based around its own media (pirate radio) and its own economic infrastructure (independent labels and record stores). Pirate radio seems particularly significant: the fact that it is community radio, offering the music for free, and that it is amateur, with DJs and MCs actually paying to play (they have to cough up a subscription fee for their air time, to pay for equipment that is lost when the authorities seize transmitters and so forth). Pirate radio is important also because it is public: the culture is underground, but this is an audible underground, it is broadcast terrestrially, blasting out onto the airwaves of London or the other big UK cities. It’s a community asserting its existence on the FM radio spectrum. This means that people who don’t like the music or the social groups it represents will stumble on it, but also that people who don’t know about the music will encounter it — potential converts to the movement. If the pirates went completely online, it would cease to be an underground, it would become much more just a niche market of marginal music going out almost entirely to the pre-converted. The paradox of music undergrounds is that the idea is not really to be totally underground, invisible to the mainstream and the cultural establishment. You don’t want to be ignored, you want to be a nuisance! And there is also an interaction between the undergrounds and the mainstream, where ideas from below force their way up into the mainstream and enrich and enliven it. Which then forces the underground to come up with new ideas. That process worked for a really long time with the hardcore continuum: it would develop new ideas that were so obviously advanced and compelling that the major labels would sign artists and big radio stations like BBC and Kiss FM would recruit DJs to host regular shows. It seems to have broken down with funky house, though, it’s the first hardcore continuum genre to just stay in its ghetto.
MF: In my book Capitalist Realism, I quote an article that Simon wrote on jungle for the Wire magazine. Simon put his finger there on how crucial the concept of “reality”, of “keeping it real”, was for both jungle and US rap. Simon writes of an implied political position in jungle: how it was anti-capitalist but not socialist. That always struck me as very suggestive — but these politics were never developed. I would tend to agree with Jeremy Gilbert — that the encounter between jungle and politics never really happened. But this wasn’t only a failure of the music; it was also a failure of politics. During the Nineties, the British Labour Party courted the reactionary rockers of Britpop. But where was the politics that could sychronise with the science fictional textures that jungle invoked?
So yes, Simon is right, if the hardcore continuum had any impact on politics it was in playing a part in establishing a post-racial Britain. It was impossible to fit jungle into a pre-existing racial narrative — it didn’t sound like “white” or “black” music. And the extent to which the hardcore continuum has helped to consolidate this sense of the post-racial was made clear by an hilarious recent piece in Vice magazine called “Babes of the BNP”, in which female supporters of the far right British National Party were interviewed. One question was: “In terms of the BNP’s repatriation policy on immigration, if you had to choose, who would you repatriate first, Dizzee Rascal or Tinchy Stryder?”
militant tendencies feed music
The idea that music can change the world now seems hopelessly naïve. Thirty years of neoliberalism have convinced us that there is no alternative; that nothing will ever change. Political stasis has put music in its place: music might “raise awareness” or induce us to contribute to a good cause, but it remains entertainment. Yet what of music that refuses this status? What of the old avant-garde idea that, to be politically radical, music has to be formally experimental?
The artist Michael Wilkinson’s show Lions After Slumber (exhibited last year at the Modern Institute in Glasgow) posed these questions with a quiet intensity. The show was a kind of reliquary for a bygone militancy. It was dominated by an enormous black-and-white print of the photograph of Piccadilly Circus that had hung — upside down — in Malcolm McLaren and Vivienne Westwood’s shop Seditionaries. A stretched linen included the 1871 photograph of the Paris Communards standing over the toppled Vendôme Column — but the image had been turned on its side, so that it looked as if the restored emperor was once again lording it over the Communards, who now resembled corpses.
There was no music to be heard at the show, but there were references to music scattered throughout. A screen-printed mirror showed the face of Irene Goergens, a member of the Red Army Faction — but the image came from the album sleeve to Raw Macro, by the techno artist Farben. More importantly, the title of the exhibition was a reference to Scritti Politti’s 1982 track “Lions After Slumber”. Scritti had themselves borrowed the title from Shelley’s 1819 poem “The Masque of Anarchy”, which imagined a rising “like lions after slumber/In unvanquishable number” to avenge the dead of the Peterloo Massacre.
The allusion to Scritti Politti makes it clear that the vision of politics that Wilkinson’s show simultaneously mourned and invoked was derived from post-punk — the outpouring of musical creativity in the late 1970s and early 1980s that was in many ways Britain’s version of Paris 68. In line with the Marxist and situationist theory it drew on and referenced, post-punk grasped culture as inherently political, insisting on a version of politics that went far beyond parliamentarianism.
One of the most urgent tasks for any political music was to expose the pacifying mechanisms that were already secreted in popular culture — nowhere more obviously than in the cheap dreams of love songs, which groups such as Gang of Four and the Slits deconstructed in tracks such as “Anthrax” and “Love und Romance”. In a world in which people increasingly felt as if they lived inside advertisements — where, as Gang of Four put it, at home they felt like tourists — there was nothing more ideological than culture’s claim to be entertainment. That was the word that provided the ironic title for Gang of Four’s debut LP, and was also used in one of the Jam’s most bitterly sarcastic songs, “That’s Entertainment”.
Wilkinson’s show was timely because post-punk was one of the spectres that loomed over the past decade. Its history was extensively catalogued in Simon Reynolds’ book Rip It Up and Start Again; the music was pastiched by lumpen plodders such as Franz Ferdinand and the Kaiser Chiefs, and served up again by originals such as Gang of Four, Magazine and Scritti, all of which reformed. The return of the post-punk sound had a double effect. At one level, it constituted the music’s final defeat — if conditions were such that these groups could come back, thirty years after the fact, and not even sound particularly out of date, then post-punk’s scorched-earth injunction that music should constantly reinvent itself must be as dead as its hopes for a revivified politics. Yet even the most degraded simulations of post-punk style carry with them a certain spectral residue, a demand — which these simulacra themselves betray — that music be more than consolation, convalescence or divertissement.
At the end of history, the impasses of politics are perfectly reflected by the impasses in popular music. As political struggle gave way to petty squabbles over who is to administrate capitalism, so innovation in popular music has been supplanted by retrospection; in both cases, the exorbitant ambition to change the world has devolved into a pragmatism and careerism. A certain kind of depressive “wisdom” predominates. Once, things might have seemed to happen, but we won’t get fooled again. Like the images in Wilkinson’s Lions After Slumber, the world has been turned the right way up again. The emperor is on his feet, power and privilege are restored, and any periods when they were toppled seem like ludic episodes: fragile, half-forgotten dreams that have withered in the unforgiving striplights of neoliberalism’s shopping mall.
In his study of the Sex Pistols, Lipstick Traces: A Secret History of the 20th Century — published in the politically resonant year 1989 — Greil Marcus impersonated this depressive wisdom. “By the standards of wars and revolution”, he conceded,
the world did not change; we look back from a time when, as Dwight D. Eisenhower put it, “Things are more like they are now than they ever were before.” As against the absolute demands so briefly generated by the Sex Pistols, nothing changed […] Music seeks to change life; life goes on; the music is left behind; that is what is left to talk about.
In fact, Marcus argues, the Pistols and those who followed them did change the world, not by starting a war or a revolution, but by intervening in everyday life. What had seemed natural and eternal — and which now appears to be so again — was suddenly exposed as a tissue of ideological presuppositions. This is a vision of politics as a kind of puncturing, a rupturing of the accepted structure of reality. The puncture would produce a portal — an escape route from the second-nature habits of everyday life into a new labyrinth of associations and connections, where politics would connect with art and theory in unexpected ways. When songs ceased to be entertainment, they could be anything. These punctures felt like abductions.
Abduction was what it felt like on first listening to Public Enemy. Like the post-punks, Public Enemy implicitly accepted the idea that a politics which came reassuringly dressed in established forms would be self-defeating. The medium was the message, and Public Enemy’s astonishing militant montage was remarkable for both its rabble-rousing sloganeering and its textural experimentalism. When the group’s music, produced by the Bomb Squad, looped fragments of funk and psychedelic soul into abstract noise, it was as if American history — now cut up into a science-fiction catastrophe, a permanent emergency — was made malleable and ripe for rapid-fire retelling from the perspective of a post-Panther black militancy.
Or there was the very different approach of Detroit’s Underground Resistance: in contrast to the data-density of the rap of Public Enemy’s Chuck D, they offered a largely voiceless take on techno, pursuing a strategy of stealth and invisibility, drawing listeners into a suggestive semiotic fog created by track titles (such as “Install ‘Ho-Chi Minh’ Chip”) and sleeve imagery that combined political insurgency with Afrofuturist science fiction.
What Public Enemy and Underground Resistance had in common was a rejection of the idea of music as entertainment. Instead of minstrelsy, they conceived of music in the militaristic terms explored in Steve Goodman’s recent book, Sonic Warfare: Sound, Affect and the Ecology of Fear. In this model, the use of music to subdue populations — the “psychoacoustic correction” directed by the US army against the Panamanian dictator Manuel Noriega; “sound bombs” deployed over the Gaza Strip — is by no means unusual. All music functions either to embed or to disrupt habituated behaviour patterns. Thus, a political music could not be only about communicating a textual message; it would have to be a struggle over the means of perception, fought out in the nervous system.
Underground Resistance saw their mission as fighting against “mediocre audiovisual programming”. Yet the problem is that the controllers have been all too successful in propagating this mediocrity. Where Public Enemy and Underground Resistance conceived of music as education, the dominant culture has been reclaimed by a Tin Pan Alley populism that has once again reduced music to entertainment. The internet and the iPod are part of a new economy of musical consumption in which, thus far, the possibilities of being abducted seem attenuated. In a world of niches, we are enchained by our own consumer preferences.
What is lacking in the age of Myspace is the public space that could surprise or confound our understanding of ourselves. Where, today, is the equivalent of the Top of the Pops stage, which could suddenly be invaded by the unexpected? Ironically, it is something such as The X Factor; the campaign to get Rage Against the Machine to the Christmas number-one slot was evidence of a hunger for music that was not just entertainment.
We are in a time of transition. Jacques Attali once argued that fundamental changes in the economic organisation of society were always presaged by music. Because, as a result of downloading, recorded music now seems to be heading towards decommodification, what does this suggest for the rest of the culture? And we are yet to hear the impact that the financial crash and its aftermath will have on musical production. The collapse of neoliberalism has already led to a simmering, renewed militancy on university campuses and elsewhere — how will this translate into sound? Perhaps soon we will once again hear new music that aims to turn the world upside down.
autonomy in the uk
When the Real rushes in, everything feels like a film: not a film you’re watching, but a film you’re in. Suddenly, the screens insulating we latecapitalist spectators from the Real of antagonism and violence fell away. Since the student revolts in late 2010, helicopters, sirens and loudhailers have intermittently broken the phoney peace of post-crash London. To locate the unrest spreading across the capital, you just had to follow the Walter Murch-chunter of chopper blades… So many times during 2011, you found yourself hooked to news reports that resembled the scene-setting ambience in an apocalyptic flick: dictators falling, economies crashing, fascist serial killers murdering teenagers. The news was now more compelling than most fiction, and also more implausible: the plot was moving too quickly to be believable. But the sheen of unreality it generated was nothing more than the signature of the unscreened Real itself.
Sound was at the core of one of the year’s momentous stories, the still unravelling “hackgate” narrative of national newspaper journalists caught out cracking the mobile phone messages of public figures and the grieving relatives of crime victims for story leads. After Hackgate, the UK power elite looked like something out of David Peace’s Red Riding Quartet or The Wire television series (which itself turned on the moral issues of secretly recording phone conversations). The complicities of interest and mutual fear exposed by the phone hacking story brought to mind the party scene in the 1974 episode of Channel 4’s TV adaptation of Peace’s novels, where the illicit hedonism and skulduggery of cops, hacks, corporate plutocrats, private investigators — friends and ostensible adversaries — illustrated the true meaning of David Cameron’s notorious phrase “we’re all in this together”. In 2011, we were living the film; all that was missing was the soundtrack.
At the end of 2010, the BBC’s economics editor Paul Mason wrote a blog post called “Dubstep Rebellion”, which described a pivotal moment he witnessed in the 9 December student protests: when the “crucial jack plug” of a sound system playing “political right-on reggae”, was pulled by a “new crowd — in which the oldest person is maybe 17”, and replaced it with what he mistakenly believed to be dubstep. He was corrected by Guardian contributor and author of Kettled Youth, Dan Hancox, whose own blog posted a playlist of the tracks he heard at the protest. They turned out to be mostly grime and dancehall (Lethal B, Elephant Man, Vybz Kartel), alongside chart rap and R&B such as Rihanna and Nicki Minaj. What’s striking is the lack of explicit political content in any of this music. Yet grime, dancehall and R&B have a grip on the present in the way that older forms of self-consciously political music don’t, and here is the impasse. It’s as if we’re left with a choice between the increasingly played out feel of “politically engaged” music and the sound of the present. In the past year alone, the Guardian has run numerous articles bemoaning the lack of “protest” music, but for many of us, “protest” has always been a rather pallid model of what political music could be. Besides, it’s not protest music that has disappeared: go to the Occupy camp outside St Paul’s and you won’t find a shortage of acoustic guitars. What’s missing is a specifically twenty-first-century form of political music. While there are some grime tracks that can be understood as having a political message, for the most part the genre’s political significance lies in the affects — of rage, frustration and resentment — to which it gives voice. By contrast with US hip-hop, grime remains a form that is bound up with the failure to make it. The situation of grime is an allegory of class destiny. Just as it’s possible for some to rise from the working class but not with it, so it’s possible to rise out of grime (as artists such as Professor Green and Tinie Tempah have proven with their many crossover hits), but it’s not yet been possible for anyone to succeed as a grime artist.
Paul Mason acknowledges his mistake to correctly identify what was played at the 2010 protests in his new book, Why It’s Kicking Off Everywhere: The New Global Revolutions. Notwithstanding his inability to correctly track the changes in urban dance music, however, his original blog post was prophetic. After 9 December, the student protests lost momentum. The major moments of dissent in 2011 — which would also be the most powerful explosion of working-class rage in the UK since the riots of the early 1980s — would come from the group that Mason identified as “banlieue-style youth from places like Croydon and Peckham, or the council estates of Camden, Islington and Hackney”. As with some of the 1980s riots, the immediate cause for the UK’s first major uprising of 2011 was the death of a black person, Mark Duggan, shot by the police in Tottenham. “25 years ago police killed my grandma in her house in Tottenham and the whole ends rioted, 25 years on and they’re still keepin up fuckry”, tweeted Tottenham MC, Scorcher. His grandmother was Cynthia Jarrett, whose death prompted the Broadwater Farm riots in 1985. Dan Hancox mentioned this tweet in a piece about British urban music and the riots for the Guardian, a crucial journalistic intervention at a vertiginously scary moment when the authoritarian and racist right were using the unrest as the pretext for reheating discourse that would have been deemed unacceptable only a week before. In an extraordinary but typically incoherent rant on the BBC’s Newsnight, TV historian David Starkey astonishingly blamed the riots on “black culture” — collapsing the whole of black culture into music, and all black music into a poorly understood version of gangster rap. Like much of what happened in 2011, Starkey’s delirious diatribe is best understood as a symptom: in this case of ruling-class panic and ignorance. Starkey dismissed the idea that the riots were political on the grounds that no public buildings were attacked — but what meaning do public buildings have for youth who were born into a social landscape in which the very concept of the public has all but disappeared under sustained ideological attack? The fact that the rioters targeted chain retail outlets was blamed on their “consumerism”; as if such “consumerism” were some kind of collective moral failing rather than the inevitable consequence of immersion in late capitalism’s media culture.
As Owen Jones pointed out in his book Chavs: The Demonisation of the Working Class, work, not some lost moral sensibility, was once the source of working-class discipline. But what happens to people with no expectation of work, or of any kind of meaningful future? “When the punks cried ‘No Future’, at the turning point of 1977, it seemed like a paradox that couldn’t be taken too seriously”, Italian theorist Franco “Bifo” Berardi writes in his most recent book After The Future:
Actually, it was the announcement of something quite important: the perception of the future was changing […] Moderns are those who live time as the sphere of a progress towards perfection, or at least towards improvement, enrichment and rightness. Since the turning point of the century — which I like to place in 1977 — humankind has abandoned this illusion.[2]
From decrying the failure of the future, music has increasingly become part of this inertial temporality. Nothing symbolises mainstream music’s relationship to politics better than the BBC’s coverage of U2’s set at Glastonbury. The significance here was not the music — predictably moribund and lacklustre, no longer even capable of mustering the totalitarian pomp of yore — but the way in which the TV coverage ignored the protest by Art Uncut. U2 were treated like dignitaries from the Chinese government: dissenters threatening to disrupt the empty rituals of the rock emperors wouldn’t be tolerated. Where once even the most incorporated rock registered something about the tensions and temperature of the times, now you go to rock to be insulated from the present. Both U2 and their fellow headliner Beyoncé made gestures to “politics” in their sets — past struggles now reduced to an advertiser-friendly hopey-changey sentimentalism covering over a deeper, more pervasive sense that nothing of any consequence can ever change. Yet if mainstream pop has become a bubble impermeable to the new times, it’s not as if experimental culture has yet come up with forms capable of articulating the present either. The art world’s political mobilisations — via groups such as Art Against Cuts — have been more impressive than much of the actual engaged art itself, which has too often remained caught in a mode of pious inconsequence and textural poverty.
What has been lost is the transit between experimental and popular culture which characterised earlier eras. But what the student movement has been trying to prevent is nothing less than the dismantling of the last elements of the infrastructure which made this exchange possible; free higher education, after all, was one of the means by which British music culture was indirectly funded. Perhaps that is why Gang Of Four’s “He’d Send In The Army”, Mark Stewart and the Maffia’s As the Veneer of Democracy Starts To Fade or Test Department’s The Unacceptable Face of Freedom — records made more than a quarter of a century ago — still have more purchase on the traumatic and tumultuous events of the year in the UK than anything produced by a white musician in 2011. Recalling a conversation with Green Gartside at the Wire’s Off The Page festival of writing about music in February, it’s telling that today has no equivalent to Green’s post-punk anxieties about articulating new relationships between music and politics. Yet if this disconnection is bad for culture, it might be good for politics. For if music and subculture no longer act as effective mechanisms for controlled desublimation, converting disaffection into culture which can in turn be transformed into entertainment — feeding what Jean-François Lyotard memorably called the “Tungsten-Carbide stomach” of capital, which omnivorously consumes anything, and excretes it as commodities — then discontent can appear in a rawer form. This might be the reason that uber-reactionary Jeremy Clarkson has urged those at St Paul’s to stop camping and start writing protest songs.
It could be, however, that our thinking about the problem is wrong-headed. It isn’t that music is lagging behind politics; the politics itself is missing. The major political event of the year in the UK was the riots, but they were political in a negative sense. Reactionary commentators attempted to evacuate the riots of any political content by classifying them as an outburst of criminality. But even if we reject this for the absurdity it plainly is, it’s possible to regard the riots as symptomatic — a symptom, precisely, of the failure of politics. “Harming one’s own community is entirely mindless, but why would someone care for a community that doesn’t care for him?” Professor Green asked Dan Hancox. “They might think of this as an uprising, but the anger is misdirected and conveyed in such a way will not have any kind of positive effect.” Wiley also saw the riots as a sign of impotence: “They’re saying ‘We’re going to do what we want!’ — and I’m thinking ‘No you’re not, because when the police get a grip on it, you’re going to be either banged up, or dead’.” With the Draconian prison sentences imposed on many of those who played even a minor role in the riots, Wiley’s prediction has been vindicated. Ceasing to be a symptom is one definition of achieving political agency, and — in a world where professional politicians look like inert mannequins incapable of preventing multiple impending catastrophes — nothing could be more urgent than this.
It’s clear that this agency will not in the first instance be achieved through the hollowed out, decadent spaces of parliamentary politics. The political movement with which Franco Berardi is most associated, autonomism, has assumed a central importance amongst the political struggles that are coalescing in the UK and elsewhere. Consider, for example, the autonomistinfluenced “ultra-leftist propaganda machine” called Deterritorial Support Group, whose blog became a crucial hub for new political thinking in the UK. Steeped in electronic music culture, DSG are as significant for their political aesthetics as for any substantive political position they present: what they offer is a new form of political antagonism far beyond the folksiness of “protest music”, capable of operating across the cyberspatial, mediamatic and designer terrains of contemporary culture. This is politics as Underground Resistance’s Electronic Warfare. In the era of hacking collectives such as Lulzsec, Anonymous and Wikileaks, DSG recognise that cyber-insurgency can open up a new kind of political insurgency. With the Diamond Jubilee and the Olympics, not to mention Mayan prophecies of apocalypse, 2012 is shaping up to be the most symbolically charged year in the UK since 1977. Is this the year when No Future will finally come to an end?
the secret sadness of the twenty-first century: james blake’s overgrown
A certain trajectory seems to have come to an end with James Blake’s new album, Overgrown. Blake has gone from digitally manipulating his own voice to becoming a singer; from constructing tracks to writing songs. The initial motivation for Blake’s early work no doubt came from Burial, whose combination of jittery two-step beats and R&B vocal samples pointed the way to a twenty-first-century pop. It was as if Burial had produced the dub versions; now the task was to construct the originals, and that entailed replacing the samples with an actual vocalist.
Listening back to Blake’s records in chronological sequence is like hearing a ghost gradually assume material form; or it’s like hearing the song form (re)coalescing out of digital ether. A track such as “I Only Know (What I Know Now)” from the Klavierwerke EP is gorgeously insubstantial — it’s the merest ache, Blake’s voice a series of sighs and unintelligible pitch-shifted hooks, the production mottled and waterlogged, the arrangement intricate and fragile, conspicuously inorganic in the way that it makes no attempt to smooth out the elements of the montage. The voice is a smattering of traces and tics, a spectral special effect scattered across the mix. But with Blake’s self-titled debut album, something like traditional sonic priorities were restored. The reinvention of pop that his early releases promised was now seemingly given up, as Blake’s de-fragmented voice moved to the front of the mix, and implied or partially disassembled songs became “proper” songs, complete with un-deconstructed piano and organ. Electronics and some vocal manipulation remained, but they were now assigned a decorative function. Blake’s blue-eyed soul vocals, and the way that his tracks combined organ (or organ-like sounds) with electronica, made him reminiscent of a half-speed Steve Winwood.
Many who were enthusiastic about the early EPs were disappointed or mildly dismayed by James Blake. Veiling and implying an object is the surest route to producing the impression of sublimity. Removing the veils and bringing that object to the fore risks de-sublimation, and some found Blake’s actual songs unequal to the virtual ones his early records had induced them into hallucinating. Blake’s voice was as cloyingly overpowering as it was non-specific in its feeling. The result was a quavering, tremulous vagueness, which was by no means clarified by lyrics that were similarly allusive/elusive. The album came over as if it were earnestly entreating us to feel, without really telling us what is was we were supposed to be feeling. Perhaps it’s this emotional obliqueness that contributes to what Angus Finlayson, in his review of Overgrown for FACT,[2] characterises as the strangeness of the songs on James Blake. They seemed, Finlayson says, like “half-songs, skeletal place-markers for some fuller arrangement yet to come.” The journey into “proper” songs was not as complete as it first appeared. It was like Blake had tried to reconstruct the song form with only dub versions or dance mixes as his guide. The result was something scrambled, garbled, solipsistic, a bleary version of the song form that was as frustrating as it was fascinating. The delicate insubstantiality of the early EPs had given way to something that felt overfull. It was like drowning in a warm bath (perhaps with your wrists cut).
On Overgrown, the post-rave tricks and tics have been further toned down, and the album is at its weakest when it limply flirts with the dancefloor. Piano is still the lead instrument, but the chords hang over a backing that is almost studiedly anonymous — a luxuriantly warm pool of electronics where the rhythm is propelled more by the gently eddying bass rather than the beats. Like James Blake, though, Overgrown repays repeated listening. As with the first album, there is a simultaneous feeling that the tracks are both congested and unfinished, and that incompleteness — the sketchy melodies, the half-hooks, the repeated lines that play like clues to some emotional event never disclosed in the songs themselves — may be why it eventually gets under your skin. Blake has said that, by contrast with his debut, Overgrown sounds like the work of a man who has experienced love. For me, it is as emotionally enigmatic as its predecessor. The oddly indeterminate — irresolute and unresolved — character of Blake’s music gives it the quality of gospel music for those who have lost their faith so completely that they have forgotten they ever had it. What survives is only a quavering longing, without object or context, Blake coming off like an amnesiac holding on to images from a life and a narrative that he cannot recover. This “negative capability” means that Overgrown is like an inversion of the oversaturated high-gloss emotional stridency of chart and reality TV pop, which is always perfectly certain of what it is feeling.
But what is the faith that Overgrown has lost? Blake’s development has paralleled that of Darkstar, who similarly moved from the tricksy, tic-y vocal science of “Aidy’s Girl is a Computer” to the chilly melancholia of their first album, North. Their new record News From Nowhere has a brighter, dreamier feel, but, as with Overgrown, it is notable for its lack of designs on the dancefloor. In a discussion that Simon Reynolds and I had about UK dance music,[3] Reynolds argued that the “emotional turn” represented by Blake and Darkstar was an implicit acknowledgement that “dance music no longer provides the kind of emotional release that it once did, through collective catharsis.” The music doesn’t have to be explicitly sad for this to be the case — there is a melancholia intrinsic to the very turn inward. As Reynolds points out, the idea that Nineties dance music was unemotional is a fallacy. This was a music saturated with affect, but the affect involved wasn’t associated with romance or introspection. The twinning of romance and introspection, love and its disappointments, runs through twentiethcentury pop. By contrast, dance music since disco offered up another kind of emotional palette, based in a different model of escape from the miseries of individual selfhood.
In the twenty-first century, there’s an increasingly sad and desperate quality to pop culture hedonism. Oddly, this is perhaps most evident in the way that R&B has given way to club music. When former R&B producers and performers embraced dance music, you might have expected an increase in euphoria, an influx of ecstasy. Yet the digitally-enhanced uplift in the records by producers such as Flo-Rida, Pitbull and will.i.am has a strangely unconvincing quality, like a poorly photoshopped image or a drug that we’ve hammered so much we’ve become immune to its effects. It’s hard not to hear these records’ demands that we enjoy ourselves as thin attempts to distract from a depression that they can only mask, never dissipate.
A secret sadness lurks behind the twenty-first-century’s forced smile. This sadness concerns hedonism itself, and it’s perhaps in hip-hop — the genre that has been most oriented to pleasure over the past twenty-odd years — where this melancholy has registered most deeply. Drake and Kanye West are both morbidly fixated on exploring the miserable hollowness at the core of super-affluent hedonism. No longer motivated by hip-hop’s drive to conspicuously consume — they long ago acquired anything they could have wanted — Drake and West instead dissolutely cycle through easily available pleasures, feeling a combination of frustration, anger, and self-disgust, aware that something is missing, but unsure exactly what it is. This hedonist’s sadness — a sadness as widespread as it is disavowed — was nowhere better captured than in the doleful way that Drake sings, “we threw a party/yeah, we threw a party,” on Take Care’s “Marvin’s Room”.
It’s no surprise to learn that Kanye West is an admirer of James Blake’s. Meanwhile, this mix[4] that was doing the rounds a couple of years ago made parallels between Blake and Drake. There’s an affective as well as sonic affinity between parts of Kanye’s 808s and Heartbreak and My Beautiful Dark Twisted Fantasy and Blake’s two albums. You might say that Blake’s whole schtick is a partial re-naturalisation of the digitally manipulated melancholy Kanye auditioned on 808s: soul music after the Auto-Tune cyborg. But liberated from the penthouse-prison of West’s ego, the disaffection languishes listlessly, incapable of even recognising itself as sadness. Unsure of itself, caught up in all kinds of impasses, yet intermittently fascinating, Overgrown is one more symptom of the twenty-first century’s identity crisis.
review: david bowie’s the next day
If you’re interested in The Next Day — and even if you aren’t — you’ve probably heard it by now. Heard it, been disappointed by it, ceased caring about it. The only really twenty-first-century thing about The Next Day is the way it exemplifies the hype velocity of current communication: artfully timed PR rumours, hints and hyperbole induce anyone within its range to hallucinate a sublime object behind the veil, only for that object to degenerate into quotidian mediocrity the very second we’ve downloaded it.
The willingness to hallucinate is certainly there. Witness the sheer heft of the coverage, and feel the desperation behind it. The prospect of Bowie’s return was guaranteed to tickle the palate of a certain age of listener, but the desires that it triggered were also for something missing from contemporary popular music. These days, Bowie stands for all the lost possibilities going by the idea of art pop — which is to say, not only pop plus art, or pop as art, but a circuit where fashion, visual art and experimental culture connected up and renewed each other in unpredictable ways. His absence was a palate cleanser — his string of forgettable 1980s and Nineties records now forgotten, he could once again be the thin white space onto which fantasies are projected. His absence almost seemed like a ploy invented by Bowie the impresario-strategist. After all, the only way to make a new Bowie record an event was for him to withdraw long enough that it could seem like it might — really, this time — be forever.
The Next Day’s first single “Where Are We Now?”, with its references to West Berlin era Potsdamer Platz and Nurnberger Strasse, sounded like an object carefully designed to pique the interest not only of the Bowie diehards but also those with a more general stake in pop history and mythology. Berlin! Tony Visconti! The track’s lugubrious melancholy prompted the fantasy that The Next Day could be Bowie’s version of Sinatra’s No One Cares — an old crooner, a man lost in time paradoxically regaining currency by giving up on the sad pursuit of a present that had escaped him for good long ago. But it was a red herring. There are all kinds of intimations of mortality in The Next Day’s words — and reviewers seeking to rescue the record have tended to take refuge in the lyric sheet — but the form is rock, and an alarmingly unprepossessing, devoid of funk (as well as electronics) rock at that. The rest of the album makes the distance between now and (Berlin) then of “Where Are We Now?” painfully evident, a pain heightened by Visconti’s failure to convert this collection of session muso workouts into anything memorable. The Next Day sounds as if it were barely produced at all: it has the flatness of a demo. The relatively warm reception The Next Day received tells its own sad tale about the state of pop in 2013.
You can’t just put Visconti and Bowie together in a studio in 2012 and expect the equivalent of Low, “Heroes” or Lodger to result. The sorcerous powers that artists seem to possess as of right are never really theirs. Bowie — who perhaps more than any artist has performed the pop star’s lack of interiority — has always known this, and he and Eno did much to puncture the Romantic conceit that creativity comes from the mysterious inner depths of a musician. Bowie’s serial passage through personae, concepts and collaborators only telegraphed what is always the case: that the artist is synthesizer and curator of forces and ideas. This is all very well when the syntheses and the synergies are working, and there’s a steady supply of new collaborators to feed off and to lionise. It’s harder in this long striplit hours in the studio when the old magic won’t come, when the revels have ended but you still have to go through the motions.
It’s cruelly appropriate that Bowie’s powers deserted him at practically the very moment that the Seventies — the decade with which he will always be synonymous — ended. I came to musical consciousness round about the time of 1980’s Scary Monsters, and took Bowie for granted. Ziggy Stardust already sounded like a hoary old rock ‘n’ roll relic, and even much of Scary Monsters sounded reactionary by comparison with what proteges like Gary Numan, the Associates and Visage were doing. Yet Bowie had helped to create the conditions of his own obsolescence. His successors were following Bowie’s template for what a pop star should be: a conceptualist and a designer, sexuality and gender indeterminate, alien and/or android, all outside and no inside, the changing face of the strange. From this point on, Bowie himself would be bereft of masks and make-up — it would be just him, the music and the Eighties suits. What followed was years of gradually lowering expectations, of spectacular misfires and the occasional lost gem, but mostly there was reliable mediocrity, the familiar declined star pattern where each new record is fanfare as a return to form, only to immediately disappear into irrelevance.
Much of this is compressed onto the cover image, which is by far the most startling thing about The Next Day. It’s startling not for the act of desecration — but for the casual character of the desecration: a white square over the “Heroes” cover — what could be more half-assed? When I first saw the cover image I thought it must be a prank — what would the real cover image be like? Here is cover designer Jonathan Barnbrook’s rationale for the design: “The ‘Heroes’ cover obscured by the white square is about the spirit of great pop or rock music which is ‘of the moment’, forgetting or obliterating the past. However, we all know that this is never quite the case, no matter how much we try, we cannot break free from the past.”
The image becomes more than a comment on Bowie — the man who once traded on his ability to escape the past is now trapped by it. It also functions as a diagnosis of a broader temporal malaise. What is this white space, this void? An optimistic reading would construe it as the openness of a present that is not yet decided. A bleaker take — one in keeping with the hackneyed quality of the music — would see the white space as standing in for the vacancy of the present, with nothing there except a necessarily failed attempt to escape and recover the past. That’s our pop predicament in 2013, a predicament which The Next Day couldn’t seriously have been expected to resolve.
the man who has everything: drake’s nothing was the same
So here we are again: life at rainbow’s end. Everything that can be bought, available practically immediately, 24/7: women, food, cars, you name it, you click on it. Every hotel suite can be prepared to your specifications. The only things that are different are the shower controls. It’s all top quality, although naturally you can get down and dirty with the fast food options if you want to, and often (why not?) you do:
Got everything, I got everything… I cannot complain, I cannot (You sure about that, Drake?)
I don’t even know how much I really made, I forgot, it’s a lot… Fuck that, never mind what I got[2]
OK, then, let’s get the obvious question out of the way first. If you’ve got everything, why are you so sad?
Surely it can’t be as simple and sentimental as that hoary old chestnut: money can’t buy you love? Come on, is this really where rap was destined to end up: with the rapper as some romcom character, all the braggadocio and super-conspicuous consumption just so much bluster to conceal the boy-lack that the redeemer-woman will make good in the final reel? That old story, again? “Next time we fuck, I don’t want to fuck, I want to make love… I want to trust.” Drake can’t quite believe this routine, can’t quite make us believe it. He knows perfectly well that this sensitive stuff can play as one more pick-up-artist’s ruse… He’s spent so long deceiving and then revealing his deceptions that he’s no longer sure when he’s trying to play us or speak openly, or what the difference is. Crying real tears with one eye, while winking over the latest conquest’s shoulder to the camera with the other. He’d convinced us he was different, but that was a trick, and one that others have caught on to. There’s nothing very brave or unique about talking about your feelings now that “niggas talk more than bitches do.” Is this more honesty, or just an acknowledgement that he needs a new USP?
I got 99 problems, getting rich ain’t one.
Listening to Nothing Was the Same, I’m reminded of Judd Apatow’s Funny People. Apatow’s film is defined by a series of hesitations and avoidances. First of all, it seems as if it is going to be a film about a jaded but rich and successful comedian, George Simmons (Adam Sandler), who learns the value of life when he’s diagnosed with a serious illness; then it seems to be about a man who accepts the value of love and family. Yet each time the film seems to move towards these standard generic resolutions, Apatow pulls back. Simmons’ hedonic nihilism re-asserts himself; the threat of death can’t break the bad habits of a lifetime; the love he lost long ago was actually better off lost. He’s not happy being himself but he doesn’t want to be anyone else. Far from relieving this existential dilemma, fabulous wealth means that he has nowhere at all to hide from it.
Nothing Was the Same is characterised by the same ambivalence — a longing to be a new person who can love and trust (with a woman, naturally, charged as the agent of this transformation) together with a recognition that he will never change, that he’ll always be drinking, smoking, fucking, that he’s far from perfect, but neither is anyone else, right? He never really took off the gangsta-minstrel drag for good; instead, he keeps casting it aside, inspecting it, distancing himself from it, before wearing it again. He can’t help himself (or so he keeps telling us). But this oscillation is valuable for what it tells us about rap’s embattled masculinity in general. Drake confirms that the street-strutting bad boy “just looking for head in a comfortable bed” is the other face of the desperately alone little boy lost crying to his mommy substitute. The boasting brute is always on the run from the helpless infant inside, but, for that very reason, the emotionally broken-down male isn’t an alternative to all the ego-armour posturing, so much as it is its enabling condition. Women are to be publicly disdained, treated as currency in a homosocial bragging economy; in private they are asked to make these wounded men whole again. Is there a track that has exposed the real nature of the male-to-female love song better than Take Care’s “Marvin’s Room”? The conceit — a drunk Drake leaving a phone message to a long lost love he treated badly but now thinks he wants back — leaves us in no doubt that he was speaking to himself via a fantasised female Other.
Gangsta’s hyperbolically-staged fantasies of omnipotence were always nouveau-riche giveaways, which, like the bling, sang out that these working-class black Americans had not yet achieved the easy way in the world, the casual confidence that are the birthrights of those born to wealth and power. The (gold) chains have always clanked as loudly as Jacob Marley’s that the struggle to escape servitude has run aground, and that untold riches for a very few were the compensation for the many languishing in inertia, poverty, incarceration. Is “Started from the Bottom” — which we all laughed at: no you didn’t, Drake! — Drake’s commentary on all this? Hear it as an act of imagination, Drake putting himself in the sneakers of those who had to struggle from the depths like he never had to, rather than as some forged autobiography, and it makes more sense. But listen to the sheer weariness that weighs down the track: the heavy tristesse that starts the moment after you’ve reached the top of the tower, as the realisation sinks in that there’s no replacing the thrill of the chase. Drake was always expected to be a success, so he was deprived even of that brief moment of satisfaction before the ennui and the paranoia set in. Reaching the top was standard, the least he could expect.
Nothing Was the Same is tangled up in all the confusions of a generation of men faced with contradictory imperatives — the post-feminist awareness that treating women like shit isn’t cool, together with the Burroughsian bombardment of always-available pornography. There’s no point moralising here, either for Drake or us. Drake’s at his weakest when he half-heartedly attempts some kitschy Hallmark card affirmation of lurve; he’s at his most painfully revelatory when he admits that these impasses, these binds, are just too much for him. He can’t escape these knots because the knots are what he is. His bewilderment about what a man is supposed to be now is the very hallmark of a contemporary heterosexual masculinity that realises that the patriarchal game is up, but which is too hooked on the pleasures and privileges to relinquish them yet (just one more click on the porn, then I’ll be Mr Sensitive forever).
On Nothing Was the Same, Drake often sounds like Tony Montana in Scarface: fucking, eating, snorting, is that all there is? But the tone here couldn’t be more different from Pacino’s Eighties cocaine histrionics. A glacial fatalism runs beneath everything here, and Drake matters because he makes contact — maybe better than anyone else — with the sense of hopelessness that quietly subsists beneath all the twerking and tweeting, all the twitter and the chatter of twenty-first-century culture. Hear this in the gorgeous electro-downer haze that saturates the album and establishes its tone much more than any of the beats. Yet there’s something beyond the fatalism, too. You can hear it in Drake’s signature move — the transition from rap to singing, the slipping down from ego-assertion into a sensual purring, the relaxing into a lasciviousness that has nothing to do with the localised libido and dumb automatisms of phallic sexuality. Down here, there is a glorious release from the pressures of identity. Rave-like, pitched-up vocals are suspended on placid currents of synth. Voices stop being human, become avatars from a space where subjectivity has been left behind like a bad dream. On the opener, “Tuscan Leather”, Whitney Houston’s ghost is summoned from the hotel bathroom, mutated into some butterfly-fragile chirruping creature singing inside a specimen jar. I’m frequently reminded of nothing so much as the refracted architectures and water sprites of Balam Acab’s Wander/Wonder. When you dive into these electro-oceanic depths, Nothing Was the Same ceases to be a fascinating symptom of all the blockages of the present, and becomes a longing for something new, something strange and lovely.
break it down: dj rashad’s double cup
Time-stretched Amen breakbeats, rave-euphoric vocals: on Double Cup, Rashad pays his dues to the hardcore continuum, but the traces of jungle and rave here only accentuate how different footwork is to Nineties British dance music.
Footwork has been greeted with the fanfare that usually accompany the arrival of an avant-garde dance music. These contradictory responses — footwork’s being written off as something that you can’t dance to at the same time as it is dismissed as a functional music, something that would only be properly appreciated by those dancing to it — is a sure sign that we are in the presence of something which scrambles the defaults of rearview hearing.
But footwork is new in a strange way. It’s not historically new: it dates back to the Nineties. And what’s uncanny, unheimlich, about footwork is that practically everything in the sonic palette is familiar. Most of the sounds on Double Cup feel like they could have come from the twentieth century, even if they have actually been produced in the twenty-first.
So, wherein resides footwork’s newness then? In a fascinating blog post,[2] Tristam Adams identifies exactly what makes footwork new: its compositional innovations. To bring this out, Adams contrasts footwork to jungle. Jungle’s newness was in large part a consequence of the widespread availability of digital sampling technology, which facilitated both new sounds and new ways of treating sound (time-stretched breakbeats and vocals). Beyond this, though, I’m not sure that the way Adams constructs the comparison between jungle and footwork is quite right. Adams hears jungle as more “machinic” than footwork — but what was exciting about jungle to many of us at the time was that it gave a whole new sense of what machinism was. Jungle’s machinism was delirious; it was, in Kodwo Eshun’s immortal phrase, a rhythmic psychedelia, composed from whorls, twists, and vortexes of sound; there were none of the rigid mechanoid lines of techno. Jungle was dark, but also wet, viscous, and enveloping.
It’s here that the contrast with footwork can most be heard — and felt. To those whose ears and nervous systems were mutated by jungle in the Nineties, footwork can initially sound strangely desiccated — like the dry bones left after jungle’s digital ocean has receded. “UK bass music” is an almost wilfully bland term, but it does point to the element which gave every genre from jungle to UK garage and dubstep their consistency: a viscous, glistening bass sound. This is conspicuously absent from Rashad’s sound. Instead of functioning as a dark liquid element on (or in) which other sounds could be suspended, Rashad’s bass is a surging and reclining series of stabs and jabs that heightens and lowers tension without ever releasing it.
This leads on to another difference from jungle and the broader tendencies in Nineties digital culture. Where jungle, like Nineties CGI, used digital technology to smooth out some of the hard lines that had been characteristic of early computer sound and imagery, footwork has deliberated opted for angularity. Charlie Frame’s comparison of listening to Rashad with “gazing at an animated GIF that grows ever more absurd with each iteration”, captures very precisely footwork’s jerky repetitions. Perhaps the appeal of the animated GIF and of footwork are both tied up with the way that they reject the dominant aesthetics of digital culture now. Think of the way that the elastic architectures of Nineties animatronics gave way to the dreary photorealism of contemporary animation. Now, novelty is to be found in the refusal of communicative capitalism’s false promises of smoothness. If the Nineties were defined by the loop (the “good” infinity of the seamlessly looped breakbeat, Goldie’s “Timeless”), then the twenty-first century is perhaps best captured in the “bad” infinity of the animated GIF, with its stuttering, frustrated temporality, its eerie sense of being caught in a timetrap.
That frustrated, angular time — and the enjoyment of it — is at the heart of footwork. The genre can sound like an impenetrable thicket of rhythms if the thing you lock onto first is the most distinctive thing about footwork: the coiling spasms of super-dry snares. Lock into the floaty synth pads and the vocals, however, and footwork comes on as strangely mellow. In this respect, footwork can then be heard as an extrapolation of elements of Nineties G-funk. An earlier Hyperdub sound — the dayglo wonky of Joker — had mined G-funk for its absurdist pitch-bent synths. What footwork takes is some vocal styling (the rap that is so often subject to its stuttering repetitions), but also a certain mood. G-funk differentiated itself from standard gangsta posturing by the way it dissolved the hard ego of the rapper into clouds of Chronic. Beneath the busyness of capitalist realism — and its demands that we never stop selling ourselves — was another mode of being, where time diffused slowly as exhaled smoke. Beyond the phallic machismo, there was a different libidinal economy, defined by a superficially paradoxical combination of deep yearning and a desire to remain absolutely in the sunlight-saturated moment, liberated from the urgencies of business. This is all the more poignant because a gangster’s work is never done, his enemies don’t sleep, and chilled-out bliss could be terminated at any moment by gunfire. To the G-funk celebration of smoking, Rashad adds other affective toners: the lost-in-the-moment exhilaration of the raver, and R&B’s wistful regrets/lascivious moaning. The overall result is, in terms of mood and affect, oddly reminiscent of cool-era jazz — there is the same ambivalence, the same evocation of an harsh yet alluring urban environment, the same combination of sadness and confidence, the same articulation of longing and bliss.
Then there is the tic-talk of the voices themselves — the way they are made to stammer and circle around themselves. It’s as if there is a cross-contamination, a human-machine (psycho)pathology, the machines infecting the human voices with glitches, the humans passing on Freudian slips, parapraxes, to the machines. Rashad’s plaintive machinism reminds me of nothing so much as the hallucinatory intensity of the “I Love You” section of William Burroughs’ The Ticket That Exploded:
On my knees I hoped you’d love me too. I would run till I feel the thrill of long ago. Now my inspiration but it won’t last and we’ll be just a photograph. I’ve forgotten you then? I can’t sleep, Blue Eyes, if I don’t have you. Do I love her? I love you I love you many splendored thing. Can’t even eat. Jelly on my mind back home. ‘Twas good bye deep in the true love. We’ll never meet again, darling, in my fashion.[3]
Burroughs’ early cut-up and fold-in texts, with their analysis and decoding of emotional manipulation via media and their understanding of pornography as a control apparatus, now read like extraordinarily prophetic anticipations of the present moment. As with Burroughs, there is a double pathos in Rashad’s work. First of all, there is a pathos at the level of the affects in the voices themselves; and the way that the voices are orphaned from their supposed origins means that there is an overwhelming sadness even if the feeling expressed is ostensibly joyful. It’s the same kind of depersonalised sadness we might feel if we happened upon lost photographs of an unknown person’s holiday, long ago. Then there is another pathos that arises from the way that the voices are made to repeat and stutter; the sadness of recognising a speaking animal (ourselves) in the grip of automatisms, repetitions, drives. Rashad articulates the impasses of our twenty-first-century condition with a precision and a compassion that few others can match. More importantly, he suggests that — against all the odds — we might still be able to dance our way out of the time-traps and identity prisons we are locked in.
start your nonsense! on eMMplekz and dolly dolly
There are still all kinds of possibilities for combining voice and sound in new ways. Rap was the last major form to popularise a use of the voice that was not singing, but the field is wide open, as these two new albums from eMMplekz and Dolly Dolly prove.
The first temptation with these records is to hear them as “spoken word” — with the musicality subordinated to a voice that is literary, conversational, comedic. However, what makes these two albums so unique is the way that musicality here infests and inflects the voice, the way that the sound refuses to stay (in the) background. Both albums take much of their inspiration from the very English tradition of Nonsense, which includes Edward Lear, Lewis Carroll, Monty Python, and more recently, Chris Morris. It was on account of Carroll that André Breton reputedly said that the English had no need of surrealism. Here, eMMplekz and Dolly Dolly proffer different versions of twenty-first-century English sonic surrealism.
With eMMplekz, a collaboration between Ekoplekz and Mordant Music’s Baron Mordant, the precursors that first come to mind are certain moments in post-punk — Cabaret Voltaire’s “Photophobia”, Throbbing Gristle, the Fall — yet eMMplekz don’t sound quite like any of these. From its title on in, Your Crate Has Changed, the Baron’s punconscious wordplay has a very contemporary focus.
If Drake and Kanye West expose the sadness and madness deep within the cyber-pleasuredome — the sound of depressed superstars as hypercommodities — then eMMplekz observe the malaises and pathologies of capitalist cyberspace from outside the digital matrix. Instead of the seamlessslick, depthless pixellation to which always-on digitality has habituated us, Ekoplekz’s analog electronics seethe and hiss, gathering and dispersing like a steam and mist. These synthesizer sketches function like impressionist sound paintings of what Ken Hollings has called the “digital regime”, and it’s as if, like users coming down from a psychotropic, we are finally seeing it for what it is.
“I’ve got to take this…” Baron Mordant has a schizoanalytic ear for how the digital regime reveals itself through the phrases it induces to casually utter. Doesn’t this phrase — so often repeated, so little thought about — capture all too accurately our fatalism in respect of communicative capitalism? “I’ve got to take this” — I’ve got to let it, accept it, I can’t escape, there’s nothing I can do… There’s no way out, there’s no release from the frenzied inertia of all those cyberspatial urgencies, these alerts. “Tethered to my hotspot, tethered to my hotspot…” Constant anxiety about staying connected, constant worry about holding onto the equipment that allows us to stay connected. “Can you watch my laptop?” We’re all sick of this now… we’re all sick because of this now… “Sorry for your Lossy…” What is all this digital compression costing us, and when do we ever get to count the cost? (The first thing we do in the morning is grope for our smartphones — straight from sleep into the somnambulance of capitalist cyberspace. “Unsubscribe from Soviet time” — maybe we did that too soon, and now it’s business o’clock, forever…)
Your Crate Has Changed is like an English take on Franco “Bifo” Berardi’s Precarious Rhapsody: Semiocapitalism and the Pathologies of Post-Alpha Generation. Berardi persuasively argues that the interlock between precarious work and capitalist communications technology has produced a population whose nervous systems are overloaded with stimuli. Mordant gives voice to weary old digital migrants whose middle-aged flesh is too saggy and grey to be made-over — people deprived of security, forced to keep on hustling even though they are too old for the game, bone-weary. No rest for the precarious, no chance to tune into anything except the imperatives of business. “Invoices in my head… invoices in my head…”
Invoices in my head, and too much spam and random cyber-noise to hear anything else. But I don’t think there’s been anyone since Mark E. Smith at his telepathic peak in the late Seventies/early Eighties who has managed to tune into the rogue frequencies of England’s schizo-babble as effectively as the Baron does here. Mordant finds all the clandestine signals hidden in jingles and classified ads. He channels the voices of the lonely, the desperate, all the weirdos and the saddoes; ourselves, perhaps, but the secret selves we keep stuffed behind our Facebook walls. Yet there are still avenues of escape — on a couple of tracks, an infant’s babbling offers an alternative Nonsense to capital’s infantilised huckster-speak.
A surface joviality — a different kind of humour, much less mordant — separates Dolly Dolly from eMMplekz. Yet it’s the slippages of tone and genre, from light pastiche to intimations of mortality, the sliding of persona from gone-to-seed raconteur to charity shop mystic, from short story-teller to preening bard, that make Antimacasser such an odd jewel of a record, and Dolly so singular a performer.
The opening track, “Wattle and Daub” — a collaboration with Position Normal — is more than worth the admission price alone. Over a lysergicallysmeary detuned piano (or maybe guitar), Dolly Dolly dolefully declaims a Nonsense-Shakespearean state of the nation address. “England my England… the cold mist of your fibrous trolleys stifles the sun… half-strangled uncles stuffed with crisps… your sky full of plump chintz cushions…” It’s like Tony Hancock’s melancholia has been dream-conflated with his mockery of thespian and playwright pretensions. Yet the Nonsense is disarming: “Wattle and Daub” gives us nothing less than a psychedelic-surrealist portrait of a country deprived of psychedelia and surrealism. A world without surprise, an entirely domesticated universe, banality as cosmology: “Let’s colonise the other planets, fill them with bitter and dry roasted peanuts, pigeons and oven chips.” The dead world of middle-aged Britain’s living rooms; the cheery veneer of advertising’s ever-smiling, glowing-faced families turned inside out. “I’m sick of being a man”, moans the character who narrates the closing track. Aren’t we all? But Antimacasser finds all sorts of disused or temporarily abandoned doorways into other worlds, all kinds of rabbit holes in which we can escape from being a sad human animal. Old New English Library paperbacks become occult manuals, full of esoteric philosophy. It’s still possible to transform ourselves, to transport ourselves, and Dolly Dolly shows us how.
review: sleaford mods’ divide and exit and chubbed up: the singles collection
The East Midlands accent, lacking urban glamour, lilting lyricism or rustic romanticism, is one of the most unloved in the UK. It is heard so rarely in popular media that it isn’t recognised enough even to be disdained. I must confess that I have a dog in this fight. I grew up in the East Midlands, and when I left university I was described by a sympathetic lecturer as having a “speech and accent problem”. The accent gradually disappeared, as I learned to suppress the lazy Leicestershire consonants and articulate my speech in something closer to so-called received pronunciation — an achievement loaded with ambivalence and shame.
Sleaford Mods’ Jason Williamson makes no such accommodation to metropolitan manners, and he’s disgusted at those who speak in fake accents, whether they’re imitating someone from East London or “Lou Reeds, G.G. Allin…” The appeal to the local in politics and culture is usually smug and reactionary; a petit-bourgeois ruse to acquire more cultural and actual capital by overpricing the artisnal and the organic (Williamson is wise to this scam too, blasting at “expensive coffee shops full of local art/Fuck off”). But the politics of locality operate differently when it comes to accent. The English bourgeoisie speak in more or less the same accent wherever they come from. The insistence on retaining a regional accent is therefore a challenge to the machineries of class subordination — a refusal to accept being marked as inferior.
Williamson was born in Grantham, Lincolnshire — Sleaford is about twenty miles away — and was involved in the music scene for years, following a familiar provincial trajectory: not making it, but always being lured back at the very point he was about to give up. He was in and out of local groups, followed the dream to San Francisco and London for a while, and ended up back home when it didn’t come off. He tried to go out on his own, but he couldn’t find anything new, until, bored and frustrated in a recording studio, he started ranting over a metal track. He had found his voice, literally. He was inspired by the Wu-Tang Clan, but he didn’t so much repeat their sound as their methodology, forcing listeners to adjust to his accent, idiolect and references. This risked bathos — the East Midlands ain’t New York, and Sleaford Mods would come off as just another comic turn if it weren’t for Williamson’s incendiary intensity. (Which isn’t to deny the mordantly acidic wit that runs through his lines: “Chumbawamba weren’t political?/They were just crap”, isn’t just funny but critically astute.)
Listen to the singles collection, Chubbed Up, next to Divide and Exit, and it’s clear not much has changed in the duo’s sound. The variation is provided by Williamson’s words, the music by Andrew Fearn always fits an (unfussy) formula: pugilistic post-punk bass; functional but unprepossessing beats; occasional cheap keyboard riffs and listless wafts of guitar. It’s digitally manipulated, but conspicuously unpolished — the software is used not to micromanage the sounds but to capture them into a purgatorial loop.
The name Sleaford Mods sounds like vintage graffiti, or something you’d have sewn onto a Union Jack at an England football match three decades ago. On the face of it, they couldn’t be any less mod. Where is the style and the cool in this relentless outpouring of profanity and discontent? But mod was a complex phenomenon, as much about the failure to achieve the glamour of black America as it was about the aspiration towards possessing it. The mods might have loved Miles and Motown but when they made music it sounded like the Who and the Jam — rock born with a plastic spoon in its mouth, stuck in a monochromatic England skulking in the shadows cast by the USA’s Pop Art consumer dreams. The mods worked in office jobs, in semi-skilled occupations and in department stores, longing for a luxury far above their station. But their ambitions weren’t to climb the social ladder of bourgeois respectability — they prefigured instead a world in which style exploded far beyond the narrow calculations of business, and everyday life could become a work of art. As Dick Hebdige wrote in his essay “The Meaning of Mod”: “Every mod was existing in a ghost world of gangsterism, luxurious clubs and beautiful women, even if the reality only amounted to a draughty Parker anorak, a beaten up Vespa, and fish and chips out of a greasy bag.” With Sleaford Mods, the chips and the grease are all that’s left. Factories have closed and trade unions have been subdued. Art schools and the media have rebourgeoisified. University courses have been opened up, but the real graduate jobs are reserved for the same old suspects. The only time you are likely to hear a working-class accent on television is in a poverty porn documentary.
This is Sleaford Mods’ world, but they refuse the place assigned to them by well-meaning metropolitan liberals and by unscrupulous Tories. They won’t play the part of a dumb feckless prole or white, working-class racist (Williamson loathes St George’s flag white van men as much as their Tory overlords). They won’t knuckle down and gratefully accept zero-hours contract jobs, or be content to “rot away in the aisles of Co-Op”, as the single “Jolly Fucker” had it.
If anything, Divide and Exit feels more claustrophobic than its predecessor, Austerity Dogs, with even the tiny dreamy spaces that once opened up on tracks such as “Donkey” eliminated by Williamson’s relentless excremental flow. Excremental is the right word: piss and shit course through Williamson’s rhymes, as if all the psychic and physical effluent abjected by Cameron’s Britain can no longer be contained, and it’s bursting upwards, exploding through all the deodorised digital commercial propaganda, the thin pretences that we’re all in this together and everything’s going to be all right.
What overflows in Williamson’s pottymouth is a seething disaffection incubated on the dole or in dead end jobs and further stoked up by the shop-soiled fantasies of escape pushed by an ailing music business. An early single was called “Jobseeker”: “So Mr Williamson — what have you done to find gainful employment since your last signing on date?/Fuck all!” A fantasy exchange no doubt: here, as often in Sleaford Mods Williamson gives vent to a voice that would otherwise stay locked in his head. Discontent is everywhere in the UK now but for the most part it’s privatised: blunted by alcohol and anti-depressants, or directed into impotent comments box spite and empty social media outrage: “All you Zombies, tweet tweet tweet”.
If Williamson’s anger often seems intransitive — his fuck offs are sheer explosions of exasperation, directed at no one in particular, or at everyone — it’s underscored by a class consciousness painfully aware that there is nothing which could transform disaffection into political action. “Aren’t we all just/Pissing in the flames?” Cameron and the Tories are obviously despised — there’s a particularly memorable nightmare image of the “Prime Minister’s face hanging in the clouds/Like Gary Oldman’s Dracula” — but who can stop them? “Liveable shit/You put up with it”. This is both a taunt directed at the audience and an acknowledgement of Williamson’s own capitulation in doing what’s necessary to survive.
It isn’t always the role of political music to come up with solutions. But nothing could be more urgent than the questions that Sleaford Mods pose: who will make contact with the anger and frustration that Williamson articulates? Who can convert this bad affect into a new political project?
test dept: where leftist idealism and popular modernism collide
There’s something very timely about the return of Test Dept. Their installation DS30 (2014), the accompanying film and the book Total State Machine (2015) — a comprehensive history and critical study of the band — have arrived just in time for the deep crisis of neoliberalism in the UK.
Test Dept were always more than a musical group. They are better understood as a popular modernist collective that had the production of sound at its centre, but which also made visuals, projections and films. Test Dept were formed in London in 1981 by Jonathan Toby Burdon, Graham Cunnington, Angus Farquhar, Paul Hines and Paul Jamrozy. They began as a second-wave industrial act, following on from a first wave led by Throbbing Gristle and Cabaret Voltaire. With their use of found metal objects and their performances in spaces of labour and logistics (disused factories, transport hubs), Test Dept offered what seemed, on the face of it, to be a very literal take on the “industrial”. Via their involvement in a number of UK struggles — including the miners’ strike (1984–85) and the anti-Poll Tax movement (1988–91) — Test Dept also became intensely invested in the politics of the industrial and the post-industrial.
Test Dept’s signature sound is intensely percussive, a convulsive dance music that took its inspiration from Soviet constructivism, but which became something like the British equivalent of the politicised US hip-hop group Public Enemy. The records are sonic mosaics, pulsing with panic, the sampled voices of Tory MPs countered by defiant statements by left-wing militants. One of Test Dept’s most powerful tracks — “Statement” from the 1986 album The Unacceptable Face of Freedom — features miner Alan Sutcliffe giving a moving account of police brutality during the strike. The track is a work of emotional engineering, a collectivist response to the manipulation of affect and desire through advertising, branding and political propaganda. Sutcliffe went on to tour with the group: one example of the way in which struggles produced not only new alliances but new social spaces, in which art-making ceased to be a matter for specialists of a certain age.
For any British, left-wing person, remembering the mid-1980s is liable to provoke a sadness that is visceral, choking, wrenching. I still can’t recall without weeping the day when the miners returned to work in 1985 after a year on strike. What I have called capitalist realism — the deeply embedded belief that there is no alternative to capitalism — was definitively established in the UK during that period, in Margaret Thatcher’s second term in government. For a significant proportion of the population, the 1982 Falklands War had transformed Thatcher from a figure of loathing into a glorious war leader. This renewed popularity, together with the formation of the Social Democratic Party by Labour Party defectors, allowed the Tories to achieve a landslide victory in the 1983 general election. It proved to be a traumatic defeat for the British left in general, and for the Labour Party in particular. Labour began its long march towards Blairism and its eventual complete capitulation to neoliberalism and corporate tyranny. Meanwhile, the crushing of the miners’ strike, and the wave of privatisations that the Tories unleashed, created the conditions for the neoliberal Britain that is only now falling apart, thirty years later.
In retrospect, it can look as if the whole of the 1980s was a series of defeats for the left. One value of Total State Machine is to remind us that it didn’t feel that way at the time. Rather, like John Akomfrah’s video installation The Unfinished Conversation (2013), the Total State Machine book invokes a forgotten 1980s, in which style culture was synchronised with the rise of an anti-authoritarian left that confidently laid claim to a new modernity, set to dispense with capital, patriarchy and racism as so many historical relics; a 1980s in which radical chic and designer socialism weren’t dirty words but real possibilities.
Total State Machine includes a section of Cynthia Rose’s 1991 book Design After Dark. Inspired by a Test Dept performance, Rose argues that young Britons would
succeed in staging a dancefloor revolution. It will not be the Komsomolstyle overthrow dreamt of by Red Wedge, the ill-fated attempt by a collective of musicians — led by Billy Bragg, Paul Weller and Jimmy Somerville — to spearhead a campaign to defeat the Tories in the 1987 General Election. Instead, it will come about through grass-roots changes — successive waves of guerrilla sounds, guerrilla design, guerrilla entertainments. The new design dynamic will be an impulse born out of celebration, rising out of leisure enacted as an event. And it will change young people’s perception about what entities like design and communication should do.[2]
Sadly, it didn’t work out that way. Rose was absolutely right that most of the innovative energy in British music culture would come from dance music, which was about to enjoy its most fecund period ever. But the atmosphere around rave, jungle and garage tended towards the apolitical, the libertarian or the capitalist. The alliance of the left with the new technologies, energies, infrastructures and forms of desire that Rose saw emerging was to be very short-lived.
The comparison with Red Wedge is instructive here. Part of the problem with Red Wedge was that, despite taking its name from a poster designed by El Lissitzky (Beat the Whites with the Red Wedge, 1919), its music represented a retreat from modernist experimentalism. Bragg’s blokeish neo-folk, the hamfisted jazz-funk-pop Weller made with the Style Council, the Communards’ strangely depressing party music: none of this was capable of articulating a future. It was all bogged down in the worst kind of 1980s gloss.
Test Dept were one of the last examples of what has been called post-punk, but really they are part of a longer trajectory of art pop/pop art going back to the 1950s. The conditions for this popular modernism were subject to sustained attack in the mid-Eighties, and they have never recovered. The Tories began to dismantle the infrastructure of social security, higher-education maintenance grants, squatting and art schools that had given working-class people access to the resources of so-called high culture and time to produce their own sound, fiction and art.
But the neoliberal capitalism that drove this assault on culture is now heading for disaster — in Greece, in Spain, in Scotland and, finally, in England. Far from being some static monument to a bygone era, Total State Machine is an invaluable archive, an inventory of strategies, gestures and techniques that can now be repotentiated by others ready to begin where the Test Dept of the 1980s left off. Rose’s prophecies of a new design dynamic can yet come true. Popular modernism isn’t dead: it has merely had a thirty-year hiatus.
no romance without finance
Jennifer M. Silva’s Coming Up Short: Working-Class Adulthood in an Age of Uncertainty is a heartbreaking study of the corrosive effects of the neoliberal environment on intimacy. Silva’s book focuses on young people specifically — it is based on a hundred interviews she undertook with young working-class men and women in two American cities in Massachusetts and Virginia. Her findings are disturbing. Over and over again, Silva finds her young subjects exhibiting a “hardened” self — a form of subjectivity that prides itself on its independence from others. For Silva, this hardened subject is the consequence of this generation being abandoned, institutionally and existentially. In an environment dominated by unrelenting competition and insecurity, it is neither possible to trust others nor to project any sort of long-term future. Naturally, these two problems feed into one another, in one of the many vicious spirals which neoliberal culture has specialised in innovating. The inability to imagine a secure future makes it very difficult to engage in any sort of long-term commitment. Rather than seeing a partner as someone who might share the stresses imposed by a harshly competitive social field, many of the working-class individuals to whom Silva spoke instead saw relationships as an additional source of stress. In particular, many of the heterosexual women she interviewed regarded relationships with men as too risky a proposition. In conditions where they could not depend on much outside themselves, the independence they were forced to develop was both a culturally-validated achievement and a hard-won survival strategy which they were reluctant to relinquish.
“In a world of rapid change and tenuous loyalties”, Silva argues, “the language and institution of therapy — and the self-transformation it promises — has exploded in American culture.”[2] A therapeutic narrative of heroic self-transformation is the only story that make sense in a world in which institutions can no longer be relied upon to support or nurture individuals:
In social movements like feminism, self-awareness, or naming one’s problems, was the first step to radical collective awareness. For this generation, it is the only step, completely detached from any kind of solidarity; while they struggle with similar, and structurally rooted, problems, there is no sense of “we”. The possibility of collective politicisation through naming one’s suffering is easily subsumed within these larger structures of domination because others who struggle are not seen as fellow sufferers but as objects of scorn.[3]
The spreading of therapeutic narratives was one way in which neoliberalism contained and privatised the molecular revolution that consciousness-raising was bringing about. Where consciousness-raising pointed to impersonal and collective structures — structures that capitalist and patriarchal ideology obscures — neoliberalism sees only individuals, choices and personal responsibility. Yet consciousness-raising practices weren’t only at odds with capitalist ideology; they also marked a decisive break with Marxist-Leninism. Gone was the revolutionary eschatology and the militaristic machismo which made revolution the preserve of an avant-garde. Instead, consciousness-raising made revolutionary activity potentially available to anyone. As soon as two or more people gather together, they can start to collectivise the stress that capitalism ordinarily privatises. Personal shame becomes dissolved as its structural causes are collectively identified.
Socialist-feminism converted Lukács’s theory of class consciousness into the practice of consciousness-raising. Since consciousness-raising has been used by all kinds of subjugated groups, it would perhaps be better to talk now of subjugated group consciousness rather than (just) class consciousness. But it is worth noting in passing that neoliberalism has sought to eradicate the very concept of class, producing a situation memorably described by Wendy Brown, in which there is “class resentment without class consciousness or class analysis”. This erasure of class has distorted everything, and allowed many struggles to be rhetorically captured by bourgeois liberalism.
Subjugated group consciousness is first of all a consciousness of the (cultural, political, existential) machineries which produce subjugation — the machineries which normalise the dominant group and create a sense of inferiority in the subjugated. But, secondly, it is also a consciousness of the potency of the subjugated group — a potency that depends upon this very raised state of consciousness. However, it is important to be clear that the aim is not to remain in a state of subjugation. As Nancy C. M. Hartsock explains, “the point is to develop an account of the world that treats our perspectives not as subjugated, insurrectionary, or disruptive knowledges, but as potentially constitutive of a different world”.[4]
To have one’s consciousness raised is not merely to become aware of facts of which one was previously ignorant: it is instead to have one’s whole relationship to the world shifted. The consciousness in question is not a consciousness of an already-existing state of affairs. Rather, consciousness-raising is productive. It creates is a new subject — a we that is both the agent of struggle and what is struggled for. At the same time, consciousness-raising intervenes in the “object”, the world itself, which is now no longer apprehended as some static opacity, the nature of which is already decided, but as something that can be transformed. This transformation requires knowledge; it will not come about through spontaneity, voluntarism, the experiencing of ruptural events, or by virtue of marginality alone. Hence Hartsock’s concept of standpoint epistemology, which maintains — following Lukács and Marx — that subjugated groups potentially have an access to knowledge of the whole social field that the dominant group lacks. Members of subjugated groups do not however automatically possess this knowledge as of right — it can only be accessed once group consciousness is developed. According to Hartsock, “the vision available to the oppressed group must be struggled for and represents an achievement which requires both science to see beyond the surface of the social relations in which all are forced to participate, and the education which can only grow from struggle to change those relations.”
One way of seeing Jennifer M. Silva’s book is as an account of radically deflated consciousness. Crucial to this is Silva’s restoration of the concept of class as a frame shaping the experiences of those who feature in her study. Class is what is typically missing from her interviewees’ “therapeutic” accounts of themselves. Exactly as Wendy Brown says, many of Silva’s subjects tend to exhibit (an unconscious and disavowed) class resentment without class consciousness.
Reading Silva’s descriptions of women wary of giving up their independence to men they perceive as feckless wasters, I was reminded of two R&B hits from 1999: “No Scrubs” by TLC and “Bills Bills Bills” by Destiny’s Child. Both these songs see financially independent women upbraiding (presumably unemployed) men for their shiftlessness. It is easy to attack such tracks for their seeming peddling of neoliberal ideology. Yet I think it far more productive to hear these songs in the same way that we attend to the accounts in Silva’s book. These are examples of consciousness deflated, which have important lessons to communicate to anyone seeking to dismantle capitalist realism.
It is still often assumed that politics is somehow “inside” cultural products, irrespective of their context and their use. Sometimes, agit-prop style culture can of course be politically transformative. But even the most reactionary cultural expression can contribute to a transformative project if it is sensitively attended to. It is possible to see the work of the late Stuart Hall in this light: as an attempt to bring to leftist politics the messages that culture was trying to impart to it. If this project was something of a tragic failure, it was a consequence, not of the shortcomings in Hall’s approach, but of the intransigence of the old left, its deafness to the desires and anxieties being expressed in culture. Ever since Hall fell under the spell of Miles Davis in the 1950s, he dreamed of somehow commensurating the libidinal modernity he encountered in popular music with the progressive political project of the organised left. Yet the authoritarian left was unable to tune into this ambition, allowing itself to be outflanked by a new right which soon claimed modernisation for itself, and consigned the left to the past.
To understand this failure from another angle, let’s consider for a moment the work of the late music and cultural critic Ellen Willis. In her 1979 essay, “The Family: Love It Or Leave It”[5], Willis observed that the counterculture’s desire to replace the family with a system of collective child-rearing would have entailed “a social and psychic revolution of almost inconceivable magnitude”. It’s very difficult, in our deflated times, to re-create the counterculture’s confidence that such a “social and psychic revolution” could not only happen, but was already in the process of unfolding. Like many of her generation, Willis’s life was shaped by first being swept up by these hopes, then seeing them gradually wither as the forces of reaction regained control of history. There’s probably no better account of the Sixties’ counterculture’s retreat from Promethean ambition into self-destruction, resignation and pragmatism than Willis’s collection of essays Beginning To See The Light.[6] As Willis makes clear in her introduction to the collection, she frequently found herself at odds with what she experienced as the authoritarianism and statism of mainstream socialism. While the music that she listened to spoke of freedom, socialism seemed to be about centralisation and state control. The counterculture’s politics were anti-capitalist, Willis argues, but this did not entail a straightforward rejection of everything produced in the capitalist field. Certainly, pleasure and individualism were important to what Willis characterises as her “quarrel with the left”, yet the desire to do away with the family could not be construed in these terms alone; it was inevitably also a matter of new and unprecedented forms of collective (but non-statist) organisation. Willis’ “polemic against standard leftist notions about advanced capitalism” rejected as at best only half-true the ideas “that the consumer economy makes us slave to commodities, that the function of the mass media is to manipulate our fantasies, so we will equate fulfilment with buying the system’s commodities”. Culture — and music culture in particular — was a terrain of struggle rather than a dominion of capital. The relationship between aesthetic forms and politics was unstable and inchoate — culture didn’t just “express” already-existing political positions, it also anticipated a politics-to-come (which was also, too often, a politics that never actually arrived).
Yet there was also an immanent transformative immediacy in the music of the counterculture. It reinforced the feelings of despair, disaffection and rage that bourgeois culture ordinarily makes us distrust. As such, music functioned as a form of consciousness-raising, in which a mass audience could not only experience its feelings being validated, it could locate the origins of those feelings in oppressive structures. Moreover, the ingestion of hallucinogens by growing numbers of the population, and the emergence of a psychedelic imaginary that touched even those who had never used acid, made for a widespread perception that social reality was provisional, plastic, subject to transformation by collective desire.
If Beginning to See the Light is a painful — and painfully honest — account of consciousness deflation, then the same story is narrated within music culture itself. Peter Shapiro has shown how early Seventies soul and funk music — the O Jays’ “Back Stabbers”, the Undisputable Truth’s “Smiling Faces Sometimes”, Sly Stone’s “You Caught Me Smiling” — “engaged in a remarkable conversation” about the newly minted Smiley yellow face image, “an imagistic minefield that played confidence games with centuries of caricatures, the beaming faces of the white establishment promising civil rights and integration [and] Nixon’s Dirty Tricks gang.” With Nixon on the rise and the Panthers subdued, songs like “Backstabbers” caught a new mood of suspicion and recrimination. In his classic essay “The Myth of Staggerlee”, Greil Marcus argues that these songs — along with the rest of Sly and the Family Stone’s There’s A Riot Goin’ On and the Temptations’ “Papa Was A Rolling Stone” — were part of a bitter moment, when Sixties optimism had drained away to be replaced by paranoia and melancholy. Stone writes, “when new roles break down and there is nothing with which to replace them, old roles, ghosts, come in to fill the vacuum”. The collectivity and the multiplicity that the Family Stone had embodied — radical democracy in vibrant action: a group made up of men and women, blacks and whites — gave way to a morose and dejected individualism. “The best pop music does not reflect events so much as it absorbs them”, Marcus wrote. “If the spirit of Sly’s early music combined the promises of Martin Luther King’s speeches and the fire of a big city riot, Riot represented the end of those events and the attempt to create a new music appropriate to the new realities.”
These “new realities” would eventually become nothing less than capitalist realism itself. Capitalist realism — in which current social relations are reified to the point that any shift in them becomes unimaginable — could only be fully consolidated once the Promethean-psychedelic imaginary was all but entirely subdued. But this would take a while. The Seventies weren’t only about countercultural retreat and defeat. In When the Lights Went Out: Britain in the Seventies, Andy Beckett argues that a “liberal or left-wing melancholy about the Seventies has, in many ways, been the mirror image of the doomy right-wing view of the same period”. But, as Beckett argues, this “fails to acknowledge that for many politicised Britons, the decade was not the hangover after the Sixties; it was the point when the great Sixties party actually started”. The successful Miners’ Strike of 1972 saw an alliance between the striking miners and students that echoed similar convergences in Paris 1968, with the miners using the University of Essex’s Colchester campus as their East Anglian base. The Seventies also saw the growth in Britain of gay, anti-racist, feminist and Green movements. In many ways, it was it was the unprecedented success of the left and the counterculture in the 1970s that forced capital to respond with neoliberalism. This was initially played out in Chile, after Pinochet’s CIA-backed coup had violently overthrown Salvador Allende’s democratic socialist government, transforming the country — via a regime of repression and torture — into the first neoliberal laboratory.
The Seventies that Andy Beckett celebrates in the British context found expression in the US in the disco genre. Disco was a music that grew out of the convergence of a number of subjugated groups. It was a music made by and for gays, black people and women, and — like most postwar popular music, it was overwhelmingly produced by the working class. Chic’s Nile Rodgers — surely the most important producer and sonic conceptualist of the late Seventies and early Eighties — had been a member of the Black Panthers as a teenager. Disco provided the template for the successive waves of dance music in the Eighties and Nineties, including house, techno, rave and garage. In her 1991 book Design After Dark, Cynthia Rose prophesied a “dancefloor revolution” that would
come about through grass-roots changes — successive waves of guerrilla sounds, guerrilla design, guerrilla entertainments. The new design dynamic will be an impulse born out of celebration, rising out of leisure enacted as an event. And it will change young people’s perception about what entities like design and communication should do.[7]
Yet Rose understandably failed to anticipate the extent to which the new energies, infrastructures and forms of desire she identified would be appropriated by a neoliberal culture which would lay claim to freedom and pleasure, while associating the left with a grey puritan statism. Once again, the left missed an opportunity, failing to successfully align itself with the collective euphoria of dancefloor culture. Thus the “good times” on the dancefloor became fleeting escapes from a capitalism that was increasingly dominating all areas of life, culture and the psyche.
This super-domination came out in the mordant yet playful “realism” of Gwen Guthrie’s 1986 R&B hit, “Ain’t Nothing Goin’ On But The Rent”, one of the first popular musical signs of the emergence of the new hardened subject that Silva analyses so well. At a time of rising unemployment, Guthrie sang, “You’ve got to have a j.o.b. if you want be with me/no romance without finance”. The subjectivity performed in Guthrie’s song is in many ways the female counterpart to the gangster rap persona that was emerging when the single was released. Both reject intimacy and tenderness. In gangster rap there is a hyberbolic performance of invulnerability — a performance that can only appear bitterly ironic, when we consider the fact that even some of the most wealthy and successful gangster rappers (such as Tupac Shakur and Biggie Smalls) would end up being shot dead. By contrast, and despite its surface bravado, “Ain’t Nothing Goin’ On But The Rent” is a song about the need for security — “fly girl like me/needs security” — in conditions of radical uncertainty. This wasn’t some celebration of Reaganomics. On the contrary, Guthrie’s song drew out the way in which Reaganomics was corroding the conditions for intimacy — a message that was much more emotionally charged and politically resonant than most of the protest songs of the time. Similarly, the formula “no romance without finance” need not only be construed as merely some reactionary concession to capitalist realism. Rather, it can be heard as a rejection of the ideological sentimentality that separates out social reproduction from paid work. Anticipating much of twenty-first-century popular music, “Ain’t Nothing Goin’ On But the Rent” is the sound of the loneliness that happens when consciousness is deflated, and the conditions for raising it are absent. But with the new movements that are rising in the US after Ferguson, with the movements in Europe that have produced Podemos and Syrisa, there is every reason to believe that those conditions are returning. It is beginning to look as if, instead of being the end of history, capitalist realism was a thirty-year hiatus. The processes that began in the Sixties can now be resumed. Consciousness is being raised again.
PART FOUR: FOR NOW, OUR DESIRE IS NAMELESS: POLITICAL WRITINGS
don’t vote, don’t encourage them
There was a time when elections at least seemed to mean something. I still recall, viscerally, the hollow, bitter sense of total existential defeat the day after Foot’s tragically bound-for-disaster hard left succumbed to the storm troopers of SF Kapital under Thatcher, and I, only fifteen years old, contemplated “Five More Years” of Tory rule. I didn’t hear it at the time, but the song that always brings that feeling, that moment, is Mark Stewart’s “Liberty City”: “I’ll give a wave to the management mercenaries… Don’t their clean clothes look so pretty/Try to awaken then from the comforts of slavery…”
There are still those who would like to pretend that a Tory administration would be so much worse than New Labour, so that deigning to vote for anyone else would be an “indulgence”. Choosing “the least worst” is not making this particular choice, it is also choosing a system which forces you to accept the least worst as the best you can hope for. Naturally, the defenders of the dictatorship of the elite pretend — perhaps they even deceive themselves — that the particular slew of lies, compromise and smarm they are hawking is “only temporary”; that, at some unspecified time in the future, things will improve if only we support the “progressive” wing of the status quo. But Hobson’s choice is no choice, and the delusion of progressivism is not a psychological quirk, it is the structural delusion upon which liberal democracy is based.
Johan Hari tries to make the case for reluctantly voting New Labour today, on the grounds that the Tories are the only realistic alternative and they are manifestly worse than New Labour. But just what is the threat that Howard’s Tories pose? Will they suspend habeas corpus? Can’t, Toneeeeee’s already done it. Will they shamelessly and shamefully play to the rightwing gallery on immigration? Well, yes, but that’s only what the Joker Hysterical Face is already doing. (It’s not the war that made me lose any vestigial sentimental attachment to New Labour, it was their disgusting and despicable pandering to the right on immigration.)
Let’s dispense with this idea, once and for all, that New Labour has “improved” anything. New Labour is the worst of all worlds: Thatcherist managerialism without the Thatcherite attack on vested interests. In the pre-Thatcher 1970s, it took six carworkers to do the job of one; in the post-Thatcher Noughties, it takes six consultants to do the job of none (since the mission statement wasn’t worth writing in the first place). Same decadence, different beneficiaries. New Labour and its supporters scoff at the Tories’ idea that you could cut £35 billion in public spending and yet improve public services. As someone who works in public services, it strikes me as eminently plausible (not that I believe that the Tories would do it, or do it right, if they came to power, naturally). Cutting back on red tape, bureaucrats, paperwork would have two immediately positive effects: it would get rid of the managers and administrators whose wages are a disproportionate drain on the budget, and it would improve the performance of those who actually do the jobs, simply by dint of the fact that they wouldn’t have to deal with nannying memos and those who send them all the time.
Blair isn’t just contingently a liar, he is, like the new breed of career politician he heads, a professional liar. As a lawyer turned politician, it’s no surprise that Blair treats reality as a distraction from PR. He has been complicit in producing a situation in which there is no more at stake in parliamentary democracy than “beating the other side”, as in a “debate” at the Oxford Union. His I-am-innately-good moral righteousness is as much a testament to his public school and Oxbridge education as anything else: you see, glinting in the eyes, the unwavering certainty of the truly imbecilic. Blair likes to see himself as a conviction politician, but apart from his imperialist intransigence (itself a symptom of his belief in his own innate superiority), what else IS he actually committed to? It’s telling that the only thing he was prepared to defy public opinion on was the war.
Blair’s slogan “education, education, education” is the sickest joke of all (and not only because he has presided over the dumbest front bench in recorded history, another testament to the wonder of Oxbridge). Maybe he has “pumped more money” into education, but that is useless if the extra funds are going on quangos, incompetent administrators and facile “initiatives” that were doomed to fail and pointless even if they succeeded.
The “Third Way” “solution” to Further Education is a typical Blairite catastrophe. Colleges are now funded per student, with the result that students now treat themselves as “consumers” — i.e. the canny ones quickly realise that even the most abusive or violent behaviour is unlikely to result in their being removed from the college, since it means a significant cut in the college’s revenue. Students with behavioural problems shouldn’t simply be turned away, but neither can they be allowed to continue attending college as if nothing has happened. That is a dereliction of duty towards the student, and towards the other students, whose education and learning environment is damaged while such behaviour is left unchecked. But “Third Way” funding means that the only result will be institutional cynicism. Imposing “targets” and assigning funds on the basis of meeting them — what the economist calls “reform”, i.e. ideology dressed up as realism — will only ever lead to a situation in which bureaucrats and the bureaucratically-minded prosper. The way to improve education, and all other public services, is to accept the obvious truth (though such truth is contrary to ideology): most people working in these services are not, in fact, venal, are not motivated solely by what is in the interests of “them and their famileeee”. So it would be better to hand more control back over to them; by all means intervene if it is going wrong, but don’t assume that things work better if they are run by bureaucrats (the whole of reality is a counter-example to this ludicrous thesis).
I admit that, emotionally and unthinkingly, I will find myself supporting the “left” parties when the results come in tomorrow night. Yes, I want to see Galloway give Oona King a kicking, yes I would love to see Letwin lose his seat. But only in exactly the same way that I want to see X contestant beat Y contestant in Big Brother; it really is only sentimentality to pretend that this spectacle has much consequence. This will always be the case in liberal democracy at the best of times, but especially so in a country which has an electoral system so fundamentally corrupt and unjust. Hari is right that, in the Eighties, 56% of the electorate voted for left parties, but because the vote was split between Labour and the Lib Dems, the Tories were allowed to maintain their reign of terror. But that is an argument for urgent reform of the electoral system, not for voting New Labour.
As I.T. rightly argues, the “people died for the vote” line is utterly facile. Soldiers in the Wehrmacht died for the glories of the Fatherland — does that mean I should become a Nazi? Catholics burned for their belief in transubstantiation: should I then repent and go to Mass on Sunday? Plus, I think I’m on fairly safe ground, really, with the conjecture that no one, but no one, died for the opportunity to “choose” between Blair and Howard.
october 6, 1979: capitalism and bipolar disorder
Realism has nothing to do with the Real. On the contrary, the Real is what realism has continually to suppress.
Capitalist realism, like socialist realism, is about “putting a human face” on and naturalising a set of political determinations. The komissars of Kapital like to pose as tough-minded pragmatists who tell unpalatable truths and who alone are capable of facing up to the harsh “realities” of the world. Yet Kapitalism — no less in its its soon-to-take over Chinese State version than in its soon-to-collapse American model — is based upon a slew of fantasies so credulous that they are almost charming. In a powerful piece in the Independent today,[2] Johann Hari parallels the militant complacency of the current ruling elite with the thinking of previous highly developed social groups, such as the Incas and the Mayans, which had “committed ecocide”. “What were Easter Islanders saying as they cut down the last tree on their island?,” Hari quotes geographer Jared Diamond asking in his book Collapse: How Societies Choose to Fail or Survive. It is grim to reflect that the answers — “jobs not trees!” or “technology will solve our problems; never fear, we’ll find a substitute for wood” — are precisely the rationalisations that a thanatropic drive would produce in order to do its work. In the unconscious, Freud says, no one really believes they will die, and this is no doubt also true of civilisations, which despite the melancholy monuments testifying to the demise of Maya and Easter Island, are convinced that they are the exceptions, they are the one which cannot perish.
It is easy to see what capitalist “realism” means when you consider Blair’s habitual response to appeals from the environmental lobby. Measures to rein in eco-catastrophe may well be desirable — even necessary — but they, Blair tells us with a heavy heart bursting his sleeve, are “politically impossible”. Here, then, is capitalist “realism”: the reduction to the realm of the “impossible” of any steps that will prevent the destitution of the human environment. For that is what “realism” amounts to: not a representation of the real, but a determination of what is politically possible. But what is politically possible is at odds with what is physically possible, so in a sense, it is the servomechanism-agents of Kapital, not their opponents, who “demand the impossible” now. Their fantasy of a sustainable Kapitalism carrying on, forever, without burning out the planet, is perfectly delirial.
Another insight into capitalist realism was provided last week by Marxist economist Christian Marazzi (Scuola Universitaria Professionale della Svizzera Italiana, Lugano, Switzerland) whose lecture “Finance, Attention and Affect” at Goldsmiths was an interrogation of the meaning — and psychological, social and neuronic impact of — post-Fordism.[3] Christian dated the moment of the switch from Fordism to post-Fordism very precisely: 6 October 1979. It was on that date that the Federal Reserve increased interest rates by twenty points, preparing the way for the “supply-side economics” that would constitute the “economic reality” with which we are now so familiar. The rise in interest rates not only contained inflation, it made possible a new organisation of the means of production and distribution. The economy would no longer be organised by reference to production, but from the side of the point of sale. The “rigidity” of the Fordist production line gave way to a new “flexibility”, a word that will send chills of recognition down the spine of every worker today. This flexibility was defined by a deregulation of capital and labour, with the workforce being casualised (with an increasing number of workers employed on a temporary basis) and outsourced.
The new conditions both required and emerged from an increased cybernetisation of the working environment. The Fordist factory was crudely divided into blue- and white-collar work, with the different types of labour physically delimited by the structure of the building itself. Labouring in noisy environments, watched over by managers and supervisors, workers had access to language only in their breaks, in the toilet, at the end of the working day, or when they were engaged in sabotage, because communication interrupted production. But in post-Fordism, when the assembly line becomes a “flux of information”, people work by communicating. As Wiener taught, communication and control entail one another.
What Deleuze, after Burroughs and Foucault, called “the society of control” comes into its own in these conditions. Work and life become inseparable. As Christian observed, this is in part because labour is now to some degree linguistic, and it is impossible to leave language in the locker after work. Capital follows you when you dream. Time ceases to be linear, becomes chaotic, punctiform. As production and distribution are restructured, so are nervous systems. To function effectively as a component of “just in time production”, you must develop a capacity to respond to unforeseen events, you must learn to live in conditions of total instability, or “precarity”, as the ugly neologism has it. Periods of work alternate with periods of unemployment. Typically, you find yourself employed in a series of short-term jobs, unable to plan for the future.
The horrors of these new working patterns are clear, but it is imperative that the left renounces one of its most dangerous addictions, its nostalgia for Fordism. As Christian pointed out, the disintegration of stable working patterns was in part driven by the desires of workers — it was they who, quite rightly, did not wish to work in the same factory for forty years. In many ways, the left has never recovered from being wrong-footed by Kapital’s mobilisation and metabolisation of the desire for emancipation from the Fordist routine. Especially in the UK, the traditional representatives of the working class — union and labour leaders — found Fordism rather too congenial; its stability of antagonism gave them a guaranteed role. But this meant that it was easy for the advocates of post-Fordist Kapital to present themselves as the opponents of the status quo, bravely resisting an inertial organised labour “pointlessly” invested in fruitless ideological antagonism which served the ends of union leaders and politicians, but did little to advance the hopes of the class they purportedly represented. And so the stage was set for the neoliberal “end of history”, the “postideological” ideological justification for rampant supply-side economics. Antagonism is not now located externally, in the face-off between class blocs, but internally, in the psychology of the worker, who, qua worker, is interested in old-style class conflict, but, as someone with a pension fund, is also interested in maximising their investment. There is no longer an identifiable external enemy. The consequence is that, as Christian put it in a memorable image, post-Fordist workers, are like the Old Testament Jews after they left the “house of slavery”: liberated from a bondage to which they have no wish to return but also abandoned, stranded in the desert, confused about the way forward.
The psychological conflict raging within individuals — they themselves are at war — cannot but have casualties. One hidden, or at least naturalised, consequence of the rise of post-Fordism is that the “invisible plague” of psychiatric disorders that has spread, silently and stealthily, since around 1750 (i.e. the very onset of industrial capitalism), has reached a new level of acuteness in the last two decades. This is one more dimension of the Real that capitalist realism is constitutively unable to process.
It is typical of New Labour that it should have committed itself, so early in its third term, to removing people from incapacity benefit, as if most people claiming the benefit were malingerers. In contrast with this assumption, it doesn’t seem unreasonable to infer that most of the people claiming incapacity benefit — and there are well in excess of two million of them — are casualties of Kapital. A significant proportion of claimants, for instance, are people psychologically trashed as a consequence of the capitalist realist insistence that mining was no longer economically viable (though, even considered in brute economic terms, once you factor in the cost to taxpayers of such benefits, the arguments about “viability” seem rather less than convincing). Many have simply buckled under the terrifyingly unstable conditions of post-Fordism.
The current ruling ontology rules out any possibility of a social causation of mental illness. The chemico-biologisation of mental illness is of course strictly commensurate with its de-politicisation. Considering mental illness as an individual chemico-biological problem has enormous benefits for capitalism: first, it reinforces capital’s drive towards atomistic individualisation (you are sick because of your brain chemistry), and second, it provides an enormously lucrative market in which multinational “pyscho-mafias” can peddle their dodgy drugs (we can cure you with our SSRIs). It goes without saying that all mental illnesses are neurologically instantiated, but this says nothing about their causation. If it is true, for instance, that depression is constituted by low serotonin levels, what still needs to be explained is why particular individuals have low levels of serotonin.
The increase in bipolar disorder is a particularly significant development. In the discussion after Christian’s lecture, I asked him about the relationship between this form of mental illness and capitalism as a system. It is clear that capitalism, with its ceaseless boom and bust cycles, is itself, fundamentally and irreducibly, bipolar. Capitalism is characterised by a lurching between hyped-up mania (the irrational exuberance of “bubble thinking”) and depressive come-down. (The term “economic depression” is no accident). To a degree unprecedented in any other social system (and capitalism is very precisely NOT a social “structure” in the way that the despotic state or the primitive socius are), capitalism both feeds on and reproduces the moods of populations. Without delirium and confidence, capital could not function. As it happened, Christian confirmed that he had in fact been working with people who had been “psychologically smashed” by capitalism, many of whom, it turned out, had in fact developed bipolar disorder. It could hardly be denied that there is an isomorphic relationship between the social and individual disorders of capitalism.
How could madness not result when we are invited to consider America’s consuming of $600 billion a year more than it produces “realistic”? (As opposed, so we are told, to Europe’s “unrealistic” social welfare programmes.) Make no mistake, the realists are insane, which more than ever reveals the force of the slogan, “the Real is the impossible, but the impossible which happens”. Ecological catastrophe and mental illness are present in capitalism’s wrap-around simulation as warps, unassimilable discontinuities, that which cannot be but which, nevertheless, cannot be extirpated. Perhaps these negative Reals — these dark shadows which allow us to see Kapital’s striplit mall of the mind for what it actually is — have their complement in a positive Real, an event completely inconceivable in the current situation, but which will break in and re-define everything.
what if they had a protest and everyone came
What kind of protest is it that everyone agrees with?
If you weren’t already suspicious of the dull unanamity that coalesced on Saturday [Live 8],[2] reflect on the fact that the Russian show only happened because Putin didn’t want to be the only G8 leader whose country did not have a Live 8 gig. That fact alone reveals that the relationship between the current ruling elite and their ostensible opponents in the entertainment biz goes far beyond complicity.
Live 8 rests on two “libidinal fallacies”.
The first is obvious: it ignores the systemic and abstract nature of the geopolitical situation. It really isn’t the case that “eight men in a room” can “change history” simply by an act of will. Beyond the sentimental bluster, everyone knows that, but Live 8 depends upon a fantasy that there are two types of subject who need to be enlightened: the Subject Who Does Not Know (and whose “awareness” is to be raised) and the Subject Who Knows But Who Doesn’t Care. But who are these people? Who, exactly, needs to be “made aware” of the fact that Africa is desperately poor? And does anyone, even those who buy into the cheap off-the-shelf caricature of Bush as a dumb chimp, really think that he, personally, deliberately chooses to inflict starvation on African children? More to the point, does anyone really think that, on the level of personal morality, Bush is any different from the billionaire pop stars so histrionically raising their fists against him and wagging their fingers at us? That is to say: if there is some sort of moral dividing line, would you really want to place Bush on one side and Elton John and $ Bill Gates on the other?
It is not that Live 8 is a “degraded” form of protest. On the contrary, it is in Live 8 that the logic of the protest is revealed in its purest form. The protest impulse of the Sixties posited a Malevolent Father, the harbinger of a Reality Principle that (supposedly) cruelly and arbitrarily denies the “right” to total enjoyment. This Father has unlimited access to resources, but he selfishly — and senselessly — hoards them. Yet it is not capitalism but protest itself which depends upon this figuration of the Father. It goes without saying that the psychological origins of this imagery lie in the earliest phases of infancy. The hippies’ bucolic imagery and “dirty protest” — filth as a rejection of adult grooming — both originate in the “unlimited demands” of the infant. A consequence of the infant’s belief in the Father’s omnipotence is the conviction that all suffering could be eliminated if only the Father wished it. (In terms of Live 8: if only those 8 men yield to our demands, all poverty could be eliminated forever!) The demand for total enjoyment is actually pretty indiscriminate: the protest could just easily be against war (bummer maaaan) or against being charged for going into a festival (hey, breadheadzzzzzzz, don’t be heaveeeee…)
Indidentally, one of the successes of the latest global elite — the Social Democrats — has been their avoidance of identification with the figure of the hoarding Father, even though the “reality” they impose on the young is substantially harsher than the “reality” they protested against in the Sixties. In this sense, Bush is a godsend for Blair, since Blair can pose as the “really realistic” representative of Social Democratic moderation “winning concessions” from the obscene excesses of Bush, the Junkyard King of Amerikapital’s hideous fusion of id and superego. (The reference to the Birthday Party is not idle here. Oddly, their Junkyard strikes me as an uncannily prescient psychoanalysis both of Bushite Amerika and the role that it plays in everyone else’s fantasies, “Big-Jesus-Oil-King down in Texas drives great holy tanks of Gold/screams from heaven’s Graveyard/ American heads will roll in Texas/roll like daddy’s meat…”)
This brings us to the second fallacy. What is being disavowed in the abjection of evil and ignorance onto fantasmatic Others is our own complicity in planetary networks of oppression. What needs to be kept in mind is BOTH that capitalism is a hyper-abstract impersonal structure AND that it would be nothing without our co-operation. As I will never tire of insisting, the most Gothic description of capital is also the most literal. Capital is an abstract parasite, an insatiable vampire and zombie-maker; but the living flesh it converts into dead labour is ours, and the zombies it makes are us. Determinists of both a neoliberal and anti-humanist bent (believe it or not, it is not unheard of for such positions to coincide within the same person, proving that Marx wasn’t wrong about the essentially contradictory nature of capitalist ideology) merely echo teleo-Marxism at its most eschatological when they insist that what the meat (or human) components of the capital machine are of no consequence since the total triumph of capital is historically inevitable.
The question of what capital wants from us requires answers at a number of levels: economic, psychonalytic, and perhaps most pressingly, theological. In any case, it is clear that, for the moment at least, capital cannot get along without us. It remains the case, however, that we can get along without it. The parasite needs its “mere conscious linkages”, but we do not need the parasite. In addition to anything else, to ignore the crucial functioning of the meat in the machine is poor cybernetics. The denial of human agency is an SF fantasy, albeit one that is everywhere realising itself.
But to reclaim that agency means first of all accepting our insertion at the level of desire in the remorseless meat-grinder of capital. Capital is not something imposed upon us by Bush; it is we who are hooked on the “garbage in honey’s sack”, unable to kick the habit of returning to the Big Jesus Trashcan for another hit of feel-good junk.
It also means raising the price — libidinal, personal, monetary — of agency. The repeated claim from onstage multi-millionaires that the audience were going to “change history” simply by turning up and tuning in cheapens agency in every sense. Participating in a narcissistic, self-righteous spectacle is not “doing something”. Tony Parsons, of all people, made the very good point in the Mirror today that the generation of the Thirties and Forties did not expect Crosby and Sinatra to change the world — but, as he says, many of them had either risked or given up their lives to change things.
Withdrawal from the capital matrix entails an unplugging that will seem painful to nervous systems commensurated to the Reality-Pleasure Principle. Partly it means giving up the reassuring comforter of the Bad Father Figure and facing the fact that the G8 leaders are not capable of legislating away all planetary misery, but are “old men at the crossroads”, capital’s meat puppets not its masters. There is a sense in which it simply is the case that the political elite are our servants; the miserable service they provide from us is to launder our libidos, to obligingly re-present for us our disavowed desires as if they had nothing to do with us. If anyone is in charge in Kapital it is Oedipus Rex, i.e. us. (“I yam the King!” as Cave caterwauled on “Junkyard”. Yes: the junkie as monarch, that’s capitalist sovereignty.) The political “reality” that Bush and the others will no doubt blame their failure to act upon is not just an ideological smokescreen. It is the reality constituted by the desires of that selfsame Live 8 crowd who, when push comes to shove, will not pay extra taxes, will not give up cheap flights or car use, will not make a stand against inequity and stupidity at work if it means compromising their interests and those of their famileeeee and yet who expect global crises to be magically solved by eight stooges in a room.
The great benefit of Lacanianism is to reject both the party of the Infant (“you want new masters, and you shall have your wish” as Lacan told the student protestors of the Sixties) and the party of the Father (the empircomongers who try to sell the Symbolic as the only Real). There must indeed be a demand for the Impossible, but an Impossible which does not correspond with the definition provided by either party. It is not a question of total enjoyment, but of the not-all, a sober psychosis, lessness…
defeating the hydra
In Marvel’s Nick Fury, Agent of S.H.I.E.L.D. comics, the nefarious S.P.E.C.T.R.E.-like international crime and terror network was called H.Y.D.R.A. Its slogan was “cut off a limb and two more shall take its place”. In Saturday’s Times, Paul Wilkinson, Chairman of the Centre for the Study of Terrorism and Political Violence, described the “decentralised network” of al-Qaeda as a “true hydra”. But the lesson of the hydra myth — that to use force against certain types of enemy is not only ineffective, it is counter-productive — is one that the leaders of the War on Terror have yet to learn.
It is the absurd War on Terror itself that has fed the al-Qaeda hydra and put British citizens on the frontline. The issue here is not simply a causal one — the War on Terror has made life unsafer in the West — but a conceptual one — the very notion of a War on Terror has meant that Western populations are reclassified as active combatants in a war not only to the death, but beyond death, an infinite, excitatory cycle of violence begetting violence.
Despite what the increasingly hysterical Pro-Bombing “Left” (PBL) maintain, the causal argument is won. (A testament to this is the way in which the PBL refuse even to have the argument. As one, they have wagged their finger at anyone who has pointed out the obvious causal chain linking US and British foreign policy with Thursday’s events, tut-tutting about the unseemliness of “politicising” the atrocity “even before the bodies are buried”, as if contempt for neo-imperialist Shock and Awe somehow equated to lack of respect for the victims of the attacks in London, as if their own columns were disinterested and neutral, and as if solemn moralising rather than political analysis were what is called for.) The claim that the bombing of Iraq has been a recruiting sergeant for terrorism is uncontroversial. A Foreign Office and Home Office dossier cited in the Sunday Times today states what any intelligent observer already knows:
It seems that a particularly strong cause of disillusionment among Muslims, including young Muslims, is a perceived “double standard” in the foreign policy of western governments, in particular Britain and the US. The perception is that passive “oppression”, as demonstrated in British foreign policy, e.g. non-action on Kashmir and Chechnya, has given way to “active oppression”. The war on terror, and in Iraq and Afghanistan, are all seen by a section of British Muslims as having been acts against Islam.[2]
Even the Economist grants that some of al-Qaeda’s “large group of sympathisers” will have had “extra levels of motivation since the Iraq war”. (It adds: “George Bush has sometimes claimed that a silver lining to the cloud his forces are struggling through in Iraq is that at least the West’s enemies are being fought there rather than at home. The attacks in London are a reminder that that view is as wrong as it is glib.”[3])
But the reclassification of the struggle with al-Qaeda as “war” is another factor that promotes, inspires and legitimates terrorism, a factor perhaps no less significant than the misadventures in Iraq and Afghanistan. For example: it used to be the case that the British government refused to accept that it was “at war” with the IRA; it was the IRA who made that claim. The unwillingness to concede that Britain was engaged in war partly had the effect of making it possible to claim both that the IRA were terrorists (i.e. BY DEFINITION not a group with whom one could be at war) and that any attack on the civilian population was an outrage visited on innocents. But if indeed we ARE at war (as the oxy/moronic War on Terror would have us believe), and if what “we” are fighting for is “our values”, and “simply getting on with our lives” is an expression of those “values” — as, since Thursday, we have endlessly been told it is — then it would follow that we are all indeed warriors co-opted into War on Terror. As Simon Jenkins put it (also in the Sunday Times), “it is Blair who gave terrorism the status of war. He can hardly complain when the enemy treats it as such”.
Johann Hari observed — surely not approvingly? — that the bombings on Thursday were received in London almost as if they were a natural disaster. Much of the media here has insisted, rather, that the bombings be treated as a SUPERNATURAL disaster, the act of a transcendent Evil that cannot and furthermore must not be explained. Both Blair and Bush find it expedient and congenial to use a theological language to describe a threat that would be better considered in more worldly terms. That language is dangerous for two reasons: first, because it contributes to the sublimation of the al-Qaeda threat, transforming a diffuse network into a supernatural force, and second, because it renders all analysis of the threat al-Qaeda actually poses all the more difficult.
According to an emerging orthodoxy in certain sections of the British media, just about any attempt to offer economic, political or sociological explanation for al-Qaeda’s emergence is tantamount to an expression of sympathy for its aims and methods. As Savonarola has pointed out, the PBL and other reactionaries attempted in the immediate aftermath of Thursday to make the very word “political” a slander as they desperately cast about trying to establish a period of non-reflection in which “politics” and thought could be suspended — a period, that is to say, in which their politics and their non-thinking could be imposed as the default response.
The most facile and stupid example of this type of argument might have been Nick Cohen’s piece in the Observer today,[4] rightly excoriated by Lenin[5] (I say “might” because the amount of shrill stupidity, sentimental nonsense and emotional pornography churned out by the hacks over the last few days has reached new levels of stupefaction, as the miserable reality of central London’s rapacious Hobbesian inferno, where folk will beat you to death rather than let you get into a Tube ten seconds before them, has been magically transformed by the bombs and media fairy dust into the very essence of an underdog England in which it is WWII forever: to the sound of choruses of “maybe it’s because I’m Londahner” ringing out from the ghosts of the music halls, journos have shamelessly done themselves up as pearly kings and queens, taking on the role of celebrants of a Fantasy London which is as convincing as Dick Van Dyke’s accent in Mary Poppins.) The “agalma”, the special treasure, of this London resides in the status of “heroic victim” that a disaster such as this re-confirms. A dangerous logic takes hold: we’re under attack, we must be Good.
The supernaturalisation of al-Qaeda is crucial to this strategy. If we are the Good, it can only be the senselessly Evil, the irrationally jealous, who would want to attack us. (This mode of bewildered self-aggrandising is as crucial to a certain version of American identity as spam-eating-make-doand-mend-what you-complaining-about-that-severed-leg-for dour fortitude is crucial to Blitz Englishness.) Needless to say, the positing of an ethnic subject — We, the Good — whose innate virtue is reconfirmed by its being attacked is constitutive of both the al-Qaeda and the post-911 US mindset. A military asymmetry is doubled by a fantasmatic symmetry. Each is the other’s Satan.
To talk of al-Qaeda in theological (rather than in political, social or economic) terms is to adopt their mode of discourse in an inverted form. It is to return to a pre-Feuerbachian, pre-sociological perspective in which all the lessons of the nineteenth- and twentieth-century studies of the social psychology of religion — undertaken by figures as diverse as Durkheim, Marx, Weber, Nietzsche and Freud — are forgotten. If a particular strain of religion is to be understood as, in Cohen’s words, “an autonomous psychopathic force” rather than as a social, economic and psychological phenonenon with complex causes, then all hope of reasoned analysis is a priori ruled out. Unreason is abjected onto the enemy (even as it is evinced in one’s own not even minimally coherent ravings), thus legitimating the idea that “the only option” is military force.
The floating of the pseudo-concept of “Islamofascism” has been central here. There are any number of reasons to consider the idea that there is such a thing as Islamofascism a nonsense. Here are two. First of all, fascism has always been associated with nationalism, but, like global capital, Islamism has no respect for nationality; the first loyalty of the Islamist is to the global Umma. Secondly, fascism is about the State — Islamism has no model of the State, as could be seen in Afghanistan under the Taliban.
The only sense one can make of “fascism” as used by the PBL is that it names anything that is really, really bad (that well-defined category) or it involves the curtailment of liberties. The brand of Islamism al-Qaeda favours would certainly curtail liberties, but not necessarily the same ones that fascism would curtail, or for the same reasons.
Rather than engaging in nebulous negative sublimation — “Behold, Satan” — it would better behove the opponents of Islamist Terrorism to consider more carefully what is specific about it. As John Stevens noted over the weekend, the typical al-Qaeda terrorist is unlikely to have been parachuted in from an Afghan village. They are much more likely to have lived in the West, either as residents or as nationals. Their affiliation with al-Qaeda will, we can speculate, almost certainly serve the function of resolving a tension in themselves. Al-Qaeda recruit from schools and colleges because they are astute enough to recognise that male adolescence is a time of boiling confusion that craves easy certainties. It cannot be that difficult for a fervent Jihadi to convince impressionable young men adrift in the miserable haze of Babylonic capitalism that it is not al-Qaeda but their enemies who are really Evil.
After all, it is not hard to construct a convincing story that the success of the West has been achieved at the expense of Muslims. The Sunday Times reports that in Britain “Muslims are three times more likely to be unemployed than the population as a whole; 52% of them are economically inactive (the highest of any faith group) and 16% have never worked or are long-term unemployed. This is blamed on a lack of education: 43% of Muslims have no qualifications.” But it is not just the poor themselves who flock to al-Qaeda; it is also those burning with a sense of injustice on behalf of the poor.
In this context, it is worth remembering Giuliani’s jaw-dropping proclamation (to which Savonarola has been assiduous in drawing our attention): “People who live in freedom always prevail over people who live in oppression.” So speak the Masters, the Winners… Who speaks for the oppressed then? The rise of Islamism must be correlated with the demise of the left. If it has become the default repository for Muslim rage against injustice then that is partly due to the US, which, as is well-known, funded Islamist Jihadis in a bid to defeat Communism. Since only something like Communism could absorb and re-direct the energies that are fuelling alQaeda, I look forward to the day when the US will fund Islamic Communism, and the circle will be complete.
the face of terrorism without a face
So Tony Blair is the leader who has brought suicide bombing to Britain.
Any remaining doubt about the link between 7/7 and the Iraq bombing and occupation was dissipated today when a friend of one of the suspects, Mohammed Sadique Kahn, spoke to — of all things — the Evening Standard. “The friend […] said Khan, Tanweer and Hussain grew up together and ‘often talked about their anger at their Muslim brothers and sisters being unfairly treated in Iraq by the US.’”
No surprises there. And no surprises, at least not for k-punk readers, that the bombers were British. That, at least, somewhat undermined the racist agendas of European and US “Experts” who blamed the atrocity on Britain’s supposedly insufficiently authoritarian immigration and asylum policies, barely concealing their disgust at multi-ethnic “Londonistan”, a stance that echoes Mark Steyn’s Islamophobic revulsion at “Eurabia”. The BNP in Barking found that their predictable attempts to extract political capital from the bombings — a leaflet with a photograph of the trashed number 30 bus over a caption saying, “Maybe now it’s time to listen to the BNP” — also fell foul of the revelation that the bombers came from Leeds, not the Middle East. Naturally, that news brings with it the possibilities for other kinds of exploitation by racists. It is a grotesque understatement to say that the next few months will not be easy for Muslims in Britain. Emollient words about “true Islam” will be as ineffective as they are misleading. There is no true Islam. Islam, like all other religions, is a riot of contradictions, a tissue of interpretations. The words of the Prophet give as much comfort to zealots as to pacifists.
David Davis said last week that modern terrorism is “terrorism without a face”. Suddenly, however, the terrorists have a face — even though it is not the one that many expected, or wanted. The photographs of the perpetrators and the photographs of the victims — who could tell them apart? There is no tell-tale “demonic stain” on the faces of the killers. They aren’t the austere, obsessive “foreigners” that the popular imagination had conjured. They wore trainers and tracksuits, they were religious, sure, but no one thought they were fanatics. They weren’t even socially dysfunctional geeks. By all accounts, they were popular, played cricket. Nor was there any obvious lack or deprivation in their lives.
The obvious questions seem to be “how”, “why”? Yet the same questions do not seem the obvious ones to ask when we see photographs of similar young men who happen to be in in the US or British forces, men who have participated in the killing of very many more civilians.
The Blairite objection to terrorism cannot be its means, since he, too, considers the killing of a certain number of civilians an acceptable sacrifice for the greater Good. (One of the problems this kind of utilitarian calculus has always faced is that there is no obvious point at which to stop counting the consequences. But, as we’ve already established, surely Thursday must count amongst the consequences of the Iraq misadventure.) It is the ends, then, in which the difference must reside, not the means. Blair is supernaturally confident that he is on the side of the angels, that he is pursuing the Good, whereas his enemies are Evil. The problem is that they think exactly the same way.
He tells us that we are in a war. But to many Muslims — not “mad mullahs”, but, amongst others, young men from “ordinary” backgrounds — it is as obvious as it is to Blair what the right, the only side, to be on is. It is the side of the poor and the oppressed, not the side of the hyperprivileged and the massively well-armed. The rage, the righteous sense of injustice that led those four to give their lives and take the lives of others — and please, do not describe what they did as “cowardly”; “brutal” by all means, but not “cowardly”, and certainly nowhere near as cowardly as the Powell doctrine of bombing from a great height — that anger needs to be channeled by other forces, forces which don’t counter oppression with repression, which don’t transform rage into outrage.
UPDATE: Breakfast TV, BBC1. A group of young Muslims from Leeds — not “fanatics” by any means — tell the reporter (who has to concede that they are articulate and measured) that Iraq is the major factor in switching young men onto extremism in Britain. They make it clear that they are appalled by the events of last Thursday, condemn them without reservation, but nevertheless are angered by the patent double standards of the British media. The fifty people who died last week — whose deaths they in no way trivialised — seem to count much more than the thousands who die in Iraq. (It makes me wonder what would happen if the media indulged in what Simon Jenkins called “grief pornography” for Iraqis: if there were back stories and photographs for all of them, would the public mood change?) In the studio, Irshad Manji, author of The Trouble with Islam Today, tries to demur, falling back on the standard line that 9/11 preceded Iraq. True enough, but there had never been suicide bombing in Britain until last week. Manji makes some good points: in a piece the other day (I think in the Standard?), she broke ranks with the sentimental consensus about “true Islam”, arguing that there needs to be an Islamic Reformation, with the acceptance within the religion that certain passages of the Koran can be wrong. But the call for Islamic auto-critique must go alongside a recognition that the “Crusader” policies of the US and the UK feed an aggrieved militancy that will make that kind of Reformation much less likely.
conspicuous force and verminisation
The paradoxical War on Terror is based on a kind of willed stupidity; the willed stupidity of wishful thinking. Only the logic of dreamwork can suture “War” with “Terror” in this way, since terrorists were, by classical definition, those without “legitimate authority” to wage war. However, it is horribly evident for some while that a new, frighteningly facile, definition of Terrorism has come into play. What makes Terrorists terrorists is not their supposed lack of legitimate authority but their Inherent Evil. We are ontologically Good; Good by our very nature, no matter what we do. We belong to an “alliance of moderation” against the Axis of Evil. So when “we” “accidentally” level an apartment block full of children with our moderate bombs, we do not cease to be moderate. The difference between They, the Evil, and We, the Good is, of course, intent; the Terrorists deliberately target civilians. This is their only aim, because they are Evil. Although we kill vastly more civilians, we do not intend to it, so we remain Good.
For the libidinal roots of this wishful thinking, we have to look beyond the foibles of individuals to the political unconscious of the hyper-militarised state. It is geared to deal with threats if they come from other armed states, so it pretends — deceives itself, and then attempts to deceive us — that this is in accord with the actual geopolitical situation. Condi’s crocodile tears notwithstanding, the US, needless to say, is in no position to condemn Israel’s air strikes, since the Israeli bombings follow the War on Terror script to the letter. The conflict with Hezbollah turns into a destruction of Lebanese people and infrastructure, just as the struggle with al-Qaeda became a war on Afghanistan and Iraq. For the hyper-miltarised state, asymmetry can only be thought of as an advantage: we have more and better weaponry than them, therefore we must win.
The stupidity here is evident, and multi-levelled. First of all, it involves a literal occlusion and suppression of intelligence. Terrorism is a problem to be met with brute force rather with intelligence. Successfully defeating Terrorist groups is a long-term business, dirty, but above all, stealthy, invisible. But the War on Terror is inherently and inescapably spectacular; it arises from the demands of the post 9/11 military-industrial-entertainment complex: it is not enough for the state to do something, it has to be seen doing something. The template here is Gulf War 1, which as both Baudrillard and Virilio knew, could not be understood outside logics of mediatisation. Gulf War 1 was conceived of a kind of re-shooting of Vietnam, with better technology, and on a videogame desert terrain in which carpet bombing would be industrially effective. This is the kind of asymmetry that the military-industrial-entertainment complex likes: no casualties (on our side).
The bringing to bear of what, following Veblen, we might call conspicuous force presupposes a second stupidity: the verminisation of the enemy. Before Gulf War 1 had even happened, Virilo saw the logic of verminisation rehearsed in James Cameron’s Aliens, wherein the “machinic actors do battle in a Manichean combat in which the enemy is no longer an adversary, a fellow creature one must respect in spite of everything; rather, it is an unnameable being that it is more appropriate to exterminate than to examine or analyse.” In Aliens, Virilio ominously notes, attacks on the “family [form] the basis of […] neocolonial intervention”. The teeming, Lovecraftian abominations which can breed much faster than we can are to be dealt with by machines whose “awesome appearance is part of [their] military effectiveness”. Shock and awe.
Aliens was the moment in which a new mode of the military-industrialentertainment-complex became visible. Virilio argued that Aliens’ privileging of military hardware “could only lead in the end to the extinction of the talking film, its complete replacement by film trailers for hardened militarists”. In fact, the talking film has been replaced by the shoot-em-up videogame whose picnoleptic delirium is flat with the prosecution of the Sega-Sony-CNN war. “Realists” who attacked Baudrillard and Virilio for their insistence upon the fact that war is now constitutively mediatised missed the point that hyperrealisation is precisely what permits the production of very real deaths on a mass scale.
Verminisation not only transforms the enemy into a subhuman swarm that cannot be reasoned with, only destroyed; it also makes “us” into victims of its repulsive, invasive agency. As Virilo perspicaciously observed, Aliens itself operated “a bit like a Terrorist attack. Women and children are slaughtered in order to create an irreversible situation, an irremediable hatred. The presence of the little victim has no theatrical value other than to dispose us to accept the madness of the massacres…”
While “we” have “families” who are being senselessly killed, vermin have neither memory nor motive; they act unreflexively, autonomically. Their extermination is a practical problem; it is simply a matter of finding their nests and using the right kind of weapon. Applying this thinking to Hezbollah or any other group is appalling racism, naturally, but also astonishingly poor strategy, implying no understanding of Terrorism whatsoever. Destroy all the infrastructure, kill all the operatives: but you will have only created more images of atrocity; indestructible and infinitely replayable repositories of affect, which, by demanding response and producing (a usually entirely justified) recrimination, act as the best intensifiers and amplifiers of Terror.
my card: my life: comments on the amex red campaign
The current American Express Red advertising campaign[2] cries out for the kind of intricate semiotic dissection Roland Barthes pioneered in Mythologies. The ad — which shows happy, smiling supermodel Gisele embracing happy, smiling African Maasai warrior, Keseme — is a succinct emblem of the current ruling ideology.
The image, with its evocation of ideas of culture and nature, consumerism and debt, independence and dependence — fairly drips with polysemic resonances. There is enough here to keep semiologists busy for years.
But the central opposition — “My Card” versus “My Life” — says more than it intends. The First World is metonymically represented by a plastic card, and it is left to the Third World to symbolise all the “natural” vitality that unliving capital has eliminated from Western culture. The Western woman equals (artificial, cosmetic) culture; the African man equals living nature. Indeed, when we click on the “My Life” button we see the stereotypicallydescribed “proud and fiercely independent […] Maasai tribes of East Kenya” suborned into the role of embodying “the dignity, courage and breathtaking beauty of Africa”, their culture quickly flattened back into nature.
Slavoj Žižek has argued that what he calls “liberal communism” — as exemplified by the charitable gifts made by super-succesful capitalists such as Bill Gates and George Soros — is now the dominant form of capitalist ideology. “According to liberal communist ethics”, Žižek argues,
the ruthless pursuit of profit is counteracted by charity: charity is part of the game, a humanitarian mask hiding the underlying economic exploitation. Developed countries are constantly “helping” undeveloped ones (with aid, credits, etc.), and so avoiding the key issue: their complicity in and responsibility for the miserable situation of the Third World.[3]
This is the real meaning of the embrace between Giselle and Keseme — under global capitalism, the relationship between First and Third Worlds can never be a symmetrical synergy in which both partners win. It will always be a system of structural inequality in which one side is always destined to lose.
But Product Red marks a move on from Žižek’s liberal communism. Liberal communism is really just old-style philanthropy, in which exploitation is atoned for by subsequent acts of charity. With Red, by contrast, the act of consumption is presented to us as already and immediately benevolent. At the Product Red launch in January, Bono,[4] Red’s most high-profile advocate, made a point of differentiating the new approach from philanthropy. “Philanthropy is like hippy music, holding hands”, Bono claimed. “Red is more like punk rock, hip-hop, this should feel like hard commerce.” (It is unclear what inspired Bono’s invocation of punk rock — perhaps he was thinking of The Great Rock ‘n’ Roll Swindle — but his reference to hip-hop might be the most savage indictment of the genre yet.)
We confront here the curious mixture of brutal cynicism and dewy-eyed piety that is so characteristic of late-capitalist culture. The billboard version of the American Express ad tells us that “This card is designed to eliminate Aids in Africa”. Even when we dismiss this as obvious nonsense — the most credulous consumer cannot but be aware that the card was designed to increase the profits of American Express — the ideological blackmail still holds: how can anything which assists in the struggle against Aids in Africa possibly be wrong?
We’ve already touched upon one reason: campaigns such as this occlude and mystify the systemic character of the relationship between Western capital and the Third World. The picturesque image of a “traditional” Maasai warrior beguiles us into forgetting the way in which Western institutions profit from Third World debt. It also photoshops out capital’s attempt, in Žižek’s words, to “export the (necessary) dark side of production — disciplined, hierarchical labour, ecological pollution — to ‘non-smart’ Third World locations”.
Another, related, reason is that Product Red promises to eliminates politics as such. If the invisible hand of the credit card user can ameliorate the problem of Aids in Africa, there is no need for a political response at all — what John Hayes of American Express calls “conscientious commerce” will be sufficient. In this way, Product Red goes beyond using a Masaai tribesman to advertise American Express, and uses him to sell neoliberal ideology itself.
the great bullingdon club swindle
We’re all in this together.
Capitalist realism everywhere… On television yesterday morning, the relentless message coming from pundits and vox pops — even from most of those who reject the particular form that the cuts have taken — was that “something had to be done”. The Great Bullingdon Club Swindle is larceny and deception on such a grand scale that one almost has to admire its breathtaking audacity. The Bullingdon Club has pushed Doublethink to new limits with its mantric repetition of the ludicrous claim that it was New Labour policy, rather than the bank bailouts, that was responsible for the massive deficit. The strategy seems to be to employ the illocutionary power of repetition — if they keep saying it, then it will have been true. The Bullingdon boys are working a mass hypnosis trick, forcing through shock doctrine measures while the population are still in a kind of postcrash trance. But where, previously, neoliberals had used the crises in other political systems (state socialism, social democracy) as an opportunity to helicopter in their “reforms”, on this occasion they are using a crisis brought about by neoliberal policy itself to try to electro-shock the neoliberal programme back into life. I heard one buffoon on television saying that “we’ve been in denial for the last ten years”. If there’s denial, it’s happened in the last two years, and on the part of the neoliberals and their friends in the business elite, who — after demanding at gunpoint unprecedented sums of public money — are now brazenly continuing to peddle the story that they are the friend of the taxpayer and that it is welfare claimants, not them, who are the scroungers who have brought the country to the “brink of bankruptcy”. In what must surely be the most astonishing bait and switch in British parliamentary history, the victims of neoliberal policy — public services and the poor — are now being asked (or rather forced) to pay for the manifest and total failure of that policy. As John Gray argues in the LRB[2], it’s no surprise that Orange Bookers like the “wolf-eyed replicant” Nick Clegg — as China Mieville[3] memorably described him — are happy to impose on the country the same neoliberal programme that they have imposed on their own party. Even so, has there even been a party that has so comprehensively and so quickly squandered the good will of those who voted for them as have the Lib Dems? Cuddly Vince Cable’s grinning excuse for the backtracking on student fees was a masterclass in capitalist realism, as he practically said, “Well, that’s what happens when you get into power — you give up your principles.” (Cable is increasingly looking like a villain from a John Grisham flick, the avuncular eminence grize whose charm lures you into the firm, before being revealed to be a sinister embezzling fraudster.)
For months now, we’ve been sold the story that public services are “bloated”. There’s no doubt that New Labour mismanaged public services and wasted money — on managers, on market Stalinist control procedures imported in from business, and on GPs’ ludicrously overinflated salaries. But the narrative of an overfunded public sector produces cognitive dissonance for those of us who have actually been delivering frontline public services in the last ten years, where we’ve been expected to do more work for less money and with fewer resources. If those were the good times, you can only feel a shudder of dread anticipating what it will be like when things are bad. Incidentally, if you’ve wondered why there have been so few posts here in the last month or so, it’s because I’ve been trying to piece together a living as a visiting (i.e. casualised) lecturer, working in institutions that are strained to breaking point by neoliberal “reforms”. Cuts will mean more casualisation, in those institutions that will be able to survive at all.
But the most breathtaking aspect of the Bullingdon swindle is the “we’re all in this together” slogan, rightly described by Seumas Milne[4] as “preposterous”. What we’re seeing now is the Terminator of Capital with its neoliberal-managerialist mask wrecked, and the Big Society (Victoriana 2.0) ruse not convincing anyone. The doughy, fat-of-the-land face of privilege now shows itself openly, exuding the emollient manner of noblesse oblige, but without any sense of obligation. What survives is pure ideological reflex, the decorticated Terminator blindly blasting at its usual targets: public services, welfare, the arts. It’s folk economic faux-wisdom (“if a household overspends, we know that we have to give up things we’d rather keep”) that is providing the smokescreen for this ideological assault. Myths and deliberately cultivated misapprehensions abound: judging from all the rhetoric, you’d think that education and the arts were drains on the economy, rather than the highly successful “businesses” that they in fact function as.
Nevertheless, it’s crucial that we recognise that this is a time of opportunity for the left. Laurie Penny[5] is right that the Labour Party does not have the answers at the moment. Yet the Labour Party’s current lack of an agenda can be seen as a good thing, for two reasons. Firstly, at least this means that Labour has lost the managerialist neoliberal agenda that defined it for the last fifteen years. The de-New Labourisation process will take a while, but it will be expedited much quicker with Ed Miliband as leader than it ever would have been with David at the helm. (Notice how David — whom the media were presenting as a great lost leader, a kind of world-historic statesman, on the grounds, presumably, that Hilary Clinton took a fancy to him — is already a forgotten man. In the media’s soap narrative, David’s leaving frontbench politics was an open wound which the Labour Party would take years to recover — that doesn’t quite seem to be the case.) Secondly, the fact that the post-Blair and Brown Labour Party is now a cored-out shell means that it is a space, which it is at least plausible that could be filled by new ideas and strategies. For the first time in fifteen years, the future of the Labour Party is not fixed. It’s worth remembering at this point that the failures of the Labour Party, its succumbing to capitalist realism, is not just the consequences of the internal logic of the party. It was extra-Parliamentary forces that gave rise to the Labour Party in the first place; it was the defeat of those forces that drove the Labour Party into its craven placating of business in the New Labour era. If Labour is to be anything more than a zombie party once again, it will be new forms of extra-parliamentary organisation that revivify it.
For that reason, this is definitely not the time to recline into the leftist version of capitalist realism, the defeatist counterpart to the Bullingdon club’s bullishness. Now is the time to organise and agitate. The cuts can provide a galvanising focus for an anti-capitalist campaign that can succeed. Protests in these conditions won’t have the hubristic impotence of anticapitalist “feelgood feelbad” carnivals and kettles. This is shaping up to be a bitter struggle, but there are specific, determinate and winnable goals that can be achieved here: it isn’t a question of taking a peashooter to the juggernaut of capital.
The UK, the first capitalist country, is the world capital of apathy, diffidence and reflexive impotence. But it is also a country that periodically explodes into rage. Beneath todays’s ideological trance, beneath the capitalist realist hopelessness, an anger simmers here that it is our task to focus and coordinate. Public displays of rage can play an enormously significant role in shifting the symbolic terrain that is currently governed by capitalist realism. I know there are some who see parallels between now and the initial phases of the first Thatcher government. But Thatcher had a number of factors on her side which the Bullingdon boys don’t.
Firstly, Thatcherism was part of a wider global restructuring of capitalism — the objective tide of history was on its side. But global capital has not yet found a solution to the problems that led to the banking crisis.
Secondly, this shift from Fordism to post-Fordism allowed Thatcher to offer inducements that can’t be repeated: cheap shares from formerly nationalised companies, the sales of council houses. The nationalised companies have long since been sold off, and their private counterparts have in most cases failed to deliver the promised increases in consumer satisfaction — although they have certainly delivered massive profits to those who do hold shares in them. Now all we can look forward to are spiralling energy bills and higher train fares. There are no council houses to sell — indeed, the coalition is planning to effectively end what is left of social housing in this country for good, by forcing up council tenants’ rates, and limiting tenancies to five years.
Thirdly, there was of course the Falklands — but, since the forces are already stretched threadbare, where are the resources for such a neocolonialist intervention now, and would jingoism function in the same way in 2010 that it did in 1982?
Fourthly, there was Thatcher herself — a divisive but charismatic politician, who could plausibly present herself as struggling against vested interests, not only on the left, but also in the British establishment. The current Tory government has none of these advantages, and the neoliberal right in general has lost control of the future, much as it refuses to acknowledge this. In the Standard, Anne McElvoy recently described Ed Miliband as “an unreconstructed social democrat”. From what position does McElvoy think she is speaking here? Like much of the mainstream media, which is contriving to carry on as if 2008 didn’t happen, McElvoy is desperately clinging to the myth of a political “centre ground” that no longer has any legitimacy. After the bank bailouts, the neoliberal settlement is just as dead as social democracy.
The “we’re all in this together” slogan may turn out to be a phrase that comes to haunt the Tories in the way that “Labour isn’t working” dogged Labour for a generation. Classlessness might have seemed plausible for a moment when fronted by John Major, who didn’t go to university, or by Tony Blair, the poster boy for (leftist) post-political administration. But that moment has long passed, and cuts of this kind being forced through by a cabinet of aristocrats and millionaires make brutally apparent a class antagonism that the New Labour government obfuscated. Whenever the ruling class tells us that “we’re all on the same side”, it is a sure sign that we can hurt them. Similarly, the current media phobia about unions is an indication of the power that they have at this time. History is starting again, which means that nothing is fixed and there are no guarantees. Right-wing victory is only inevitable if we think that it is.
the privatisation of stress
Ivor Southwood tells the story of how, at a time when he was living in a condition of underemployment — relying on short-term contracts given to him at the last minute by employment agencies — he one morning made the mistake of going to the supermarket.[2] When he returned home he found that an agency had left him a message offering him work for the day. But when he called the agency he was told that the vacancy was already filled — and upbraided for his slackness. As he comments, “ten minutes is a luxury the day-labourer cannot afford”. Such labourers are expected to be waiting outside the metaphorical factory gates with their boots on, every morning without fail. In such conditions
daily life becomes precarious. Planning ahead becomes difficult, routines are impossible to establish. Work, of whatever sort, might begin or end anywhere at a moment’s notice, and the burden is always on the worker to create the next opportunity and to surf between roles. The individual must exist in a state of constant readiness. Predictable income, savings, the fixed category of “occupation”: all belong to another historical world.[3]
It is hardly surprising that people who live in such conditions — where their hours and pay can always be increased or decreased, and their terms of employment are extremely tenuous — should experience anxiety, depression and hopelessness. And it may at first seem remarkable that so many workers have been persuaded to accept such deteriorating conditions as “natural”, and to look inward — into their brain chemistry or into their personal history — for the sources of any stress they may be feeling. But in the ideological field that Southwood describes from the inside, this privatisation of stress has become just one more taken-for-granted dimension of a seemingly depoliticised world. “Capitalist realism” is the term I have used to describe this ideological field; and the privatisation of stress has played a crucial role in its emergence.
Capitalist realism refers to the widespread belief that there is no alternative to capitalism — though “belief” is perhaps a misleading term, given that its logic is externalised in the institutional practices of workplaces and the media, as well as residing in the heads of individuals. In his discussions of ideology, Althusser cites Pascal’s doctrine: “Kneel down, move your lips in prayer, and you will believe”: psychological beliefs follow from “going through the motions” of complying with official languages and behaviours. This means that, however much individuals or groups may have disdained or ironised the language of competition, entrepreneurialism and consumerism that has been installed in UK institutions since the 1980s, our widespread ritualistic compliance with this terminology has served to naturalise the dominance of capital and help to neutralise any opposition to it.
We can quickly grasp the form that capitalist realism now takes by reflecting on the shift in the meaning of the famous Thatcher doctrine that “there is no alternative”. When Thatcher initially made this notorious claim, the emphasis was on preference: neoliberal capitalism was the best possible system; the alternatives were undesirable. Now, the claim carries an ontological weight — capitalism is not just the best possible system, it is the only possible system; alternatives are hazy, spectral, barely conceivable. Since 1989, capitalism’s success in routing its opponents has led to it coming close to achieving the ultimate goal of ideology — invisibility. In the global North at least, capitalism proposes itself as the only possible reality, and therefore it seldom “appears” as such at all. Atilio Boron argues that capitalism has been shifted to a “discreet position behind the political scene, rendered invisible as the structural foundation of contemporary society”, and cites Bertolt Brecht’s observation that “capitalism is a gentleman who doesn’t like to be called by his name”.[4]
The Depressing Realism of New Labour
We would expect the Thatcherite (and post-Thatcherite) right to propagate the idea that there is no alternative to the neoliberal programme. But the victory of capitalist realism was only secured in the UK when the Labour Party capitulated to this view, and accepted, as the price of power, that “business interests, narrowly conceived, would be henceforth be allowed to organise the shape and direction of the entire culture”.[5] But perhaps it would be more accurate to record that, rather than simply capitulating to Thatcherite capitalist realism, it was the Labour Party itself that first introduced capitalist realism to the UK political mainstream, when James Callaghan gave his notorious 1976 speech to the Labour conference in Blackpool:
For too long, perhaps ever since the war, we [have] postponed facing up to fundamental choices and fundamental changes in our economy […] We’ve been living on borrowed time […] The cosy world we were told would go on forever, where full employment could be guaranteed by a stroke of the chancellor’s pen — that cosy world is gone…
However it is unlikely that Callaghan foresaw the extent to which the Labour Party would come to engage in the politics of “corporate appeasement”, or the extent to which the cosy world for which he was performing the last rites would be replaced by the generalised insecurity described by Ivor Southwood.
The Labour Party’s acquiescence in capitalist realism cannot of course be construed as a simple error: it was a consequence of the disintegration of the left’s old power base in the face of the post-Fordist restructuring of capitalism. The features of this — globalisation; the displacement of manufacturing by computerisation; the casualisation of labour; the intensification of consumer culture — are now so familiar that they, too, have receded into a takenfor-granted background. This is what constitutes the background for the ostensibly post-political and uncontestable “reality” that capitalist realism relies upon. The warnings made by Stuart Hall and the others writing in Marxism Today at the end of the 1980s turned out to be absolutely correct: the left would face obsolescence if it remained complacently attached to the assumptions of the disappearing Fordist world and failed to hegemonise the new world of post-Fordism.[6] But the New Labour project, far from being an attempt to achieve this new hegemony, was based precisely on conceding the impossibility of a leftist hegemonisation of post-Fordism: all that could be hoped for was a mitigated version of the neoliberal settlement.
In Italy, autonomists such as Berardi and Negri also recognised the need to face up to the destruction of the world within which the left had been formed, and to adapt to the conditions of post-Fordism, though in rather a different manner. Writing in the 1980s, in a series of letters that were recently published in English, Negri characterises the painful transition from revolutionary hopes to defeat by a triumphalist neoliberalism:
We have to live and suffer the defeat of truth, of our truth. We have to destroy its representation, its continuity, its memory. All subterfuges for avoiding the recognition that reality has changed, and with it truth, have to be rejected. The very blood in our veins had been replaced.[7]
We are currently living with the effects of the left’s failure to rise to the challenge that Negri identified. And it doesn’t seem a stretch to conjecture that many elements of the left have succumbed to a collective form of clinical depression, with symptoms of withdrawal, impaired motivation and the inability to act.
One difference between sadness and depression is that, while sadness apprehends itself as a contingent and temporary state of affairs, depression presents itself as necessary and interminable: the glacial surfaces of the depressive’s world extend to every conceivable horizon. In the depths of the condition, the depressive does not experience his or her melancholia as pathological or indeed abnormal: the conviction of depression that agency is useless, that beneath the appearance of virtue lies only venality, strikes sufferers as a truth which they have reached but others are too deluded to grasp. There is clearly a relationship between the seeming “realism” of the depressive, with its radically lowered expectations, and capitalist realism.
This depression was not experienced collectively: on the contrary, it precisely took the form of the decomposition of collectivity in new modes of atomisation. Denied the stable forms of employment that they had been trained to expect, deprived of the solidarity formerly provided by trade unions, workers found themselves forced into competition with one another on an ideological terrain in which such competition was naturalised. Some workers never recovered from the traumatic shock of seeing the Fordistsocial-democratic world suddenly removed: a fact it’s worth remembering at a time when the Conservative-Liberal Democrat coalition government is hounding claimants off incapacity benefit. Such a move is the culmination of the process of privatising stress that began in the UK in the 1980s.
The Stresses of Post-Fordism
If the shift from Fordism to post-Fordism had its psychic casualties, then post-Fordism has innovated whole new modes of stress. Instead of the elimination of bureaucratic red tape promised by neoliberal ideologues, the combination of new technology and managerialism has massively increased the administrative stress placed on workers, who are now required to be their own auditors (which by no means frees them of the attentions of external auditors of many kinds). Work, no matter how casual, now routinely entails the performance of meta-work: the completion of log books, the detailing of aims and objectives, the engagement in so-called “continuing professional development”. Writing of academic labour, the blogger Savonarola describes how systems of permanent and ubiquitous measurement engender a constant state of anxiety:
One of the more pervasive phenomena in the current cod-neoliberal academic dispensation is CV inflation: as available jobs dwindle down to Kafkian levels of postponement and implausibility, the miserable Träger of academic capital are obliged not just to overfulfil the plan, but to record […] every single one of their productive acts. The only sins are sins of omission […] In this sense, the passage from […] periodic and measured measurement […] to permanent and ubiquitous measurement cannot but result in a kind of Stakhanovism of immaterial labour, which like its Stalinist forebear exceeds all rationales of instrumentality, and cannot but generate a permanent undercurrent of debilitating anxiety (since there is no standard, no amount of work will ever make you safe).[8]
It would be naïve to imagine that this “permanent undercurrent of debilitating anxiety” is an accidental side-effect of the imposition of these self-surveillance mechanisms, which manifestly fail to achieve their official objectives. None other than Philip Blond has argued that “the market solution generates a huge and costly bureaucracy of accountants, examiners, inspectors, assessors and auditors, all concerned with assuring quality and asserting control that hinder innovation and experiment and lock in high cost”.[9] This acknowledgement is welcome, but it is important to reject the idea that the apparent “failures” of managerialism are “honest mistakes” of a system which sincerely aims for greater efficiency. Managerialist initiatives served very well their real if covert aims, which were to further weaken the power of labour and undermine worker autonomy as part of a project to restore wealth and power to the hyper-privileged.
Relentless monitoring is closely linked to precarity. And, as Tobias van Veen argues, precarious work places “an ironic yet devastating” demand on the labourer. On the one hand, work never ends: the worker is always expected to be available, with no claims to a private life. On the other hand, the precariat are completely expendable, even when they have sacrificed all autonomy to keep their jobs.[10] The tendency today is for practically all forms of work to become precarious. As Franco Berardi puts it, “Capital no longer recruits people, but buys packets of time, separated from their interchangeable and occasional bearers”.[11] Such “packets of time” are not conceived of as having a connection to a person w