“Shall we blast or shall we build?”
So ran the final line in an old CND anthem written in 1958 by science fiction writer John Brunner and much sung on anti-bomb marches. Here’s more; the message, in many ways, remains relevant.
Time is short; we must be speedy
We can see the hungry filled
House the homeless, help the needy
Shall we blast, or shall we build ?
At the outset of the 21st century a number of writings were published arguing that in the century ahead there would be a stark choice. If we dealt with pressing issues such as climate change, the environment, the oceans, the bees and butterflies and harness the positive power of new technologies then the 21st century could be the turning point towards a wonderfully Utopian human life on Earth. If we didn’t, then we’re fucked. A good example of the genre was the very readable The Meaning of the Twenty-First Century by James Martin and published by the extremely worthy Eden Project. (I see you can now get a copy for 1p on Amazon…)
More recently there has been a small trickle of works presenting a similar death-row or rebirth scenario from a radical left viewpoint. Nick Srnicek and Alex Williams’ Inventing the Future (Verso 2015) is a good example. They argue that the technology now exists, or could easily be developed, to do away with most of life’s drudgery thus enabling a move to a postcapitalist future without work. The estimable Paul Mason certainly agrees with the postcapitalist formulation, as is well outlined in his Postcapitalism, A Guide to our Future (Penguin, 2016).
Either-or scenarios are notoriously rigid, black and white with no grey. For the most part we humans muddle through. If we get an idealistic bee in out bonnet and enact it on a mass scale it usually leads to social, human and even economic disaster. On the other hand, a lack of idealism produces a cloying, nihilistic mess – which is not an unfair description of what we have now in many Western countries. Also, of course, a lack of idealism nearly always benefits the status quo which, in turn and by definition, tends to the right rather than the left.
However, for all that either-or discussions of the future, dystopia or utopia, can be hard to negotiate and make credible, there is, I believe, a special urgency to them right now in an era which is witnessing the death of the expert, almost total digital mediatization and the disruption of many political, social and even personal certainties. These are, of course, all of a piece and could potentially usher in an atrophy of meaningful democracy, a situation in which the most democratic ideas actually lead to their own withering away. I call it democracide.
Germany, perhaps still the most noted example, committed democracide in 1932, 3 and 4, when democratic elections which were played by the rules actually enabled Hitler to take office despite never having a majority against his main opponent Paul von Hindenburg. Hitler, of course, then abolished the position of President and replaced it with Furrier. And the rest really is history.
I cite Hitler simply give an extreme example of democracide. Despite scare postings on social media, our situation is not directly comparable with the 1930s, although it is true that there are parallels that give cause for thought: governments targeting the most vulnerable, the rise of a working class right-wing given voice and legitimacy by populist denouncements of the political class, nationalism…Historically, however, and in the cause of verbal accuracy we should be wary about labelling all this “fascist”. Until a right wing military haunts the streets, until even just the semblance of democracy is abolished, until the role of Prime Minister is replaced by an authoritarian dictator, until there is a political philosophy that favours certain races and classes and denigrates others (and so on) we should probably hold off on the F word. Ghastly though fascism is, it is by no means the only repressive socio-political scenario of the right; its enormous disadvantages in the 21st century are twofold. First, our collective historical experience of the 20th century means we can “see it coming”, although it is true that the lessons of the past can remain unlearned no matter how tuned in to them we might appear to be. Second, and more importantly, it demands an actual abolition of the last vestiges of democracy, incomplete and increasingly imperfect though it may be. In a breathtaking paradox, it is the very groundswell of what passes for democracy (let’s just say Trump, Brexit) that would prevents it from being taken over by obvious dictators.
In any case, if we are searching for the real dangers, the destabilisers of today, scraping the historical barrel for faint traces of fascism isn’t really where its at. Trump, and for that matter Farage, is a symptom, not a cause. The cause is elsewhere, much deeper and more elusive. It lies somewhere between the Internet, total digital mediatization and the death of the expert and/or truth.
Maybe we can start with social media – interestingly and maybe ironically enough because if you read this blog it will either be on its own WordPress website or on Facebook. From being a thinking person with no voice and no outlet (because I’m not on the press and media gravy train) online technology now enables me to write this, and you to read it. So far so good – the enabling, democratising power of digital media – or so we are often exhorted to believe.
There’s another side to this though: it is often vulgarly said that opinions are like arseholes – everyone has them. Or to put it differently, there is no online filter that lets you know whether my particular opinions are worth reading or not as opposed to the many thousands that are now available to you. In fact there’s an enormous load of drivel out there, and some very unpleasant drivel at that.(Of course, I have to concede that to some people my opinions too will be drivel.) The cultural and practical difference between this online present and the recent past is very simple. Off-the-wall opinions (much, much more off-the-wall than mine) were generally accessed in some fairly remote ways where most people didn’t go, notably, for example, in those wonderful, pokey little alternative bookshops that were found in most cities and gave you a thrill of subversion just to walk through the door. These tended to be “anarchist” or left-wing bookshops, the right-wing not usually being quite as articulate and up front. But there were right-wing pamphlets issued at one time by the likes of the BNP, the National Front or – further back – the British Union of Fascists or the Mosleyite Union Movement. They were distributed on the streets, at meetings and surreptitiously in schools. In comparison to online culture these obscure little enclaves – left or right – remained outside the everyday experience of the majority. By contrast an opinion expressed on Facebook, Twitter or whatever, as we all well aware, can go round the world as soon as it is posted. It can stimulate threads in which opinions are challenged, explained or amplified. The role of the “experts”, therefore, is vastly reduced and compromised. We all pitch in.
All this, of course, is familiar, but it needs to be acknowledged that one of the root causes of the death of the expert (probably the most significant cause of all) is the way digital technology has had the effect of legitimising all opinions, all views no matter how odd, iconoclastic, dangerous, idealistic, nasty, informed, ignorant , peaceful, warlike, wise or unbelievably stupid. In this situation it is easy for the likes of Trump or Farage or Le Pen to peddle a dangerous dope – the fearlessly stated, supposedly hitherto unexpressed views and feelings of the voiceless ones, the common if a little conservative folk who only want to make a living and do the right thing in the face of interference and corruption on the part of “them” – the political class or the liberal politically correct left. This constitutes another paradox. The views of this newly empowered democratic populectorate, expressed via its votes for demagogues, short circuits and backfires into a kind of non-democracy. Trump spent millions convincing his supporters that he was just an ordinary guy like them. Jimmy Carter in 1976-7 spent the tiniest fraction of Trump’s (or Clinton’s) amount of money. His campaign didn’t have money, not on that scale. I’m not arguing for a return to the good old days when presidents only had a relatively small budget. I am saying that the situation is now totally out of hand, entirely a media (and very much a digital media) event, with no integrity, and yet rumbling away in Trump’s camp was the notion of the ordinary guy. The absurdity is staggering.
The death of the expert, however, is not merely related to politics alone. You even find it in the very places where you might expect to find actual real experts – in universities and colleges. Although universities are always keen to trumpet the research profiles of their staff (and thus show how many experts they employ and therefore get more money) there has increasingly been a disconnect between this and what happens on the coalface, as it were – teaching undergraduates in particular. Making sure that students’ tasks fall easily within their grasp, underplaying levels of expertise, “democratising” knowledge, making the acquisition of knowledge and educational experience subservient to student retention and employment prospects – all this puts the experts in a strange position. They are supposed to know their stuff but they’re not supposed to show it in case it scares someone off.
Many teachers now find that their students check their facts on Google and even come back with alternatives. Is it entirely churlish to resent this? After all, facts is facts and we’re all entitled to know them and get them right. Yes perhaps, but what is to be made of the piano student who checks everything their teacher tells them and comes back saying there’s another way of doing it to the point where there’s no way to proceed? Or, from my personal experience, what of the piano student who watches YouTubes of performances of the piece he is learning, thus making it harder for me to encourage a search for his own interpretation? For that matter what do we make of conversations which flow nicely until one of the company gets on the phone or their laptop to check the facts? I once went to a folk singing session and contributed a song only for one of those present to look it up on their phone and tell me I’d missed out a verse. I really, honestly did want to take their phone and stamp on it.
All this, of course, illustrates the self evident fact that our lives are now digimediatized pretty well constantly. Half the people you see in the street, and most on railway stations, are on their phones. The mobile phone is a novelty not a necessity although few now can see it that way. The idea that we should be available twenty-four hours a day means that basically we now live in public, in a kind of public mediasphere, in which privacy, private thoughts and the inner life are increasingly buried deeper. On Facebook we mourn the deaths of the famous, a kind of cyber-grief that has very little substance other than nostalgia. We outrage at what our political opponents are doing and procure a load of “likes” but what effect does this have on any actual real world situation? Probably very little…
The situation is even more complex, more existential than this. The cyberworld is not a place; it is not a family; it is not a palpable reality. It is a conceptual space, a space constructed in the mind rather than anywhere else in our lives, although it may lead to real live events. It is compulsive. On a radio programme recently young people were asked how many hours per day they spent online. Their answer was that it would be easier to work out how many hours they were not online. I spend too much time online. You probably do. There a gamers who play for up to sixty hours non stop and wear diapers so they won’t have to get up. The online world has an addictive quality, and like all addictions it masks a spiritual desire for completion that too often ends in emptiness.
The new world, the postcapitalist world without work argued for by Srnicek and Williams, let us recall is a world in which drudgery is taken over by technology. The idea of getting rid of boring, mind-numbing work is appealing and takes good aim against the work ethic promoted by capitalism under the guise of being good for us. It raises a problem familiar to readers of science fiction: what happens when the technology goes down? I mean all of it, not just the bit that accomplishes a particular task that we’ve been happy to scrap. An increasing or total dependence on technology and machinery, digital or otherwise (but mostly digital) must have the human being as its failsafe device. In other words, we would be ill-advised to forget the “old way”. In editing music or speech it can be the case that old school studio engineers who were brought up editing magnetic tape with a razor blade and splicing tape have a particular sensitivity which bleeds through into their work with computer programmes. So the first thing is not to lose the basic skill. Reliance on the machine is a form of personal irresponsibility.
But further. The real dilemma is between the human spirit and technocracy. Those who stake much on the cyber world, on new technologies, on their amazing abilities and potentials might see this as a false, or old fashioned, dichotomy. Surely the human spirit can now develop in hitherto unimagined ways, they might say, by embracing the extraordinary new world that the new technologies are now beckoning us towards. This may be so, but note that the word I used to counterpose with the human spirit is “technocracy” not “technology” – to which I shall return in a moment. If the first step towards dealing with a problem is to develop an awareness of it then there needs to be a developing discourse of critique of the digimediatized world. Those (like me) who have critical perspectives need to find forums, and be offered platforms, in which to express such critiques. Pandora’s box is, self evidently, opened and its contents unleashed. This not only refers to technology and what it can do but also to what we ourselves become in the digimediatized world we now live in. What happens to our humanness; what happens to this labouring, myth-making animal that we are? And this also refers to politics, social formations, attitudes to knowledge, experience, relationships of all kinds and our general sense of location in life. It also takes in the enormous area of debate and exchange on the subject of democracide mentioned earlier. The most sacred of sacred cows, democracy itself, surely needs to be put in the alchemical vessel of these debates.
Alongside an awareness and ongoing discussion of the problematics, there are other paths to be explored. The digimediatized world, precisely because it is so pervasive, needs a counterculture that is a real counterculture which genuinely runs counter to the precepts and values of the new scenario. There are two ways of doing this. One is to accept and embrace digitechnologies and use them for countercultural purposes. There are, for example, music clubs up and down the country (Plymouth’s Cafe Concrete for example), record labels, and genuinely experimental approaches to music that subvert or oppose mainstream and majority music. Music is the example I know best but this happens in all other spheres. To have any genuinely countercultural impact it needs to keep itself as clear of academia, commerce and prescriptive funding schemes as possible. The other countercultural path is that of lo or no tech. Liveness, non-mediatized events are a crucial element in a current counterculture.
The rationale for these general strategies echoes that proposed by Theodore Rozak in 1970 in The Making of a Counterculture (Anchor/Doubleday 1969). His description of the technocratic society was of one that justifies itself by appeal to technical experts whose appeal is based on scientific forms of knowledge “and beyond the authority of science there is no appeal”. This led him to make three key observations relating to the notion that our human needs are purely technical in character – first, they can be analysed formally by specialists and translated into “social and economic programmes, personnel management procedures, merchandise. and mechanical gadgetry” (my italics). Second, this formal analysis of our feds is now pretty complete – so no improvement needed there. Third, all the necessary experts happen to be those currently in charge or “on the official payroll of the corporate structure”. In these days where the rhetoric of cultural democracy can be pervasive it is salutary to note that Rozak’s ideas are even more valid than they were in 1969, and this is why a supporting and developing a genuine counterculture – or countercultures plural (as opposed to one that has the system’s ways at heart but finds opposition fashionable or image-worthy) is urgent.
Rozak understood the 1960s counterculture as a response to technocracy. When the era is placed under the microscope this view can hardly be disputed. The 1960s saw a revival of fin de siecle ideas involving mysticism, orientalism and the occult as well as a folk revival with a second wind. It harboured some romanticism and an anti-enlightenment rhetoric – Blake rather than Newton. In other words, there was an urgent concern with spiritual values, the making of the individual, what Illich called “tools for conviviality” and a great faith in the arts, especially music, in the non-tangible world of experience, the politics of experience as R.D. Laing called it. Or the Jimi Hendrix Experience…
A development of spirituality is the last, most necessary but complex response I shall list in relation to the digimediatized world and its complex social, political and personal dimensions. I do not advocate an imitation on the 1960s, and most certainly not a revival of that fey, imperialist, self-centred world of New Ageism of the post-’60s. Spirit is niot necessarily mysticism. It simply refers to the unseen, to the un-manifest but demonstrably existential. It does not need to have anything to do with gonzo-Buddhism, new age bodywork practices, runes, the Tarot, the smell of sandalwood or the practice of Tantra or the thoughts of wise old gurus from the Himalayas, although some of these may have something to offer. The main point here is that there can be no such thing as cyber spirituality. Whatever form it takes (music and the arts, meditation, relationships of all kinds, scientific research, teaching and learning…) confrontation with the inner life, the sense of self, purpose and identity necessarily takes place live and alone – despite whatever help might be available from mentors and sympathetic witnesses. And live means away from the computer screen – and leave your phone at home.
A Buddhist speaker I heard as long ago as the 1980s discussed technology. He claimed that technology is the new world. We must embrace technology. We must understand what we can do with it. We must also – he said – practice Buddhism so that we are always in charge of the technology rather than vice versa. We do not, of course, all have to practice Buddhism, but in essence the man was right. The development of new material possibilities must be accompanied by a corresponding development of the inner life. Not long ago I sat in a local pub with four friends all of whom were jabbing text messages onto their phones. I rest my case…