‘But Miss, why are you working on reception?’

I am sat at a secondary school year 10 careers event, representing Higher Education. It is called ‘Meet the Professionals’. I also happen to work at this school as a part-time receptionist. As a representative of careers in higher education the pupils naturally ask, ‘but Miss, why are you working on the school reception?’

The answer to this innocent question is impossible to elucidate in a speed-dating-style set-up for 15 year-olds. Staying true to the aims of the careers event, pupils stump me with: ‘what is your favourite part of your job?’ How to explain, to school children whose outlets for creative and flexible thinking have been whittled away by curriculum changes and exam criteria, that the best part of my job is being free – and paid – to just think about history?

I can’t explain… because at that time, a month ago, I was a receptionist, a soon-to-be-Dr PhD student, a witness to the first round of strikes over pension cuts in universities, and an institution-less academic aghast at persistent evidence of privilege and inequality in recruitment and experience across academia. It would be unrealistic, deceptive and irresponsible of me to wistfully extemporize on the value of mere thinking, the life of the mind, amid such cynical and introspective times. I returned to my post behind reception feeling dejected.

I am aware that the one truly liberating aspect of my educational experience has been its transformative effect on my opportunities: like many of these school pupils now, my social status limited my chances but my education mobilised me. As far as possible, I wanted to encourage their highest aspirations. Yet, my immediately precarious existence, in which a delay has been unleashed on my housing decisions, my health and the hazy chance of a family and a future, prevents me from encouraging them to follow my suit. ‘But Miss, why are you working on the school reception?’ Because I have never achieved any step in my higher education without compromising myself financially, physically and psychologically, because though my start in life was no-where near as bad as some children, it was dramatically worse than most people who succeed in academia and I still pay for that.

The day before the event, I sheepishly tried to pull out, aware it would be incongruous to pose as a ‘professional’ when I didn’t even know if I had a future in this sector. Shouted down in the nicest possible way by the teacher who organised it, I was told that of all people to give advice to these pupils, I was ‘real’ and my example in other words was realistic: accessible. It was, of course, flattering to think I could set an example to anyone.

However, I am distinctly uncomfortable with the notion that the mere fact of my ‘disadvantaged’ background is the reason I should be proudest of my achievements: to succeed in academia required a certain amount of adaptation to this selective, intellectual world. It is embarrassing to me that in order to disrupt the course of my origins I had to subvert, deny and disassociate from them. Only in that way could I appeal to an intellectual audience; indeed, I often notice that when I slip up and betray my roots I incur a bit more suspicion, a little less benefit of the doubt from colleagues in some venues. In effect, I have conspired to maintain the very aura around academia that sets it apart from the real world and fortifies its exclusivity. Not only is the job of an academic hard to explain, but there is a sense of protectionism in the way many shore themselves up against the outside world. And for this reason, universities are often either completely out of touch with, or even disrespectful towards, undergraduates (amongst others).

This strike has exposed the multitude of real-world issues that concern all academics and HE professionals, it has also galvanized a cross-sector appreciation of entrenched privilege, at the same time, it has revealed a distinct lack of clarity about what academics do and why this is important, in particular the job of thinking. The creep of monetized education – the explicit link between degree and employment – speak to many commentators of an era in which creative and critical thinking is being exterminated.  Is this because non-academics, especially prospective students, are less and less likely to care about or value the mere act of thought? No. Is this because contributions from a more diverse range of ‘accessible’, more honest intellectuals would be necessary in order to popularise the value of thinking? Maybe. In effect, demystifying the work of higher education could help to bulwark the sector against political attack – the trade-off being that many an ivory tower would come tumbling down in the process.

Perhaps writing this post is just a bid for catharsis… or dignity, but it only takes trying to describe my work to a room full of teenagers, whose potential financial and psychological trauma I want no hand in, to realise that until the work of thinking makes sense to them, it won’t make sense to me.

 

 

 

 

 

 

 

 

 

 

 

 

Advertisements

Nuclear exceptionalism: a collective blind spot?

In the early Cold War (roughly, the 1950s), nuclear exceptionalism reached such a peak that a new era of military security took hold. Rather than create safety by defending territories aggressively, i.e. through pre-emptive military attacks, or retaliatory military attacks, the superpowers (and that includes the UK) somehow settled for living on the cusp of both scenarios. Because, the nuclear bomb, perceived as exceptionally destructive, became a pre-emptive and retaliatory weapon at the same time. And therefore, it sort of, cancelled itself out as a real weapon of war.

This did not actually mean that many strategists, militarists and members of the general public agreed that nuclear weapons were exceptionally bad, indeed it is well-known that many continued to view nuclear bombs as a reasonable aspect of the weapons arsenal. But what it did mean was that the main players in global security, the likes of the Soviet Union, the United States, Great Britain and China, somehow muddled along under the regulatory principles of deterrence. Deterrence was not a new concept – it just means that, when someone can fight you back in the way you attacked them: you decide not to do it.

But the attack aspect of the scenario was so awful to people’s imaginations that governments and strategists decided it would be best not to test how far the public was willing to go. A collective moral conscience and some limited recognition of government accountability to citizens (really, genuinely wanting a little bit of stability, you know, after those two world wars) sustained the Cold War status quo.[1] It made sense to continue spending money on weapons development, sustain military technologies, and train armies to use new weapons systems, because it all totted up to, peace.

More of this point below.

A thermo-nuclear explosion entails the same effects of a bomb, but per gram of the weight of the bomb, the strength of explosion is higher. That means one bomb covers a very, very large area at once and its temperature is so high that it has a far more damaging, levelling effect on that area (and the people in it). What does make the nuclear weapon exceptional is its qualities of radioactivity. These, popularised in various memorable, fictional formats, cause many varieties of immediate sicknesses and long-term health and environmental issues.

No wonder the world was deterred from using them, we might think.

But not so.

Suddenly it was logical to own massive stockpiles of risk-ridden weapons in the name of not using them. But undercover, indeed in files we may not even be able to read yet, nuclear skirmishes, nuclear battles occurred rather more than this regulatory principle, deterrence, would hold. In the North Atlantic ocean, submarines carrying nuclear warheads had fights, literally bashing each other under arctic ice, what could have happened? And what of the many accidents, some published and publicised, in which a finger might have pressed ‘the button’?

Owning and developing the nuclear weapon is not as simple as courting ‘nuclear suicide’ as many pundits and politicians like to put it. That is simply too easy and quite frankly too offensive (to people who know the meaning of the word) a way of putting it.

You see, we fixate rather a lot on the dangers of nuclear weapons, the horrific consequences of nuclear war, and the inherent risk of stationing them around the world, all of which are not in question. But in the meantime, the likes of napalm, mustard gas, sarin, biological warfare, drone strikes, other extraordinarily large non-nuclear bombs, have all been used in conflicts since 1945: attacks as bad as the kind you could see through nuclear weapons have occurred – perhaps not to as great a size – but they have happened and continue to happen.

If, as is currently the case, people experience anxiety and outrage at the nuclear pomposities being exchanged between Trump and Kim Jung-Un, then it would also be worth being reminded that this is not an aberration – it is a public production of a military tale that rumbles below the surface. And there will be no better means to prove this point than what a war would actually look like between those two countries:

… it would use a high proportion of conventional weapons – no doubt some of the weapons I have listed above. All the tanks, bombs, guns, artillery, men and women, employed in the various wars that hit headlines all the time. There might be nuclear weapons too, of various sizes, and various effects, used strategically from land, sea and air. In effect, the imagined deterrent effect of nuclear Armageddon is wearing off because in the lifetime of those weapons, some people have maintained a belief in their unexceptionality, and some have increased the exceptional power of ‘conventional’ weaponry.

The distinctly fragile accord to allow nuclear weapons to ‘cancel each other out’ has eroded with the rise of new nuclear states and the many challenges to the authority and power of the Cold War superpowers. At the same time, the weapons that maintained wars have got better, bigger, and been tacitly endorsed by public silence.

My thesis highlights that civilians in Britain incubated – to an extent, though not fully – a belief that by having a security system based on a nuclear deterrent war could be avoided. This belief relied on an impression of that exceptionality of nuclear weapons. This was natural. In the 1950s, war had proven to be devastating enough, it was distinctly unwanted, unjustified and not in anyone’s everyday interests to go to war again after the Second World War. People needed space to get on with life, and if emphasising and vocalising the exceptionality of nuclear weapons could do that – then fine – but any attack was an unwanted attack on the postwar home front.

Now where are we? Somehow still deeply alert to the horror of nuclear weapons, appalled that statesmen would suggest their deployment, yet somehow not as aware, or bothered by the deployment of conventional military arsenals. I am not telling you to become a pacifist, go out on the streets, and protest all wars. But I am suggesting that, without critiquing the hyperbole and rhetoric used to explain and report attack, armaments and war, we lose all sense of proportion, place and time.

Kim Jung-Un doesn’t think nuclear weapons are exceptional, they are part of a plan that has been in place in North Korea since 1945, to return the country to a whole and force reprisals on new and old enemies. To recognise this, is a step towards recognising the power that nuclear weapons continue to have over the aberrations and the silences, with which global governments surround the purpose and actuality of real wars.

[1] The nuclear imagination extended far and wide; for some reading see: Matthew Grant and Benjamin Ziemann, (eds.) Understanding the Imaginary War: Culture, Thought and Nuclear Conflict, 1945-90, (Manchester, Manchester University Press, 2016) and Jonathan Hogg, British Nuclear Culture: Official and Unofficial Narratives in the Long 20th Century, (London, Bloomsbury Academic, 2016); Joseph Masco, The Nuclear Borderlands : the Manhattan Project in Post-Cold War New Mexico (Princeton, N.J. :Princeton University Press, 2006).