Richard Lenehan: The Need to Atone

In 1929, the G’psgolox totem pole was taken, without consent, from a First Nations community in Canada to Stockholm’s Museum of Ethnography.  The settlers that took it did not understand, or did not care to understand, this artefact’s socio-cultural importance to that community.   A totem pole is carved from wood to commemorate death: as the wood rots and becomes one with the earth, so too do the souls of the deceased.  In its ignorance, the Museum preserved the totem pole indoors in storage, thereby “trapping” the souls that it commemorated by not allowing it to rot.  Only when a replica was supplied in 1991 was the totem pole finally repatriated, allowing the community to heal in the knowledge that its dead were finally at peace.

This incident illustrates how significant cultural property is to communities, and why we need to address the colonial history of such artefacts in our museums.    Taking a totem pole from its community was akin to stealing a gravestone from this country – an action that we would see as clearly wrong.   Hearing about this made me think about cultural artefacts we have “collected” from other countries, and this essay will argue that these should be repatriated.  It is clear that these artefacts have stories to tell.  We should consider who has the right to keep these objects, and to tell their stories.

Our museums are filled with spoils from our imperial and colonial past.  Not only that, these objects tend to be displayed in ways intended to vindicate the actions of our ancestors in returning from overseas with the cultural property of others, and to tell the stories of these objects from the collector’s point of view, rather than in a cultural context.  This is wrong.  These items would be enriched if seen in the context of the place of their origin.  I am not arguing that we have inherited guilt for looting by our forebears.  I am however arguing that we have inherited responsibility for their actions, and that it is up to us to make things right.

Standard arguments in support of not repatriating artefacts include that they should be displayed in central western locations where they are accessible to the largest number of people, that they will be better looked after in our museums, that they contribute to our knowledge and understanding, and that they may never have been found if it were not for the “collectors”.

Museums are curated to elicit a particular emotional and intellectual response to the objects they display.  Their curators are, however, conditioned to view history from their privileged perspective.   It can therefore be argued that the true historical and cultural context, and the importance of looted artefacts, not only cannot be appreciated here, but is also denied to their rightful owners.

In my opinion, another important reason for returning artefacts is that taking them without permission was stealing.  The stripping of relief sculptures from the Parthenon by Lord Elgin in the 1880s is an example of this.  At that time, the Ottomans occupying Greece gave him permission to take small artefacts from the building, but not to interfere with its “walls or works”.  Removal of what became known as “the Elgin Marbles” was in contravention of his permit, which was, in any event, issued by those without cultural rights to the site.  This can only be described as theft.   A modern-day analogy would be if the United Nations, who had temporary charge of parts of Glasgow during COP26, had allowed delegates to take home historical Glasgow artefacts as souvenirs.  There is no doubt that this would have caused an outcry, and justifiable demand for their immediate return.  

This theft was compounded by the mistreatment of the Marbles under British care.  During their time in the British Museum, the Marbles were cleaned with a metal wire brush to make them look whiter, thereby destroying a lot of fine detail, such as muscles and sinews.  It is therefore hypocritical to suggest that they are better protected here.  In fact, the artefacts would have been better left in situ.  Indeed, at the time they were stolen, accurate casts of the Marbles had already been made, meaning that replicas could have been enjoyed in Britain, with the originals remaining in place to be viewed in their historical and cultural context. This is another situation that should be addressed by repatriation and apology.

There are also clear moral arguments for the return of artefacts.  There was an element of control in taking them from a territory in the first place – it was symbolic of taking control of the territory itself too.  These artefacts are not now easily accessible to the peoples from whom they were taken, and for whom they have cultural significance. 

Moreover, there are clear economic arguments for the return of artefacts.  Items of historical interest frequently come from less developed countries.  There is a real possibility that returned artefacts could be the form the basis of a tourist trade.  You can draw analogies with how Scotland has benefited so much from cultural tourism in recent years, and it would be unjust if other nations could not benefit from their cultural heritage due to the misappropriation of symbols of that heritage.

In wake of recent consciousness-raising events such as the Black Lives Matter campaign, I believe that the fact that artefacts serve as reminder of past oppression is also important when coming to a decision on this point.  The shackles and yokes used on slaves in the 1880s in the southern United States of America are reminders of the atrocious acts committed, and the complete lack of freedom of the stolen people from the southern continents.  We acknowledge that cultural appropriation is wrong, and that dominant cultures should not appropriate from minority cultures.  This should be as true in relation to artefacts as it is in relation to behaviours, rituals or attire.

Museums need to review their acquisitions, and to ask critically whether they need to reframe the context in which they are seen.  They should also be asking whether the items belong with them, or whether they rightfully belong elsewhere.  If they belong elsewhere, then they need to start the process of repatriation, apology and healing.  This last year has shown us that people are questioning this country’s imperial and colonial past, and wanting to make some reparation.  To date this has taken the form of the removal of statues and monuments, but the return of looted artefacts to their communities seems like the logical next step to explore.


Juliet McKay: Black and White Films are Superior to Films in Colour

“The first knee jerk reaction of my kids is that they don’t want to see a black and white movie… 10 minutes into the picture, they don’t know whether it’s black and white or in colour.” (Steven Spielberg)

For many, black and white (B&W) films belong firmly in the past. This is understandable; 1961 was the last year in which the majority of films released were B&W. Despite this, two B&W films still grace IMDb’s list of the top ten greatest films as voted by users.  One from 1957, despite colour becoming more commonplace, the other from 1993, which was a very clear, conscious, stylistic decision. These are Sidney Lumet’s “12 Angry Men” and Spielberg’s “Schindler’s List”. This suggests that there’s still an audience capable of appreciating black and white films as some of the best movies ever. Yet, inexplicably, many younger viewers refuse to watch anything in B&W, some of my friends and Spielberg’s own children included. I personally much prefer the look and feel of B&W and believe monochrome to be far superior to movies shot in colour for aesthetic, historic and genre related reasons.

Nowadays colour is often assumed to be the more interesting and realistic option; however, popularity seldom equals greatness. B&W provides a simplistic, beautiful quality that colour is unable to replicate or replace. Over time, B&W has been overtaken by colour and now remains a rare artistic choice. Since most of the content I consume daily is in colour, I pause when I see something in monochrome because it allows me to dive into a whole other reality. Films aren’t real. We use them as an escape to another world, not simply a reflection of our own, and B&W enhances the experience. We live in a world full of colour; why would you want to watch something so familiar? It can be utilised as a tool to embrace the distinction between the real world and the fictional place the medium transports us to. Frank Darabont, celebrated director of “The Shawshank Redemption” (1994), believes that this unique view of the world “is what makes black and white so very cool.”

Remarkably, B&W films are also able to achieve the very opposite and make a film feel even more real, director/screenwriter Samuel Fuller said, “Life is in colour, but black and white is more realistic.” This can be done by giving it a serious, gritty documentary tone – “La Haine” (1995), or by making it feel authentic to the time period – “The Elephant Man” (1980).

B&W can place a movie in a specific time period by creating a link to the past; “Ida” (2013) succeeds beautifully in establishing its setting as bleak, post war Poland. It can also be used to pay homage to certain genres or film techniques. Noah Baumbach chose to shoot his film “Frances Ha” (2012) in monochrome to mimic the French New Wave movement from the late fifties and sixties. They were usually B&W, used low budget, simple techniques and rejected typical film conventions. I really love that B&W is still being used to pay tribute to some of the most influential periods of cinema and is often the perfect choice.

Classic Hollywood, a time rightfully referred to as ‘The Golden Age’ catapulted stars such as Humphrey Bogart and Katherine Hepburn to icon status and was a hugely influential era of cinema. The grayscale glitz and glamour of this era in cinema history I believe is unmatched. B&W is an integral and iconic feature of films made in this period. Classics like “Casablanca” (1942) and “Citizen Kane” (1941) were colourised and rereleased during a failed attempt to attract viewers by Ted Turner of Turner Classic Movies proving only that films intentionally shot in black and white should be left that way. The Golden Age of Hollywood was an important time that revolutionised many aspects of the film industry, these films remain essential watches. Monochrome is perfectly suited to this era because so many of the popular themes are enhanced by the lack of colour and the contrast between black and white: paranoia, suspense, morally ambiguous characters, good versus evil and their often-cynical view of the world.

Furthermore, film noir, one of this period’s most iconic genres as well as my personal favourite, would not exist without B&W. The monochrome enhances every aspect of these films that includes “Double Indemnity” (1944) or “The Big Heat” (1953), from their dark atmospheres to the figures that emerge from the shadows, cigarette in one hand and pistol clutched in the other. “The Man Who Wasn’t There”, the Coen Brothers’ 2001 film, mimics the style of film noir through use of B&W. Other neo-noirs, filmed in colour, for example “LA Confidential” (1997), use popular film noir tropes yet, along with the loss of B&W, the essential noir atmosphere and look is also lost. In this movie, when audience and protagonist are introduced to Kim Basinger’s femme fatale, she is dressed head to toe in black and white, paying homage to its inspiration and suggesting the director would prefer it to be monochrome. Guillermo del Toro has a star-studded neo-noir coming out next January in colour. Although I am looking forward to this, would it be better in B&W? Obviously, the answer is yes.

Black and white films should not become a thing of the past. They have captivated audiences for over one hundred years and I hope that they continue to do so for another hundred. I would love to see more films make this stylistic choice in modern cinema but I also think it’s very important to continue to watch classics. Glorious technicolour was a revelation when the world was first introduced to it but now, films in colour just feel too ordinary. Even some of my favourite films in colour are ones made by directors like Alfred Hitchcock who started in black and white and continued to use it when colour became available, only using colour if it was to play a significant role in storytelling. Through perfecting the craft of making films without colour, he shows that you can tell a story flawlessly without it. However, recently an article by Variety predicted that the cinematography category at the 2022 Oscars may be dominated by B&W, including films such as ‘Belfast’ and ‘The Tragedy of Macbeth’ showing that monochrome might be making a well-deserved comeback.  While some may still disagree, for me, colour has never moved from beyond the gigantic shadow cast by black and white cinema.

And cut!


Sidney Lumet: Interviews by Sidney Lumet

Steven Spielberg on the Importance of Studying Classic Films – AFI

Gemma White: Why Vinyl Is Better Than Spotify

What comes to mind when you think of a record? For some, it could be the signature crackly sound, for others, old 60’s music playing on a dusty shelf. Perhaps you or your parents may have owned some? Maybe you’ve walked past some niche record shop with rows of untouched vinyl? Or, if you are part of the younger generation, you may recognise them from the single “You Spin Me Round”, which has been re-recorded by many different artists. Many people don’t understand the fuss around vinyl records as technology has advanced since then, so why do so many people, to this day, still use them? In this essay I will explain how vinyl is actually better than online digital streaming.

To start with the most obvious one, the quality of the sound. It’s hard to argue that vinyl has better sound quality than digital streaming; it’s simply a fact. Some believe that listening to a song through the vinyl medium is the best way to hear that song. Of course, this would be affected by the quality of the record player itself, but for the most part, they would be correct. Due to the way vinyl records are created (they are made up of small grooves which the needle is lowered onto and spun on) every single part of the song’s analogue sound-waves is captured in the grooves. This makes them the only true lossless format of music. Whereas with digital music, a digital kit is unable to read analogue sound-waves. This means that they have to translate the waves into a digital signal and back again into sound-waves. This leaves some information lost or changed in the process, not giving the listener the true sound. For a personal experience, I remember playing a record for my brother and his reaction to a song that he had only previously heard digitally. He was taken aback by how you could hear every instrument more clearly and the vocals were smoother. Then he proceeded to ask me, “Why does it not sound crackly?” This crackly sound which many people prefer when listening to music on vinyl, occurs when dust and dirt accumulate in the grooves, causing the needle to jump and produce the noise.

Another reason why there is a buzz around vinyl is not to do with the music itself, but with the experience of buying the records. When you walk into a record shop you can expect to find a few old men looking at classic rock or jazz and possibly some hippie art students flipping through 60s psychedelic pop, but you are guaranteed to fall in love with the atmosphere. Spending hours flipping through rows of old and new records just simply cannot be compared to staring at a screen to select what song to listen to. The rush of dopamine you get when you find an album you like among hundreds of mediocre ones, going out with friends and spending a day looking at music, bringing a parent along and watching their face light up when they find something they “haven’t heard since they were your age,” are just a few of the great parts about going record shopping. Of course, if you are not into the whole social aspect of going out to buy a record, then you can find virtually any record online begging to be part of your collection.

The main reason so many people love vinyl records, including myself, is that they are a physical representation of the music. They can last decades while remaining in a relatively good state. This means that vinyl tends to be an investment for many people and that second-hand records are also very popular. With digital music, there is, of course, no physical representation of what you are listening to. You cannot buy music that someone has already listened to online, but when you buy a used record, you are physically passing music from one person to another. As they are physical, they can make great gifts for people. I have bought many people a vinyl record as a present as it is an easy option and always goes down well. Not to mention, the connection you build with the music while gently putting it onto the table, placing the needle down, and eventually flipping the side over is just far superior to simply clicking a button to play a song online. When I first got my record player my mum looked out a box of her old collection and passed them onto me, thankfully we share a similar music taste, so to my delight I found many albums I enjoyed that were still in good condition such as ‘A Tango in the Night’ by Fleetwood Mac and Paul Simon’s ‘Graceland’ and of course no vinyl collection is truly complete without ‘Blue Monday’ by New Order, but not only did I enjoy listening to these, it was also the connection I had while listening to the same vinyls my mum would’ve at my age that simply could not be replicated if I played them digitally.

If a problem you face while listening to music is figuring out what to play next, then you are not alone. When listening to music on streaming services such as Spotify, sometimes the endless options available can feel daunting, and often you spend more time looking for something to play than actually listening to the music. This is where I feel like the saying “less is more” can be applied, with many people nowadays not fully listening to an album and liking to jump between artists. This is harder to do so with vinyl, as the format forces you to listen to the majority of the songs on the album. This can be good for expanding your music taste by allowing you to listen to more from the same artist. Also, it relieves you of the pressure of, “What should I play next?” as another song will play automatically after the next. This way of listening to music can help you appreciate the effort some artists put into their work, as the arrangement of the songs can play a crucial part in making the music flow well together. Actually sitting down and engrossing yourself in what you’re currently playing is a much different experience than the casual way of playing something through Spotify.

However, many people argue against the use of vinyl. One viewpoint is that they are very fragile and can be easily ruined. Therefore, why would you want to spend money on something that could be rendered worthless so easily? While they are correct in some aspects, I believe that it doesn’t hold enough weight to deter vinyl lovers. Vinyls do need to be stored correctly to be kept in good condition such as; keeping covers on them, keeping them upright, making sure dust doesn’t get collected in the grooves, and the list goes on. Then, while you are listening to them, you should be careful not to make any movement that could cause the needle to jump and create a scratch, as that will lead to the record skipping and being unplayable. Similar to how you will find book lovers that scoff at the idea of downloading a novel on a Kindle as it doesn’t give the same experience as flipping the pages, the same principle can be applied to vinyls. Ultimately, you cannot create the same experience with technology. Taking all of this into account, the fragility of the vinyls adds to their value and makes you appreciate them more.

In conclusion, I believe vinyl is better than digital streaming, such as Spotify. You can find practically any album or song you like in vinyl format, meaning it is an option open to anyone who really enjoys music. Furthermore, the physical aspect of records helps create a deeper connection between the listener and the artist, and the casualness of digital music has, in some ways, watered down the potential impact music can have on people.


Joseph Green: Time to Knock Down Our Dark Past

The purpose of a statue is to honour greatness. Yet, Britain is peppered with statues to those who have harmed people, such as slave traders and colonialists. Events in the summer of 2020 sharpened the focus as the world reeled in the aftermath of George Floyd’s death. This was a symbolic catalyst. Since then, an incredible seventy UK statues, dedicated to slave traders, colonialists and racists, have been removed. But, far too many still remain. Dismantling such symbols of oppression is, in my opinion, entirely justified. Why on earth would we glorify those who wronged and harmed people?

Statues celebrate the glorious, so why keep the inglorious on display? Statues usually commemorate the honourable. Surely then, it is contradictory to keep those commemorating dishonourable slave traders. Who wants to immortalise those who traded in human misery? Until June 2020, Bristol city centre was dominated by the towering bronze figure of 17th-century slave trader, Edward Colston. From the 16th century to the 19th century, an estimated 10 to 20 million slaves left Africa. Forced from their homes, and families, they were transported to the Americas to work in plantations. Undoubtedly, this is one of the most horrific stains on our humanity. Why then do we continue to accept the presence of statues to these ogres? And big names are among them: there is the famous explorer, and murderous slave trader, Sir Francis Drake; then there is Henry Dundas, a Scottish politician, who prevented the abolition of the slave trade for fifteen years after it should have been eradicated in 1792, which ultimately led to 630,000 slaves having to wait more than a decade for their freedom. After all, in other contexts, in other places, statues of the shameful have been toppled. Take the tearing down of a monument to Saddam Hussein, in 2003. Iraqi strongman, Kadhem Sharif “al-Yabani” Hussen took a sledgehammer and smashed the statue of the shamed dictator known as the ‘Butcher of Baghdad’. Obviously, he understood the contradiction of a statue celebrating a disgraceful man.

Moreover, since the UK is more multicultural than ever before, many are offended by the continued presence of statues celebrating colonialists. With changing attitudes, a large chunk of society now sees the British Empire as pernicious; yet, statues glorifying colonialists remain. Modern-day Britain is struggling with racial tensions, much of which springs from colonialism. These tensions are heightened by the myriad colonialist statues that still stand. Take the statue of Cecil Rhodes. Standing proudly outside Oriel College, Oxford, Rhodes is a controversial figure. Today, many view him as the 19th century poster-boy for everything that is disgusting about Empire. He epitomises white supremacy, colonialism and unalloyed racism. In 1895, his British South Africa Company established the southern African territory of Rhodesia, now Zimbabwe, as a British colony. In 2015, a protest group called Rhodes Must Fall, started at the University of Cape Town, South Africa, which also has a Rhodes statue. The movement insisted that it was not targeting Rhodes himself. Rather, that by continuing to prominently display the statue, it legitimises the colonialism he stood for. Surely, that is indisputable. His will leaves no doubt of this. In it, he admits that his, “… true aim and object whereof shall be for the extension of British rule throughout the world…” Surely, leaving Rhodes’ statue standing outside one of the most prestigious UK universities suggests that those in power still harbour visions of racial superiority.

Undoubtedly, many of our inherited statues are no longer compatible with today’s progressive values and so should be removed. They should be replaced with structures that are truly representative of contemporary Britain. According to the 2011 Census of England and Wales, out of a population of over 56.1 million people, 14% identified their ethnicity as non-White European. That’s 7 million people. Yet, out of the 950 UK statues standing today, a mere 16 are of black people. This is wrong. We need statues to represent who we are in today’s society. We need statues that represent how we want the rest of the world to view us. And surely that is not as a country where being white, being a man and being privileged is truly representative of the population as a whole. Therefore, it should be celebrated that in September 2021, a public statue was raised in Cardiff to Betty Campbell. Notably, she was not male, or white or posh. During the 1970s, she was the first black, working-class woman to reach the position of headteacher in a Welsh school. Just as notably, her statue was erected as a result of a public vote. Her school, Mount Stuart Primary in Butetown, Cardiff, was an example of, “…best practice in equality and multicultural education throughout the UK”. Therefore, the Welsh people who voted to commemorate her, in a statue, are sending out a vision of themselves as inclusive. And she is not the only person to have done good for their community. There are many people who could better represent our society. Marcus Rashford is a good candidate. There is already a mural to him in Manchester, which states underneath: ‘Take pride in knowing that your struggle will play the biggest part in your purpose’. In the summer of 2020, Rashford campaigned successfully for the continuation of free school meal provision for underprivileged children. Despite his wealth and fame, he exemplifies social conscience. Certainly, this is the image of the UK that should proudly beamed out – not that of a disgusting, colonial past.

However, the British Government does not wish to see such statues dismantled as it believes that they are part of our history. The Prime Minister, Boris Johnson, has stated that, “To tear [statues] down, is to lie about our history”. In fact, the Government is so concerned that it has brought in new laws to protect statues. These will ensure that historic memorials are ‘retained and explained.’ They think that it is a better to keep statues and have a plaque nearby to explain the actions – good and bad – of the person honoured.

If the UK Government believes statues to murderous slave traders must be preserved, why did Spain, Germany, Ukraine and Georgia, amongst others, tear down statues of equally murderous men like Franco and Stalin? For example, in 2007, Spain’s Historical Memory Law, demanded, “the removal of all Francoera symbols from streets and buildings”. In 2010, a statue of Stalin was removed from Gori, Georgia. Did these nations not care about history as much as Boris Johnson does? More likely, they wished to signal how much they disapproved of what these men did. The UK Government’s failure to recognise that the continued presence of statues, like that of Edward Colston, was offensive suggests that it does not wholly disapprove of how Britain’s wealth was built off the backs of enslaved people. Ben Luke, editor of the Art Newspaper agrees that, “Statues are not history; often they are impediments to truth because they are erected to glorify the powerful as a fig leaf for their flaws and iniquities.” Edward Colston was a powerful man who had many such flaws and iniquities, most prominently the enslavement of human beings. What is his statue if not a glorification of the slave trade?

Ultimately, no matter how greatly a city, or country, benefited, in the past, from evildoers’ contributions, this is nullified by the fact that they made that contribution at the cost of human lives. Statues to such individuals are an eyesore. They misrepresent what Britain wants to be today. Instead, we must strive to be what Robert Louis Stevenson described as an inclusive, non-exploitative community of, ‘multifarious, incongruous, and independent denizens’. And the statues erected must reflect this.

Bibliography: colonialists-removedacross-uk christophercolumbus-his-men-could-not-annihilate/ notable-women-inthe-us-180958237/ (David Olusoga historian) (,and%20explained’%20for%20future%20generations.&text=Historic%20England%20and%20the%20Secretary,in%20the%20most%20exceptional%20circumstances.

Jane Eadie: Twice Upon a Time in Hollywood

British singer Adele is set to star in a remake of Jonathan Demme’s classic psychological thriller The Silence of the Lambs. No, of course she’s not: if she were, you would not have been able to avoid the adverts in magazines and on the sides of buses for the last three months. But if she had been cast in such a role, would you have been surprised? You could see it happening, couldn’t you? Any film studio on the planet would love to make this scenario a reality but why should they jump at the chance to have Adele as the headline star in such a reboot? Is it because she has a proven track record as an actor? Is it because such a talented singer is likely to be able to mine their emotions to turn out a brilliant acting performance? Or might it just be because, with 60 million worldwide album sales to her name and a voracious fanbase, anything she is associated with is a sure-fire hit? 

The fact of the matter is that, regardless of how much of a dud your script might be, or how appalling an actor Adele ends up being, the chances are your movie will attract a huge audience and earn loads of pounds before the penny drops. It’s a scenario we see all too often: David Baddiel- comedian turned children’s author, James Corden- British comedy actor turned US chat show host, Madonna- pop mega star turned Golden Raspberry award-winning worst actress, or even Rylan Clark- talent show wannabe turned surprisingly credible presenter. 

My point is not whether these conversions are a critical success or failure (chances are they’ll at least make money), nor is it a criticism of these people themselves (who wouldn’t seize the opportunity if offered?). It’s that for every lazy decision to overextend and exploit the already famous, to bank on the bankable, there is the likelihood that a truly original, unique and as yet unheard voice gets stifled. 

There’s a laziness too in the assumption by Hollywood producers that the likes of Leonardo DiCaprio’s yet to be born son will grow to become a world-renowned actor. But it’s the same lazy assumption that fuelled the careers of Dakota Johnson, Jayden and Willow Smith, Lily-Rose Depp and many others who through no fault of their own, got their foot in the door. Would Lottie Moss ever have graced a catwalk if she had a different surname? Or Miley Cyrus have got a record deal? Of course, it’s not just in the entertainment business that nepotism flourishes but it’s somehow a bigger injustice, where there is such huge fame and money at stake, that someone should get their lucky break on the strength of their surname rather than their sheer talent and drive. Then again, does it not stand to reason that children who are brought up surrounded by famous parents, steeped in the world of show business, would not be attracted by the allure of the same career? And what’s to say that when a famous parent passes their surname to their son or daughter, that they don’t also pass down some of the genes that made them the stars in the first place? Who’s to say that Kaia Gerber wouldn’t have made it as a supermodel even if she hadn’t been the daughter of Cindy Crawford? And then there’s the fact that some famous offspring take a very different path than their parent: it seems difficult to believe that Stella McCartney could credit her ability to design a handbag from the fact that her dad was one of the Beatles. 

But my point here is that I’m not blaming the children or the parents, or denying that the former might be brilliants talents in their own right. My issue is with those in the entertainment industry who always seem too willing to default to the easy option – the lazy option – of trying to get ever more mileage from a limited pool rather than go to the effort of spreading the net that bit further and seeing what treasures lie in uncharted waters. 

It’s the same laziness that seems to prevail when it comes to the actual product: be it an album, a musical or a film. Whether it’s a reboot, a remake, a sequel or a translation of a foreign film, how often do we see valuable funding and studio production time given over to seemingly endless rehashes of previously successful books, films and music, leaving little room for nurturing newer talent with fresher ideas. A successful movie franchise like James Bond or Star Wars is one thing but at least there is a vague attempt to switch up the storylines each time. But does the world really need another adaptation of Little Women? Having had two BBC versions in the 1950s and in the 1970s and two animated series in the 1980s, as well as film versions in 1917, 1918, 1933, 1949, and 1978, there was arguably a case for there to be a slightly more contemporary version.  Having had that as relatively recently as 1994, however, why was there felt the need to churn out yet another mini series in 2017 followed by a seventh film adaptation in 2019. Let’s face it, the story was set in the early 1860s and it hasn’t really changed! 

With the recent release of the latest instalment of the current Spiderman franchise, featuring Tom Holland’s incarnation of the friendly neighbourhood superhero, we start to wonder how long it will be before he is ditched in favour of another series reboot featuring an even fresher face, like Andrew Garfield and Tony Maguire were before him. Since 2002, we’ve had 3 remakes of a series of 3 movies telling essentially the same clearly money-spinning story to 3 successive audiences. That’s not to say that there’s not an appetite for this type of stuff – I speak as someone who’s seen all 9 and counting! – but it’s so obviously driven by money over original creativity and the laziness of Hollywood producers turning out batch after batch of a winning formula rather than experimenting with some new ingredients.

If that’s not lazy enough, did the producers of the 2021 reboot of the 6 season 2007-2012 phenomenon that was Gossip Girl even get out of bed to decide that a remake was a good idea? With the original cast still young enough to play themselves and the still teenage audience getting a strong sense of deja-vu, how long will it be before we see a series starting to be remade before the original version has even finished its run? 

If there can be any legitimate justification for this for this lazy approach to producing works of entertainment it’s that audiences feel comfortable with names and faces, characters, scenarios and even plots they’ve grown familiar with. Just about everyone on the planet must’ve tuned in, whether by accident or design, to an episode of Friends that they’ve already seen but that hasn’t stopped them continuing to watch to the end. Perhaps the laziness of the producers, agents and promoters is fuelled by the fact that they’ve recognised that audiences are lazy too!

Ultimately though, the pursuit of art and entertainment relies on new faces, original ideas and unique talents. Classical music would never have moved on without Mozart. Art would never have moved on without Picasso. Bob Dylan moved the dial, not his son, Jakob. It was the original Star Wars in 1977 that really pushed the boundaries, rather than the concluding chapter in 2019. It’s the original raw talent that needs to be sought out and given a break. That’s where the creative and entertainment industries ought to be channelling their not inconsiderable energy and resources. Although there’s a cosy satisfaction to be had in reading a novel written by a familiar name, watching a tv series with a well-known actor or seeing a film adaptation of a much-loved classic, it’s time to wake up and realise that the truly thrilling and rewarding is only to be found in encountering a piece of art, literature, film or music that is utterly groundbreaking. Whether it’s the talent spotters that discover, the agents and producers that nurture or the audiences that consume, it would be refreshing to see a bit more effort. Let’s stop being lazy and clinging to what we know already. Let’s embrace the new.


Louise McFadden: Unhappily Ever After: The Harmful Effects of Traditional Fairy Tales on Children

Once upon a time there lived a little girl who was captivated by fairy tales. At bedtime, she listened carefully to her mother’s voice reading the stories aloud, and gazed at the colourful illustrations which brought them to life. Every night, disturbing thoughts of wicked stepmothers, children abandoned in forests and wolves devouring grannies swirled around her young, innocent mind. Such cruelty and brutality are common themes in traditional fairy tales, leaving many children terrified and anxious. Considering this, as well as the sexism, lack of diversity and questionable morals displayed, is it any wonder that little Louise grew up and felt the need to write an essay condemning these damaging and outdated stories?

Murder, kidnapping, mutilation and cannibalism: these are just some of the atrocities that make traditional fairy tales inappropriate for children. According to the historian and mythographer Professor Dame Marina Warner, as well as the fairy tale expert Professor Jack Zipes, many stories were not originally intended for children, but for adults. This includes the popular Brothers Grimm stories of the 1800s, such as ‘Little Red Riding Hood’, ‘Snow White’ and ‘Cinderella’. The earliest adult versions of ‘Cinderella’ contain gruesome details – the ugly stepsisters amputate their own toes to fit into the glass slipper, and later their eyes are pecked out by birds. Originally, in ‘Snow White’, the wicked stepmother is made to dance in red-hot iron shoes until death. Lovely. While some stories have been rewritten over the years in an attempt to make them more child-friendly, a disturbing amount of death, brutality and abuse remains. For example, do you think a story about abandoned children being lured into a cannibal’s house sounds appropriate for a four-year-old? It sounds like the plot of a horror movie. According to Reader’s Digest, ‘Hansel and Gretel’ was one of the nine most popular fairy tales in 2021. What makes this worse is that, like many other fairy tales, it was based on horrendous true events. Many real-life children were abandoned, some even eaten, during the Great Famine of 1314 to 1322. Many parents don’t know the origins of these stories. If they did, perhaps they’d think twice about sharing them with their children. Some, however, do realise the anxiety caused by the cruelty and gore. A OnePoll study in Britain in 2018 revealed that a third of parents said their kids cried at Little Red Riding Hood being eaten by the wolf, and over a quarter change the stories they read to their children. It goes without saying that parents shouldn’t have to adjust the barbarity in their children’s stories – there should be no barbarity to begin with.

As well as the wicked violence of the stories, the endemic sexism also has a damagingly corrosive effect on children. Hundreds of years ago, fairy tales were intended to teach boys and girls their roles. According to Liz Grauerholz, former Professor of Sociology at Purdue University, and Lori Baker-Sperry, Professor of Women’s Studies at Western Illinois University, in their study of Grimm’s fairy tales titled ‘The Pervasiveness and Persistence of the Feminine Beauty Ideal in Children’s Fairy Tales’ (2003), young women were to be “domesticated, respectable, and attractive to a marriage partner”. Why are we still indoctrinating children with outdated gender roles in 2022? Princesses in traditional fairy tales typically do housework all day, lack ambition and have zero independence. They have very shiny hair, though. In fact, the disturbing emphasis on feminine beauty is highlighted by the well-known quote from ‘Snow White’: “Mirror, mirror on the wall, who is the fairest of them all?” Grauerholz and Baker-Sperry’s study states that 94% of Grimm’s fairy tales mention beauty or ugliness. Pressuring young girls to meet impossible beauty standards is unethical and brainwashes them to believe that their appearance is their most important trait. It is not. Seriously, what sort of message are we sending our daughters? That they should sit looking pretty, waiting for a man to save them? Four of the most famous traditional fairy tales follow the recipe of the passive princess waiting to be rescued by the powerful prince – ‘Cinderella’, ‘Snow White’, ‘Rapunzel’ and ‘Sleeping Beauty’. Girls can be so much more than this. They can provide for themselves. They can be the heroines of their own stories. Should parents really be creating a situation where young girls idolise princesses like the Little Mermaid, who sacrificed her voice for a man? As for boys, toxic masculinity is encouraged. Princes in the aforementioned fairy tales tend to have very little characterisation other than being the tough, heroic rescuers and protectors of women. We must stop teaching boys to be strong all the time and show no weakness, emotion or vulnerability. It’s unfair to weigh these restrictions and expectations on anyone, let alone a child.

Traditional fairy tales have a lack of diversity. If I asked you to imagine some characters from fairy tales, you would most likely picture young, white, able-bodied princesses with clear skin and twenty-inch waists. Princes tall and muscly, witches old and wrinkly. Where is the representation for children of colour, disabled children and the LGBT community? There’s no excuse not to include characters that these children could relate to. It’s extremely important to have racial diversity in children’s stories for children of colour to feel included and represented, and to prevent racism developing from a young age. Additionally, there is no body diversity. All characters (except villains because everyone knows that a character’s goodness is directly related to their physical attractiveness) are thin and good-looking. One exception is Hans Christian Andersen’s ‘The Ugly Duckling’ … who eventually turns out alright on the basis that he becomes a beautiful swan. Consequently, some children struggle with low self-esteem, continuing throughout adulthood. What harm would having some more inclusive stories with diverse characters possibly do to our children? Apart from making them happier and more empathetic?

The tales are crammed with bad morals and messages – poisoned apples, corrupting children’s minds, and giving them a twisted perception of good and evil. Every detail from the stories plants a seed in their heads. For instance, stealing and greed are condoned in ‘Goldilocks and the Three Bears’ and ‘Jack and the Beanstalk’. When you see the Prince harmlessly kissing Sleeping Beauty to wake her from the evil fairy’s spell, your six-year-old sees that it’s okay to kiss people when they are asleep. Is it really true love’s kiss or is it sexual assault? Another example of an insidious message is in ‘Beauty and the Beast’. While there is some debate over whether Belle suffers from Stockholm Syndrome (a psychological condition causing hostages to develop positive feelings towards their captor), the story is nonetheless problematic. Belle changes the Beast, teaching him kindness and eventually transforming him back into a prince. Stop teaching young girls that it’s their responsibility to fix men who abuse them. In addition, distorted messages about romantic relationships create unrealistic expectations for children. Fairy tale couples are usually adolescent, implying that love is found easily and quickly, and is only for young people. This can lead to anxiety and depression even when the child is grown up, still looking for “the one”. Moreover, the fact that many tales end with a magnificent wedding insinuates that marriage is the ultimate prize and sign of success. This isn’t true. Love and success can come in many forms and it’s important to teach our kids different happy endings.

I’m aware of the argument that fairy tales improve children’s imaginations. However, they often simply can’t tell the difference between magic and reality. Over fifty American youngsters who kissed frogs hoping for a real-life prince to appear (after watching the Disney film ‘The Princess and the Frog’) certainly didn’t gain a better imagination – they gained salmonella poisoning. And to those who argue that the stories are entertaining – no one is saying that children shouldn’t be told stories, just that there are more suitable ones which could aid childhood development.

Traditional fairy tales do more harm than good. Perhaps if we replace these traumatic stories with ones that are enjoyable, while also being more up-to-date, ethical, inspiring and inclusive, we can all live happily ever after.




Hilda Boswell’s Treasury of Fairy Tales

The Usborne Fairy Tale Treasury by Rosie Dickens

Zoe McGinley: Should Chocolate be kept in the Fridge or the Cupboard?

It’s hard to find someone who doesn’t like chocolate: we are a race of chocolate connoisseurs. There is no argument that the feel-good chemicals released from its consumption play a massive part in how so many of us find chocolate so delightfully irresistible. But the real debate is not about which satisfies the palate more between a Snickers or a Mars Bar, or even how each of us prefer to eat our Creme Egg? The much less documented but highly contested argument which has been splitting opinion between families and friend groups is… should chocolate be eaten straight from the fridge or not? Of course it should! There are simply no words in the English language that can fully describe the euphoric sensations of a cold Cadbury’s Marvellous Creations sweetly and tantalisingly caressing the taste buds.

Chocolate is a renowned and popular household treat today but, surprisingly, many people today aren’t completely familiar with the full history of chocolate. It is thought that chocolate originates back to the Olmecs in Latin America around 4000 years ago, who picked the fruit (pods) of cocoa trees, dried and roasted the beans and then used them to create a chocolatey liquid. There is some further evidence, centuries later of the Mayans who had created a warm ‘brew’ of ground cocoa seeds, chillies, water and cornmeal which they named ‘xocolatl’. By the 15th century, the Aztecs believed that chocolate was a gift from the god Quetzalcoatl and, realising its widespread demand and use as an aphrodisiac, used the cocoa beans as currency. 

Of course, overtime things like sugar and honey were used to sweeten the bitter taste of chocolate, which ultimately, led us to the birth of a new method where the cocoa butter was squeezed from the beans to make a powder which was mixed with liquid and then poured into moulds. Thus, chocolate had evolved from a tangy and presumably unpleasant drink into the sweet, deliciously indulgent confectionery we know and love today through the added genius of master chocolatiers.

When Swiss chocolatiers, Daniel Peter and Henri Nestle added a little milk powder into their cocoa mixture, this opened the floodgates for companies like Cadbury’s who had absolutely mastered the art of chocolate making by producing, in my somewhat connoisseur opinion, the best milk chocolate on the planet. Of course, others may contest that opinion but that’s not the issue I want to debate here – the real argument is whether chocolate tastes better straight from the fridge? Yes, we all purchase our daily or weekly (ok, sometimes monthly) indulgent supply straight off a room-temperature shop shelf, but I think that there is simply no better way to eat chocolate than straight from the fridge! Some agree, some disagree, and some just don’t want to admit that they agree. I fully understand that taste is subjective and this is all just a matter of opinion, however there is in fact scientific evidence to back up this delicious preference. An article from 2012 by Chemistry Matters states the reasons why chocolate does indeed taste better from the fridge. This is all to do with polymorphism which has the ability to form a solid to exist in more than one crystal structure. These structures are called polymorphs. It’s all a bit too technical to explain in scientific detail but, essentially, the ingredients in chocolate have numerous properties that react in different temperatures. Ok, you must be thinking what does this have to do with why we should store chocolate in the fridge? Well, in a nutshell (a Fruit n Nutshell) some polymorphs are too bland and too brittle on their own to act as chocolate and some other properties can change if left at room temperature therefore creating a distinct change in taste but, by storing chocolate in the fridge (a stage known as crystallisation) it prevents the polymorphs from changing as it would whilst sitting in a cupboard at room temperature. Basically, when chocolate is stored in a fridge it is of course colder which adds and an additional level of flavour to release tantalisingly over the taste buds as it melts in the mouth. 

This whole debate has proven to be somewhat contentious with a hugely divided opinion over the issue and not least within my own household. Yes, there are some ‘non-fridger’ members of my family who are brave enough to risk my wrath by having the nerve to remove our chocolate stash from the fridge by citing that it should indeed be enjoyed at room temperature. As a more heated debate ensued, we all agreed that the only way to settle the argument was to find some official conclusion from the big confectionery companies as they’re the experts, right? Wrong! In reply to a recent online blog which asked readers whether chocolate should be kept in the fridge or pantry, Cadbury’s themselves had indeed waded into the matter to state “Chocolate should always be stored in a slightly cool, dry, dark place such as a cupboard or pantry at temperatures less than 21C to ensure the quality isn’t compromised”. So who do we trust – those who spend years in university to become scientists or those who work in the factories watching the machines do the chocolate making? 

But what about melted chocolate? Well, that argument I understand, there’s nothing better than the experience of coming home to make a cup of hot chocolate after a long winter’s day or the texture of biting into a perfectly melted chocolate cookie straight from the oven. My question is, who would want a room temperature chocolate bar melting into your hands on a hot summer’s day?

Who are these “experts” to tell us the “correct” way to eat our chocolate when really, it all comes down to preference? Should we consider the claim from Cadbury’s that they know the perfect chocolate storage conditions for ultimate flavour when, in reference to their Crème Egg, they have devoted a whole advertising slogan offered back to consumer choice when they ask ‘how do you eat yours’? It’s also a safe assumption that the Aztecs would not have just believed their chocolate drink to have come from one god, but rather the ultimate gift from all the gods had they only had access to a fridge!

So now, I encourage you, stick your favourite chocolate bar in the fridge and tell me I’m wrong.


Niamh Graham: Is a University Degree a Requirement for Career Success?

Is it the end of the world if you don’t go to university after school? Most people’s immediate answer to this question will be, ‘Yes, of course you need to go to university if you want to succeed in life and get a good job.’ In fact, this is not true: you don’t need a university degree. There are other ways to go about getting your dream job. In fact, many people that have become successful have never even set foot in a university; many more dropped out, having not lasted long enough to get their degree. This essay will explore the reasons why not going to university may be better than wasting four more years of your life stuck in a classroom. 

One of the main problems for people thinking of attending university is whether or not they can afford it and whether the cost is really worth it. To answer the question – spoiler! – it’s probably not. With maintenance loans and tuition fees to pay, graduates are finding themselves in thousands of pounds of debt before they have even applied for their first job. In 2021, students graduating from English universities will have incurred an average student loan debt of over £45k, compared to almost £28k in Wales, over £24k in Northern Ireland and just over £15k in Scotland. So, you really need to ask yourself: is the money you’re willing to spend going to be worth it? Even after the financial risk there is still no guarantee that you will get a good, well-paying job. In fact, only 59% of those who qualified from Higher Education went on to full time employment. If the job you think you want to do does not require a university degree and further education, the solution is simple: don’t go. It’s not worth the time, the money or the stress.

Speaking of stress, a Uni Health study found that 80% of those studying in Higher Education reported symptoms of stress or anxiety, while NUS surveys found that nine in ten students experienced stress. Would you want to be spending an extra four years (minimum) doing more assignments and exams when it’s not entirely necessary? I wouldn’t. Taking work home is a fundamental part of university life. You are never finished. You always have something you should be doing instead of relaxing, taking a break or seeing friends and family. This results in feeling that, in those moments when you’re not working towards your degree, you feel like you should be.

Nowadays, after you finish university the likelihood of you getting your desired career from the course you took is diminishing. The job prospects for grad students is decreasing at quite a significant rate. Average student satisfaction rates (which take into account factors like support from university, quality of teaching/tutoring, course structure and, crucially, career prospects after graduating) have fallen consistently over the last few years. Last year, the government released sets of data about the career prospects of a degree, broken down by subject or institution of study. While some courses have great earning potential, the data showed that a large number of courses don’t lead to well-paid employment afterwards, which is why the majority of people chose to go to university in the first place. This is leading to an increasing amount of people who are realising that they don’t need a degree to secure the jobs and careers they want.

Lastly, it is a well-known fact that some of the wealthiest and most influential entrepreneurs in the world dropped out of college and university. Steve Jobs, Bill Gates and Mark Zuckerberg are some people who left college before they could collect their diplomas. Lesson: you are still able to get a well-paying job without a degree. Here are some of the highest paid jobs in the UK that you can get without going to university: air traffic controller, digital marketing, SEO expert, white hat hacker, firefighter, offshore energy jobs, game developer, translator, police constable and entrepreneur. All of these jobs still pay a handsome amount of money and you can start them straight out of school. Your level of education does not need to define your career or your success. Just because you’ve got a degree doesn’t automatically mean that you are entitled to a higher salary: you have to earn respect in the workplace by showing what you can actually do and, of course, in some cases you learn much more on the job.

But I do also understand why some people choose to go to university. It gives you time to explore different career options and experience a taste of the different courses available if you haven’t decided what you want to do with the rest of your life. Going to university also gives you the chance to learn and obtain some very valuable life skills that you can take with you after you leave. Many of the people who go to university leave it blessed with long-lasting relationships with the people they met while they were there. The academic aspect is a big part of attending but it also gives you the chance to bond and connect with people who are likeminded and who enjoy the same interests that you do. And yes, there are of course a number of professions where you are required to have certain degrees before starting on the job.

In today’s world, there are so many more options and career routes that are available to ambitious individuals who are willing to roll up their sleeves and work hard. In fact, many of the professions that traditionally require a degree are now reassessing their requirements and route to qualification. The key to success is about having a focused approach to what you want to do and finding out as much as you can about that career. Speak to people who already do the job and be prepared to be flexible and to have the ability to adapt to circumstances and take advantage of opportunities when they present themselves. More often than not, these characteristics make for a much more employable candidate than one who has a certain combination of letters after their name.


Finbar McGinn: Coronavirus and the Framing of War

“We are engaged in a war against the disease which we have to win.” – Boris Johnson 3/17/20

Anyone who has been paying attention to the news in the past year, no doubt, has been battered over the head with an excess of militaristic images and jargon, from Donald Trump declaring himself a ‘War Time President’ to the images conjured up in the mind of the ‘NHS frontline soldiers’ battling heroically against the ‘invisible enemy’ and the countless other expressions used endlessly. This incessant use of militaristic language and imagery by the government and the BBC has prompted a chain reaction of artists and public figures declaring N.H.S workers as ‘saints’ and even one image depicting them as blue-suited mask wearing angels with big fluffy rainbow wings and a glimmering halo. Anyone visiting from a year ago would be slightly baffled by the present canonisation of all NHS workers and the deification of the NHS, so what has prompted the shift in language and this drastic new appreciation of the NHS and those involved in the ‘fight’ against coronavirus?

Simply put, it is a popular method that governments across the world use to strengthen public belief in government policy by conveying a sense of urgency and emergency to the public through use of language and metaphors of war. Through the power of rhetoric and propaganda the public are led to believe that civil liberties must be curbed in order to ensure security or, in our case, health security; this process is also referred to as securitisation[1]. However, it is not by sheer rhetoric and propaganda alone that the government enacts its policy; it also employs the use of ritual. This is seen by how the framing of ‘war’ to the British Public is reinforced by the war-like rituals that the public participate in like, clapping for essential workers at 8pm weekly along with the pinning of rainbows on windows across the U.K to show solidarity and support for the ‘frontline’ workers. And of course, it wouldn’t be a true war without the essential war-time speech from the Queen in which she even went so far as to reference the classic WWII song, ‘We’ll Meet Again’ written for Soldiers leaving their families, drawing a tenuous analogy to their sacrifice to our own by our acceptance of Lockdown.

However, historically this is not the first time and only time this linguistic trick has been pulled and in fact, it has been quite popular outside the U.K. Like the U.K, the U.S too has had its fair share of attempts to ‘wage war’ on different issues; the ‘War on Terror’ and the ‘War on Drugs’ both come to mind. The U.S government’s attempt to use the language of war to strengthen public support in its varied political struggles against drugs, crime and terrorism seem to have failed miserably. Public support for both the United States Government’s attempt to crack down on these issues is at an all-time low and in the case of the ‘War on Drugs’ the Government seems to be effectively reversing its policy by gradual relaxation of rules surrounding softer drugs across a third of the U.S as well as some states like Oregon even going so far as to decriminalise all hard drugs. Despite its later failure the initial effectiveness of this policy was quite astounding. Take for example the so called ‘War on Terror’, which caused a significant and permanent expansion to security in Airports and resulted in what many people now deem excessive curbs to civil liberties. To show one of the ways in which the Government used the framing of ‘war’ to their advantage you need look no further than the now infamous Patriot Act. The supposed ‘Patriot Act’ was passed shortly after 9/11 by the Bush Administration in an attempt to crack down on terror by introducing extensions to legal privileges on wiretapping, enhanced surveillance and further loss to civil liberties. The importance of language is emphasised by the government’s decision to use the name ‘Patriot Act’ which obviously suggests that the bill is being passed by sheer patriotic good will. The War effort against Terror become patriotic, and skeptics are deemed as unpatriotic deserters. This is a great case study of how the language of war is used to enable government policy, but also shows one of the ways in which this method can often be dangerous as it permanently reduced civil liberties of Americans and empowered government surveillance of private citizens.

The question then arises, could the War against Covid incur a 9/11 of Health Security and of Security in general – a major health crisis that allows a government to implement sweeping curbs on civil liberties? Such an example of exploitation of a crisis occurred under Viktor Orban’s quasi-fascistic government in Hungary where ‘Orban seized wide-ranging emergency powers and the ability to rule by decree’ according to the Conversation. This clearly shows how the governments often uses issues of ‘National Security’ and the framing of war to expand its power in this one particular example. The results of military rhetoric can also be seen with Donald Trump declaring his campaign against the ‘foreign virus’ from China and even Xi Jinping, himself, calling for a ‘people’s war’ against the virus. What unites these two men in their choice of language is their use of ‘the war against the virus’ for political gain. Trump declaring the virus to be ‘foreign’ and from China simultaneously allows him to take a jab at the rising Chinese Communist Party as well as further raise fear about immigrant populations within the U.S. Although, Xi Jinping’s government has also made equally outrageous claims that the virus originates from the U.S to further hatred towards America and the Western World and expectedly, fixates his language around the ideology of Communism with talks of a ‘people’s war’ against the virus. This highlights one of the central problems of ‘waging war against coronavirus’; that the government can often use the language of war nefariously to gain and expand political power by any means necessary. President Trump’s framing of the enemy as foreign and from China unfortunately resulted in a sharp rise in anti-Chinese attacks across America, showcasing blatantly the potential harm of war rhetoric.

However, more consequential examples across Europe occurred when the different Nations collectively decided that the appropriate response to ‘the threat of the invisible enemy’ was to impose exceptional measures such as lockdown and other general restrictions. Thus, issues of freedom of movement and decisions to open shops became matters of national security and subsequently were decided and policed by a new unrestricted government, a situation unthinkable a year ago. And still at the end of lockdown, as the public desperately cry out for freedom by any means, the government seeks to maintain the securitization of basic civil liberties through use of vaccine passports and even facial recognition to potentially limit your vaccine skeptical uncle from ever entering a pub again in his life. Once again, the process of securitisation and the government’s use of the language of war to facilitate this process highlights the importance of rhetoric and language of ‘war’ in producing a less tempered acceptable attitude towards difficult but important decisions in the public made by the government. The ability to make going to the pub or attending a public event an uncontroversial matter of health security truly speaks to the supreme power of rhetoric and propaganda.

Because of the media and government’s use of ‘war’ rhetoric and the subsequent securitisation of civil liberties, my generation has never known a world without barriers at Christmas markets, machine-gun wielding police at airports and mass government surveillance of private citizens and it now seems that our children will never know a world without vaccine passports at pubs and facial recognition at football games.

[1] This is not to make a statement on whether or not it is justified in each instance to curb civil liberties in the name of security.

Helen Findlater: Let’s Fix This!

It is 2012, and in a clean, clinical room in Denmark, Angelea smokes crack cocaine to aid chronic pain in her left leg – the result of a serious car accident.  She brings her drugs to the smoking room; they are tested for purity under a microscope.  Constantly supervised by nurses, Angelea feels safe, dignified and respected.  Most importantly, she is given further resources to help; she has greater control over her future.

Mention the subject of drug addiction and most people think criminals.  Me?  I think victims: people with a medical condition that needs properly cared for.  Until we accept this definition the problem will only get worse.  So, how can we make it better?  How can we fix this?  One possibility, already having dramatic results on the continent, is fix-rooms, properly known as consumption rooms.  Fix-rooms are safe spaces where users can take illegal narcotics under supervision.  Fix-rooms already exist in Denmark, Switzerland, Holland and Canada.  Fix-rooms could help fix problems here in the UK.

The facility where we met Angelea earlier is called Skyen and it accommodates between 500 and 700 drug intakes per day.  This project has quite literally changed the way of life for over 5000 drug addicts in Denmark.  I would love to see similar projects running in the UK and I hope to convince you of the benefits of fix-rooms for the good of all.

Fix-rooms are safe and hygienic spaces for victims of drug addiction.  In the UK, in litter-strewn back streets and grubby hostels, addicts share drugs and needles.  The use of a fix-room gives drug addicts a haven, free from disease and infection.  By providing clean facilities and clean equipment (e.g. syringes), fix-rooms reduce injecting-risk behaviour (syringe sharing), ultimately reducing the risk of HIV transmission and fatal overdoses.

The UK now has the worst drug mortality rate in Europe: in 2017 Denmark recorded 237 overdose deaths whereas the UK recorded 3,256 – an unacceptable and avoidable loss of 3019 lives.  Scotland holds the unenviable prize of first place for the highest drug mortality rate in Europe – that’s a scandal of epic proportions and the fact that our UK neighbours, England and Wales, share third place is no consolation.  We are clearly getting our approach to drugs wrong in the UK.

Fix-rooms would be a step in the right direction for us since there has never been a recorded death in any of the 78 fix-rooms that exist on the continent!  They employ highly trained medical staff who care for the needs and the safety of the victims of drug addiction.  If something goes wrong they are there to administer antidotes and immediately resuscitate the patients.  Surely in Scotland, with its harsher climates and notoriously poorer diet (which contribute to our poor health), there is an even greater need for facilities like these to help reduce our drug deaths?

Many would argue that fix-rooms encourage illegal drug use but this is nonsensical since no one (except a drug user) would appear at the door of what is effectively a clinic seeking to become a drug user!  Views like that are symptomatic of the failures in drug policy that fix-rooms would go a long way to repairing!  If we stopped criminalising addicts and increased their access to health and social care services then we might just start to get things fixed.

According to a survey conducted by the International Network of Drug Consumption, 78% of professional groups represented in fix-room teams are social workers.  A Canadian cohort study showed that the use of a Vancouver fix-room was associated with increased rates of people referred to addiction care centres and increased rate of the uptake of detoxification treatments.  Fix-rooms don’t take away the significance of addiction aid; they support, promote and provide care.

Wouldn’t you like to walk into the city centre or a park without worrying about discarded syringes?  Introducing fix-rooms significantly reduces public drug use, discarded syringes and wider societal impact.  Before Skyen opened as many as 10,000 syringes were found on the streets of Vesterbro – this significantly decreased to 1000 after a year of its opening.  Not only would our streets be safer for everyone, but we would also significantly reduce the pressure on our emergency services.  There would be fewer calls to the police regarding public drug use, and fewer ambulance call-outs related to overdoses.  Fix-rooms have proven that their use can significantly reduce the financial and social burden on society associated with drug addiction.

To addicts, fix-rooms are a god-send, however many in power believe they aren’t of any use despite the clear evidence to the contrary.  The Home Office has dismissed the positive prospects of fix-rooms and parroted the old lies about them becoming a focus ‘of crimes’ and are intent on continuing their plans for more treatment facilities and more focus on disrupting drug supplies – the much-fabled war on drugs that has failed time and time again!  Their words are also quite hollow since they have repeatedly cut treatment budgets causing a 26% rise in drug-related deaths in England (2013-2016).  Steve Rolles, a senior policy analyst at the Transform Drug Policy Foundation, which campaigns for the legalisation and Government regulation of drugs, said: “The idea that eradication or a drug free society can be achieved through enforcement is clearly ridiculous.”  The harsh reality is that the government are blind to the real problems of addicts and are determined to criminalise and demonise them rather than assist them in combating their conditions. Short-sighted government policies that continue to criminalise drug addicts and condemn them to suffer in the crippling conditions associated with dependence mean that we will never solve the problem.  We need to change the focus from criminal to care.

By accepting the need for health services to be the lead focus in drug addiction and funding fix-rooms we could dramatically reduce the number of fatal overdoses, discarded syringes and reduce the risk of HIV among vulnerable and desperate people in need of our support.  We could decrease the number of drug-related emergency call outs and increase the number of addicts referred to treatment facilities.  I accept that there is no magic-bullet solution to fix this but fix-rooms are a positive step in the right direction and they would, most certainly, dramatically reduce drug-related crime and drug-related deaths . . . and surely that’s worth fixing!


How ‘fixing rooms’ are saving the lives of drug addicts | Mattha … › world › commentisfree › nov › fixing-rooms…

Why ‘fix rooms’ might be an answer to Scotland’s drug … › politics › 1437423-drug-fix-rooms-should-be-introduce…

UK government rejecting ‘fix rooms’ in Glasgow ‘stands in the … › News › Scottish News › Drugs

Nina Snedden: Tyler, The Creator – Flower Boy or Goblin?

Flashing the words ‘Flower Boy’ on screens behind him, the artist, Tyler, the Creator appears determined to embody this title. Dressed in a pair of yellow shorts, a blue printed shirt and neon pink cap, he seems to be blooming; his goofy allure evident from the boldness of his choice of attire. There is a certain warmth which radiates from the strength of his presence: zany, eccentric and unpredictable. Lounging across his vibrant stage set, with its certain dream-like quality, Tyler offers refuge from the band of drugged-up, monotonous mumble rappers which headlined Longitude 2018. The screens go black. A cluster of rainbow lights pulsate before an idyllic scene appears; a light blue sky, flecked with the palest of candy pink clouds, an assortment of large and assertive trees and him. A single flower.

Hardly the archetypal criminal… yet in the summer of 2015, whilst attempting to enter the UK for a run of festival performances, despite being in the country just 7 weeks earlier, Tyler was turned away at the border and banned from Britain for 3 to 5 years by then-Home Secretary, Theresa May. Government documents specifically cite lyrics from five songs – ‘Tron Cat’, ‘Blow’, ‘VCR’, ‘Sarah’ and ‘French’ – from Tyler’s first two projects and explain that he was banned under the terms of Home Office policy on ‘behaviours unacceptable in the UK’ – a set of guidelines formed in 2005 to try to prevent suspected terrorists from entering Britain. Tyler is said to have been banned for ‘unacceptable behaviour by making statements that foster hatred, which might lead to inter community violence in the UK’, with his albums B******, in 2009, and Goblin, in 2011, labelled in documentation justifying the ban as ‘based on the premise of adopting a mentally unstable alter ego who describes violent physical abuse, rape and murder in graphic terms which appears to glamorise this behaviour’ and seeming to encourage ‘violence and intolerance of homosexuality’. This wasn’t the first time Tyler has had trouble entering a country. In 2014, he was banned from New Zealand for posing a ‘threat to the public order and the public interest’, and in early 2015 he became the subject of a large public campaign by Australian feminist group ‘Collective Shout’, who referenced early song lyrics in an effort to ban him from entering the country, leading to Tyler’s Australian tour being derailed. Is there any truth to the claims of the supposed ‘threat’ which Tyler poses? How can two such contrasting images of the same artist co-exist?

In an interview with The Guardian in September 2015, Tyler himself admitted that much of the work in question was written when he was ‘super-young’ when ‘no one was listening’. It is undoubtedly true that Goblin, and perhaps even more so B****** (Tyler’s first mixtape), upon first listen appear a nauseating stream of gore and horror, created for the sole purpose of shocking the audience. Songs like ‘Sarah’, ‘French’ and ‘VCR/Wheels’, diabolically twisted and loaded with graphic violent references and homophobic slurs – even 10 years after their release – still don’t sit quite right with me. However, it is important to note that these two projects form part of a trilogy. The third project in Tyler’s trilogy, ‘Wolf’, is the key to understanding his early releases. A far more mature Tyler, ever the ‘walking paradox’, grapples with deeply rooted psychological problems on ‘Wolf’ set to smooth dreamy simple beats. On ‘Answer’, Tyler appears more vulnerable than ever before, addressing his estranged father and bragging about all he’s achieved without him, whilst still praying that if he ever calls his father answers. Tyler also explores the loss of his grandmother, rapping on ‘Cowboy’, ‘ain’t been this sick since brain cancer ate my granny up’, before battling issues with fame and wealth on ‘Colossus’ and ‘Cowboy’ when he raps ‘You’d think all this money would make a happy me, but I’m ‘bout as lonely as crackers that supermodels eat.’ On the penultimate track of the album, ‘Lone’, the storylines of B******, Goblin and Wolf finally come together in a therapy session, with alter ego Dr TC asking ‘So, what’s going on, Wolf? Talk to me, man…what’s on your mind?’ It then becomes clear that the graphic violent images portrayed on Tyler’s earlier projects, through the medium of alter egos, have originated from a mentally unstable mind, whilst talking to a therapist. In the video for ‘Sam (is dead)’, we see Tyler shooting himself three times, leaving three dead Tylers on the floor, representing the death of his alter egos, Ace, Tron Cat and Wolf Haley. The track title also suggests Tyler has already killed the alter ego, Sam. In this way, Tyler’s complex concept album, Wolf, explains the inner turmoil which prompted the creation of such dark alter egos on B****** and Goblin, transforming Tyler from villainous brute to misunderstood misfit; whilst the track ‘Sam (is dead)’ shows Tyler maturing and killing off his dark thoughts to allow for his future brighter albums, Cherry Bomb and Flower Boy, on which Tyler eventually transcends his darkness to emerge into the light by coming out as gay. It is clear that this beautiful, intricately constructed exploration of the complexities of the human condition was lost upon Theresa May, and many other detached listeners, as Tyler seems to reflect on the track ‘Glitter’ on his most recent album, which ends ‘we didn’t get your message, either because you were not speaking or because of a bad connection.’

This sort of investigation into our humanity is a commonplace of literature and film, recurrent throughout history, so why is it that when this same topic is approached by a rapper it is immediately attacked? Although not a traditional medium, rap is still a means of expression and art, communicating to a whole new generation; an art form judged by Theresa May, based purely upon presumption and ignorance. Rap is a genre with a long history of positive influence – from the anti-drug message broadcast to millions of youths on ‘Say No Go’ by De La Soul, to the reality of inner-city poverty and crime revealed in ‘The Message’ by Grandmaster Flash – and an even greater potential for influencing the youth of today. Yet it has long been cloaked in the negative guise of a testosterone fuelled bombast by those who do not listen to, or understand, or wish to understand the sentiments expressed in the music. If Tyler’s same concepts had been expressed through the medium of opera, traditionally perceived to be a far more ‘intellectual’ form, would he have been attacked with such fervour? Or would he have been attacked at all? ‘The Rape of Lucretia’, an opera by Benjamin Britten, in which the voice of ‘Sextus Tarquinius’, a rapist, is adopted was not only not banned, but was in fact met with praise from critics. Surely this proves the deeply unjust and snobbish mistreatment of Tyler, and more broadly of rap as an art form. Art should be provocative and controversial. It is a means of pushing boundaries and re-defining societal norms. Why should this responsibility be reserved solely for orthodox mediums? Tyler himself queried ‘Why don’t they ban authors? Writers who write these mystery books about people getting raped and sabotaged and murdered and brainwashed – why don’t they ban them?’ Marquis de Sade’s books, notorious for their misogyny, sadism and gruesome details, are still widely available for consumers. Yet Tyler was detained for a piece of art, a dissection of human nature. It is undoubtedly wrong to restrain an artist’s expression in this manner. Tyler himself reflects this, stating ‘Now freedom of art and speech are at hand.’ In our current political climate, surely there are larger threats to British peace than a young artist’s means of self-expression, discovery and acceptance?

There is a particular, inane irony that it should have been Theresa May who made this ‘moral judgement’ on behalf of the country. This is a woman who, since becoming Prime Minister, has cowered to the will of Donald Trump, proclaiming her faith in her ‘special relationship’ with a man who actively facilitates hate. If May’s desire to protect LGBTQ rights is so strong, why is it that she prances about with Trump, whose transgender military ban does anything but offer support for the community? The implications of Tyler’s homophobia appear even more comical following his own ‘coming-out’, made explicit on his recent album Flower Boy. Yet, even prior to this, these accusations were largely nonsensical, clearly coming from a place of blatant ignorance. OFWGKTA, a hip-hop collective founded in 2007 by Tyler, himself, boasts notable LGBTQ alumni, Frank Ocean and Syd, with whom Tyler has repeatedly collaborated closely and undoubtedly regards as close friends. The profound hypocrisy of Theresa May’s stance becomes clear given the fact that her own past concerning LGBTQ issues is partially marred with murk. In 2010, May’s first act as Home Secretary was to ensure that public bodies did not have to actively try to reduce inequality. Whilst just last year, May hosted Ugandan MP, Jovah Kamateeka, who hopes to pass an anti-homosexuality law in Uganda which would introduce life-long imprisonment for gay and lesbian couples. Tyler, based on deliberately provocative acts of rebellion and artistic expression from his teenage years, which, unlike those of most teens, were lived under the microscope of the media, has been identified, targeted and morphed by May into a scapegoat for societal evils which he does not, and has not ever represented. May’s eagerness to seize the opportunity to vilify a young black gay artist, who is in fact blooming into an ironic gay icon for this generation, may be evidence of her ongoing, innate discomfort with the LGBTQ community.

May’s chequered past with LGBTQ issues – voting in 1998 against the reduction of the age of consent for homosexual acts from eighteen to sixteen to bring equality to the law affecting heterosexual and homosexual acts, voting against a Bill allowing gay couples to adopt in 2002 and remaining absent from four votes on the Gender Recognition Bill in 2004, before finally voting to introduce Civil Partnerships for LGBT couples in 2004 – suggests her act was a means of disguising her past disapproval of homosexuality. With the drastic evolution of May’s own stance, her decision to deprive an artist, who carries the possibility of creating a massive positive influence upon the youth of today, from the opportunity of sharing his own evolution with the public, is baffling. Was this evolution simply a convenient mask which May wore to fit in with David Cameron’s more ‘inclusive’ brand of Conservatism? Was her ban an act of good will or merely a quest for a tangible villain? May’s actions seem likely to have been a means of ‘proving’ her progressive thinking on LGBTQ issues to the world by banning someone who seemed to be attacking the community; an act which she undertook without bothering to take into account the whole truth behind Tyler’s body of work, and an act which, in fact, ironically ended in attacking a member of the LGBTQ community.

Was the decision to ban Tyler from the UK ultimately a reflection of an ultra-sensitive, overly-prescribed society, in which influential people keen to be seen to be doing the ‘right thing’ act on knee-jerk reactions and superficial interpretations rather than really listening to what ‘provocative’ artists are trying to say? Tyler conveys this himself, explaining, ‘It’s like the world is scared of everything. I feel like everyone is so sensitive to everything, and if they don’t like something it’s like: Oh my God, I don’t like the colour yellow – let’s get yellow banned from every country, let’s sign a petition – let’s start a hashtag to make sure this colour is never seen, because I don’t like it and I don’t understand it.’ And this is what Tyler wants to do – paint the world yellow, inspire and excite fans. From the nauseating darkness of his Goblin days, to the brightness and optimism of Flower Boy, his evolution is a potent one, reflecting the reality of the vagaries of life, and the struggle with acceptance of one’s sexuality. Who would’ve thought that the obscenity-filled works of B****** and Goblin would plant the seeds for Flower Boy to grow? Whether it be telling ‘black kids they can be who they are’ on ‘Where This Flower Blooms’ or supporting the Black Lives Matter Movement on ‘Foreword’, Tyler truly has bloomed into a role model for his fans.

2127 Words


Thomas Gillen: Reduce, Reuse, Recycle: How the people alone can’t stop Climate Change

Another doom and gloom headline flashes across your computer screen. The fifth Horseman of the Apocalypse, Climate Change, has trotted into town, cutting down the polar bears, scorching Greece and sunny Siberia, purging the ice sheets and pillaging the coastline, and only one word is left in its wake – you. Global Warming is one of the greatest questions of the 21st Century, threatening the delicate balance of entire weather systems and more as the average global temperature rises – and is constantly spun into the individual’s problem, throwing the public eye away from the politicians and corporations obfuscating the issue in the courts for their own selfish agenda. I feel that the corporatocracy of today is the harbinger of a bleak tomorrow in the face of a worldwide crisis.

The paragons of anti-intellectualism and downright scientific denialism among those able to affect change – the elected – is no small sign of this pervading problem in politics. With very few scientists going into political professions, the parliaments are ruled by those who are poorly informed on crucial climate legislation and basic science – when Scott Pruitt, the current USA Enviromental Protection Agency administrator in a major carbon emissions centre is actively assisting the repeal of important legislation in the crusade against global warming, the environment is not in good hands. I personally feel the lamentable lack of scientific representation in government circles is hindering the ability of key countries to act against man made climate change, and the public’s ability to make waves in these issues wanes because of it.

Not every government is so apathetic towards the world’s plight. But even so, they still engage in debatable practices. Nuclear power is a developing, and very promising, energy industry that is constantly, and regularly, demonised by some in the political sphere. The energy output of 6 grams of uranium-235 is roughly equivalent to a metric tonne of coal – and all you hear is Fukushima, Chernobyl! The European Union (EU) is a leading proponent of the Paris Climate Agreements in 2015, and key members are still skeptical as the world’s hourglass runs ever drier – Germany’s reputation for efficiency is not highlighted by how its renewables and nuclear industry barely covers more than its fossil fuels usage, and there is no clear plan on phasing out the fossil fuels in the near future. For every green glowing France, there is an soot-covered Argentina, and with greenhouse gases flooding from the energy sector I think the nuclear fears being stirred by some political leaders are disingenuous and could have far reaching consequences.

Renewables, such as hydropower, fare somewhat better, with a cleaner past than other alternatives, but even that is fraught with trouble – Scotland is practically a world leader in wind energy (‘Scotland is home to the biggest renewable energy resource in Europe. We will set ambitious renewable energy targets and government funding will support low carbon technologies, energy storage and transport alternatives’) , and the UK recently announced a 56% cut to funding in that sector of the energy industry when renewables are still in dire need of help – which once again reflects a running theme in the climate discourse; The flaunting of progress in favour of short-term economic benefit.

There is, however, a price to all of these potential benefits. The start-up costs of these industries is high and not to be dismissed, with potential billions – trillions, by some estimates – of pounds having to be invested in low carbon methods to make any sort of worthwhile waves. Professor Gordon A. Hughes in Edinburgh painted the ever-so cheery picture of £16 of energy by today’s standards going for £38.50 and more, and that is not even the tip of the iceberg when it comes to funding the ‘cheap’ alternatives – and while both renewables and nuclear are relatively cheap to run once they are set up, they still have their own issues. Nuclear is potentially vulnerable to exploitation by terrorist organisations in both the first and third worlds, with Al-Qaeda allegedly having schematics for various nuclear facilities – the fallout of a dirty bomb alone is a high risk to innocent lives. There is a catch to all of that – the nuclear industry recognises this risk and has made preparations for this scenario, involving military intelligence and more. And fossil fuels, while cheap in the short term, have much larger costs. All of the environmental disasters, from tsunamis to heat waves to harsh winters, will cause much more damage than our worst nightmares – trillions of pounds of property losses, wars over what little scraps of oil can be gathered from depleted sources, and that is not even considering the greatest loss of all – life. When the dust settles, any cost now is going to seem like nothing.

Politicians, however, are not the only ones responsible – moreso a peon of the greater culprit. The corporate impact on the environment is not to be understated – with 71% of all greenhouse gas emissions coming from 100 companies, including the likes of ExxonMobil and Shell, the regular adage of ‘drive less’ and ‘eat less meat’ loses its potency. The unfortunate truth of the matter is that a coordinated effort to phase out staples of society like meat is far down the road, if at all, but the responsibility to reduce their emissions are still there – and while Big Macs are still in high demand, poor infrastructure and lack of subsidization in these industries is going to continue to fester like a tumour, putting profits above improvement. Personally, I’d rather not die to cow farts.

The constant shifting of blame in the climate debate is a terrifying precedent, and it is not being addressed by the top brass in nearly enough force. The public’s responsibility to combat climate change cannot be understated, but the complete lack of a unified vision and focus across the world is a much scarier thought. The Earth will always find a way to continue turning, and another extinct species – humanity – isn’t going to stop it.


Rachael Eadie: Give it a Rap!

Rap music is everywhere: in the entertainment we consume, as background music in the shops and restaurants in which we go about our daily lives and even in advertising for mainstream brands like Pepsi or Gap. It has become a global phenomenon, one of the most popular and lucrative music genres in the world, creating worldwide superstars and legions of adoring fans. Surely a force for good? Well yes, if your idea of positivity is explicit language, glorification of gang violence, the perpetuation of racial stereotypes, misogyny, drugs and a fixation on money and materialism. Are these values we really want to encourage? If it was just to cater for a minority taste this wouldn’t be such a big deal, but since rap is now the most popular music genre in the United States, part of the mainstream in western culture and is rapidly increasing in popularity around the world, isn’t it time for some types of rap music to change their tune?

It wasn’t always this way. I struggle to understand how something so poetic in origin, rooted in the story telling culture of Africa and often used so successfully by early artists such as Grandmaster Flash, as a vehicle for highlighting issues of injustice, oppression and poverty has to such a large extent become so corrupted in its values, hijacked by the corporates and turned into a global money making machine. Nowadays the mere mention of the words “rap music” conjures up too many negative images.

The objectification of women is a huge issue in some types of rap music, particularly the hardcore and “gangsta” sub genres (which also happen to be the most lucrative ones). To my mind the lyrics and the visual representation of women in these rappers’ videos is more often than not offensive. What kind of example is this setting for young women today? How many rap videos portray a strong, independent, intelligent woman asserting her authority over men? Instead all we ever see is a succession of submissive, scantily clad women portrayed as sex objects. If that’s all you’re exposed to when you’re young, you’ll start to think that it’s normal. In the twenty first century we are surely beyond the point where the sort of goals women set for themselves is to see who can be the most “bootylicous”. Particularly in the wake of the recent Harvey Weinstein scandal, it can only undermine the message of the #MeToo movement to glamourise the exploitation of women. There’s enough misogyny around already: the last thing we need is it being constantly blasted in our ears and shoved in our faces.

I also don’t get how, in a time where we are encouraging tolerance in so many other areas, many rap artists seem to get away with expressing sentiments and using words like, ‘hoe’ and ‘n***a’ which, in any other context, would be considered racist, sexist or offensive to the point of being totally unacceptable.

Another area where some rap music seems to create controversy is the manner in which the lyrics glorify violence and glamourise criminal activity. Think of all the rap songs that latch onto the same depressingly recurring theme of scoring drug deals, knife crimes, drive-by shootings and aspiring to be the next big gang leader. As Eazy-E quotes in his song, Boyz-N-The-Hood; “Little did he know I had a loaded twelve gauge/One sucker dead LA Times front page”. For some artists this does in fact represent the reality of their lives, as a few have found out to their ultimate cost e.g. the east/west coast gang rivalry which claimed the lives of rappers Notorious B.I.G. and Tupac Shakur. The irony, though, is that other rappers, Drake being one example, will happily create a “gangsta” alter ego for themselves for the purposes of commercial success when in fact they come from backgrounds a million miles removed from the deprived neighbourhoods of South Central LA. What angers me is that this is not only misleading but irresponsible. Many people idolise these artists and see them as role models, thinking that sort of lifestyle is something to aspire to and imitating their behaviour in the belief that it’s the cool thing to do.

So many artists in this genre seem to obsess about appearances and materialism, as if quoting designer brands, high-end luxury goods and top of the range sports cars gives them some sort of kudos. Maybe if more rap was not about getting the latest Rolex and more about getting a decent set of values it would set a better example for its audience. (But then, Kanye West didn’t get to be a billionaire by promoting the values of modesty, selflessness and caring for others: he got to be a billionaire by promoting his music and his trainer brand, Yeezy’s.) Yet this unhealthy fixation on designer “bling” can only serve to emphasise the gulf between rap’s megastars and their audiences, many of whom can’t afford to dream about the luxury Caribbean holidays and endless bling enjoyed by those they idolise. As Chuck D, leader of the group, Public Enemy, and one of the most prominent voices in politically and socially conscious rap music, cleverly observed: it is hardly the stuff of Robin Hood that the route for many of today’s rap stars to achieving success and funding their own lavish lifestyles seems to be to exploit their own fan base, much of which lives in relative poverty.

It would be an over simplification to suggest that all rappers subscribe to the language of crime, violence and misogyny. Yes, there are the socially conscious rappers who denounce violence, whose messages are inspirational and who seek to challenge, instead of perpetuate, the stereotypes. There are those voices promoting a message of love, peace and understanding rather than one of hate, tension and intolerance but they are at risk of being drowned out. If rap is to return to its historical roots as a force for good on its ever growing audience, it’s time to give more airtime to the likes of Frank Ocean and Stormzy and to call time on “gangsta” rap and its negative influences.

Bibliography: websites

Madalena Loughlin-Gomes: The Real Death Cure?

Transhumanism: a not-so-new new-wave global movement describing itself as ‘a class of philosophies of life that seek the continuation of the evolution of intelligent life beyond its currently human form, by means of science and technology.’ This belief, which first made headlines in the 1990s, has steadily gained support ever since, and while I was initially highly sceptical, there is no doubt that those pushing the theory are on to something commercially and, just possibly, philosophically too. This is a world that I regarded as belonging to some distant future with flying cars and teleportation, but a quick Google search revealed whole businesses, bitcoin economies and ways of life revolving around the belief that humans really can – and ought to – live forever. Yet, does the vast scale of this movement make it morally correct? What justification is there for obtaining immortality other than selfishness?

What first caught my eye from the plethora of transhumanist organisations was a cryopreservation institute: ALCOR Life Extension Foundation. With just 40 years since its foundation and fewer than a dozen full-time employees, ALCOR made over $2 million in revenue in 2017 alone, making them the leading cryonics institute globally, and their CEO and transhumanist activist, Max More, a millionaire. So, what exactly is this multi-million-dollar enterprise? Cryopreservation is essentially the preservation in liquid nitrogen of people who would otherwise die due to the limitations of today’s medicine. I was quickly dragged down the rabbit hole of ALCOR’s online world of sci-fi-like inventions and possibilities. There were webpages covering areas from case-studies and cryopreservation demonstrations, to an FAQ section for ‘bio-luddites’ (non-believers in the transhumanist world).

I remember tentatively clicking on the ‘cryopreservation process’ page and being surprised to find out that it involved no freezing whatsoever, rather the replacement of blood with a solution to stop cells from bursting at sub-zero temperatures. I was more disturbed however, by the discovery that when critically ill patients are close to de-animation (i.e. death – transhumanists only refer to death in quotation marks and with a high degree of scepticism; as death loses its omnipotent connotations if you believe in immortality), there’s a ‘standby-team’ near them at all times, complete with bags for when the patient’s blood is sucked from their body, and an ice-bath to plunge them into minutes after ‘legal-death’. These various tools would surely be a harrowing sight for the patients, knowing that their literal lifeblood will be drained from their veins seconds after their heart stops beating (after all, no one dies in front of the crematorium, or in a morgue). Nevertheless, over 3600 people from all over the world have paid up to $220,000 for a lifetime membership to ALCOR for cryopreservation. The possibility of ‘resurrection’ must be an alluring concept to those with terminal illnesses, or even those who simply have enough money for membership. However, the more I thought about cryopreservation, the more questions I had. The essence of this moral dilemma boils down to one thing: a battle of science and ethics. The ever-evolving argument between ‘Can we do it?’ and ‘Should we do it?’

Trying to make sense of the ethical implications of cryopreservation is enough to make anyone’s head spin, due to almost every part being completely hypothetical. However, if we theorise that ‘reanimation’ is possible, what are the real-life implications for the patients? No expert in the world can accurately envision how waking up 200 years in the future alone, or perhaps even surrounded by their own descendants, will affect someone’s mental health. Will their memories remain intact? If not, will they really be the same person? As is it not from our memories that our sense of self – our individuality – stems from? What are their human rights? What if society has evolved so much that their level of intelligence isn’t high enough to play any real part in society? The unfortunate and frustrating truth is that no one knows, but it seems that many cryopreservation believers have accepted this. Dr Ralph Merkle (ALCOR member and Director since the 1980s) stated in a video interview quite simply that ‘Cryonics is an experiment. So far, the control group isn’t doing too well.’ A little morbid, yes, but still a solid argument that gives the argument some scientific acceptability. As for my many questions regarding what would actually happen to the patients if they are revived, I was left with no answers. It seems that transhumanists are still fighting to prove the effectiveness of the cryopreservation process but have not yet put much thought into what will happen if it actually works. However, bad mental health and unemployment aren’t the only problems that resurrecting people 200 years in the future may cause.

In a world where massive population expansion is leading to completely unsustainable levels of pollution and global warming, is it really ethical to store vast numbers of people that could eventually be introduced to what will likely be an even more over-populated world than what we already live in? We are all too accustomed to the shocking statistics of over-population: over half of our forests and wetlands have vanished in the past century, all due to the population more than doubling in only four decades. If humanity will be nine billion strong by 2038, what about 200 years from now? We may even have already colonised the galaxy by then (yet another problem for the ALCOR patients – have the FAQ experts thought about how bodies floating in liquid nitrogen will fare in zero gravity?). Perhaps they will get their own planet, a sort of time-warp or possibly even a museum of Earth 200 years before they were reanimated. Whatever way it’s looked at, reanimation will surely only worsen our ongoing disaster, as if even a third of our population is cryopreserved as standard by then, then the projected figure for future populations will surely be wrong by a couple billion.

I can understand why the future of these patients and cryonics in general remains unclear. However, there remains one question that still keeps me up at night: what happens to the reanimated when they die again? Will cryopreservation be seen as the new burial? Or will we all eventually be an omnipotent consciousness, wired into a hard drive by that point? In fact, Transhumanists have dubbed this merging of human and technological intelligence the slightly ominous and Matrix-esque ‘Singularity’. The most likely option would be that if cryopreservation is successful once, it will be used again, thereby continuing the cycle of consciousness. It is at this point that cryopreservation loses all appeal for me. Who wants to be truly immortal? Real immortality isn’t even fathomable to most people, yet there are some who actively seek it, and believe it will happen in their lifetime. These are the immortalists: another worldwide network that’s just as real and, perhaps, even more mind-bending than the transhumanist organisation. The anthropologist Abou Farmiain stated that ‘Paradoxically, Immortalists believe that given the development of scientific knowledge, humans can enjoy life after death, yet it is precisely their attachment to life in this world that leads them to this faith’. There isn’t a way to ponder the ethics of cryonics without spiralling into all sorts of life-questioning dilemmas, but if the scientific basis for cryogenics is divided and uncertain, what else could we turn to for guidance when navigating the murky waters where philosophy and science collide?

For many, their guidance on the morality of cryopreservation stems from their religion, but in our largely secular society, there exists an increasing cross-over between ethics and religion. A starting point for those who follow Christianity, for example, would be that humans should not actively seek to extend their life past what it naturally should be on our finite Earth, and they should accept that death is part of life, and they are destined for peace with God in Heaven. There were many such comments in the anonymous ALCOR FAQ, with one particularly memorable reply being ‘flying is unnatural for humans, but there’s no moral opposition to planes!’. Granted, this logic was a little rough around the edges, but I could genuinely see where they were coming from. But if reanimation becomes the norm, or the Singularity is achieved, what happens to God? How will new people be born? Surely computers can’t just programme a new consciousness? Will Heaven just stop receiving souls? Are there souls in the Singularity? However, it seemed like the transhumanists were busy answering the hundreds of other bio-luddite’s queries, as I unfortunately got no response when I posed my questions to the members of the FAQ page, not even a witty comeback.

To conclude, the world of transhumanism and cryopreservation is a web of moral, ethical, philosophical, scientific, and religious dilemmas. Unfortunately, my original aim of deciding whether cryonics was morally correct or not was lost somewhere between the fifth article on the transhumanist argument as to why cryogenically reanimated cyborgs should be given citizenship rights, and my third email to the Cryonics Institute regarding my confusion to absolutely everything. Whether or not I will choose to become a member of the ALCOR community, and float around in liquid nitrogen for a few centuries in a tank full of strangers – both bodies and heads – for a chance at reanimation remains to be seen, but one thing for sure is that I’ve got a lot to think about, and many websites to scour before plunging into the ice-bath.



1. Anon. ‘What is transhumanism?’: (accessed April 2019)

2. ALCOR life extension foundation (information): (accessed March 2019)

3. Cryonics institute: Information on membership, statistics, processes and case-studies: (accessed March 2019)

4. Dr. Merkle’s video interview for the Humanist Community in Silicon Valley: (accessed April 2019)

5. Talal, Asad, ‘Thinking about the secular body, pain and liberal politics’ from Cultural Anthropology, Vol. 26, No 4 (November 2011) pp. 657-675 for the American Anthropological Association: (accessed April 2019)

6. U.S. Transhumanist party ‘Transhuman Bill of Rights’ (accessed April 2019)

Land of Hope and Glory? – Nina Snedden

Trauma. Torture. Torment. All of which should be synonymous with the turmoil imposed upon millions by the British Empire. Yet, on the horizon of a post-Brexit Britain, a sickening sense of national superiority seems to have emerged from the dewy shades of the British empire, once extolled by many as the ‘empire on which the sun never sets.’

An underlying nostalgia for the imperial dominance that the British empire once brought, and a sense of chauvinistic pride surrounding it, and the supposed stability that it secured- despite its harrowing treatment of countries such as India, Yemen and South Africa- is detectable within Britain today. A large section of the British public seems trapped in a web of blind glorification through denial or blatant ignorance. Despite the shocking accounts of imperialist atrocities now widely available for the British public, many Brits seem, even with the knowledge of these events, to be party to a once dormant sense of pride due to the empire’s past assertion of power and dominance over other countries. In recent years, this mentality seems to have erupted once again; fuelled by the jingoistic sentiments of xenophobic politicians and recent events in Britain. A YouGov survey shows that 59% of the British public are proud of the Empire, only 19% are ashamed, whilst 23% don’t know. These results imply a sense of amnesia throughout a large section of Britain regarding British imperialist abominations. During the Boer Wars, Britain was responsible for the death of 10% of the entire Boer population in one year alone, including 22,000 children- yet a large percentage of the British population remains deluded by the miasma that obscures our nation’s understanding of our own history. How can this be?

Many empire fetishists argue colonies profited and prospered under the red white and blue of the gaudily coloured union-jack parasol. Niall Ferguson, author of Empire: How Britain Made the Modern World, falls under this category, writing,”… no organisation in history has done more to promote the free movement of goods, capital and labour than the British Empire in the nineteenth and early twentieth centuries. And no organisation has done more to impose Western norms of law, order and governance around the world”. Yet, Britain, in fact kept its colonies and subjects in the shade, confining them to the dark shadows of exploitation. The empire’s indisputable intention was to plunder countries of their natural resources and labourers, with an utter disregard for the suffering of those living under their rule. Ashley Jackson, Professor of Imperial and Military History at King’s College London, comments, “The basis of empire is that you rule other people, you deny them independence, you exploit their labour and resources, and a lot of the ‘good things’ were often incidental and secondary.”

During recent events in Charlottesville, Virginia, when counter protestors met white nationalists at the ‘Unite the Right’ Rally, a 32 year old woman and two Virginia state patrol troops were killed and 19 people were injured. This ostensibly indicates a rise in white supremacist activity and a reluctance to condemn America’s history of slavery. Britain has heavily criticised the US and Trump- with Theresa May commenting that there was “no equivalence between those who propound fascist views and those who oppose them”, and stating that “It is important for all those in positions of responsibility to condemn far-right views wherever we hear them.” Yet bizarrely, Britain itself still seems to glorify its imperial past, shrouded as it is in impropriety, immorality and iniquity. The discussion of it is often carefully orchestrated so as to imply that colonies largely prospered under British rule. Statues erected in areas of Britain dedicated to such tyrants as Cecil Rhodes and Edward Colston go some way to prove that the fetishisation of British imperialism is still rife within a large division of British society today. Having made his fortune in the mining industry, Cecil Rhodes became focused upon the annexation of present-day Zimbabwe. Rhodes succeeded in creating the eponymously named ‘Rhodesia’ in an attempt to assert the British as ‘the first race in the world.’ Rhodes can be held accountable for essentially engineering the system of ‘apartheid’ in South Africa, by separating the Africans working in his mines from the rest of civilisation, as well as stealing millions of miles of indigenous lands and prompting the outbreak of the second Boer War, which resulted in the death of 25,000 Afrikaners. In the context of Britain today, Rhodes would be widely regarded as a white supremacist, a racist and a criminal. Why is it that his statue adorns Oriel College, Oxford?

The Brexit vote in June 2016 further points to an underlying nostalgia for British imperial dominance and a hope to reassert Britain as a ‘world power’. Historian, Margaret MacMillan said ‘They’re talking about the glorious Elizabethan Age; they’re talking about that time that Britain ruled the world. It’s a fake sort of nostalgia because of course it doesn’t take into account the complexities [of the situation].’ This desire to return to an age of power and influence requires the renewal of trading relationships with past British colonies. In a speech in July 2017 Theresa May referred to ‘building new relationships’ and reaching ‘trade agreements’ with ‘old friends’. May’s reference to past British colonies as ‘old friends’ goes far to prove the extent of Britain’s delusion surrounding its nefarious imperial past. During the Bengal famine of 1943, many Indians perished under the hand of ‘the war hero’ Churchill, regarded by history as an honourable British leader. Yet it is unlikely that the 3 million Indians who died during this period would view Britain with the same bizarrely fond affection. Nor the 3 million victims torn from their homes in colonies and enslaved between 1562 and 1807. And certainly not the Adenese, who were stripped of their clothing, sexually exploited, and forced into refrigerated cells, in the torture camps opened during the Aden emergency of the 1960s. Foreign secretary, Boris Johnson commented “We used to run the biggest empire the world has ever seen, with a much smaller domestic population and a relatively tiny civil service… Are we really unable to do trade deals?” However, British colonies including India, having suffered under the violent rule of the British empire for decades, have now economically, democratically and morally surpassed Britain. The rotting corpse of the empire cannot and should not be resuscitated.

Entrenched supremacy, racism and discrimination remains palpable within the British mindset today. The undeniably jingoistic Last Night of the Proms is another clear example of the underlying nostalgia for imperial dominance that still exists in a large faction of British society. Songs such as ‘Rule Britannia’ and ‘Land of Hope and Glory’ extoll the “virtues” of British imperialism. John Drummond, who ran the Proms during the 1980s and 1990s for the BBC referred to being ‘moved from tolerant enjoyment to almost physical revulsion’ in response to the BBC’s glaring dismissal for those who suffered under the tyranny of the empire. Many argue that tradition calls for these songs to be played. However, it was not until 1905 that ‘Rule Britannia’ became a fixed song in the event, and not until as late as 1953 that ‘Land of Hope and Glory’ became a permanent source of particular frustration for anti-imperialists. Lyrics such as ‘Rule, Britannia! Britannia, rule the waves!’, as well as, ‘Britons never, never, never shall be slaves’ imply a specific disregard for the millions of civilians enslaved and murdered under the British rule. The undulating union jack flags hark back to a false memory of when Britons appeared to ‘Rule the waves.’

Unless we collectively address and condemn our imperialist past as a nation, statues will continue to be erected condoning slavery and torture, songs will continue to be sung glorifying an empire responsible for the death of millions, and Britain, although it will likely not return to the world-wide stature and superiority it once supposedly possessed, may continue to allow racism, violence and pain to be the basis upon which power is placed. We cannot allow Britain to regress in such a way.

Planned Obsolescence: Weapon of Mass Discarding or Catalyst for Progress? – Hannah Berry

Emitting a dim yellow glow in a fire station in Livermore, California, the Centennial Light has burned for a record-breaking 115 years since it was first turned on in 1901. Fast forward an entire century, and light bulbs are burning out and being replaced within months. If a light bulb designed in the 19th century can last for over one hundred years, why, in the late 20th and early 21st century, have light bulbs tended to last no more than a few months? The answer is planned obsolescence, a by-product of modern capitalism.

Frequent changes in design; society’s views on fashion and trends; the focus on ‘replace over repair’ of goods and an astronomical use of non-durable material, are the largest contributors to planned obsolescence; a policy of producing consumer goods that rapidly become obsolete and so require replacing. Although believed by economists to be a social necessity for driving technological advancement and innovation, planned obsolescence is unsustainable for the future. Such a policy fuels the society’s damaging consumerist culture and wasteful attitudes, leading to high manufacturing demands, production of waste, natural resource depletion and damaging repercussions on consumers.

One of the most obvious injustices of planned obsolescence is the heavy burden it places on consumers. With the assistance of media, advertising and design changes, manufacturers are frequently introducing new changes in fashion and influencing consumers’ decisions and perceptions of styles which are deemed fashionable or trendy and forces them to believe they must have these products. Fashion of any sort is a classic example of ‘perceived’ obsolescence: consumers are manipulated to believe that a seasonal fashion or certain clothing is no longer in style, so they must be replaced by new garments. This results in the large waste of an increasing amount of items at a high financial cost to the consumer.

This lifestyle has tremendous financial costs for consumers. Often equipment that needs repaired will become obsolete as the price for repair is higher or comparable to the price of replacing the item altogether, or the service or parts are no longer available, resulting in the consumer having no choice but to replace the item, rendering it dysfunctional. For example, major corporations such as Apple and Samsung are now designing their smartphones so there is no access to the battery inside the phone so it is difficult to replace the battery, making the item functionally obsolete. Other examples include the updating of software or designs which make the older versions incompatible with the new advancement, forcing the previous version to become functionally obsolete and forcing the consumer to invest in the new updates.

Over the past few decades, the expected lifespan of products has drastically diminished, so that most consumers today purchase products with the expectation that they will need to be replaced within a couple of years. In an attempt to boost the economy after the World Wars, retailing analyst Victor Lebow articulated the solution that has become the norm for the whole system, he said: “Our enormously productive economy… demands that we make consumption our way of life, that we convert the buying and use of goods into rituals, that we seek our spiritual satisfaction, our ego satisfaction, in consumption… we need things consumed, burned up, replaced, discarded at an ever-accelerating pace.” (Lebow, 1955)

Since its first documented case in the 1920s during the Great Depression, to its adaptation, popularisation and acceptance over the decades, consumers have become acclimated to the practices of planned obsolescence. Planned obsolescence should not be normalised by society; this results in turning a blind eye on the ethically questionable practices and the destruction of the environment.

An even more serious concern, due to consumerist attitudes and our acceptance of the practice of planned obsolescence throughout society, is that the overall demand for the manufacturing of these products is rapidly increasing, thus the overall demand for the Earth’s finite resources is subsequently rising. Studies from the United Nations Environmental Program (UNEP) found that global extraction of materials has tripled since 1970, and not once in the last 40 years has materials extraction declined, even during times of recession and economic crisis. In the past three decades alone, one third of the planet’s natural resources have been consumed. We are cutting, mining, hauling and trashing the place so fast that we are undermining the planet’s very capacity to support human life adequately. By continuing to intentionally limit the useful lifespan of a product by making it unfashionable or no longer functional, manufacturers are creating a significant driving factor to unsustainable attitudes and practices, depleting the planet of its precious, finite resources.

Consumers often view planned obsolescence as a cynical plot by manufacturers and corporations to boost sales and profits while the consumer and the environment pays the price. Arguably, those in support of the planned obsolescence strategy believe it to be the catalyst and driving force for progress and technological advancement. When a new technology is developed, many previous inventions become obsolete. This could bring about truly innovative products, like the advancement of horse and carriage transportation to automobiles, or the typewriter to the computer. However, far too often, planned obsolescence is too easily justified by a slightly sharper camera phone, or slightly more memory, or a new operating system that confuses as much as it simplifies. Do we ‘really’ need these things?

Plastic water bottles, cutlery, plates, cups, razors and bags, seen in the countryside or on the streets or dumped in the landfill: today, we live in a ‘throw-away society’; a culture of over consumption and excessive production of short-lived or disposable products. Planned obsolescence is the leading cause of our wasteful consumer habits and the constant manufacturing of these unnecessary products contributes greatly to pollution, which affects the water we drink and the air we breathe. The United States Environmental Protection Agency (USEPA) found that only 1% of the products we buy are still in use as little as six months after their date of sale. In other words, 99% of our consumption is trashed within six months. The products themselves end up in landfills, taking up precious space that is often at a premium. According to the UNEP, E-waste, or discarded electronic appliances such as smart phones, computers, and televisions, is one of the fasted growing sources of waste. On average a person keeps a smartphone for 18 months, whether the battery fails, screens or buttons break or the operating systems can no longer be upgraded, the immediate solution owners turn to is not the repair of the current system, but the purchase of a brand-new device that is advertised to be ‘better than ever before’.

The disposal of waste releases harmful toxins into the air, the surrounding soil and ground water. A large majority of this waste is disposed of in landfills full of hazardous materials, often in the world’s poorer countries including Bolivia, Ghana and South Sudan. Jim Puckett, co-founder of BAN; an organisation for environmental health and Justice visited Ghana and saw teenagers and young adults working in the landfills, exposed to hazardous substances, burning discarded electronics, and releasing toxic fumes into the air. The accelerating production of so much waste due to planned obsolescence, impacts greatly on the environment, contributing to waste pollution and endangering human life, not only in the countries that produce this waste but also the developing nations.

If environmental and climate challenges are to be tackled, then the wasteful production and consumption patterns driven by planned obsolescence is not the way forward as a sustainable strategy to stave off an economic crisis. The investment in more durable items and taking steps to minimise your participation in a consumer-focused society is the way forward from a disposable and wasteful culture. Only the truly innovative products which provide significant positive advances in society, should light the path to a sustainable future.

The unsustainable practice of planned obsolescence, through the continual replacement, rather than repair, and the manufacturing of non-durable products, results in: masses of waste generation; pollution; loss of biodiversity; the rapid depletion of Earth’s precious resources; and high financial costs for consumers. These challenges must be tackled to move forward towards a sustainable future and can only be achieved by rendering planned obsolescence obsolete.



Andrews, J. (2013). Planned Obsolescence. Retrieved October 29, 2016, from My Homemade Life:

Bloch, M. (2010, September). Planned Obsolescence and the Environment. Retrieved November 13, 2016, from Green Living Tips:

Brundage, J. (2016, July). Planned Obsolescence and Resource Extraction . Retrieved November 10, 2016, from Voices For Mother Earth:

Chan, A. (2013, May). Planned Obsolescence. Retrieved November 13, 2016, from THURJ: The Harvard Undergraduate Research Journal:

Dans, P. (2012, November). Economics of Obsolescence at the Expense of Consumers. Retrieved November 16, 2016, from Le Mauricien:

Hadhazy, A. (2016, June). Here’s the Truth about the Planned Obsolescence of Tech. Retrieved November 13, 2016, from BBC:

Hindle, T. (2009, May). Planned Obsolescence . Retrieved November 15, 2016, from The Economist:

Lebow, V. (1955). Price Competition in 1955. Journal of Retailing, 3. Retrieved December 9, 2016, from

Mastro, A. D. (2012). Planned Obsolescence: The Good and the Bad. Retrieved November 14, 2016, from PERC: Property and Environment Research Center:

Quiet Environmentalist. (n.d.). Is the Earth Doomed Due to Planned Obsolescence? Retrieved November 15, 2016, from Quiet Environmentalist:

Renner, M. (2016, September). Why Your Stuff Turns to Junk and Cooks the Planet. Retrieved November 13, 2016, from World Watch:

Smith, L. (n.d.). The Disposable Society. Retrieved October 29, 2016, from Investopedia:

Vince, G. (2012). BBC. Retrieved October 29, 2016, from BBC:

Wong, C. (2012, October). Planned Obsolescence: Buying into Consumerism. Retrieved November 13, 2016, from Economics Students Society of Australia:

WWF. (n.d.). Watse Disposal. Retrieved November 16, 2016, from WWF: World Wildlife Fund:

“Uno cappuccino, per favore!” – Honor McWilliams

It’s breakfast: a busy day awaits. You have to remember to send this to soandso, you need to tell that to other soandso. Calls need to be answered, tasks must be completed and you have to transition from activity to activity without the slightest hesitation. Your head begins to swirl as you whisk through the never ending list of ‘To Do’s’ and you seriously question whether or not you will make it to lunch without collapsing.

But then salvation comes. Sleek and round, you see it gradually emerge in the distance. A distinct aroma fills the air and in some way your racing mind comes to a halt. Gradually coming closer, you are mesmerised by the soft, white shroud. You reach out to greet your saviour at last filled with warmth, hope, serenity.

“So, who’s having the cappuccino?”

Perfectly sized yet indulgent, simple but sophisticated, the cappuccino remains a cherished emblem of post-war Italy and her so-called ‘Dolce Vita’ or ‘Sweet Life’ in cafés and restaurants throughout the globe.

At least it should. Just last year, Starbucks announced that they were beginning to phase out cappuccinos in certain branches by replacing them with the more contemporary and stylish flat white. Staff complained that cappuccinos were too onerous to make in comparison to the more efficient and succinct preparation of the flat white, given that both taste virtually the same. But since when was coffee solely about taste? Surely a drink as classic as the cappuccino should symbolise something greater than just momentary pleasure?

Peter Thomson, owner of the Coffee Hunter blog, openly gushed over the popularity of the trendy flat whites, claiming they represent a “new wave of independent, hipster-style craft coffee.” When asked about the consequent cappuccino apocalypse, he struggled to hide the disdain in his voice when he said, “The cappuccino is a relic of when the whole world aspired to drink coffee Italian style.”

Aspired?! Who says we don’t continue to dream of discovering a modest little café which serves the most ‘belissimo’ cappuccino, tucked away down a side street in the Eternal City or elsewhere in Italy? Perhaps cappuccinos are indeed remnants of simpler times yet this does not mean that they should be forgotten or neglected by your local barista and left to collect dust. They should be preserved.

The humble origins of this frothy masterpiece date back to as early as the 16th century. The Capuchin Friars, an order of the Franciscans, were widely celebrated for their incredible service to the underprivileged and destitute while adopting a lifestyle of poverty themselves. The mere image of the Friars indicated this devotion to simplicity, opting to wear brown robes with long pointed hoods. It was from this distinctive hood, known as “cuppuccio” in Italian, that the Capuchins were named. Little did these modest monks know that they were to serve as the unique inspiration for possibly the most elegant hot drink in history four centuries later.

Picture 1930’s Italy. Amidst the economic turmoil following the 1929 Wall Street Crash and the oppressive fascist dictatorship of Benito Mussolini,  society sought for some form of escape. A small symbol of hope. At this very time, a mixture of coffee and milk topped with whipped cream and chocolate sprinkles began to emerge in Trieste. It became fashionable and the trend soon flourished throughout Italy. Many began to remark that the unusual light brown colour of the mixture resembled the habits of the Capuchin Friars: the early cappuccino was born.

However, the coming years were no less chaotic for the Italian people- the horrors of further global conflict shattered national morale, citizens witnessed the tumultuous destruction of Mussolini’s government and the economy was in a perilous state. Just when Italy seemed to be peering into a dark abyss, a miracle occurred. The Italian Economic Miracle of 1950-60, to be precise. Not only did economy and society undergo momentous recovery, but Italian culture began to evolve with the improving times. Italians were now smiling at the sun.

This, though, was not the only miracle that occurred during this period. One equally as significant cannot be ignored: ‘The Age of Crema.’  This mass development of highly sophisticated coffee machines, capable of preparing pristine coffee to utter perfection revolutionised Italy. This single event ingrained the techniques and prestige of Italian coffee making all over the world, defining their culture and country.

Yet it was the humble cappuccino which captured the essence of revolution. These machines were especially designed to heat and steam milk, refining the original cappuccinos into the modern concoction we drink today. One half made of aromatic double espresso, the other of hot milk completed by steamed milk foam with a light dusting of chocolate. As Italian morale was rebuilt at the core of society’s new ‘Dolce Vita’, so too was the cappuccino.

Soon after its spectacular debut in Italy, the allure of cappuccinos spread throughout Western society. Europe, Australia and America all caught onto this trend in rapid succession by the early 1990’s, and it was due to the coffee craze that shops such as Starbucks were founded. While these were not meant to replicate traditional, family-run Italian cafés, they served to bring the flavour of Italy to a diversity of cultures.

Of course, as their success became unprecedented such American companies began to see the incredible gain of drinks like cappuccinos. They quickly began to neglect the precision and care Italian craftsmen dedicated to the cappuccino evolution. Thought turned to hastiness. Perfection became sloppiness. A symbol of new life was now no more than a poorly made drink. They cared little for the ‘Dolce Vita’ it epitomised.

But this casual disregard of past heritage and ‘relics’ has become increasingly common in our modern society. In our desperation to constantly evolve and move forward, we forget to find value in looking back in our fear of becoming cemented in the past. It is absolutely necessary to cling on to the most stylish and trendy thing at the current moment, yet we feel no remorse once we desire to toss it aside upon the discovery of something new. The flat white may be ‘in’ right now, but in a few years time this too will be shoved from the shelves like the cappuccino.

We must keep advancing, yes, but not to the detriment of everything that has brought us to where we stand today. We must learn to do one simple thing from time to time- pause.

When I have a cappuccino with my breakfast (never after 11am- that would be sacrilegious to Italians!) I’m able to stop for a while. Pondering over the brim of my coffee cup, I begin to feel more revived. After all, the cappuccino emerged in the wake of the very revival of the Italian people. I realise that although I’m anxious to get on with my day and complete the endless tasks that I have, I should make time appreciate my past and present. From there I can gradually evaluate the future. It is vital that we find some way to feel relaxed or comforted, and for me it’s by having a cappuccino.

The cappuccino first defined the sophistication of coffee, and it always will. Starbucks can change their menu as much as they like, but they can never re-write history. The cappuccino represents renewal, hope and happiness. It is imbedded in Italian culture and cuisine. It may not be as new as the flat white and other such trendy coffees, but they posses a timeless style that can’t be poured away down the kitchen sink, no matter how much Starbucks may try.

So have your frappuccinos, toffee lattes or caffè mochas. Pompously order your deconstructed coffees, skinny cortados and soy gibraltars. Rave about your pumpkin-spice lattes, caramel macchiatos and flat whites. I’ll stick to my cappuccino, per favore.