Richard Lenehan: The Need to Atone

In 1929, the G’psgolox totem pole was taken, without consent, from a First Nations community in Canada to Stockholm’s Museum of Ethnography.  The settlers that took it did not understand, or did not care to understand, this artefact’s socio-cultural importance to that community.   A totem pole is carved from wood to commemorate death: as the wood rots and becomes one with the earth, so too do the souls of the deceased.  In its ignorance, the Museum preserved the totem pole indoors in storage, thereby “trapping” the souls that it commemorated by not allowing it to rot.  Only when a replica was supplied in 1991 was the totem pole finally repatriated, allowing the community to heal in the knowledge that its dead were finally at peace.

This incident illustrates how significant cultural property is to communities, and why we need to address the colonial history of such artefacts in our museums.    Taking a totem pole from its community was akin to stealing a gravestone from this country – an action that we would see as clearly wrong.   Hearing about this made me think about cultural artefacts we have “collected” from other countries, and this essay will argue that these should be repatriated.  It is clear that these artefacts have stories to tell.  We should consider who has the right to keep these objects, and to tell their stories.

Our museums are filled with spoils from our imperial and colonial past.  Not only that, these objects tend to be displayed in ways intended to vindicate the actions of our ancestors in returning from overseas with the cultural property of others, and to tell the stories of these objects from the collector’s point of view, rather than in a cultural context.  This is wrong.  These items would be enriched if seen in the context of the place of their origin.  I am not arguing that we have inherited guilt for looting by our forebears.  I am however arguing that we have inherited responsibility for their actions, and that it is up to us to make things right.

Standard arguments in support of not repatriating artefacts include that they should be displayed in central western locations where they are accessible to the largest number of people, that they will be better looked after in our museums, that they contribute to our knowledge and understanding, and that they may never have been found if it were not for the “collectors”.

Museums are curated to elicit a particular emotional and intellectual response to the objects they display.  Their curators are, however, conditioned to view history from their privileged perspective.   It can therefore be argued that the true historical and cultural context, and the importance of looted artefacts, not only cannot be appreciated here, but is also denied to their rightful owners.

In my opinion, another important reason for returning artefacts is that taking them without permission was stealing.  The stripping of relief sculptures from the Parthenon by Lord Elgin in the 1880s is an example of this.  At that time, the Ottomans occupying Greece gave him permission to take small artefacts from the building, but not to interfere with its “walls or works”.  Removal of what became known as “the Elgin Marbles” was in contravention of his permit, which was, in any event, issued by those without cultural rights to the site.  This can only be described as theft.   A modern-day analogy would be if the United Nations, who had temporary charge of parts of Glasgow during COP26, had allowed delegates to take home historical Glasgow artefacts as souvenirs.  There is no doubt that this would have caused an outcry, and justifiable demand for their immediate return.  

This theft was compounded by the mistreatment of the Marbles under British care.  During their time in the British Museum, the Marbles were cleaned with a metal wire brush to make them look whiter, thereby destroying a lot of fine detail, such as muscles and sinews.  It is therefore hypocritical to suggest that they are better protected here.  In fact, the artefacts would have been better left in situ.  Indeed, at the time they were stolen, accurate casts of the Marbles had already been made, meaning that replicas could have been enjoyed in Britain, with the originals remaining in place to be viewed in their historical and cultural context. This is another situation that should be addressed by repatriation and apology.

There are also clear moral arguments for the return of artefacts.  There was an element of control in taking them from a territory in the first place – it was symbolic of taking control of the territory itself too.  These artefacts are not now easily accessible to the peoples from whom they were taken, and for whom they have cultural significance. 

Moreover, there are clear economic arguments for the return of artefacts.  Items of historical interest frequently come from less developed countries.  There is a real possibility that returned artefacts could be the form the basis of a tourist trade.  You can draw analogies with how Scotland has benefited so much from cultural tourism in recent years, and it would be unjust if other nations could not benefit from their cultural heritage due to the misappropriation of symbols of that heritage.

In wake of recent consciousness-raising events such as the Black Lives Matter campaign, I believe that the fact that artefacts serve as reminder of past oppression is also important when coming to a decision on this point.  The shackles and yokes used on slaves in the 1880s in the southern United States of America are reminders of the atrocious acts committed, and the complete lack of freedom of the stolen people from the southern continents.  We acknowledge that cultural appropriation is wrong, and that dominant cultures should not appropriate from minority cultures.  This should be as true in relation to artefacts as it is in relation to behaviours, rituals or attire.

Museums need to review their acquisitions, and to ask critically whether they need to reframe the context in which they are seen.  They should also be asking whether the items belong with them, or whether they rightfully belong elsewhere.  If they belong elsewhere, then they need to start the process of repatriation, apology and healing.  This last year has shown us that people are questioning this country’s imperial and colonial past, and wanting to make some reparation.  To date this has taken the form of the removal of statues and monuments, but the return of looted artefacts to their communities seems like the logical next step to explore.

Bibliography:

https://projects.seattletimes.com/2018/artifacts-of-injustice/
https://traffickingculture.org/encyclopedia/case-studies/gpsgolox-totem-…

Juliet McKay: Black and White Films are Superior to Films in Colour

“The first knee jerk reaction of my kids is that they don’t want to see a black and white movie… 10 minutes into the picture, they don’t know whether it’s black and white or in colour.” (Steven Spielberg)

For many, black and white (B&W) films belong firmly in the past. This is understandable; 1961 was the last year in which the majority of films released were B&W. Despite this, two B&W films still grace IMDb’s list of the top ten greatest films as voted by users.  One from 1957, despite colour becoming more commonplace, the other from 1993, which was a very clear, conscious, stylistic decision. These are Sidney Lumet’s “12 Angry Men” and Spielberg’s “Schindler’s List”. This suggests that there’s still an audience capable of appreciating black and white films as some of the best movies ever. Yet, inexplicably, many younger viewers refuse to watch anything in B&W, some of my friends and Spielberg’s own children included. I personally much prefer the look and feel of B&W and believe monochrome to be far superior to movies shot in colour for aesthetic, historic and genre related reasons.

Nowadays colour is often assumed to be the more interesting and realistic option; however, popularity seldom equals greatness. B&W provides a simplistic, beautiful quality that colour is unable to replicate or replace. Over time, B&W has been overtaken by colour and now remains a rare artistic choice. Since most of the content I consume daily is in colour, I pause when I see something in monochrome because it allows me to dive into a whole other reality. Films aren’t real. We use them as an escape to another world, not simply a reflection of our own, and B&W enhances the experience. We live in a world full of colour; why would you want to watch something so familiar? It can be utilised as a tool to embrace the distinction between the real world and the fictional place the medium transports us to. Frank Darabont, celebrated director of “The Shawshank Redemption” (1994), believes that this unique view of the world “is what makes black and white so very cool.”

Remarkably, B&W films are also able to achieve the very opposite and make a film feel even more real, director/screenwriter Samuel Fuller said, “Life is in colour, but black and white is more realistic.” This can be done by giving it a serious, gritty documentary tone – “La Haine” (1995), or by making it feel authentic to the time period – “The Elephant Man” (1980).

B&W can place a movie in a specific time period by creating a link to the past; “Ida” (2013) succeeds beautifully in establishing its setting as bleak, post war Poland. It can also be used to pay homage to certain genres or film techniques. Noah Baumbach chose to shoot his film “Frances Ha” (2012) in monochrome to mimic the French New Wave movement from the late fifties and sixties. They were usually B&W, used low budget, simple techniques and rejected typical film conventions. I really love that B&W is still being used to pay tribute to some of the most influential periods of cinema and is often the perfect choice.

Classic Hollywood, a time rightfully referred to as ‘The Golden Age’ catapulted stars such as Humphrey Bogart and Katherine Hepburn to icon status and was a hugely influential era of cinema. The grayscale glitz and glamour of this era in cinema history I believe is unmatched. B&W is an integral and iconic feature of films made in this period. Classics like “Casablanca” (1942) and “Citizen Kane” (1941) were colourised and rereleased during a failed attempt to attract viewers by Ted Turner of Turner Classic Movies proving only that films intentionally shot in black and white should be left that way. The Golden Age of Hollywood was an important time that revolutionised many aspects of the film industry, these films remain essential watches. Monochrome is perfectly suited to this era because so many of the popular themes are enhanced by the lack of colour and the contrast between black and white: paranoia, suspense, morally ambiguous characters, good versus evil and their often-cynical view of the world.

Furthermore, film noir, one of this period’s most iconic genres as well as my personal favourite, would not exist without B&W. The monochrome enhances every aspect of these films that includes “Double Indemnity” (1944) or “The Big Heat” (1953), from their dark atmospheres to the figures that emerge from the shadows, cigarette in one hand and pistol clutched in the other. “The Man Who Wasn’t There”, the Coen Brothers’ 2001 film, mimics the style of film noir through use of B&W. Other neo-noirs, filmed in colour, for example “LA Confidential” (1997), use popular film noir tropes yet, along with the loss of B&W, the essential noir atmosphere and look is also lost. In this movie, when audience and protagonist are introduced to Kim Basinger’s femme fatale, she is dressed head to toe in black and white, paying homage to its inspiration and suggesting the director would prefer it to be monochrome. Guillermo del Toro has a star-studded neo-noir coming out next January in colour. Although I am looking forward to this, would it be better in B&W? Obviously, the answer is yes.

Black and white films should not become a thing of the past. They have captivated audiences for over one hundred years and I hope that they continue to do so for another hundred. I would love to see more films make this stylistic choice in modern cinema but I also think it’s very important to continue to watch classics. Glorious technicolour was a revelation when the world was first introduced to it but now, films in colour just feel too ordinary. Even some of my favourite films in colour are ones made by directors like Alfred Hitchcock who started in black and white and continued to use it when colour became available, only using colour if it was to play a significant role in storytelling. Through perfecting the craft of making films without colour, he shows that you can tell a story flawlessly without it. However, recently an article by Variety predicted that the cinematography category at the 2022 Oscars may be dominated by B&W, including films such as ‘Belfast’ and ‘The Tragedy of Macbeth’ showing that monochrome might be making a well-deserved comeback.  While some may still disagree, for me, colour has never moved from beyond the gigantic shadow cast by black and white cinema.

And cut!

Bibliography:

https://www.rogerebert.com/interviews/casablanca-gets-colorized-but-dont-play-it-again-ted

Sidney Lumet: Interviews by Sidney Lumet

https://www.infoplease.com/culture-entertainment/film/movies-and-film-aesthetics-black-and-white-and-color

Steven Spielberg on the Importance of Studying Classic Films – AFI

https://variety.com/2021/artisans/awards/female-cinematographers-could-dominate-oscars-1235104934/

Zoe McGinley: Should Chocolate be kept in the Fridge or the Cupboard?

It’s hard to find someone who doesn’t like chocolate: we are a race of chocolate connoisseurs. There is no argument that the feel-good chemicals released from its consumption play a massive part in how so many of us find chocolate so delightfully irresistible. But the real debate is not about which satisfies the palate more between a Snickers or a Mars Bar, or even how each of us prefer to eat our Creme Egg? The much less documented but highly contested argument which has been splitting opinion between families and friend groups is… should chocolate be eaten straight from the fridge or not? Of course it should! There are simply no words in the English language that can fully describe the euphoric sensations of a cold Cadbury’s Marvellous Creations sweetly and tantalisingly caressing the taste buds.

Chocolate is a renowned and popular household treat today but, surprisingly, many people today aren’t completely familiar with the full history of chocolate. It is thought that chocolate originates back to the Olmecs in Latin America around 4000 years ago, who picked the fruit (pods) of cocoa trees, dried and roasted the beans and then used them to create a chocolatey liquid. There is some further evidence, centuries later of the Mayans who had created a warm ‘brew’ of ground cocoa seeds, chillies, water and cornmeal which they named ‘xocolatl’. By the 15th century, the Aztecs believed that chocolate was a gift from the god Quetzalcoatl and, realising its widespread demand and use as an aphrodisiac, used the cocoa beans as currency. 

Of course, overtime things like sugar and honey were used to sweeten the bitter taste of chocolate, which ultimately, led us to the birth of a new method where the cocoa butter was squeezed from the beans to make a powder which was mixed with liquid and then poured into moulds. Thus, chocolate had evolved from a tangy and presumably unpleasant drink into the sweet, deliciously indulgent confectionery we know and love today through the added genius of master chocolatiers.

When Swiss chocolatiers, Daniel Peter and Henri Nestle added a little milk powder into their cocoa mixture, this opened the floodgates for companies like Cadbury’s who had absolutely mastered the art of chocolate making by producing, in my somewhat connoisseur opinion, the best milk chocolate on the planet. Of course, others may contest that opinion but that’s not the issue I want to debate here – the real argument is whether chocolate tastes better straight from the fridge? Yes, we all purchase our daily or weekly (ok, sometimes monthly) indulgent supply straight off a room-temperature shop shelf, but I think that there is simply no better way to eat chocolate than straight from the fridge! Some agree, some disagree, and some just don’t want to admit that they agree. I fully understand that taste is subjective and this is all just a matter of opinion, however there is in fact scientific evidence to back up this delicious preference. An article from 2012 by Chemistry Matters states the reasons why chocolate does indeed taste better from the fridge. This is all to do with polymorphism which has the ability to form a solid to exist in more than one crystal structure. These structures are called polymorphs. It’s all a bit too technical to explain in scientific detail but, essentially, the ingredients in chocolate have numerous properties that react in different temperatures. Ok, you must be thinking what does this have to do with why we should store chocolate in the fridge? Well, in a nutshell (a Fruit n Nutshell) some polymorphs are too bland and too brittle on their own to act as chocolate and some other properties can change if left at room temperature therefore creating a distinct change in taste but, by storing chocolate in the fridge (a stage known as crystallisation) it prevents the polymorphs from changing as it would whilst sitting in a cupboard at room temperature. Basically, when chocolate is stored in a fridge it is of course colder which adds and an additional level of flavour to release tantalisingly over the taste buds as it melts in the mouth. 

This whole debate has proven to be somewhat contentious with a hugely divided opinion over the issue and not least within my own household. Yes, there are some ‘non-fridger’ members of my family who are brave enough to risk my wrath by having the nerve to remove our chocolate stash from the fridge by citing that it should indeed be enjoyed at room temperature. As a more heated debate ensued, we all agreed that the only way to settle the argument was to find some official conclusion from the big confectionery companies as they’re the experts, right? Wrong! In reply to a recent online blog which asked readers whether chocolate should be kept in the fridge or pantry, Cadbury’s themselves had indeed waded into the matter to state “Chocolate should always be stored in a slightly cool, dry, dark place such as a cupboard or pantry at temperatures less than 21C to ensure the quality isn’t compromised”. So who do we trust – those who spend years in university to become scientists or those who work in the factories watching the machines do the chocolate making? 

But what about melted chocolate? Well, that argument I understand, there’s nothing better than the experience of coming home to make a cup of hot chocolate after a long winter’s day or the texture of biting into a perfectly melted chocolate cookie straight from the oven. My question is, who would want a room temperature chocolate bar melting into your hands on a hot summer’s day?

Who are these “experts” to tell us the “correct” way to eat our chocolate when really, it all comes down to preference? Should we consider the claim from Cadbury’s that they know the perfect chocolate storage conditions for ultimate flavour when, in reference to their Crème Egg, they have devoted a whole advertising slogan offered back to consumer choice when they ask ‘how do you eat yours’? It’s also a safe assumption that the Aztecs would not have just believed their chocolate drink to have come from one god, but rather the ultimate gift from all the gods had they only had access to a fridge!

So now, I encourage you, stick your favourite chocolate bar in the fridge and tell me I’m wrong.

Bibliography:

https://www.history.com/topics/ancient-americas/history-of-chocolate

https://danthechemist.wordpress.com/2013/02/12/why-refrigerated-chocolate-tastes-better/https://www.independent.co.uk/life-style/food-and-drink/cadbury-chocolate-bar-fridge-pantry-cold-how-to-a9526636.html

Niamh Graham: Is a University Degree a Requirement for Career Success?

Is it the end of the world if you don’t go to university after school? Most people’s immediate answer to this question will be, ‘Yes, of course you need to go to university if you want to succeed in life and get a good job.’ In fact, this is not true: you don’t need a university degree. There are other ways to go about getting your dream job. In fact, many people that have become successful have never even set foot in a university; many more dropped out, having not lasted long enough to get their degree. This essay will explore the reasons why not going to university may be better than wasting four more years of your life stuck in a classroom. 

One of the main problems for people thinking of attending university is whether or not they can afford it and whether the cost is really worth it. To answer the question – spoiler! – it’s probably not. With maintenance loans and tuition fees to pay, graduates are finding themselves in thousands of pounds of debt before they have even applied for their first job. In 2021, students graduating from English universities will have incurred an average student loan debt of over £45k, compared to almost £28k in Wales, over £24k in Northern Ireland and just over £15k in Scotland. So, you really need to ask yourself: is the money you’re willing to spend going to be worth it? Even after the financial risk there is still no guarantee that you will get a good, well-paying job. In fact, only 59% of those who qualified from Higher Education went on to full time employment. If the job you think you want to do does not require a university degree and further education, the solution is simple: don’t go. It’s not worth the time, the money or the stress.

Speaking of stress, a Uni Health study found that 80% of those studying in Higher Education reported symptoms of stress or anxiety, while NUS surveys found that nine in ten students experienced stress. Would you want to be spending an extra four years (minimum) doing more assignments and exams when it’s not entirely necessary? I wouldn’t. Taking work home is a fundamental part of university life. You are never finished. You always have something you should be doing instead of relaxing, taking a break or seeing friends and family. This results in feeling that, in those moments when you’re not working towards your degree, you feel like you should be.

Nowadays, after you finish university the likelihood of you getting your desired career from the course you took is diminishing. The job prospects for grad students is decreasing at quite a significant rate. Average student satisfaction rates (which take into account factors like support from university, quality of teaching/tutoring, course structure and, crucially, career prospects after graduating) have fallen consistently over the last few years. Last year, the government released sets of data about the career prospects of a degree, broken down by subject or institution of study. While some courses have great earning potential, the data showed that a large number of courses don’t lead to well-paid employment afterwards, which is why the majority of people chose to go to university in the first place. This is leading to an increasing amount of people who are realising that they don’t need a degree to secure the jobs and careers they want.

Lastly, it is a well-known fact that some of the wealthiest and most influential entrepreneurs in the world dropped out of college and university. Steve Jobs, Bill Gates and Mark Zuckerberg are some people who left college before they could collect their diplomas. Lesson: you are still able to get a well-paying job without a degree. Here are some of the highest paid jobs in the UK that you can get without going to university: air traffic controller, digital marketing, SEO expert, white hat hacker, firefighter, offshore energy jobs, game developer, translator, police constable and entrepreneur. All of these jobs still pay a handsome amount of money and you can start them straight out of school. Your level of education does not need to define your career or your success. Just because you’ve got a degree doesn’t automatically mean that you are entitled to a higher salary: you have to earn respect in the workplace by showing what you can actually do and, of course, in some cases you learn much more on the job.

But I do also understand why some people choose to go to university. It gives you time to explore different career options and experience a taste of the different courses available if you haven’t decided what you want to do with the rest of your life. Going to university also gives you the chance to learn and obtain some very valuable life skills that you can take with you after you leave. Many of the people who go to university leave it blessed with long-lasting relationships with the people they met while they were there. The academic aspect is a big part of attending but it also gives you the chance to bond and connect with people who are likeminded and who enjoy the same interests that you do. And yes, there are of course a number of professions where you are required to have certain degrees before starting on the job.

In today’s world, there are so many more options and career routes that are available to ambitious individuals who are willing to roll up their sleeves and work hard. In fact, many of the professions that traditionally require a degree are now reassessing their requirements and route to qualification. The key to success is about having a focused approach to what you want to do and finding out as much as you can about that career. Speak to people who already do the job and be prepared to be flexible and to have the ability to adapt to circumstances and take advantage of opportunities when they present themselves. More often than not, these characteristics make for a much more employable candidate than one who has a certain combination of letters after their name.

Bibliography

https://www.topuniversities.com/student-info/student-finance/how-much-does-it-cost-study-uk#:~:text=Now%2C%20UK%20and%20EU%20students,Survey%20of%20University%20Tuition%20Fees).

https://www.hesa.ac.uk/news/18-06-2020/sb257-higher-education-graduate-outcomes-statistics/activities

https://www.theguardian.com/education/2019/may/31/why-are-students-at-university-so-stressed#:~:text=Mounting%20social%20and%20academic,in%2010%20students%20experienced%20stress.

https://www.justit.co.uk/insight/4-reasons-why-less-people-are-going-to-university/

https://unihealth.uk.com/is-stress-at-university-always-bad/

https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/924353/The_impact_of_undergraduate_degrees_on_early-career_earnings.pdf

https://www.futurefit.co.uk/blog/jobs-without-a-degree/

https://www.cnbc.com/2018/08/16/15-companies-that-no-longer-require-employees-to-have-a-college-degree.html

https://www.statista.com/statistics/376423/uk-student-loan-debt/

https://unihealth.uk.com/

Helen Findlater: Let’s Fix This!

It is 2012, and in a clean, clinical room in Denmark, Angelea smokes crack cocaine to aid chronic pain in her left leg – the result of a serious car accident.  She brings her drugs to the smoking room; they are tested for purity under a microscope.  Constantly supervised by nurses, Angelea feels safe, dignified and respected.  Most importantly, she is given further resources to help; she has greater control over her future.

Mention the subject of drug addiction and most people think criminals.  Me?  I think victims: people with a medical condition that needs properly cared for.  Until we accept this definition the problem will only get worse.  So, how can we make it better?  How can we fix this?  One possibility, already having dramatic results on the continent, is fix-rooms, properly known as consumption rooms.  Fix-rooms are safe spaces where users can take illegal narcotics under supervision.  Fix-rooms already exist in Denmark, Switzerland, Holland and Canada.  Fix-rooms could help fix problems here in the UK.

The facility where we met Angelea earlier is called Skyen and it accommodates between 500 and 700 drug intakes per day.  This project has quite literally changed the way of life for over 5000 drug addicts in Denmark.  I would love to see similar projects running in the UK and I hope to convince you of the benefits of fix-rooms for the good of all.

Fix-rooms are safe and hygienic spaces for victims of drug addiction.  In the UK, in litter-strewn back streets and grubby hostels, addicts share drugs and needles.  The use of a fix-room gives drug addicts a haven, free from disease and infection.  By providing clean facilities and clean equipment (e.g. syringes), fix-rooms reduce injecting-risk behaviour (syringe sharing), ultimately reducing the risk of HIV transmission and fatal overdoses.

The UK now has the worst drug mortality rate in Europe: in 2017 Denmark recorded 237 overdose deaths whereas the UK recorded 3,256 – an unacceptable and avoidable loss of 3019 lives.  Scotland holds the unenviable prize of first place for the highest drug mortality rate in Europe – that’s a scandal of epic proportions and the fact that our UK neighbours, England and Wales, share third place is no consolation.  We are clearly getting our approach to drugs wrong in the UK.

Fix-rooms would be a step in the right direction for us since there has never been a recorded death in any of the 78 fix-rooms that exist on the continent!  They employ highly trained medical staff who care for the needs and the safety of the victims of drug addiction.  If something goes wrong they are there to administer antidotes and immediately resuscitate the patients.  Surely in Scotland, with its harsher climates and notoriously poorer diet (which contribute to our poor health), there is an even greater need for facilities like these to help reduce our drug deaths?

Many would argue that fix-rooms encourage illegal drug use but this is nonsensical since no one (except a drug user) would appear at the door of what is effectively a clinic seeking to become a drug user!  Views like that are symptomatic of the failures in drug policy that fix-rooms would go a long way to repairing!  If we stopped criminalising addicts and increased their access to health and social care services then we might just start to get things fixed.

According to a survey conducted by the International Network of Drug Consumption, 78% of professional groups represented in fix-room teams are social workers.  A Canadian cohort study showed that the use of a Vancouver fix-room was associated with increased rates of people referred to addiction care centres and increased rate of the uptake of detoxification treatments.  Fix-rooms don’t take away the significance of addiction aid; they support, promote and provide care.

Wouldn’t you like to walk into the city centre or a park without worrying about discarded syringes?  Introducing fix-rooms significantly reduces public drug use, discarded syringes and wider societal impact.  Before Skyen opened as many as 10,000 syringes were found on the streets of Vesterbro – this significantly decreased to 1000 after a year of its opening.  Not only would our streets be safer for everyone, but we would also significantly reduce the pressure on our emergency services.  There would be fewer calls to the police regarding public drug use, and fewer ambulance call-outs related to overdoses.  Fix-rooms have proven that their use can significantly reduce the financial and social burden on society associated with drug addiction.

To addicts, fix-rooms are a god-send, however many in power believe they aren’t of any use despite the clear evidence to the contrary.  The Home Office has dismissed the positive prospects of fix-rooms and parroted the old lies about them becoming a focus ‘of crimes’ and are intent on continuing their plans for more treatment facilities and more focus on disrupting drug supplies – the much-fabled war on drugs that has failed time and time again!  Their words are also quite hollow since they have repeatedly cut treatment budgets causing a 26% rise in drug-related deaths in England (2013-2016).  Steve Rolles, a senior policy analyst at the Transform Drug Policy Foundation, which campaigns for the legalisation and Government regulation of drugs, said: “The idea that eradication or a drug free society can be achieved through enforcement is clearly ridiculous.”  The harsh reality is that the government are blind to the real problems of addicts and are determined to criminalise and demonise them rather than assist them in combating their conditions. Short-sighted government policies that continue to criminalise drug addicts and condemn them to suffer in the crippling conditions associated with dependence mean that we will never solve the problem.  We need to change the focus from criminal to care.

By accepting the need for health services to be the lead focus in drug addiction and funding fix-rooms we could dramatically reduce the number of fatal overdoses, discarded syringes and reduce the risk of HIV among vulnerable and desperate people in need of our support.  We could decrease the number of drug-related emergency call outs and increase the number of addicts referred to treatment facilities.  I accept that there is no magic-bullet solution to fix this but fix-rooms are a positive step in the right direction and they would, most certainly, dramatically reduce drug-related crime and drug-related deaths . . . and surely that’s worth fixing!

Bibliography

https://www.bbc.co.uk/news/magazine-38531307

How ‘fixing rooms’ are saving the lives of drug addicts | Mattha …www.theguardian.com › world › commentisfree › nov › fixing-rooms…

Why ‘fix rooms’ might be an answer to Scotland’s drug …news.stv.tv › politics › 1437423-drug-fix-rooms-should-be-introduce…

UK government rejecting ‘fix rooms’ in Glasgow ‘stands in the …www.dailyrecord.co.uk › News › Scottish News › Drugs

http://www.emcdda.europa.eu/system/files/publications/2734/POD_Drug%20consumption%20rooms.pdf

http://www.emcdda.europa.eu/countries/drug-reports/2019/spain_en

https://www.independent.co.uk/news/uk/politics/dark-web-darknet-dark-net-war-drugs-futile-uk-largest-online-drugs-market-europe-silk-road-fbi-cannabis-cocaine-heroin-a7183141.html

Thomas Gillen: Reduce, Reuse, Recycle: How the people alone can’t stop Climate Change

Another doom and gloom headline flashes across your computer screen. The fifth Horseman of the Apocalypse, Climate Change, has trotted into town, cutting down the polar bears, scorching Greece and sunny Siberia, purging the ice sheets and pillaging the coastline, and only one word is left in its wake – you. Global Warming is one of the greatest questions of the 21st Century, threatening the delicate balance of entire weather systems and more as the average global temperature rises – and is constantly spun into the individual’s problem, throwing the public eye away from the politicians and corporations obfuscating the issue in the courts for their own selfish agenda. I feel that the corporatocracy of today is the harbinger of a bleak tomorrow in the face of a worldwide crisis.

The paragons of anti-intellectualism and downright scientific denialism among those able to affect change – the elected – is no small sign of this pervading problem in politics. With very few scientists going into political professions, the parliaments are ruled by those who are poorly informed on crucial climate legislation and basic science – when Scott Pruitt, the current USA Enviromental Protection Agency administrator in a major carbon emissions centre is actively assisting the repeal of important legislation in the crusade against global warming, the environment is not in good hands. I personally feel the lamentable lack of scientific representation in government circles is hindering the ability of key countries to act against man made climate change, and the public’s ability to make waves in these issues wanes because of it.

Not every government is so apathetic towards the world’s plight. But even so, they still engage in debatable practices. Nuclear power is a developing, and very promising, energy industry that is constantly, and regularly, demonised by some in the political sphere. The energy output of 6 grams of uranium-235 is roughly equivalent to a metric tonne of coal – and all you hear is Fukushima, Chernobyl! The European Union (EU) is a leading proponent of the Paris Climate Agreements in 2015, and key members are still skeptical as the world’s hourglass runs ever drier – Germany’s reputation for efficiency is not highlighted by how its renewables and nuclear industry barely covers more than its fossil fuels usage, and there is no clear plan on phasing out the fossil fuels in the near future. For every green glowing France, there is an soot-covered Argentina, and with greenhouse gases flooding from the energy sector I think the nuclear fears being stirred by some political leaders are disingenuous and could have far reaching consequences.

Renewables, such as hydropower, fare somewhat better, with a cleaner past than other alternatives, but even that is fraught with trouble – Scotland is practically a world leader in wind energy (‘Scotland is home to the biggest renewable energy resource in Europe. We will set ambitious renewable energy targets and government funding will support low carbon technologies, energy storage and transport alternatives’) , and the UK recently announced a 56% cut to funding in that sector of the energy industry when renewables are still in dire need of help – which once again reflects a running theme in the climate discourse; The flaunting of progress in favour of short-term economic benefit.

There is, however, a price to all of these potential benefits. The start-up costs of these industries is high and not to be dismissed, with potential billions – trillions, by some estimates – of pounds having to be invested in low carbon methods to make any sort of worthwhile waves. Professor Gordon A. Hughes in Edinburgh painted the ever-so cheery picture of £16 of energy by today’s standards going for £38.50 and more, and that is not even the tip of the iceberg when it comes to funding the ‘cheap’ alternatives – and while both renewables and nuclear are relatively cheap to run once they are set up, they still have their own issues. Nuclear is potentially vulnerable to exploitation by terrorist organisations in both the first and third worlds, with Al-Qaeda allegedly having schematics for various nuclear facilities – the fallout of a dirty bomb alone is a high risk to innocent lives. There is a catch to all of that – the nuclear industry recognises this risk and has made preparations for this scenario, involving military intelligence and more. And fossil fuels, while cheap in the short term, have much larger costs. All of the environmental disasters, from tsunamis to heat waves to harsh winters, will cause much more damage than our worst nightmares – trillions of pounds of property losses, wars over what little scraps of oil can be gathered from depleted sources, and that is not even considering the greatest loss of all – life. When the dust settles, any cost now is going to seem like nothing.

Politicians, however, are not the only ones responsible – moreso a peon of the greater culprit. The corporate impact on the environment is not to be understated – with 71% of all greenhouse gas emissions coming from 100 companies, including the likes of ExxonMobil and Shell, the regular adage of ‘drive less’ and ‘eat less meat’ loses its potency. The unfortunate truth of the matter is that a coordinated effort to phase out staples of society like meat is far down the road, if at all, but the responsibility to reduce their emissions are still there – and while Big Macs are still in high demand, poor infrastructure and lack of subsidization in these industries is going to continue to fester like a tumour, putting profits above improvement. Personally, I’d rather not die to cow farts.

The constant shifting of blame in the climate debate is a terrifying precedent, and it is not being addressed by the top brass in nearly enough force. The public’s responsibility to combat climate change cannot be understated, but the complete lack of a unified vision and focus across the world is a much scarier thought. The Earth will always find a way to continue turning, and another extinct species – humanity – isn’t going to stop it.

Bibliography:

https://www.theguardian.com/sustainable-business/2017/jul/10/100-fossil-fuel-companies-investors-responsible-71-global-emissions-cdp-study-climate-change

http://www.lse.ac.uk/GranthamInstitute/research/

  https://www.ucsusa.org/global-warming/science-and-impacts/science/each-countrys-share-of-co2.html  

https://sciencing.com/about-6134607-nuclear-energy-vs–fossil-fuel.html

https://www.theguardian.com/business/2018/jan/16/uk-green-energy-investment-plunges-after-policy-changes  

https://www.nature.com/articles/d41586-017-07510-3  

https://www.ft.com/content/6c9a53f4-8597-11e7-8bb1-5ba57d47eff7  

https://greens.scot/policy/energy  

Rachael Eadie: Give it a Rap!

Rap music is everywhere: in the entertainment we consume, as background music in the shops and restaurants in which we go about our daily lives and even in advertising for mainstream brands like Pepsi or Gap. It has become a global phenomenon, one of the most popular and lucrative music genres in the world, creating worldwide superstars and legions of adoring fans. Surely a force for good? Well yes, if your idea of positivity is explicit language, glorification of gang violence, the perpetuation of racial stereotypes, misogyny, drugs and a fixation on money and materialism. Are these values we really want to encourage? If it was just to cater for a minority taste this wouldn’t be such a big deal, but since rap is now the most popular music genre in the United States, part of the mainstream in western culture and is rapidly increasing in popularity around the world, isn’t it time for some types of rap music to change their tune?

It wasn’t always this way. I struggle to understand how something so poetic in origin, rooted in the story telling culture of Africa and often used so successfully by early artists such as Grandmaster Flash, as a vehicle for highlighting issues of injustice, oppression and poverty has to such a large extent become so corrupted in its values, hijacked by the corporates and turned into a global money making machine. Nowadays the mere mention of the words “rap music” conjures up too many negative images.

The objectification of women is a huge issue in some types of rap music, particularly the hardcore and “gangsta” sub genres (which also happen to be the most lucrative ones). To my mind the lyrics and the visual representation of women in these rappers’ videos is more often than not offensive. What kind of example is this setting for young women today? How many rap videos portray a strong, independent, intelligent woman asserting her authority over men? Instead all we ever see is a succession of submissive, scantily clad women portrayed as sex objects. If that’s all you’re exposed to when you’re young, you’ll start to think that it’s normal. In the twenty first century we are surely beyond the point where the sort of goals women set for themselves is to see who can be the most “bootylicous”. Particularly in the wake of the recent Harvey Weinstein scandal, it can only undermine the message of the #MeToo movement to glamourise the exploitation of women. There’s enough misogyny around already: the last thing we need is it being constantly blasted in our ears and shoved in our faces.

I also don’t get how, in a time where we are encouraging tolerance in so many other areas, many rap artists seem to get away with expressing sentiments and using words like, ‘hoe’ and ‘n***a’ which, in any other context, would be considered racist, sexist or offensive to the point of being totally unacceptable.

Another area where some rap music seems to create controversy is the manner in which the lyrics glorify violence and glamourise criminal activity. Think of all the rap songs that latch onto the same depressingly recurring theme of scoring drug deals, knife crimes, drive-by shootings and aspiring to be the next big gang leader. As Eazy-E quotes in his song, Boyz-N-The-Hood; “Little did he know I had a loaded twelve gauge/One sucker dead LA Times front page”. For some artists this does in fact represent the reality of their lives, as a few have found out to their ultimate cost e.g. the east/west coast gang rivalry which claimed the lives of rappers Notorious B.I.G. and Tupac Shakur. The irony, though, is that other rappers, Drake being one example, will happily create a “gangsta” alter ego for themselves for the purposes of commercial success when in fact they come from backgrounds a million miles removed from the deprived neighbourhoods of South Central LA. What angers me is that this is not only misleading but irresponsible. Many people idolise these artists and see them as role models, thinking that sort of lifestyle is something to aspire to and imitating their behaviour in the belief that it’s the cool thing to do.

So many artists in this genre seem to obsess about appearances and materialism, as if quoting designer brands, high-end luxury goods and top of the range sports cars gives them some sort of kudos. Maybe if more rap was not about getting the latest Rolex and more about getting a decent set of values it would set a better example for its audience. (But then, Kanye West didn’t get to be a billionaire by promoting the values of modesty, selflessness and caring for others: he got to be a billionaire by promoting his music and his trainer brand, Yeezy’s.) Yet this unhealthy fixation on designer “bling” can only serve to emphasise the gulf between rap’s megastars and their audiences, many of whom can’t afford to dream about the luxury Caribbean holidays and endless bling enjoyed by those they idolise. As Chuck D, leader of the group, Public Enemy, and one of the most prominent voices in politically and socially conscious rap music, cleverly observed: it is hardly the stuff of Robin Hood that the route for many of today’s rap stars to achieving success and funding their own lavish lifestyles seems to be to exploit their own fan base, much of which lives in relative poverty.

It would be an over simplification to suggest that all rappers subscribe to the language of crime, violence and misogyny. Yes, there are the socially conscious rappers who denounce violence, whose messages are inspirational and who seek to challenge, instead of perpetuate, the stereotypes. There are those voices promoting a message of love, peace and understanding rather than one of hate, tension and intolerance but they are at risk of being drowned out. If rap is to return to its historical roots as a force for good on its ever growing audience, it’s time to give more airtime to the likes of Frank Ocean and Stormzy and to call time on “gangsta” rap and its negative influences.

Bibliography: websites 

thecrimson.com

lyrics.com

impactofrapmusiconyouths.weebly.com

weebly.com