
The Scholars' Journal has been printed on carbon balanced paper and has offset 0.036 tonnes of CO2 through Verified Carbon Standard (VCS) reduction projects thereby offsetting carbon emissions and helping to prevent climate change.
The Scholars' Journal has been printed on carbon balanced paper and has offset 0.036 tonnes of CO2 through Verified Carbon Standard (VCS) reduction projects thereby offsetting carbon emissions and helping to prevent climate change.
Universities are now specifically studying you – the students who have written in this journal and your peers. Because you are (mostly!) Generation ‘Alpha’, a new generation soon to start applying to Universities. Born between c.2009-10 and 2024, Gen Alphas are the largest single generation ever to exist: there are about 2 billion of you worldwide. You have grown up in an environment that is synonymous with technology: the year Gen Alpha started (2010) was the year the iPad was invented, Instagram was launched and ‘App’ was the word of the year. You have lived through the first modern global pandemic and your worldview has forever been shaped by it. You have a heightened environmental awareness in part because you also have the longest predicted life span of any generation so far, with some of you living into the 22nd century. You have already incredible brand and cultural influence as well as social media power, and, before this decade is out, as some of your generation start at University, you will have an economic footprint that is predicted to exceed $5.5 trillion US dollars.
What will you do with that power and possibility? I think the answers can be seen clearly in the topics you have chosen to write about in this journal and which clearly matter to you: from enhancing our medical quality of life to protecting our environment; from understanding the way our brains work and influence our behaviour to how we present ourselves to the world; from how we should engage with others to what sense we can make of the geopolitics that swirl around us. There is no shortage of hope for the future in reading your thoughts, arguments, concerns and ideas, which bear witness to your fantastic curiosity about yourselves and the world around you.
The question for institutions such as mine is how we can help you on your journey by delivering the kinds of experiences, skills and knowledge you want and need in a world in which AI seems to be able to do almost anything. And for me, above all, that is about how we can help you to experience being part of a global community of learners and practitioners, a melting pot of cultures, and to develop what is often termed ‘global competency’. In an age when the world seems increasingly crazy, and in which we can all easily lose heart as to our individual power to affect change, the ability to engage successfully with people from a myriad of different backgrounds, cultures and experiences – with respect, openness and genuine interest – will be fundamental in empowering you to work together both to respond to problems and to make a difference.
As you take your next steps in your journey, with curiosity, determination, ambition, resilience and enquiry as Cokethorpe rightly encourages you to do, give some thought also to this. 2025 sees the end of Generation Alpha and the beginning of Generation Beta: those born between 2025 and 2039. And you will be fundamental in shaping the world they grow into and the people they become!
Professor Michael Scott Professor of Classics and Ancient History, Pro-Vice-Chancellor (International) The University of Warwick, UK @profmscott
You do not have to go far to feel the crazy world Professor Scott refers to. Open your email; there is the keyboard-warrior’s rant. Scroll through your socials; you are engulfed by those needing to be affirmed. Turn on your TV; it’s a diet of disasters. It seems we are addicted to the easy answer; deserve to be morally outraged; feel an entitlement to be right. Can respect, openness and interest invigorate our curiosity about the uncertain and the unknown, to help us respond to problems so we can make a difference?
Yes: but a curiosity worth its name is the sum of many parts. Curiosity may lead us to enquire, but it requires sustained intellectual rigour to nurture it. Ambition is energising, but without determination in the face of failure, it may come to little. Should you be a scholar now, aspiring to be one in the future, or looking back on your past achievements, we hope this journal demonstrates the liberating wisdom of Immanuel Kant: ‘Dare to be curious'.
Miss Hutchinson and Mr Elkin-Jones Heads of Scholars
The Legal Drinking Age: Time for a Change?
Henry O'Brien - Lower Sixth Is Free Will an Illusion?
Will Chandler - Second Form
How Close are we to a Dystopian Future?
Immy Harris - Fourth Form
Is Farming Undervalued by Society in the UK?
Florence Nixey - Second Form
What are the Effects of Conspiracy Theories?
Katy Stiger - Fourth Form
Communism: the Worst or the Best 'ism'?
Joe Norman - Lower Sixth
Should We Limit Freedom of Speech?
Will Hansen - Lower Sixth
Appendix 62-63
Additional references
Eva Graves, Fourth Form, Gascoigne House
Allen Scholar - Scholar Supervisor: Sam Farr, Upper Sixth, Swift House Editor: Miss Hutchinson
One in 37 people worldwide will be diagnosed in their lifetime. 50 people are diagnosed every day in the UK alone. At least 901 people die every day. Parkinson’s Disease is a neurodegenerative disease that presents itself through various symptoms. Neurology has made significant strides in understanding and treating movement disorders, particularly Parkinson’s Disease. However, along with this extra research comes an understanding that there are many different brain disorders that fall under the umbrella the general public know of as 'Parkinson’s Disease'. Shown through a broad spectrum of symptoms, these disorders (known as Parkinsonian diseases or disorders) include conditions known as 'primary' disorders, such as Dementia with Lewy bodies, Progressive Supranuclear Palsy, and familial Parkinsonian disorders such as Huntington’s Disease. There is also another subcategory known as 'secondary' disorders, including drug-induced Parkinson’s and toxin-induced Parkinson’s. This paper will focus on the 'primary' diseases – specifically Parkinson’s Disease, Progressive Supranuclear Palsy, Dementia with Lewy bodies and familial Parkinsonian disorders. While most of them share common features such as tremors, rigidity, and bradykinesia (impairment of voluntary motor control and slow movements or freezing), the original causes and progression of each can vary unmistakably. Sometimes, treating a certain symptom can even exacerbate other symptoms. When seeking a diagnosis, this complexity means there is no single approach to treatment. The desires of different families or patients may be driven by a whole host of personalities, priorities and symptoms.
Common similarities between Parkinsonian Diseases
Parkinsonian Diseases exhibit overlapping symptoms, especially because they are all neurodegenerative diseases. This means they are chronic conditions that damage and destroy areas of the nervous system over time; chiefly the brain, breaking it down over time. These symptoms can be split into five categories: motor symptoms, cognitive symptoms, behavioural symptoms, sleep disturbances and autonomic dysfunctions.
Motor symptoms refer to symptoms affecting movement and balance. Bradykinesia, the slowness of movement, is one of the three main motor symptoms associated
with Parkinson’s Disease, but can also be common in all Parkinsonian disorders. Along with bradykinesia, rigidity (muscle stiffness) and tremors are the other main motor symptoms of Parkinsonian diseases; tremors can however vary in presentation. Postural instability, or balance issues, are also very common amongst patients with Parkinsonian diseases.
Cognitive symptoms are caused by mild cognitive impairments, which are problems with a person’s ability to think, learn, remember, use judgement and make decisions (Anon, n.d.). These cognitive impairments then lead to progressive cognitive decline, which refers to the gradual loss of thinking difficulties such as learning, remembering,
The patient’s experience: My grandmother’s journey towards getting an exact diagnosis In March 2020, my grandmother started having hallucinations. This was probably her first significant Parkinson’s symptom. However, she did not tell anyone about them until around a year later.
Having struggled with her blood pressure throughout adulthood, she had some high blood pressure scares that meant she had to go to the Emergency Room (or Accident and Emergency, as it is commonly still called in the UK) a few times. Eventually, in spring 2021, she ended up in an Intensive Care Unit (ICU), not because of high blood pressure, but instead due to dangerously low sodium causing neck pain. We believe the low sodium was caused by a diuretic she was put on due to the high blood pressure. During the ICU stay, after severe hallucinations, she was diagnosed with hospital delirium. Looking back, doctors and family alike now acknowledge this was likely to have been early Parkinsonian symptoms.
My grandmother’s General Practitioner (GP) recommended going to a speech therapist after her repeated struggles to both think of and then form words. Part of the therapy included a cognitive test known as a MoCA test, in which she got an unexpectedly low score, indicating severe cognitive impairments and issues with executive function. Her therapist then recommended visiting a neurologist.
The subsequent visit to the neurology office in autumn 2022 included a battery of tests, at the end of which the doctors stated they were 90-95% sure she had Parkinson’s Disease but did not, or could not, specify which form of Parkinson’s. This highlights the challenges associated with Parkinsonian diseases, because there are very few concrete tests available during life. Everyone in our family sighed with relief when we had a diagnosis, but little did we know that what would follow would be an ongoing struggle with multiple carers, ideas from professionals and countless hours in numerous doctors’ and therapists’ offices.
She was almost immediately put onto carbidopa levodopa, one of the only available drugs used to treat primarily the motor symptoms of Parkinson’s Disease. Because dementia symptoms are also a large part of her disease, she was also put on a dementia medication called donepezil. The levodopa significantly improved her stiffness and allowed her to move more freely. After fairly positive results with the initial dose she was prescribed, the doctors decided to up her dosage to see if they could gain a little more function. However, at this point the negative side
effects, such as falling asleep even as she was putting food into her mouth and extremely wobbly legs far outweighed any positives gained.
In February 2023, my grandmother started talking strangely (making nonsense sounds and stringing words together that had no connection) and having vision problems. Because these symptoms seemed as though they could be indicative of a stroke, they headed to hospital again. After doing some tests and noticing difficulties with eye movement, the hospital neurologist suggested that my grandmother may have PSP. Following this hospital stay, she was determined to be too compromised to live on her own and began splitting her time between two of my aunts’ houses.
However, when the idea of PSP was followed up with her neurologist, he was sceptical that this was the right diagnosis, and equally sceptical that it mattered one way or another as he didn’t believe changing the diagnosis would change the treatment plan. Despite this, my grandmother was sent to a neuro-optometrist to confirm, and they agreed that it was unlikely to be PSP. Regardless of these medical opinions, she has continued to exhibit some of the hallmark symptoms of PSP. This is significant because the life span of someone with PSP is shorter than most other Parkinsonian diseases. If families are made aware that PSP is a possibility, it will not be as shocking when they find their loved one suddenly at the end of their life.
Later on, in Summer of 2024, she was put on another motor symptom medication called amantadine that was originally used to treat influenza but now is often used to treat Parkinson’s patients. Despite improving her physical state significantly, this drug caused such severe confusion, dementia and other cognitive impairments, that she was taken off it within a week.
In Autumn 2024, my grandmother had an online appointment with a dementia specialist. After some discussion, the specialist thought that it was probable that she has Dementia with Lewy bodies (DLB) and recommended getting a DaTscan, which would identify Lewy bodies, but would not differentiate between whether they were causing DLB or PD. Our family was sceptical it was a good idea because of her negative history with hospitals and scans, and her neurologist was again not sure if it would make a big difference to her treatment plan. However, the priority of the decision was reduced when she started having falls and her general care needs reached a level where my aunts could no longer take care of her on
Freya Vincent, Third Form, Queen Anne House
Allen Scholar - Scholar Supervisor: Ella Hogeboom, Fifth Form, Swift House
Editor: Mr Elkin-Jones
As humanity rapidly improves our societies, our advances in technology increase at an ever-brisker pace. This includes the development of artificial intelligence, quantum computing, and smart clothing, to name but three. Accompanying this is the biomechanical process of genetic enhancement and eukaryotic modification. Recently though, the two sides – one in favour of genetic modification and the other against the process - have become clearer and simpler due to people educating themselves and realising what genetic modification actually entails: either ‘we should do it because it helps people have food with the development of Golden Rice’, (a lifesaving, highly nutritious plant that reduces sight loss in impoverished communities); or ‘we should not do it due to the risk of gone-wrong experiments wrecking the environment’, so outweighing the previous argument.
As always with humanity, the question is not if but when?
Since the days when humanity looked up to the sky and longed to fly, or looked further and set their sights on claiming the moon, we have been defying nature’s boundaries. Along with this comes the responsibility to do so in an ethical way, so as to not disrupt the delicate natural balance of the world, which (when considering the climate crisis as only one example) we so far have had a chequered track record to say the least. When it comes to biomechanical engineering, a significant risk is that we will never truly know the impact a new species has on the environment it lives in until it is released. The basic laws of science allow us to have an insight into the short-term causes. However, the long-term causes are more challenging to predict, with the wide range of risks growing as the species evolves.
Though many people claim to know what genetic modification is, most envision a Captain America like transformation where you step into a chamber and inject the super serum and out pops a perfect specimen of the species you are modifying. In reality, it is far more labour intensive and time consuming with the various trial and error that must occur to make a single organism successful. By crossbreeding plants and animals, our Stone Age ancestors realised they could boost the amount of food they produced. (BBC Bitesize, n.d.) Whilst, like our ancestors, it is possible to naturally change the traits in foods with selective breeding, this process takes a long time, and breeders may not know which change led to the desirable advantage being created. Modern-day science is allowing us to improve this process. In basic terms, genetic
times more beta-carotene per plant than specimens grown in the greenhouses (LSU, 2005 the article was printed, 2015 it was uploaded online). This is advantageous to the local peoples growing Golden Rice in their communities as they do not need to spend extra money on buying a greenhouse, or making greenhouse conditions, but they can get a higher number of vitamins that benefits the consumers that are deficient in vitamin A.
Although GMOs can clearly be beneficial to human health, the legality and ethics of the matter are challenged by many. This includes the society named Greenpeace. They led campaigns against genetically modify foods which prompted 107 laureates to write to them asking them to abandon their campaign in 2016 (Achenbach. J,2016). As well as this, in 2024 the Filipino Court of Appeals issued a cease-and-desist order for the growth of Golden Rice in the country, citing a lack of scientific certainty regarding its health and environmental impact (Bautista 2024).
Although the definite impact it has on health not being fully explored yet, humans should be allowed to genetically modify other species. The environmental impact it has is great, that is correct, but as the saying goes: 'With great power comes great responsibility' and with the chaos that humans have already inflicted on our planet, I believe that if we take responsibility, we should be allowed to genetically modify organisms. We have successfully shown that we can modify living cells to adopt desired traits in a safe, ethical way that benefits humans and causes no detrimental harm to the planet (an example of this is Golden Rice). People claim that the environmental impact is harmful to the planet because it disrupts the natural balance that nature intended.
Unfortunately, for those that stand against the motion of consuming and creating GMOs there has been no major disaster (if any at all) that paints GMOs in a negative way. However, that does not mean that one could not occur, but we have rules and regulations in place to prevent such a thing happening. The benefits and rewards to be reaped greatly outweigh the needle-in-a-haystack chance of a
catastrophe occurring and harming the population. There has been no GMO that has made it into the wider world and caused a large (or small) negative impact on the environment. This suggests that no disastrous events have occurred regarding GMOs and there should be no need for hostility towards them. Caution should be advised, and it would be natural for us as humans to feel this way towards something we know rather little about considering the many ways the genetic modification process could go wrong. Accompanying this, is the human desire to be curious and learn more about our planet. This is what makes us different to other animals. The drive we exhibit as humans that pushes us to dare to dream and drives us to make the breakthroughs in science that allow us to create things that really impact the world. Our ability to engineer beneficial foods that aid humanity and do not harm the planet should not be feared-but celebrated.
So far, GMOs and the Green Revolution (the 20th century agricultural project that utilised plant genetics, fertilisers and intelligent irrigation systems to bring food to the hungry masses (Spanne. A, 2021)) have helped an estimated one billion people evade starvation and secured the livelihoods and jobs of many more (Ventura. L, 2022).
Following the release of better crops that have been made drought resistant, or crops that do not require pesticides and fertilisers, farmers have not needed to buy products that aid plant growth. As a result, farmers have saved money and achieved larger profits than in previous years, when working with non-GMOs. On a global scale, this has saved the world 83 trillion US dollars and spared 223 million people from returning to the developing population by giving them a reliable food source (English. C, 2021).
As of now, there are lots of regulations to follow regarding the genetic modification process. These have been drawn up by the UK Competent Authority, (the Health and Safety Executive and the Secretary of State for the Department for Environment, Food and Rural Affairs acting in coalition) to form the GMO (CU): The Genetically Modified Organisms Contained Use regulations. This includes a risk assessment to be done on all GMOs and GMMs (Genetically Modified
Ollie Black, Second Form, Feilden House
Scholar Supervisor: Finn Van Landeghem, Upper Sixth, Vanbrugh House Editor: Miss Hutchinson
Whenconsidering the rise in diagnoses of neurodiversity, it is important to establish exactly what the term means. Today it is now used so broadly that there are many different definitions, some more technical than others. For example, the Oxford dictionary defines it as 'the range of differences in individual brain function and behavioural traits, regarded as part of normal variation in the human population' (OED online, 2024). If we were to take this definition of ‘normal variation’, then you could argue that a diagnosis may not be needed or be useful at all. However, Harvard medical school defines it as 'the idea that people experience and interact with the world around them in many different ways; there is no one 'right' way of thinking, learning, and behaving, and differences are not viewed as deficits' (Baumer and Frueh, 2021). What this definition does so well is acknowledge that although there may be significant differences in how peoples’ brains work, this is not a deficit, and if you do have a diagnosis there is nothing wrong with you, your brain just works slightly differently to what society statistically considers ‘normal.’
Using this definition as a broad umbrella term to describe ADHD, Dyslexia and Autism, we will examine to what extent has there been an increase in diagnoses. The figures reveal a startling increase: an article published on 19 August 2021 revealed that there was a 787%, exponential increase in recorded incidence of autism diagnoses between 1998 and 2018 (ACAMH). According to The Guardian newspaper '80 years ago, autism was thought to affect one in 2,500 children. That has gradually increased and now one in 36 children are believed to have autism spectrum disorder (ASD)'. Around one in 57 (1.76%) children in the UK is on the autistic spectrum, significantly higher than previously reported, according to a study of more than seven million children carried out by researchers from the University of Cambridge’s Department of Psychiatry, in collaboration with researchers from Newcastle University and Maastricht University.
There have been a number of different factors suggested to explain this increase. For example, Mrs Rushton, a Learning Support Teacher at Cokethorpe School, suggests that one of the leading factors is 'a greater acceptance of neurodiversity' (Rushton, 2024) throughout society;
whereas health writer Julia Ries believes that air pollution may play a role in the increase (Ries, 2023).
This essay will attempt to evaluate three of the most spoken about driving factors of said increase: we will consider access and reliability of testing; a growing awareness and understanding of neurodiversity, and finally the changing physical environment. Although all three factors are undoubtedly important, this essay will demonstrate that the most significant of the three is a change in physical environment, as it is the only factor that is arguably creating new cases instead of diagnosing pre-existing cases.
Testing for neurodiversity is at the best it has ever been, and more accessible than ever. We can see this in the rising demand for learning support teachers to help teach neurodiverse students. 'Heal and Care' (EHC) plans increased by 9.9% in 2021 alone. With a 23% rise in initial requests for an EHC plan since 2020 (protocol education). According to the BDA (British Dyslexia Association) 'both specialist teachers and psychologists can diagnose dyslexia'. Inevitably, this improved access to testing and earlier diagnosis or intervention will have a significant effect on the number of diagnoses. However, the BBC claims that only a tenth of the neurodivergent students receive a diagnosis and access to testing remains an issue, especially in less privileged students (BBC, 2019).
Some people are finding that if their child is not in private education, it is quite hard to get tested or be referred to get tested. Some children can go through their entire school life needing a diagnosis but are unable to be tested or diagnosed because of lack of resources in some schools. In addition, larger schools can struggle to notice if a child is exhibiting signs of neurodivergence: 'Out of 8.7 million school children in England, the report estimated about 870,000 of them have dyslexia but fewer than 150,000 were diagnosed, according to Department for Education figures' (BBC 2019). This is not just down to accessibility but
Bing Brown, Fourth Form, Queen Anne House
Scholar Supervisor: Alex May, Fifth Form, Vanbrugh House
Editor: Miss Hutchinson
Howdo you think the most intricate melodies from Beethoven and Liszt relate to the first scale you will learn on an instrument? They all rely heavily on the language of maths. Although they seem far apart, the worlds of music and maths have always had a strong bond, from the Ancient Greeks who were some of the best minds in human history, to Taylor Swift's hit song, Shake it Off. They both use mathematical patterns, whether they are aware of it or not. Numbers, sequences, and symmetry all work to create the best and most complete sound that we find so beautiful and of which all people can appreciate, whether it be Chopin’s Nocturne in E flat major or, more likely, Shake it Off. However, some people believe that they are from completely different worlds. In this essay I will explore how exactly maths influences and defines the music we all listen to.
It is said that one day, Pythagoras - the man who made one of the most famous equations in the world (A2+B2= C2)was walking down the street past some blacksmiths who were hammering away on their anvils. He heard hammer A and hammer B strike together and noticed that this produced a nice sound or in musical terms consonance. But when hammer B and hammer C struck together, they sounded not so nice (dissonance). Hammer A and hammer C produced another nice sound, and hammer A and hammer D produced a song like sound (Waldron, 2024). Pythagoras went to investigate and discovered that the hammers weighed 12, 9, 8 and 6 kilos, respectively. A and D were in a ratio of 2:1 which is the ratio of an octave in which you double the frequency or halve the frequency ie: an octave above a 440Hz sound would be 880 Hz and an octave below is a 220Hz. B and C weighed 9 and 8 kg and their relationship with D were 3:2 which is a perfect fifth and 4:3 which is a perfect fourth (Kenneth Sylvan Guthrie, David R 1987). The dissonant harmony was a 9:8 ratio which is also a 2nd. Nature has a hidden natural order of things and that extends to every corner of the world. Esperanza Spalding once said that 'solving an equation in maths is like mastering a piece of music. Nothing gives the same satisfaction as the end of a piece or equation, but you can also make the same mistakes, you can hit a wrong note and create dissonance or do a wrong calculation. They both are unnoticeable until it is too late, but when they are right the outcome is a feeling like no other' (Ancient Maths and Music).
You have all most likely heard the big names in music like Debussy, Beethoven, and Liszt. While they may not have been trying to use maths to make music, maths has been used. The most important of these, particularly in classical music, is the golden ratio and the Fibonacci sequence. For those of you who do not know what the golden ratio is, imagine two lines, A and B and A Is longer. The golden ratio would occur if A/B = A + B/A; the golden ratio is always 1.618 and is in nature everywhere; the spirals present on some animals, humans and most importantly music ( Schielack, Vincent.P 1987). If a piece of music is ten minutes long, it would be safe to assume that the climax of the piece would be about six minutes through. This creates a very natural, flowing melody and putting the emotional climax of the
piece at that position makes it incredibly pleasing to the ear. The greatest and most famous composer in history, Debussy, completely unknowingly uses the golden ratio in his most recognizable pieces (Howat Roy, 2009). This further illustrates how clever these composers are; to have the intuition to place everything so perfectly never ceases to amaze. Another closely related concept to the golden ratio is the Fibonacci sequence. The Fibonacci sequence is important to the way that the structure of the melody is arranged. For example, a musical phrase lasting three beats could be followed by a phrase of five beats and then followed by a phrase of eight beats etc. This gives the piece a natural movement and groups things nicely. This is also prevalent in time signatures in which time signatures like 5/8 will be grouped three and five which directly correlates to the sequence.
Jazz, similarly, is all about patterns. It is some of the most complex music to play, and the reason it is so difficult is because of the intricate patterns and sheer mental ability needed to improvise something on the spot. A big element of jazz is syncopation. You can think of syncopation as a fraction. Think about a standard 4/4 melody, it goes 1-2-3-4. Simple, right? However, to syncopate a rhythm, you have
to time everything on the 'off beat'. If we carry on in 4/4 the melody is often split into crotchets (1 beat), minims (2 beats) or quavers (½ a beat). Usually, you would emphasise the first, second, third and fourth beats. However syncopation requires you to emphasize a different beat, so you may emphasize not the first beat but maybe the first and half beats or if you are feeling confident the first and quarter beats (Brenna Yan Tin, 2024). A dotted crotchet starts on beat one and goes into beat two and then you emphasise a short quaver, this would be very off putting for the listeners' perception of the music, making it all the more interesting. Another extremely mathematical point of jazz is that of improvisation, a key part being the jazz scales. There are fifteen jazz scales, however, for the purposes of this article I will focus on the main three. These are the major scale, Dorian mode and Mixolydian mode. The major scale, although it is not a particularly complex idea it is the basis for all harmony. It is based on semi tones (like an F to an F sharp) and tones (for example F to a G). To construct a major scale, you would choose a starting note and go up in the pattern tone, tone, semitone, tone, tone, tone, and semitone. In short, it works because if you try to rearrange them in any other way it will disrupt the natural harmony and tension as the placement of the semitones help to keep it stable and not sound discordant and not as pleasing to the ear as it should be (B.W, 2024). The second one is Dorian mode. Dorian mode has an identical structure to a major scale except it is tone up. For example, in C major the Dorian scale would start on a D. The main difference is instead of having a major third it contains a minor third. Because of that minor third it is technically minor, however, unlike the natural minor scale it has a major sixth. This gives it a very identifiable jazzy feel, in terms of intervals it goes: tone, semitone, tone, tone, tone, semitone and tone. It is an extremely important scale to those who improvise jazz and without it everything would sound more pedestrian and boring (From Subject to Style, 1986). Finally, the Mixolydian mode is constructed on the fifth note of the major scale, so going back to C major the Mixolydian scale would start on G. Unlike Dorian mode it keeps the same major third meaning as a whole it is major. However, it contains a flattened seventh which gives it a bluesy and unfinished feel. The flattened seventh is exactly ten semitones above the tonic which is ever so slightly dissonant to the tonic giving it quite a 1920s film introduction feeling (Van der Merwe, 1989). In intervals it is: tone, tone, semitone, tone, tone semitone and tone. Again, the placement of semitones makes the scale lighter than the minor scale but stops the resolution of the major scale (Mel Bay, 1991).
However, some may argue that there are people who are incredibly musical and yet cannot do any maths. While this may be true there is an explanation, which is that different parts of the brain are needed for both. While music does contain lots of maths, it also requires an emotional nuance that does not require mathematical skills. Similarly, maths requires abstract reasoning and logic, which are not necessarily present in music. Musicians also tend to have a better auditory cortex and tend to have very good fine motor skills (MIC Science). On the other hand, Mathematicians use the part of the brain which is good at logic and reasoning which is called the frontal lobe (Stanislas’s, the Number Sense, 2011). Whatever the musician’s maths ability may or may not be, one thing is
clear; music is something that requires years and years of dedication and without that the musician will never be truly magnificent despite possessing innate talent. This leads to some people just giving up because they either are not seeing enough progress or just do not enjoy it. And some people just prefer maths so dedicate more time to it. Maths also requires more of a formal education from a teacher who really knows what they are doing, compared to music where teaching is also incredibly important, but musicality is innate and can be developed.
This article started with the Ancient Greeks and ended in the present day, that’s 500Bc to 2025 or 2525 years. The Greeks were the first to truly understand this complex relationship. They were the basis for all later discoveries like the Fibonacci sequence. It is through them that we are able to hear the wonderful music we have today. Equally Debussy and other famous composers were pioneers without even knowing it, their music which perfectly represents all the maths they may not have even known about further cements my point about their connection. And finally, jazz. It is the most complex form of music and the amount of quick thinking it requires is amazing. The scales and way they improvise is all maths, being able to do that on the spot really does require a very clever individual who knows exactly what patterns to do at exactly the right time. Pythagoras once said, 'There is geometry in the humming of strings, there is music in the spacing of spheres', this perfectly demonstrates the point I am making, the two co-exist and without maths there would be no music.
References:
Berle, A (1991) Mel Bay’s Encyclopaedia of Scales, Modes, and Melodic Patterns. Missouri Pacific, Mel Bay Publications. Accessed 3 December 2024.
Dehaene, S (2011) The Number Sense: How the Mind Creates Mathematics Oxford: Oxford University Press. Accessed 5 December 2024.
Guthrie, K (1987) The Pythagorean Sourcebook and Library: An Anthology of Ancient Writings Which Relate to Pythagoras and Pythagorean Philosophy: Michigan, Grand Rapids. Accessed 2 December 2024.
MIC (2014) Science Shows How Musicians' Brains Are Different from Everybody Else’s Accessed 5 December. Available at: https://www.mic.com/articles/96150/ science-shows-how-musicians-brains-are-different-from-everybody-elses
NOVA (2015) Ancient Maths and Music PBS Learning. Available at: https://www. pbslearningmedia.org/resource/nvmm-math-mathmusic/ancient-math-music/. Accessed 2 December 2024.
Roy, H (2009) Debussy in Proportion: A Musical Analysis. Cambridge, Cambridge University Press. Accessed 3 December 2024.
Schielack, VP (1987) The Fibonacci Sequence and the Golden Ratio. National Council of Teachers of Mathematics, Virginia, Reston. Accessed 3 December 2024.
Taruskin, R (1986) From Subject to Style: Stravinsky and the Painters. Berkely, University of California Press. Accessed 3 December 2024.
Van der Merwe, P (1989) Origins of Popular Style: The Antecedents of Twentieth-Century Popular Music. Oxford: Oxford University Press. Accessed 5 December 2024.
Wilson, B (2024) Lesson on How to Construct a Major Scale. Taught 20 May 2024.
Waldron, A (2024) Lesson on Pythagoras’s Philosophy. Taught 14 October.
Yan Tin, B (2024) Lesson on How to Play Syncopated Rhythm. Taught 12 March 2024.
Images:
Dolo Iglesias (2017) Person playing piano. Available at: https://unsplash.com/photos/ person-playing-piano-FjElUqGfbAw
and rewards can be received by many various factors, whether it is winning a football match, or eating a desired chocolate bar, to scrolling through TikTok. But how many of you know that these feelings are ultimately caused by a chemical called dopamine 'a biological chemical that gets released in neurotransmitters when there is an electrical impulse which allows the dopamine to bind to specific receptors on a receiving neuron' (NeuroLaunch, 2024a). This fascinating reaction impacts our lives both directly and indirectly every single day.
To explore how dopamine impacts our lives we shall delve into how our attention can be affected by dopamine levels, what is fake and real dopamine, how dopamine affects your cognitive control, and an experiment I conducted seeing if a 'human hard reset' is possible.
To begin with, we have to know about other biological chemicals that are in our brains. There are five different chemical that work the human brain with the most significant two being dopamine and serotonin. These chemicals are remarkably similar but have massive impacts on our well-being.
Serotonin affects our moods. When there is a deficiency of serotonin it can lead to anxiety, depression, and numerous other metal illnesses which at the moment around one in eight people suffer from. Whilst dopamine affects our motivation, when there is a great imbalance of dopamine, we can notice a lack of concentration and reduced pleasure in your normal enjoyable activities (Simply Psychology, 2021).
Completing a task is simple, you just need to work and stay focused on the job. However, people’s attention spans in the last decade have been on the decline. Researchers measured the attention span of the average person in 2004, 2012 and 2017. It showed that in 2004 the average attention span was two and a half minutes which is a good amount of time, but in 2012 that number halved to one and a quarter minutes. Then in 2017 it reduced further to 47 seconds (Mark, 2023).
There are many factors which may have been making it harder for us to concentrate and many people have different opinions regarding the cause. In this section of the article, we will discover a possible reason which is causing this signific drop.
Smartphones play a significant role in our everyday lives and most people who own a smartphone have some sort of social media nowadays and this plays a significant role making us distracted from our tasks. Eight hours of sleep a day is needed for your health. Any time lost from those eight hours is called sleep debt. If you only get seven hours of sleep a day your sleep debt will start accumulating and your focus level will reduce. What researchers discovered
was that the people with sleep debt spent more time doing lightweight activities like scrolling endlessly through social media.
When we see content that we enjoy watching, dopamine gets released to make us feel good, but the brain does not stop there. According to Pauline, 'The dopamine which gets released creates a feedback loop which reinforces behaviours leading to positive experiences, encouraging us to continue' (Pauline, 2024). These positive experiences can last for minutes, making us want more.
Here is where the satisfaction of social media compared to a football match differentiates. Winning a football match is a wonderful achievement which an individual may have worked hard to achieve; however, one is not guaranteed to win the match or even draw the match. Social media though, wins the game more or less every time. When Facebook was first created all the feeds were in chronological order making it harder to find what one liked. In 2009 Facebook changed how they displayed their feeds, using a new, improved algorithm to show users more of what they liked, so keeping them hooked into Facebook’s feeds (Social Media Today, 2015). This algorithm has the effect of increasing the amount of dopamine being released because the individual is always seeing what they want to see.
There are two ways in which dopamine is acquired. There is ‘fake’ dopamine and ‘real’ dopamine. Fake dopamine can be obtained by doing the easiest things where you do not really have to work for it like drinking alcohol, taking drugs, playing video games, and watching inappropriate content. These activities are simple and easy to do; they do not require much effort and are right at our finger tips. Maston asserts that '[fake] dopamine is ruining people’s potential and creating lives filled with regret instead of a life of passion and purpose' (Matson, 2022). However real dopamine is achieved by hard work, determination and even by setting goals for the future. It requires us to think and stimulate our brain keeping us mentally active.
The first time I heard about ‘fake’ and ‘real’ dopamine I was of the sense that the only difference must be the way that you obtain it. But there is actual a noteworthy difference. There are two places where dopamine is received in the brain. The dopamine desire circuit (‘fake’ dopamine acquired) and the dopamine control circuit (‘real’ dopamine acquired) (Matson, 2022). After acquiring ‘fake’ dopamine from, let us say, eating chocolate, it slightly deflates your dopamine control circuit. There is nothing to be concerned about here. The only thing that happens is that your brain starts to want more of this easy achievable dopamine where anybody can get it from. This explains why we lack motivation after receiving ‘fake’ dopamine because our brains want more of the ‘fake’ dopamine which is easy to access.
The term ‘cognitive’ means it is our feelings, emotions and thoughts. NeuroLanch state that if we obtain too much ‘fake’ dopamine it has the potential to impair our cognitive state; different sources of dopamine will influence our behaviours when setting and achieving goals. If our satisfaction is based off ‘fake’ dopamine we will think along those lines. So, if an individual is a gamer who plays many video games, the method of thinking will be somewhat similar to the video games that person plays. To get rid of ‘fake’ dopamine intake is impossible, and most likely is not good for one’s mental health, but reducing it in certain cases is better for the mental health of the individual, just like giving a screen time limit to ourselves and eating healthily.
We are now going to look further into how reducing ‘fake’ dopamine has the potential to improve our lifestyles, which I have done myself to try to improve my way of living.
I conducted an experiment at home seeing whether, if we as a family cut back on our ‘fake’ dopamine, it would improve our motivation, cognitive control and attention.
The rules could not be more straightforward:
1. No caffeine/alcohol
2. Only health food
3. No digital entertainment
4. 30 minutes meditation a day.
This experiment was conducted on my family and myself which lasted for one week. Before the rules came into play, we had one week of normality where we measured their motivation levels and attention to tasks. This was measured by asking a few questions each morning and evening before and after the rules were in play.
Before the experiment began, I made my predictions which were that, on average, our attention spans would increase, the temptation for digital entertainment would decrease and the urge for caffeine will significantly decrease for the people who drink coffee.
For the first few days I must admit that it was a little bit unusual but was not difficult. As it progressed, it became more normal and incorporated into my lifestyle.
Many companies and universities have studied whether meditation can actually improve our attention spans, and in our experiment it was time to see if it actually proves to be so. The Association for Psychological Science declared that, after a long period of time, they noticed in their experiments that the task performance improved in their subjects. This was not the only source. Kurtzman (2019) in her article showed that a 'digital meditation program significantly improved attention and memory in healthy young adults' within six weeks.
Some of the results from the experiment were that:
1. There was no longer a craving for sweet treats and caffeine
2. Afterwards everyone felt more relaxed than before
3. Some people felt more concentrated, and motivated after the experiment.
These are the intriguing results from the experiment showing that doing a little can change a lot.
Personally, I believe that the most interesting result was that the subjects longer needed caffeine after the experiment. Caffeine is a chemical which stimulates the brain and metabolism making you more aware. During the beginning of the experiment scientists could see that the people who drank caffeine everyday were mildly tired quicker than before, but, while progressing through the experiment these tables turned. After finishing, for a few weeks afterwards my father didn’t drink any caffeine because he just didn’t need it anymore.
The reality is that dopamine is a very important chemical which is vital for the survival of humankind. Depending on the way you receive your dopamine, whether being ‘fake’ or ‘real’, there are potential side effects mentally and physically.
References:
Please see Appendix on page 62.
Images: Bennett, T (2018) iPhone X beside MacBook. Available at:https://unsplash.com/photos/ iphone-x-beside-macbook-OwvRB-M3GwE
Editor: Mr Elkin-Jones
Itis as universal as it is complex: pain is one of the most profound feelings, it is often felt beyond words. Although universal and affecting millions worldwide, it is unique to each person: it can vary from the sting of a paper cut to the debilitating agony of chronic illness. Yet despite being universal, the phenomena of pain has been debated intensely, dividing philosophers, neuroscientists, and medical practitioners alike. Pain? Could it be just a trick of the mind? Or is it an important biological response with a critical evolutionary purpose? Nevertheless, it is a critical role in human life, alerting an individual to possible danger. This essay will present a description of the unavoidable experience of pain and arguments for and against it as an illusion. Concepts of biology and psychology will highlight the complexity of pain and its significance in the human experience.
To begin this discussion on the paradox of pain, it is essential to understand the physiological mechanisms that construct this sensation. Beneath the smooth unblemished surface of our skin, which appears simple to the naked eye, is a complex web of interconnected roots and channels that holds a network of nociceptors. These nociceptors can be activated by a variety of prompts: mechanical (knife cut), thermal (kettle burn) and chemical (acid). These then send signals through the peripheral nervous system to the spinal cord using nerve fibres. This introduces the ‘first pain’ – the quick reflex that draws you away from the danger. While this occurs, the signals are ultimately passed to the brain. Here, the signals are processed and interpreted, resulting in the perception of pain, known as ‘second pain.’ The result of the pain pathways and intricate network of signals can have variations in effect, from sensitivity and response to the differing pain thresholds between individuals (Murphy, J. 2023).
Neuroscientists have characterised pain into two categories; acute pain (which serves as a protection), and chronic pain (which is more complex and not yet fully understood). Chronic pain is different form acute pain as it persists after an injury has healed. This may change factors of your central nervous system that influence your tolerance to pain sensations. These ramifications highlight that, while the experience of pain is real to many and necessary in everyone’s lives, it consists of layers that cannot be solely influences by biological factors.
The sensation of pain is not entirely determined by biological processes. It has been proven that psychological factors play a significant role as well. This is demonstrated through the mind-body connection. It illustrates how someone’s emotional state can impact their experience of pain: for example, suffering from anxiety or depression. For instance, individuals suffering from anxiety or depression often report heightened pain sensitivity, evidently suggesting that psychological well-being is convolutedly linked to pain perception.
Lorimer Moseley, a clinical and research physiotherapist and Professor of Clinical Neurosciences at the University of South Australia, has dedicated the majority of his research into the brain's role in chronic pain. His concluding theory posits that pain perception is mediated by cognitive and
emotional factors, which serve as a lens through which the illusion aspect of pain can be examined. According to his perspective, pain is influenced by an individual’s expectations, and past experiences. For example, a person may feel a greater level of pain when anticipating a negative incident (Psychoneuro 2016).
For example: imagine you are in a park enjoying a sunny day; you accidentally graze your knee on the rough grass after tripping. Your body's somatic sensory receptors send signals through nerve fibres to your spinal cord and then to your brain, indicating that something has happened to your knee. Normally, this would just result in a mild sensation of discomfort.
Now, let us change the context: Earlier that week, you had a bad fall while playing football, and you remember feeling intense pain in your knee. This memory is recent and still current in your mind. When grazing your knee this time, the sensation is amplified because your brain immediately remembers your previous injury. The nociceptors in your knee heighten your perception of pain, sending messages to your brain. The negative past experience has caused your brain to emphasise the potential threat.
Even though the impact from the graze is minor and does not cause any real injury, your brain interprets the experience through the lens of your past pain. Therefore, instead of feeling a simple discomfort, you experience sharp pain in your knee. This situation showcases how your brain's memory and context can influence your pain perception. This proves that pain is often more about what your brain believes is happening than the actual state of your body (Psychoneuro, 2016). This theory suggests that the context and cognitive framing of pain can alter one’s perception, raising questions about the extent to which pain is 'real' versus constructed by our minds.
One of the most compelling pieces of evidence for the proposition that pain perception may be altered by psychological processes is the placebo effect. The placebo effect refers to the phenomenon where individuals experience physical improvements after receiving a treatment that has no therapeutic value, solely because they believe it might work. Research has proven that placebos decreased pain levels across diverse conditions, from headaches to chronic pain syndromes.
The mechanisms behind the placebo effect are complex neurobiological responses. Many clinical trials have shown that placebos can evoke genuine physiological responses that reduce pain, further suggesting that the experience of pain can be shaped by one’s expectations and beliefs. The release of endorphins along with changes in brain activity patterns relate to pain perception. This exemplifies the idea that psychological factors intertwine with physiological sensations, showing that pain can be formed by cognitive processes, regardless of if the underlying injury remains unchanged (LeWine, H. 2017).
Distraction is another key factor that can influence an individual’s perception of pain. This provides further evidence that pain may not always align with physical reality. There are various forms of distraction that can change pain—be it cognitive, visual, or auditory. Clinical trials have been shown to reduce the feeling of pain. Techniques such as going on a walk, listening to music, or puzzles shifts focus away from pain sensations and decreases the intensity.
The Gate Control Theory, proposed by Ronald Melzack and Patrick Wall in 1965, offers a perfect example of how distraction influences pain. It is a nuanced understanding of how pain is perceived in the body (Physiopedia, 2024). The theory suggests that the spinal cord holds a 'gate' that can either allow or refuse pain signals from reaching the brain. There are two distinct types of activity that play a role: C fibres that send pain signals, and A-beta fibres that carry non-painful sensory information. When a painful stimulus occurs, C fibres are activated and open the gate, allowing pain signals to be processed. However, the simultaneous stimulation of A-beta fibres can close the gate. This limits the feeling of pain (Wendt, T. 2022).
The 'Proof’ of Pain in Action: an Albanian Anecdote
Last summer, my friend Sarah and I went hiking in Albania. About halfway up the trail, she tripped over a rock and fell, scraping her knee painfully. As expected, she winced and clutched her knee. Then something surprising happened. Instead of standing there in pain, she laughed it off and at once started to massage her knee while joking about her clumsiness.
To our astonishment, within moments, she was back on her feet and ready to continue the hike. I asked how she was feeling, bracing for a complaint about pain. 'Surprisingly good!' she said. 'I guess rubbing it helped!'
Reflecting on that moment, I realised it was a perfect illustration of Gate Control Theory in action. By rubbing her knee, Sarah activated the larger nerve fibres that blocked some of the pain signals from reaching her brain. This not only distracted her from the initial discomfort but also empowered her to move past the minor injury.
Other psychological factors such as attention and emotional state can also affect this gating mechanism. Sarah’s experience illustrates that pain is a complex experience shaped by both physiological and psychological components.
The current crisis of opioid use presents a critical reason for these discussions of pain management. Opioid drugs were the most prescribed dependency forming medications in England in 2021/22 with 39.6 million items at a cost of £307 million (Stanford Health Care 2021). With the rise in opioid prescriptions to manage both acute and chronic pain, concerns have developed regarding the over-consumption on pharmaceutical interventions. This is increasingly alarming as the rise of addiction and other negative outcomes is increasing. This crisis emphasises the importance of a strong understanding of pain, that goes further than masking symptoms with drugs.
Exploring alternative pain management strategies is essential to combat this global issue. Considering the evidence that pain can be influenced by psychological and cognitive factors, non-pharmaceutical approaches— including cognitive-behavioural therapy, mindfulness, and lifestyle modifications—offer promising channels for reducing pain while minimising the risks associated with opioids. Researchers have created tools that align with the pain management theory. For example, Buzzy, developed by Pain Care Labs, is a tool that uses a dual approach of high-frequency vibration and cold therapy, to follow the gate control theory (Pain Relief Research. 2023). This works by stimulating the large nerve fibres into ‘closing the gate.’ This product does not only offer a safe, effective method to pain relief but also reduces the need for opioids. Pain Care Labs is conducting trials to assess Buzzy's effectiveness in managing various forms of pain, including low back pain, chronic pain, and post-surgical recovery. The main focuses of these studies are how Buzzy can enhance patient comfort during needle procedures, such as injections or blood draws. However, developing technology along with the research into the Gate Control Theory can open new doors into drug-free pain management for the future. Furthermore, acknowledging that pain is not merely a physical sensation but a complex process of factors allows healthcare professionals to develop more comprehensive treatment plans that address both the physiological and psychological background of pain.
In conclusion, the search for the understanding of pain as an illusion uncovers a completely multi-dimensional phenomenon beyond a simple biological explanation. Mostly, psychological factors and cognitive processes - as well as the context of the experience itself - hugely affect one's perception of pain. They do not, however, deny its existence or value. Pain plays critical roles in human life, in alerting an individual to possible danger and instigating action for self-protection. Continuing research will discover more of the amazing intricacies of the perception of pain that are rendering increasingly evident the fact that complete understanding of pain must consider the biological and psychological dimensions. It becomes urgent to build management strategies with a view to include pain in its entirety.
References: Please see Appendix on page 62.
stability. Overall, forming NATO was a big promise to stick together for security. It showed a belief that peace could be maintained when countries unite and support each other. With a formal agreement, the member states pledged to help each other in times of trouble (Apps, P, 2024). They made it clear that they wouldn’t leave any country to face threats on their own.
What is NATO's purpose in the modern era?
Collective defence is one of NATO’s top priorities. This is of upmost importance for NATO. It means that if one country gets attacked, all the member countries are under attack. This idea comes from Article 5 (NATO Online, 2019) of the NATO treaty. It makes potential attackers think twice. Even though the Soviet Union is no longer a threat, we face new challenges such as cyber-attacks, terrorism, and different types of warfare. NATO has changed how it deals with these new issues to keep everyone safe.
Firstly, we look at crisis management. NATO doesn’t just sit back when things go wrong. It steps in to help during wars or urgent situations. For example, NATO has supported areas like the Balkans and Afghanistan. The goal is to help fix conflict zones and restore normality. By doing this, NATO shows its commitment to maintaining order and preventing problems from spreading. Cooperative security is also important.
NATO also works with non-member countries too, such as South Korea and Australia. (Haglund, D.G. 2019). They join in on drills and share key information to stay safe. It’s not just about protecting NATO countries’ borders; it’s about stopping problems before they start. By working with nations in the Middle East and North Africa, NATO can tackle challenges that cross borders. This helps make the world safer for everyone.
Cyber defence is another key focus for NATO. This issue is huge in our modern world. With so much happening online, protecting against cyber threats is essential. Cyber-attacks can harm economies and major systems. NATO has started to take this seriously. They’ve created plans and set up a Cyber Defence Centre to help member countries find and respond to cyber-attacks. They also do regular training to prepare everyone for potential threats.
Counter-terrorism is another large focus for NATO. Terrorism is still a real danger (GOV.UK, 2024). NATO is dedicated to fighting it. They work on sharing information, tightening border security, and training forces to respond. By addressing the roots of terrorism, NATO aims to keep its members safe and support the global fight against this ongoing issue. Human security is also gaining importance for NATO. This idea focuses on people’s rights, equality, and protecting civilians in conflict zones. NATO wants to make sure its actions reflect these values. By prioritising human security, they hope to create lasting peace and stability in troubled areas.
Finally, NATO knows it needs to keep up with the times. The world is always changing, and so are the threats we face. To stay effective, NATO is constantly updating its strategies. This includes investing in new technologies and ensuring that member armed forces can work together. By staying current, NATO keeps its position as a key player in global security.
Is NATO still as important as it was 50 years ago?
Fifty years ago, in 1974, NATO was deeply involved in the Cold War. This was a time of strong tension between the United States and the Soviet Union. NATO’s main job was to stop Soviet attacks in Europe. The threat was real, and countries built up their military forces to prepare for any possible aggression.
After the Cold War ended and the Soviet Union collapsed in 1991, many wondered if NATO was still needed. Without a clear enemy, the alliance struggled to find its purpose. But NATO quickly shifted its focus. It started working more on crisis management and cooperative security. NATO got involved in conflicts in the Balkans, Afghanistan, and Libya. This showed that NATO could handle a variety of issues and was not just focused on country-to-country conflicts.
Today, we can see NATO differently. One key reason for its continued relevance is the rise of Russia. Russia’s actions, like taking Crimea in 2014 and invading Ukraine, have raised alarms about security in Europe. These actions make NATO important again as a way to deter Russian aggression, much like it did decades ago. The expansion of NATO to include Eastern European countries shows its commitment to collective defence.
Another important role for NATO now is dealing with non-traditional threats. Problems like cyber warfare, terrorism, and hybrid threats are more common today (GOV.UK. 2024). NATO has worked hard to develop strategies to tackle these issues. For example, they set up
the Cyber defence Centre of Excellence and are actively fighting terror globally. This shows NATO’s commitment to facing today’s security problems. NATO is also all about teamwork. It partners with countries outside the alliance and other organisations to keep the world safer. This shows that security isn’t just a national issue; it’s something we need to work on together. By cooperating with countries in the Middle East, North Africa, and Asia, NATO is helping to build a more stable environment worldwide. However, NATO has some challenges it needs to address.
There are disagreements among member countries, especially regarding how to share defence costs. Some European allies are under pressure to boost their military spending. Additionally, rising isolationist views and nationalism in some member states (notably the USA) threaten the core idea of collective defence. Despite these hurdles, NATO remains a key player in global security. It shows the ability to adapt and tackle new threats. Its focus on teamwork and the commitment to keeping peace is why NATO is still relevant today.
Conclusion
NATO is increasingly important today. It's been around for a long time and has changed with the shifting geopolitical climate. As Russia is getting stronger again, it shows how crucial NATO is for keeping countries safe. However, every member has its own internal problems to deal with and this can cause struggles to share costs or have different priorities. Despite this they have demonstrated that they can come together to tackle security issues and manage crises. As long as there are threats to democracy, NATO will keep working to keep the world power in balance. Their job is to help keep peace and show that they stand united. This mission matters just as much today as it did fifty years ago. Plus, NATO is all about protecting democratic values and human rights. This commitment makes them a key part of the global safety plan. In a world that keeps changing and getting more complicated, NATO’s role becomes even more necessary. They play a big part in keeping things stable for everyone. In conclusion, NATO isn’t just a relic of the past. It's here to stay and will keep adapting to face whatever comes next.
References:
Apps, P (2024) Deterring Armageddon: A Biography of NATO. Wildfire. GOV.UK (2024) Defending Britain from a more dangerous world. [online] Available at: https://www.gov.uk/government/speeches/defending-britain-from-a-moredangerous-world.
Haglund, D G (2019) NATO | Founders, Members, & History. In: Encyclopædia Britannica [online] Available at: https://www.britannica.com/topic/North-Atlantic-TreatyOrganization.
Minogue, K R (2000) Politics : a very short introduction. Oxford ; New York: Oxford University Press.
NATO (2018) Nato. [online] NATO. Available at: https://www.nato.int/. Ukraine War | Latest News & Updates| BBC News (2019). World News - BBC News. [online] BBC News. Available at: https://www.bbcnews.com/world/Ukraine.
Images:
Eignatik17 (2021) Map. Available at: https://pixabay.com/photos/map-voyagecartography-travel-6513914/ Pexels (2017) Cargo plane. Available at: https://pixabay.com/photos/air-force-cargoplane-aircraft-2178863/ Tumisi (2018) Cyber security hacker. Available at: https://pixabay.com/photos/cybersecurity-hacker-security-3194286/
Every Christmas, families across the United Kingdom take time out of busy Christmas schedules to enjoy a theatrical experience that comes around once a year: pantomime. A stage show full of brilliant tunes, audience participation, a touch of slapstick, elaborate costumes and sets, and a script filled with hilarious jokes and satirical comments.
From where does this tradition originate? How far back does it date? Are the pantomimes that we see today a continuation of the tradition that was intended? This essay will endeavour to answer these questions.
Going back to the sixteenth and eighteenth centuries, we can trace the beginnings of what we know today as pantomime, then known in Italian theatre as Commedia dell’arte. Alongside Theatre Haus (2021), Broadbent (1901) states that it is famed for its four different types of masked characters. There is first the Zanni, perhaps the most iconic, essentially the clown of the show. Most recognised is the Arlecchino (or Harlequin), known for wearing a fool’s cap, a diamond patterned costume and for having a mischievous nature. Next, the Vecchi- a miserable, old man, greedy and possessive of money, property, and women. Then the
Innamorati – a pair of young, upper-class lovers, naïve and with much to learn about the world. Finally, Il Capitano. This character was arrogant and self-obsessed, often bragging about military skill and expertise, and taking any opportunity to prove it.
Nicoll (1987) asserts that Commedia dell’arte was originally performed outdoors on a platform in the Italian Piazza (Town Square) before travelling across Europe, sometimes reaching as far as Moscow.
Even older than Commedia dell’arte, the modern-day Pantomime can be traced back to the Atellan Farce which was a genre characterised by the disastrous, disorganised and absurd, which originated in the Roman era.
a society change over time in the same way that pantomime itself has changed.
Mayer (2003) tells us that by the early eighteen-hundreds, the performance – now called pantomime because it was an ‘exaggerated mime’ containing no dialogue due to theatre licensing issues – relied upon stories mainly adapted from European fairy tales, just as they are today. The titles were generally dual named, most commonly with 'The Harlequin' being the first name since the Harlequin was the most important character. The second name would have been the name of the other principal character. This would give the audience an idea of what to expect from the show.
As the eighteen-hundreds progressed, children began to go to the theatre a lot more during Christmas and other holidays, a tradition that continues today. They loved to watch the madness of the ‘Harlequinade’ (a part of the pantomime that included spectacular special effects, a chase, and a lot of slapstick comedy) unravel. The main premise was that the Harlequin and his lover were running away from the lover’s father, whose pursuit was slowed down by his servant clown and a 'bumbling policeman'.
In 1837, towards the end of the ‘Harlequinade’, pantomime faced a decline in popularity but was still fighting to stay alive. After 1843, when licensing changed to allow theatres to perform plays with dialogue, the importance of the silent ‘Harlequinade’ began to decrease. As is stated by an artefact in the Victoria & Albert Museum, the importance of the fairy tale element of the pantomime increased thanks to two writers called James Planché and Henry James Byron. Their use of puns and humorous word play have become a convention of modern-day pantomime. In the late eighteen-seventies, Augustus Harris, manager of Drury Lane, produced and co-wrote a series of extremely popular pantomimes, that focused mainly on the spectacle of the show, rather than anything else.
By the end of the nineteenth century the ‘Harlequinade’ became just an epilogue to the pantomime involving a display of dancing and acrobatics. It lingered for a few decades but eventually fizzled out from the repertoire of pantomime. The last ‘Harlequinade’ was performed at the Lyceum Theatre in 1939.
Moving to the present day, the pantomime tradition is thriving. Between November 2024 to January 2025, the Oxford Playhouse staged Sleeping Beauty and I was
Lottie Graves, Third Form, Gascoigne House Scholar Supervisor: Ellie Lunn, Fifth Form, Swift House Editor: Miss Hutchinson
Imagine walking into a room full of strangers and seamlessly blending in, not just in appearance, but with your voice, your accent, the very essence of your speech, as it subtly shifts to mirror those around you. There are people who can do so, or at least films and books would have us believe so. This proposes questions around what factors would play into this change. These may include aspects such as the sole linguistics (the way the words are leaving your mouth) right down to the psychological elements of social dynamics that are the result of transitioning between communities. Is there a pattern to this transformation or is it unique to each individual? Consider your accent as an elastic band – as your environment changes, this band stretches and changes too. This also suggest the possibilities of an accent stretching so far it ‘deforms’ and never returns to its original shape, or perhaps even snapping in the way elastic band may do so. Through comparing circumstances in which an accent change may occur, and considering multiple aspects that may cause an occurrence, we hope to uncover the forces that mould our accents to be more like those of the people around us. Can our accents really be as active as an elastic band?
How is an accent first formed?
In order to understand how an accent may change, it is important to know how this accent came about in the first place. The process of an accent developing initially starts almost as soon as a child leaves the womb. We are immediately exposed to the speech of many different people, each with a slightly different accent and, as we learn to speak, an accent will develop as well. Babies are soon able to differentiate between various inflections and intonations in their parents’ or caretakers’ voices. For example, they may laugh when their father says something that they think sounds funny or cry when their mother raises her voice because, in order to comprehend what is going on around them, they are focused in on every little thing, from the shape of your mouth when you speak to the rising and falling of your voice (The Speech and Language Department in collaboration with the Child and Family Information Group, 2016). As an expert in early childhood learning, Patricia Kuhl (2015) stated that an infant can perceive the full set of 800 or so sounds at birth.
These sounds, or phenomes, can come together to make any word from any language. The brain of a young child is constantly working on articulating and distinguishing all these different sounds at first, but as they become accustomed to the specific phonetic habits of their native language, their brains start to favor those sounds. This means that, over time, they will lose the ease with which they translate sounds that are not part of the linguistic environment they are exposed to. For instance, a child growing up learning to speak English in the UK will become more familiar and comfortable with the specific cadences of that dialect and modulate an accent accordingly. As a result, this child might not be able to pronounce a harsher ‘ar’ sound in the same way a child living in the USA would, simply because their brain has decided that information is not necessary for communication. Consequently, they will not acquire the specific set of motor patterns for that accent. This often makes it difficult to transfer to a new accent, although linguistics and phonetics specialist Dr. Nicole Whitworth (2021) makes it clear that it is not impossible.
Once infants reach around six months old, the sounds they make are much more purposeful and controlled. They might even begin saying things that we think we recognize, or sound similar to the words that are being said to them or around them, such as ‘ma’ or ‘ba’. If the child likes how this sounds, they will repeat it. This stage is often referred to as ‘babbling’ or what you may know as ‘babytalk’ and is part of the process called language acquisition (Crystal, D. 2010). Despite it starting at such a young age, children do not complete the language acquisition stage until they are around six years old (Cambell, R and Wales, R. 2010). During this period, the immediate social environment the child is living in is of even more importance. The more the infant is exposed to a certain type of speech, the more they will imitate it. This is also why, even within small geographical areas, accents can be very different with local communities developing their own distinctive linguistic features and speaking habits.
Nature versus Nurture
When considering how an accent might adapt according to the person’s environment, it is important to regard the influences of both nature and nurture during this transformation. Essentially, nature refers to the inherited and heredity influences on an accent's adaptation, whereas nurture is encompassing the idea of environmental and social causations: our upbringing, surrounding culture or relationships (Cherry, K. 2022).
The influence of genetic predispositions on the reformation of an accent generally comes from the individual’s anatomy. Certain phonetic capabilities are inborn, determined by physiological attributes such as the shape and size of the vocal cords, mouth, and nasal passages (Dengler, R. 2019). In this, biological predisposition lies the very foundation of how a sound is articulated and perceived. For example, some may find that imitating different accents comes more naturally due to their heightened auditory discrimination ability, the majority of which is determined by genetics. Furthermore, up to 60% of personality traits, such as an openness to new experiences or social adaptability, are inherited genetically and may lead to an accent change (National Library of Medicine, 2022). Individuals with such traits can easily turn to a new accent or change their way of pronouncing in order to correspond to the standards of speech in the new society that they are living in. This can
be illustrated by the fact that children from multilingual families can often easily change their accent depending on the social situation they find themselves in throughout their lives.
Throughout their earlier years, children are more likely to pick up new accents if they are exposed to them frequently, partly because they are still in their language acquisition stage and thus have a more malleable accent. With continued exposure to other speech patterns, the pronunciations children use may still be shaped and reshaped as they grow. Nevertheless, the degree of elasticity to which an accent is able to change ultimately diminishes with age. Though it can be debated whether this falls under the nurture aspect of this argument as it is predominantly due to the stage of talking they are in, it is probably more accurate to categorise it as nature.
On the other hand, the evolution of an accent is greatly influenced by environmental causes and social interactions. To begin with, there are many psychological factors that affect a person’s attitude and willingness towards changing their accent. Primarily, this would be a desire to belong or impress someone in the new community, especially if they have received unwanted comments regarding their previous accent or it is associated with a negative stereotype. To be excluded from a society, or feel as if that is the case due to your accent is difficult, and the likelihood
receptive to new sounds. As one grows older, the ability to adapt the accent diminishes but is never fully eradicated. And it is through social factors - to fit in or avoid possibly negative stereotypes, for example – that cause the reasons behind why and how our accents change really become more apparent.
This, therefore, places the elasticity of any particular person's accent in his or her hands as a product of compendium variables from nature to nurture. That intricate yet delicious recipe of nature, nurture and all the other complexities that make us who we are today ensures our accents will forever offer up the beauty, richness, and diversity within human communications.
References:
Bard, S (no date) Linguists Hear an Accent Begin, Scientific American. Available at: https://www.scientificamerican.com/podcast/episode/linguists-hear-an-accent-begin/ Accessed: 1 September 2024
Bryson, B (1991) Mother tongue. Penguin Books.
Cambell, R and Wales, R (1970) The Study of Language Acquisition. Penguin Books. Crystal, D (2010) A Little Book of Language. Yale University Press.
Cherry, K (2022) The Nature vs. Nurture Debate, Verywell Mind. Dotdash Media, Inc. Available at: https://www.verywellmind.com/what-is-nature-versus-nurture-2795392. Accessed: 29 January 2025
Dengler, R (2019) The shape of your mouth affects how you talk and gets amplified across generations, Discovery Magazine Available at: https://www.discovermagazine. com/planet-earth/the-shape-of-your-mouth-affects-how-you-talk-and-getsamplified-across Accessed: 4 January 2025
Harrington, J et al (2019) Phonetic change in an Antarctic winter, The Journal of the Acoustical Society of America, 146(5), pp. 3327–3332. Available at: https://doi. org/10.1121/1.5130709. Accessed: 29 January 2025
Kuhl, P K (2015) How Babies Learn Language, Scientific American. Available at: https:// www.scientificamerican.com/article/how-babies-learn-language/. Accessed: 24 November 2024
Language Acquisition Part 5 (no date) web.mnstate.edu. Available at: https://web. mnstate.edu/houtsli/tesl551/LangAcq/page5.htm. Accessed: 5 January 2025
Loeffler, K (n.d.) When do babies start talking?, Children's Health Available at: https:// www.childrens.com/health-wellness/when-do-babies-start-talking Accessed: 29 January 2025
McWhorter, J (2016) What’s a Language, Anyway?, The Atlantic. Available at: https:// www.theatlantic.com/international/archive/2016/01/difference-between-languagedialect/424704/. Accessed
National Library of Medicine (2022) Is temperament determined by genetics?: MedlinePlus genetics, medlineplus.gov. Available at: https://medlineplus.gov/genetics/ understanding/traits/temperament/ Accessed: 4 January 2025
The Speech and Language Department in collaboration with the Child and Family Information Group (2016) Speech and language development from birth to 12 months, GOSH Hospital site. Available at: https://www.gosh.nhs.uk/conditions-andtreatments/procedures-and-treatments/speech-and-language-development-birth12-months/. Accessed: 3 January 2025
Uniyal, P (2023) How FOMO might lurk behind teen social media anxiety, 6 October. Available at: https://www.hindustantimes.com/lifestyle/health/how-fomo-might-lurkbehind-teen-social-media-anxiety-101696598418140.html. Accessed: 6 January 2025
Whitworth, N (2021) Carnegie Education ‘Where do accents come from?’ Available at: https://www.leedsbeckett.ac.uk/blogs/carnegie-education/2021/06/where-doaccents-come-from/. Accessed: 24 November 2024
Images:
Jerjer (2024) Colourful array of international flags. Available at: https://www.pexels. com/photo/colorful-array-of-international-flags-29303947/
Jhusemannde (2015) Rubber band multicoloured. Available at: https://pixabay.com/ photos/rubber-band-multicoloured-red-green-890168/
PublicDomainPictures (2013) Happy motherhood. Available at: http://pixabay.com/ photos/mother-baby-happy-motherhood-84628/
Molly Sheer,
Clarke Scholar - Scholar Supervisor: Emmeline Black, Fifth Form, Feilden House Editor: Miss Hutchinson
Music has changed dramatically in the last 125 years – from Edward Elgar’s rousing Land of Hope and Glory (1901), an expression of Britain’s greatness, through to Charlie XCX’s Apples (2024) which created a highly trending following as well as popular choreography throughout 2024 and gained 'Best Pop Solo Performance' and an award at the 67th anniversary of the Grammy awards. From the groundbreaking jazz of the 1920s to Rock Around The Clock (1955) as the first rock and roll song to top the charts, right on through to the first video to appear on MTV in 1981 (Rahsheeda, 2013) music continues to evolve and change.
In modern day, music has seemed to attract the influential and ‘trending’ side of the internet. Whilst those songs tend to grab the headlines, this article will look at less well-known musical developments in this period. Whilst no less important, these developments also give a hint at the ongoing and evolving importance of music.
On 28 July 1914, the First World War broke out. Lasting a remarkable four years, three months and two weeks it fundamentally changed the course of music in the twentieth century. Three eras of music had ended: Romanticism, Symbolism and Expressionism - The First World War had marked the end of what has been referred to as the ‘long nineteenth century’ in music history. Songs such as the upbeat It’s a Long Way to Tipperary saw soldiers
marching off to war. Vera Lynn’s We’ll Meet Again, released in 1939 at the start of Second World War, saw a very different change in mood. Music and war have always been intrinsically linked, from bugle calls to soldiers’ songs.
Music has always had the power to embody diverse cultures. With over 3,814 distinct cultures recorded by anthropologists (although this is thought to be a major underestimate), music has the potential to represent people in a way like no other. Cultures can be represented by their own specific type of music, art forms, languages, and literature, but increasingly in the Twentieth Century we have seen an increase in the fusion of styles.
For example in Argentina, where music can be of sub-Saharan African origin. Styles such as flamenco and contradanza are the product and in 2004, the dance competition, Strictly Come Dancing brought this music to a modern-day audience. In addition, in America where a considerable number of different styles of music originated - pop, country, jazz, blues, rock and roll, soul – it also continues to celebrate the fusing of styles: for example, Beyonce’s Texas hold ‘em incorporates country. It also continues to welcome outsiders, such as the The Rolling Stones, who have been active for more than 60 years and who initially found inspiration from American Blues.
People have immensely varied lives, and subsequently choose a specific genre or style of music to listen to. This can be influenced by a number of things, chiefly driven by personality traits, with peoples typically gravitating towards music that is compatible with their culture, personalities, age and mood. In addition, sometimes we want music that transcribes our emotions or current state.
For example, significant life events, such as weddings or funerals, offer opportunities to express emotions where otherwise individuals may struggle to articulate how they feel, whether showcasing their endearment to another person, or humming a tune of a treasured song, treasured by someone who is no longer present.
In the early twentieth century, brides traditionally walked down the aisle to Wagner’s Bridal Chorus (composed in the late nineteenth century). A recent study analysing 2,000 wedding- themed playlists on Spotify (totalling 49,091 songs) discovered I Wanna Dance with Somebody by
Whitney Houston as the most popular wedding song of the twenty first century.
Where once funerals were traditionally sombre affairs, families now take solace in music which represents their thoughts and feelings more closely. Abide with Me by Henry Francis Lyte (written in 1847) can often be heard alongside music by Adele, Oasis, Eric Clapton to name a few.
The Healing Properties of Music
Music can be an effective cure for stress and anxiety issues – both of which are discussed far more openly in the twenty first century than ever they were before. However, stress and anxiety are not the only things that music can help with. Research suggests that cases of autism and neurodiversity have almost doubled in the past ten years, and thankfully music has a role to play here too.
Music has been scientifically proven to help people’s mental health. It can improve cognitive performance, reduce stress, eat less and helps your memory. The number of young people who have lost social health is increasing fast. There is a sudden surge in young people having online addictions and hiding themselves away from the real world which can result in overstimulation, depression and can even slowly draw them away from close bonds.
Music is something that can help get away from screens. Playing an instrument, listening to someone else play, or even just putting on the radio makes for a mentally healthier lifestyle. Music is something that can influence one’s personal concentration: what you listen to can potentially shape your day-to-day lifestyle and vice versa. Some research shows that the popular Ariana Grande song, Breathin can help with anxiety attacks, telling you to just breathe. She is joined by many other artists such as, the Beatles, Florence & the Machine, Rhianna, Shawn Mendes, Demi Lovato, Ed Sheeran, Anne-Marie, Avril Lavigne, and Coldplay, whose music help with the management and challenges of living with anxiety and associated disorders.
The developments of the last 125 years have made music far more accessible to ordinary people. Incorporating many different styles, music can reach every different class and grouping of people. This trend is only likely to continue in the years ahead.
The biggest change, at least in the short-term, is likely to focus on the world of artificial intelligence as a way of creating music. We have already seen rapid progress in this area. As far back as 2016, researchers at Sony were working on AI-generated music and made the first-ever AI-created pop song called Daddy’s Car. Where this will lead us only time will tell.
References:
Please see Appendix on page 62.
Images:
Mikhail Nilov (2021) Man in White Shirt Playing Acoustic Guitar. Free Stock Photo. [online] Pexels. Available at: https://www.pexels.com/photo/man-in-white-shirtplaying-acoustic-guitar-6932592/ [Accessed 26 Jan. 2025].
Ylanite Koppens (2018) Music Notes. Free Stock Photo. [online] Pexels. Available at: https://www.pexels.com/photo/music-notes-934067/.
to define, since some linguists may view two dialects as two separate languages, while many associates might argue that they are two entirely independent, albeit close, languages. Politics can entangle matters too, in which one country may ‘appropriate’ the language of a neighbouring country, with the intention of enlarging its size and status. However, to differentiate between a language and dialect in a simpler way, imagine we have two people who are person A and person B that only know their own language. If person A and person B could speak the same language in different dialects then they would understand each other. However, if person A and person B need a translator in order to understand each other then they are speaking two separate languages. Therefore, a dialect identifies what region of a country you are from, but a language identifies which country you are from.
Language isolates are those which do not have a connection with the 'rest of the language branch or the complete family' (Nikolic, 2021). Our world is hugely diverse, with around 7,400 languages in use today, most of which are grouped into language families. The concept of families is based on the assumption that dialects can evolve into individual but related languages over time. This can be described in terms of a language tree, where the proto-language such as ‘Germanic’ forms the main ‘trunk’, and this becomes divided into language ‘branches’ such as West, East and North Germanic, showing the languages that have relatively recently separated from the proto-language. Over time, the tree will become more dense as new dialects develop from the branches. Although some branches will fall off after a while, with estimates that around 30 percent of all language will diminish by the end of the century and become extinct, other branches may form into a new language family of their own. However, there are still efforts being made to revive some of these languages into everyday use.
Contrary to this, there are around 200 language isolates, examples of these are Armenian, Albanian and Greek. Although they have been identified as part of the Indo-European language family, the three languages are separate branches with no connection to other parts of the Indo-European family. There are also language isolates for which no connection can be established with any language family. These include the Basque language, which is used today in part of the Basque Country within Spain and France, the Burushaski language in northern Pakistan, the language of the Ainu people in Japan and the ancient Sumerian language. Despite the great efforts of many intellectuals, there have been no connections found between these languages, or any other living languages discovered. On the other hand, assumptions have been made about connections with some extinct languages that do exist. For example, some linguists believe that the extinct Aquitaine language was either an ancestor or a relative of the Basque language. Language shapes our identity because any English speakers could travel to English speaking countries and communicate with them but people belonging to a language isolate such as the Basque region, roughly 900,000 speakers, would only be able to communicate in their mother tongue with people in that area.
It can be argued that it is not just our first language that identifies us. This can also come from the clothes we wear, our physical characteristics, the things we love to do in life, our religion, age, gender and nationality. In cases where you travel abroad, your passport is a type of identification. If you live in another country for more than three months, you are required by law to have a visa or some sort of travel permit that identifies you. Also, it is important to consider when you move to another country and need to learn a second language. For example, in order to become a US citizen, most applicants must demonstrate proficiency in reading, writing, and speaking English, by passing a complicated set of tests. Living in another country and learning an additional language can be different for everyone depending on the complication of the language and individual experiences. It may be that your family moved to another country and learnt a new language, or that you are bilingual, shaping your identity.
In conclusion, this essay has discussed key areas of language and identity in order to understand the concept and has argued that language is one of the most important considerations for identity. This is because it is a way of communicating and expressing ourselves. It helps us establish new relationships and allows us to co-operate with other people. Our language speaks louder than the clothes we wear, our hobbies or physical features and is fundamental to our identity.
References:
Byram, M. (2006) Languages and Identities Michael BYRAM PRELIMINARY STUDY Languages of Education. [online] Available at: https://rm.coe.int/preliminary-studylanguages-and-identities-intergovernmental-conferenc/16805c5d4a. Clarke, D (2022) How Many Countries Speak English (All The Continents Listed). [online] Anderson Institute. Available at: https://www.andersoninstitute.com/how-manycountries-speak-english/ Nikolic, Z and Books, C (2021) The Atlas of Unusual Languages: An exploration of language, people and geography. HarperCollins UK
Images:
GDJ (2023) Fingerprint. Available at: https://pixabay.com/vectors/fingerprintpsychology-brain-mind-7900100/ Roc Sun (2021) Traditional dance. Available at: https://unsplash.com/photos/ NZcCDKK5DYw
Scholar Supervisor: Sam Weldon, Fifth Form, Harcourt House
Editor: Miss Hutchinson
Martin Van Creveld is a Dutch late twentieth to early twenty-first century military strategist with a strong outlook on the vitality of technological prowess and author of over twenty books on the subject. I have always agreed with Martin Van Creveld’s views on technology, however technological advancement and its effects is a widely debated topic. This article will discuss the consequences and outcomes of technological superiority in conflicts within the twentieth and twenty-first centuries to understand the role that technology plays in warfare.
When looking in the past we can see some clear-cut examples of the product of technological supremacy. At the beginning of the twentieth century was a war that changed everything that was known about warfare. The First World War was a clear example of the influence of technological primacy and was the first extensive war with three fronts. Bloodshed on land, fire in the seas and spies in the sky. On all three fronts the technology had come a long way since the last major war. On land, equipment such as the slow, inconvenient bayonets and muskets were replaced with more efficient rifles and machine guns. Artillery was modernised and used on a larger scale replacing the bygone era of cavalry with its own new explosive nature. Alongside this was artillery’s close relative the howitzer, a splice of cannon and mortar which fired a 15-inch shell with an effective range of seven thousand meters (Ray, 2019). In the oceans the navy had rationalised switching
from weak and brittle wooden hulls and futile cannon balls to industrial iron hulls and destructive turrets which had also like artillery moved on to utilising shells. The old ship designs were replaced with the central focus of the navy the dreadnought, a colossal 160-meter-long steel ship with a crew of over 800 armed with twelve-inch guns, all powered by steam turbines giving it an impressive speed of 21 knots (Britannica, 2019). Finally in the sky, the First World War would showcase the first aerial combat, with technology had not been seen before. The two main categories of airships were fighters and Zeppelin bombers. When war was declared Germany had ten Zeppelin bombers, these were enormous airships inflated with hydrogen with a smaller deck below to operate from. However, due to their size and vulnerable design they lost three in the first month alone and were considered impractical before later being scrapped (Guilmartin, 2023). On the other hand, the fighter composition was far more applicable. Their original design was undefended and
could only support the pilot, so their purpose was directed towards reconnaissance. However, by the time the war had erupted they were equipped with a sizeable and potent engine allowing for the addition of fixed firing machine guns (MacIsaac, 2019). These technologies would be vital to the success of the victor and would play a massive part in all aspects of the war.
Swift firing artillery and machine gun fire merged with deep and intricate trenches lead to an almost impenetrable defence. This was most present on the western front where a stalemate had emerged causing a deadlock that was foreseeably insurmountable. However, technology provided the answer: the tank. The first tanks were fundamentally a heavily armoured vehicle equipped with a high-calibre machine gun. They were slow but not easily pierced and could effectively breach the German lines that were once considered impenetrable, which was crucial and responsible for greatly shortening the duration of the war (Ogorkiewicz, 2019). Tanks were first used on 15 September 1916 by the British at the Somme. They granted the cover needed for the infantry to cross the past slaughter fields of no-mans-land, and capably rush the German trenches that had menacingly stood for so long (Ogorkiewicz, 2019). This was one of, if not the most pivotal element to the Germans inevitable downfall.
However, this was only the dawn for the tank era, which steers us to the next major conflict The Second World War. It remains the single most bloody and ruthless war to this date, nonetheless this was the war where technological evolution was thrust to the forefront. In The First World War, tanks were predominantly utilised by the Entente. Despite that, Nazi Germany recognised the mass potential within the tank’s philosophy. They possessed a vehicle with the capacity to breach defending forces, but they then considered, what if this machine of war was swift too? They believed that the effects of a fast-moving highly
durable unit which contained the required firepower and vigour to simply force a breakthrough would be more than enough to tip the balance in any upcoming war. So, in 1934 they launched the construction of the Panzer which would eventually bear fruit when the war broke out (Ogorkiewicz, 2019).
'The technology in use helps condition tactics, strategy, organization, logistics, intelligence, command, control, and communication'
Martin Van Crevald
Due to the expansion of armoured importance, there was a demand for equipment that could rival the new reinforced threat, consequently leading to the birth of close air support or CAS. Skimming along the battlefield surface, novel high speed aircraft could augment the explosive capability required to successfully destroy the target. When war was declared on 3 September 1939 these two technologies merged would create the infamous Blitzkrieg. This, in English, meant 'lightning war', using speed and superior firepower to effectively paralyse the allied armies, before subsequently overrunning them, often generating encirclements which would shortly become the Nazi’s trademark move. Following the fall of the Polish government, and the occupation of Denmark and Norway, Germany turned its eyes to France. With Germany holding the technological trump card the allies didn’t stand a chance: 'In just over six weeks, German armed forces overran Belgium and the Netherlands, drove the British Expeditionary Force from the Continent, captured Paris, and forced the surrender of the French government' (Hart, 2022). Due to this unmatched technological might they were able to do the impossible in record time.
As the war progressed, the allies produced and empowered this new doctrine of warfare. Increasingly, advancements in adjacent fields of technology (for example, the deciphering of the Enigma code and the development of the radar positioning system) would eventually lead the Allies to claim technological superiority and therefore turn the tide against the Axis. However, even following the triumph over Nazi Germany, in the east still loomed a dwindling power - The Empire of Japan that was willing to fight to their final breath. To conclude the war, the Allies understood that they had to embark on a brutal island hoping campaign that would subsequently claim copious amounts of lives, yet again technology produced the answer: the atomic bomb.
The famed physicist Dr. J Robert Oppenheimer had been experimenting in the field of atomic research and was tasked with creating a new breed of weapon this was known as the Manhattan Project. This was successful, with the creation of the atomic bomb in 1945. The bomb was unleashed upon the Japanese provinces of Hiroshima and Nagasaki leading to a quick surrender of the Japanese Emperor saving countless lives (Britannica, 2024). These new technologies substantially altered the events of the most extensive war to ever be fought.
Realising the revolutionary technology that had been launched by the United States, the additional major powers of the post-war era raced to fashion atomic weapons of their own. Only four years subsequent to the climax of The Second World War, the Soviet Union detonated their first nuclear bomb, with the nations of the United Kingdom, France, and China soon to follow suit (Zeidan, 2024). This soon led to a balance and stalemate that was known as mutually assured destruction or MAD. MAD was a phrase for the principle that if a superpower committed a
nuclear offence on an alternative superpower both would be completely demolished beyond the point of recovery (Britannica, 2019). For the duration of the cold war, this remained accurate, and would go on to mould the face of conflict for the next 40 years with superpowers never engaging directly, solely clashing through proxy wars in countries such as Korea and Vietnam. Today, the Russo-Ukrainian conflict marks the closest the world has come to direct confrontation between major powers since the end of the cold war in 1991. The United States is annually dispatching billions of dollars worth of equipment to support the Ukrainian attempts: 'Ukraine has become far and away the top recipient of US foreign aid. This marks the first time that a European country has held the top spot since the Harry S Truman administration directed vast sums into rebuilding the continent through the Marshall Plan after World War II' (Merrow, 2024), which takes us to the vast and plentiful impacts and effects of technology in the modern day.
Of course, Ukraine is not the only war that is ongoing in the present day. However, it is the optimal example when it comes to the significance of technology in the modern era, therefore it will be the primary focus of this period. The war in Ukraine has exhibited a variety of diverse weapons, among these have emerged four leading categories. Artillery and conventional weaponry, missiles and air defences, drones and aerial expansion and finally the implementation of artificial intelligence (AI) and cyberattacks. Firstly, artillery is vital for all combat, whether it is safeguarding the defence or striking a crushing counter offensive, artillery is at the heart of it all. Heavy artillery batteries are positioned behind lines of infantry and unleash thousands of rounds per day counting for about 80 percent of the casualties on both sides (Boot, 2024). Missiles have been primarily operated by the Russian military for the duration of the war, with frequent air strikes on both military and civilian targets, for example Russia has been consistently targeting Ukrainian power plants. As a consequence, Ukraine and its supporters increased the number of anti-air defences stationed around key facilities and areas to attempt to nullify these strikes. As a prime illustration of technological progress decisively swaying balance, Russia has escalated the war by firing a novel hypersonic missile which can travel at such a speed, at their target, that they are impossible to counteract with Ukraine’s existing defences, rendering their system futile. Ukraine has recently retaliated by launching American and British advanced missiles into Russian soil, nevertheless this will only justify the utilisation of hypersonic missiles and ultimately hurt their cause (Beale and Walker, 2024). Both these technologies have been around for decades yet they are still shaping the identity of conflict.
Modern drones have been an essential part of both participants' armed forces overseeing a plethora of important tasks and missions. Their models range from small, unarmed cameras to larger more powerful military grade classes. Their superior ability to loiter has enabled them to perform regular operations with pinpoint efficiency, from supplying ground forces to reconnaissance of enemy operations even to further delivering explosives
Henry O'Brien, Lower Sixth, Harcourt House Editor: Miss Hutchinson
Drinking: it can be a double-edged sword. On one hand it can be something to enjoy socially, which helps people have a good time with friends or family due to its ability to ease most drinkers into a relaxed state. However, drinking can also have incredibly detrimental effects on both mental and physical wellbeing - it is addictive and leads to people losing control of themselves if they have consumed too much. This is particularly the case in young adults or teenagers as they do not know, or are not experienced enough, to realise the amount they are consuming. So, what is the solution? We could increase the legal drinking age from eighteen to 21, as it is in the US. Not only will this give the young minds of teenagers physically longer to mature, hence not imposing as much damage on their brains, but it would also allow them to be mentally more aware of the effects alcohol has on them and how much they should drink. This could reduce the burden drinking places on the NHS. It could also lead to fewer drinking and driving issues and less alcohol related crime. There could, however, be a backlash from the public and accusations of 'babying' young people too much. Moreover, unless there are stricter punishments for underage drinking, this change will not accomplish much as young people will still drink despite knowing the new rules. Therefore, stricter controls need be placed on vendors of alcohol, and young people educated about the consequences of drinking too much, too young. Changing the legal drinking age would send out the right message – drinking is a very adult activity with adult consequences.
The Problem – and a Possible Solution?
A case study carried out by drinkingaware.co.uk showed that in 2021, 42 percent of girls and 39 percent of boys aged eleven to fifteen reported having had an alcoholic drink. This indicates that underage drinking is a problem which needs to be fixed – not only because of its long-lasting effects mentally but also physically.
The effect alcohol has on mental and physical well being Did you know that alcohol affects a young person’s brain differently to the adult brain? Exposure to alcohol while the brain is still developing can lead to physical issues such as weakening the immune system, allowing young teenagers to get sick more frequently. This being highlighted by Alcohol Think Again (2023) Impact of Alcohol on the Young and Teenage Brain
Drinking can lead to long-term problems including difficulty with learning, planning and memory. Other mental effects include persistent changes in mood, including anxiety and irritability. Furthermore, Alcohol Change UK determines that young teens who drink alcohol have a higher likelihood of developing depression and other underlying health problems. These problems can be things like stress. Many people turn to alcohol as a quick fix to reduce stress. Alcohol affects the chemicals in your brain – slowing down (depressing) how your brain and central nervous system function. It affects the part of your brain that controls inhibition (the process of restraining your impulses or certain behaviours because of factors such as your morals or lack of confidence). This is why after a drink or two you may feel less anxious and more confident, or ‘lose your inhibitions’.
In the short term, you may feel more relaxed. But, in reality, having a drink is often used as a distraction from dealing with what’s causing you stress or anxiety. This demonstrates that alcohol can have both permanent and catastrophic effects on someone’s life, proving that
although eighteen may be considered an adult, it is not old enough to handle the effects alcohol has on the human body. Hence, the legal drinking age needs to be changed.
However, one main argument against increasing the legal drinking age from eighteen to 21 is that there would be a public backlash. Some may complain a rise is unfair and unjust – preventing sensible people from having just a few drinks on a night out. As previously said, alcohol can be enjoyable to consume as long as people are aware of how much they can safely drink.
There is perhaps a perception that controlling what people do to enjoy themselves on their own time with their own money is moving towards a 'nanny' state. Subsequently, The public would not agree with this rule change and would protest against the government for enforcing such a rule, which could disvalue the argument that the legal drinking age should be changed as although it would be beneficial to many people across the country, the backlash the government could receive may be too strong.
On the other hand, another reason why the drinking age should be increased to 21 is due to the number and nature of crimes that are committed by teenagers who are under the influence of alcohol. People under the influence are far more likely to take part in crimes involving violence and driving. For instance, a survey by forbes.com found that 4.6% of teenagers admitted they had driven after drinking within the past 30 days and fourteen percent of teenagers freely admitted that they had been driven in a car with a driver who had been drinking. Not only does this show the lack of maturity in teenagers, but it also shows they do not know the
risks associated with driving having consumed alcohol. Driving whilst under the influence of alcohol is a crime. In 2021, 13,284 people were killed in drunk driving crashes, an increase of fourteen percent over the previous year. This statistic shows that drinking alcohol is one of the main causes of deaths each year and alcohol consumption should be kept under control.
Furthermore, increasing the legal drinking age could help reduce binge drinking in the UK. Binge drinking is defined in males as someone who exceeds eight units of alcohol on their heaviest drinking day and in females as women who exceed six units on their heaviest drinking day. The Department of Health states that both men and women should drink no more than fourteen units of alcohol of week. A World Health Organisation report found teenage girls in the UK are some of the worst binge-drinkers in Europe. England, Wales and Scotland were placed in three of the top six places in a drinking league table of 36 European nations for adolescent girls.
Increasing the legal drinking age would make it harder for these girls to obtain alcohol and, in time, make it less socially acceptable for teenagers to drink to excess.
Be that as it may, increasing the drinking age could also have a dramatic impact on the economy. This is because many drink manufacturers and distributors, not to mention pubs and clubs, might oppose the rise – as it would directly affect their businesses – fewer people drinking means less alcohol sold. The drinks market makes up a huge proportion of the UK GDP as alcohol, being a demerit good, is expensive and plays a huge role in increasing the government’s budget. Increasing the legal drinking age would affect that figure in a time of economic hardship, especially as the UK economy is only just getting back on its feet and economic growth is increasing once again.
Furthermore, bars and restaurants, and indeed the hospitality industry generally, is still reeling from the Covid pandemic. This measure may potentially mean that more pubs, clubs and businesses would become obsolete as they would not gain as much profit due to a decrease in customers.
more seriously and alcoholic drinks themselves should be controlled to a greater extent, as it is simply not good enough to endanger other people who are truly in need of medical attention but are not being helped as someone couldn’t handle their drink.
The burden on the NHS has become so notable that people are starting to question our current laws. A survey carried out by Priory, the mental healthcare and addiction specialists, showed that half of adults support raising the UK’s legal drinking age to 21 to help reduce the burden on the NHS and society at large of underage drinking.
In fact, the Priory poll found that 56% of adults think the UK legal drinking limit should be 21, the same as it is in the US. A third (34%) disagreed with this opinion. Support for raising the drinking age to 21 rose to 65% among those surveyed who lived in London.
We can clearly see that there are others who share the same opinion, exhibiting that it is simply not good enough to put the hard-working NHS workers under constant stress when it can be easily avoided.
Another alternative argument could be that even if the drinking age is increased, it may not stop underage drinking. A lot of young people tend to drink in private, with friends, or even in semi-public places.
To prevent this, we need to ensure there are stiff penalties and fines in place for those for provide drinks to under-age customers – that is to catch both pubs and clubs who serve customers directly and commercial organisations such as supermarkets who sell alcohol to young people. There should be penalties for drinking in a public or semi-public place.
More than anything, we need to try to change perceptions – it should not be socially acceptable to see groups of teenagers out drinking at the weekend – or indeed at any time.
However, although the change in the legal drinking age would be negative for the economy, the benefits that it would have for the NHS cannot be overlooked. The already over-stressed and under-funded NHS has to deal with the results of drinking – be they mental or physical, on a regular basis. If the drinking age were increased, hopefully the number of cases which the NHS had to deal with would reduce and dwindle to a lower
We have all seen 'binge buses' in our city centres – why are paramedics and doctors having to spend weekends trawling the streets to rescue drunks when the money spent to enable them to do so could be used to treat cancer patients or provide exceptional palliative care? This corroborates that excess drinking should be taken
The legal drinking age needs to be changed. Eighteen years old is not old enough for a person to be legally allowed to buy alcohol. The effect alcohol has on teenagers is frightening – and the impact it has our wider society is huge: on the NHS, on crime, and on families affected by alcohol use. Not only does alcohol affect teenagers mentally and physically, but its long-lasting effects can change a person's life for good, so much so that they may not be able to recover. Increasing the legal drinking age to 21 can not only give people time to grow better physically without alcohol impacting their brain but also help them mature so they make better decisions, benefitting not only themselves, but the people around them. This is why the drinking age needs to be changed to better our society.
References:
Please see Appendix on page 63.
Images: Gomes, G (2021) Clear drinking glass with beer. Available at: https://unsplash.com/ photos/clear-drinking-glass-with-beer-Qy2KMPRV3X4
Clarke Scholar - Scholar Supervisor: George Keates, Upper Sixth, Vanbrugh House
Editor: Mr Elkin-Jones
For centuries, scholars and philosophers have debated the philosophical question that I have chosen to explore further in this article: To what extent is free will an illusion? In this article I will investigate the three different types of Determinism, which is a belief system that believes we don’t have a choice over our actions, as well as Determinism as a whole. In stark contrast, I will be exploring the idea that we have complete control over our decisions (free will), and will be sharing argument, reasons and explaining the topic as a whole, from many different standpoints, including a biological one.
Determinism as a whole is the philosophical view that all actions in the universe are inevitable. However, there are different severity levels certain people believe in the topic of determinism. Some people take their view from a philosophical standpoint and believe that all of our actions are predetermined. According to Oxford Languages it is 'the doctrine that all events, including human action, are ultimately determined by causes regarded as external to the will' (Oxford Languages, 2025). Some philosophers have taken determinism to imply that 'individual human beings have no free will and cannot be held morally responsible for their actions' (Oxford Languages. 2025). This suggests the notion that free will could well be an illusion and therefore is worthy of further exploration. Nevertheless, some people believe that certain choices can be affected. An example of this, from a geographical standpoint, could be that natural disasters or climate can affect an individual’s choices, such as an increased risk of well-being and the risk of losing some agriculture because of some harsh climates.
In contrast, free will is the ability to make your choices independently, free from predetermination. It allows individuals to act according to their personal desires, beliefs, and reasoning, shaping their unique paths in life. This concept is central to moral responsibility, as it assumes that people have control over their decisions and actions. For example, choosing a career path is a clear example of free will. A person might decide to become a teacher because of their passion for helping others, even if they are pressured or financial gains suggest pursuing a different field. This choice reflects their inner values and demonstrates the power of free will in shaping individual lives (McLeod S, 2023).
There are three main types of Determinism: Biological, Environmental, and Psychic.
Environmental, or External Determinism, is a parental influence, the media, or school. For example, Biological Determinism posits that if our parents are aggressive, it is most likely that you will be aggressive. This also means that behaviour should, in theory, be predictable. From an Environmental standpoint of Determinism, it is the factors of your climate and natural disasters that will shape how individuals and groups behave. For example, regions with extreme climate often pose a threat to agriculture and
trade, such as in the Sahara Desert, where settlement is significantly impacted and shaped by these factors. Natural disasters also pose a threat to societies by causing destruction, which can lead to migration (McLeod S, 2023).
The second type of Determinism is biological, also known as Internal Determinism. This is a perspective of Determinism from a biological standpoint. This is shown by the genetic inheritance that is inside each person. Biological Determinists believe environment factors have no influence over a person. Biological determinism is the theory that biological factors, such as genetics and physiology, fundamentally shape human behaviour and social structures. This perspective often suggests that social differences, including class, race, and gender roles, are rooted in inherent biological traits (Fiveable, 2025). For example, research has shown that a particular gene (IGF2r) from implicated intelligence, there are many more genetics that contribute to human ‘superpowers’ and this is the biological evidence that shows us that some people are genetically more advanced, which can help predict some of their choices in life (Ramsey, 2017).
The third type is Psychic Determinism. Psychic Determinism believes that all events, including mental processes, are determined, by former events of expected theories. Psychic determinism emphasises that previous experiences, especially those from early childhood, greatly influence current and future mental processes. Traumatic events, conflicts, and unresolved psychological issues can affect decision-making and behaviour throughout an individual’s life (Psychology.tips, 2024). Psychic Determinism also links to psychoanalysis. Psychoanalysis is a beneficial approach and theory, founded by Sigmund Freud, which is a system of psychological theory and therapy that aims to treat mental conditions by investigating the interaction of conscious and unconscious elements in the mind and bringing repressed fears and conflicts into the conscious mind by techniques such as dream interpretation and free association (Oxford Languages, 2025). Psychic determinism is related to the principal concept of determinism, specifically in terms of human actions. Therapists who observe to the belief in psychic determinism assume that human action and decisions are predetermined and are not necessarily under their own control (Marcopelle, 2021).
Immy Harris, Fourth Form, Gascoigne House
Scholar Supervisor: Simran Panesar, Upper Sixth, Queen Anne House
Editor: Mr Elkin-Jones
Ina world increasingly dominated by rapid technological advancements, escalating environmental crises and growing social conflicts, the prospect of a dystopian future no longer seems confined to the realms of fiction. From the rise of government control to the growth of artificial intelligence, signs of a potential dark future are emerging all around us. This article will explore the indicators and trends that suggest we may be on the brink of a dystopian reality, examining the factors that are driving us closer to a world where personal freedoms are eroded, and societal norms are upended. As we rapidly approach the crossroads of technology outdoing the human race, many people are starting to wonder; are we headed for a somewhat dystopian future? How close are we?
A dystopian future is a vision of society where conditions are marked by oppression, suffering and a loss of personal freedoms, often resulting from unchecked political power, technological advancements or environmental collapse. In such a future, individuals may live under totalitarian regimes that control every aspect of life, from thought and speech to personal relationships and movement – much like North Korea. Surveillance is pervasive, and dissent is punished harshly, creating a climate of fear and compliance. Resources are often scarce and social inequality is rampant, with a small group of those in power holding most of the wealth while the majority struggle to survive. Technological advancements, instead of benefiting human society, are often used to control or manipulate individuals, aggravating social divided or stripping away autonomy. In this bleak vision of the future progress is ground to a halt, and the world feels like a place where hope is hard to come by, leaving people trapped in a cycle of dehumanization, fear and exploitation.
Modern examples of dystopia can be found in various aspects of a functioning society, where technology, government control and social inequalities create a sense of oppression and fear. One example is the increasing use of mass surveillance in countries like China, where the government monitors citizens through facial recognition, social credit scores and tracking technologies, raising concerns about privacy and personal freedoms.
In the corporate world, tech giants like Google, Facebook and Amazon control vast amounts of personal data, often using algorithms to manipulate behaviour and shape public opinion, creating a reality where individuals are constantly monitored. Economic inequality also contributes to a modern dystopia, as wealth becomes increasingly concentrated in the hands of few, leaving vast populations struggling with poverty, unemployment and a lack of access to basic resources. Climate change and environmental destruction are further adding to the dystopian narrative, as rising sea levels, extreme weather events and resource deprivation threaten the planet’s future.
These examples illustrate how the combination of technology, surveillance, economic disparity, and environmental collapse can create a world where personal freedoms are restricted and the future feels uncertain and bleak.
North Korea demonstrates many characteristics of a dystopian society, marked by its extreme totalitarian regime that controls nearly every aspect of life. The political leader’s family, the Kims, have an iron grip on power, along with persuasive surveillance, which creates an environment of fear and oppression: citizens must constantly adhere to the states ideology or face severe punishment. North Korea is almost completely cut off from the outside world, and inhabitants are constantly brainwashed by relentless propaganda to maintain control of the country. Even basic freedoms, such as speech, movement and even thoughts are tightly restricted, with severe threats keeping the country in order. People remain in a cycle of intense control, deprivation and suffering-hallmarks of any dystopian world.
Can this be Prevented?
Preventing a dystopian future requires a collective, multifaced effort to address the root causes of inequality, environmental degradation and unchecked technological control. One key step is to promote social and economic equality by ensuring access to education, healthcare and fair wages for all, so that no one is left behind or oppressed by the systems in pace. Additionally, combating climate change through sustainable practices, renewable energy and strong environmental policies is essential to prevent the planet from collapse. As technology continues to advance, it’s crucial to implement ethical regulations to safeguard privacy, and ensure that AI and automation benefit humanity rather than exploit it.
By focusing on these areas, and making conscious, collective choices, we can create a future that prioritises freedom, justice and sustainability, steering us away from the dark vision of a dystopian world.
Dystopian books often conclude with a stark reminder of the consequences of unchecked power. While some end with a glimmer or hope, suggesting the possibility of resistance or change, many emphasize the grim realities of living in oppressive systems. A modern example of this is The Hunger Games by Suzanne Collins.
The Hunger Games is a dystopian novel set in a distant future world called Panem, where the country is divided into twelve districts and ruled by the wealthy capitol. Each year, the capitol forces one boy and one girl from each district to enter an arena and fight until the death, until there is one lone victor. As the games progress, the protagonist, Katniss Everdeen becomes a symbol or resistance, sparking a rebellion against the capitol. The novel explores theme of survival, sacrifice and the effects of power and control.
Many people see dystopia as what we are told in films and books, but the reality is that dystopia isn’t just people overthrowing their government. The word dystopia originally comes from the Greek work 'δυστόπος', meaning bad place. The idea of dystopia has been around for centuries, and contrasts the idea of 'utopia' which is a place or situation that is deemed as perfect. However, dystopia takes the opposite stance. A dystopian society is defined by levels of suffering and injustice taking place in a specific society.
Most of us have engaged with books that demonstrate this (The Hunger Games, 1984, The Maze Runner). All of these examples have unifying factors like fear, loss of individuality
and government control. When we think of these examples, we could never relate to them. They seem impossible and absurd, we could never live in the characters' shoes. But have you ever thought about the possibility that these books could be a warning? They are a demonstration of how our society might crumble if we carry living the way we are.
In many ways, as a global population, we are living in a sort of dystopia, shaped by the erosion of privacy, growing surveillance and the increasing control of technology over our lives. With mass data collection, personal privacy feels almost non-existent, leaving many people vulnerable to manipulation and exploitation. Social media and algorithms create echo chambers, distorting reality and deepening political diversions, while misinformation is spreading rapidly. Economic inequality is widening, no matter how much we try to deny it, and millions are struggling to survive due to lack of resources. Environmental degradation continues unchecked, constantly threatening our planets future, yet political inaction leaves the situation unresolved. All of these factors are contributing to a sense of powerlessness and disillusionment, where the systems that are meant to serve us for the better, seem increasingly designed to aid the globes downfall.
The impact of technology on society often veers in dystopian territory, as it demonstrates both the benefits and dangers of modern life. While technology holds
the power to connect people, enhance productivity and improve quality of life in many aspects, new forms of control and inequality are also created. Surveillance technology, like facial identification to data tracking have created a world where privacy is increasingly compromised, and personal freedoms are at risk; much like North Korea. Social media algorithms manipulate our behaviour and thoughts, promoting social division and spreading misinformation, whilst AI is threatening mass unemployment, leaving millions to struggle in a world that no longer needs their work. As these systems become more advanced, the gap between powerful and powerless citizens widens, and social life becomes more fragmented and dehumanised. In this way, technology, instead of enhancing us as a society, may lead to a dystopia where individuals are reduced to mere cogs in a vast, impersonal machine.
Is there a way we can stop a future like this from evolving? There are many factors for and against this argument, and ways to prevent a disaster evoking. It is almost impossible to predict what the outcome of our society will end up as, and I personally think the only way to truly know is if we add up all of the current factors. But even then, we can still only wait and see what the future holds and how we work together as a community to prevent this possibility.
The question of whether we are headed for a dystopian future is complex, and depends on how we navigate the
Florence Nixey, Second Form, Gascoigne House
Scholar Supervisor: Nancy Christensen, Upper Sixth, Swift House
Editor: Mr Elkin-Jones
Farming began around 10,000 BC making it the oldest 'proper' profession in the world and yet not enough people seem to know anything about it in our current society. All around the world there are many different types of farms – from alligator farms in the USA to guinea pig farms in Peru. This article will explore some of the countless problems that farmers in the UK face every day, from exploring what the ‘Red Tractor’ initiative is, to the impacts of the new Labour government's policies, to the constant and continuing threat of blue tongue disease. This article will uncover the obstacles and many hours of labour that farmers face to make sure we have food on our plates.
In 2024, tuberculosis and blue tongue cases rose. Blue tongue is an infection of the whole body of an animal. Some symptoms in cattle include not eating, abortion, and fever, whereas in sheep it includes red skin, lameness and breathing problems (Gov.UK, 2024). Blue tongue is spread by midges which have increased after the recent humid summer, thus causing a rise in blue tongue cases. The average amount of blue tongue cases in cattle alone is 139 per year, however, between 26 August 2024 to 19 November 2024 there has been 164 reported cases (NFU, 2024). This means there is a 471.94% rise in blue tongue cases from 26 August 2024 to 19 November 2024 (NFU, 2024). Tuberculosis, or TB, is an infection of the lungs which can be passed through a herd by coughing and sneezing, but it is not treatable for cattle and would be very serious if you ate some beef that comes from a cow which had TB. Depending on where the farmers live and how many times a farm has had TB, the farm will have to have yearly visits from a vet, or visits once every three-six months to see if their animals have TB. However, it costs a lot of money to get a vet out to do this. If the farm has an animal with
TB it goes into isolation, and the farm cannot sell or buy in livestock until they have two clear tests in a row. These tests must be 60 days apart. Another common thing in livestock is mastitis as it can be the downfall of many family dairy farms. Mastitis is caused by bacteria getting into the udder making their immune system fight back, swelling is often a normal reaction stopping milk from coming out. This makes the farmer have a lack of milk and they can lose the cow, resulting in a big loss of money (Kawsar, 2020).
When shoppers go to the supermarket in the UK, they might notice that there is a little sticker which says Red Tractor Insured. Red Tractor is a farming insurance scheme which around one third of farmers are insured by. However, you are required to follow strict rules to sell your food to British supermarkets. For example, only spraying your crops at certain times with certain chemicals or give animals certain feeds and not have any medicine to help it grow (Red Tractor, 2023). Farmers have to pay for Red Tractor. I interviewed a farmer who said that they had two different types of red tractor to pay as they are beef and arable and
pay £400 a year for 850 acres of land (Anon, 2024). Yet there are no rules in over 184 other countries that are as tight as the UK’s but the food British farmers produce is mixed in with other countries that have not had to follow these specific rules, then sold in the UK.
Arguably, a new threat to the farming world is the new Labour government and their budget. This could have a big impact on small family farms. Depending on what type of farm it is, livestock farms earn less money than arable farms but overall, your salary (after expenses) is around £17,300£39,000. This is likely to change annually with weather and illness in livestock. Farm property can be – and often is – very expensive and some farmers will have to pay a larger sum on inheritance tax as it consists of 20% of the farmer’s property value. The farmer I interviewed said that they had about £4,000,000 worth of property meaning that is £800,000 worth of land. They do not have this money as their income does not only have to pay rent, food, fuel (for tractors, machinery and cars), water and heating but also things like fertiliser, animal feed (like salt likes with minerals) and expensive vet bills. Inheriting farmland through generations is how a family farm works but many farms won’t be able to afford meaning it. This may cause selling a farm on but the problem is that it is likely not to be brought by a farmer, as has been much discussed in the press of late (Chu.B, 2024). It is more likely to be a housing company/ developer or a rich person or company to try to get rid of their carbon footprint as there will be trees and plant.
Living on a farm means not knowing what is coming tomorrow. There are uncertainties, such as living on a family farm without any tuberculosis in the first three generations but then getting one positive animal, as opposed to having some repeating certainties, such as beautiful views of the natural world, feeling connected to nature and having a sense of doing something worthwhile for humanity and the planet.
But recently many farmers have had more tough times than good times. They have seen cattle deaths and flooded fields, with crops (and incomes for the year) lost, never to be recovered. For the non-farmer, it is important to imagine how this feels and the cost it has upon one’s sense of self-worth. Imagine if you, the layman, lost someone close to you, or saw your newly renovated garden flooded, destroyed and waterlogged, then lost some of your already low income. One-third of farmers have diagnosed depression in the UK (BBC, 2021). This is not helped by the fact that many farmers live in a bubble and work long hours
a day. The farmer I interviewed worked different hours each day, from an eight hour day in the winter to fourteen hours at harvest. Their social life is limited: they don’t go out to see many people, apart from going shooting in the winter which probably averages once every week.
Yet being a farmer does have more plusses than just shooting and spectacular views; there are births of animals and good crop harvests. There is also physical health: farmers will tend to have particularly good physical health because they get the exercise of walking everywhere. They tend to know a lot about food and so they eat more healthy food compared to much of the population. This is due to the knowledge they have around eating seasonally- which month is good for which crop (Custard, B, 2022).
How many people look at the back of the food packet they pick up in a supermarket, look at the ingredients and see what how much food went into the product instead of chemicals made in a lab? The 109,900 farmers in the UK, that is 0.02% of the farmers in the world in trying to feed the consistently expanding amount of people. They are also the farmers who must adhere to the strictest rules out of every country in Europe, after Belgium (David, 2023). Many farmers stay in the business partly because they do not know much else, but mostly because they feel intimately and existentially connected to the natural world. Farming requires skill and passion: it is essential to maintain our food security, sustain our natural world, and support our sense of cultural identity. Farming is a lifestyle, and not a job, and UK society should value it for all these reasons.
References:
Anon (2024) Bluetongue: news, information and guidance for livestock keepers. [online] GOV.UK. Available at: https://www.gov.uk/government/collections/bluetongueinformation-and-guidance-for-livestock-keepers. Accessed 08/12/2024
BBC News (2021) One third of UK farmers could be depressed - survey. (2021). BBC News. [online] 14 Oct. Available at: https://www.bbc.co.uk/news/scienceenvironment-58911758.
Chu, B (2024) Farming tax row - BBC Verify on which figures are more reliable and why. BBC News. [online] 22 Nov. Available at: https://www.bbc.co.uk/news/articles/ c789yggdxn3o. Accessed 30/01/2025
Custard, B (2022) Guide to the farming calendar: a year in the life of a British farmer [online] www.countryfile.com. Available at: https://www.countryfile.com/wildlife/ farming-calendar-a-year-in-the-life-of-a-british-farmer. Accessed 30/01/2025
David (2023) 12 Biggest Farming Countries in Europe [2024 Stats] - Agrolearner.com. [online] agrolearner.com. Available at: https://agrolearner.com/biggest-farmingcountries-in-europe/. Accessed 08/12/2024
DEFRA (2014) Bovine TB: how to spot and report the disease. [online] GOV.UK. Available at: https://www.gov.uk/guidance/bovine-tb. Accessed 08/12/24
Kawsar, I (2020) Mastitis in Cows: Causes, Types, Treatment, Prevention and Control [online] The Vet Expert. Available at: https://www.thevetexpert.com/mastitis-in-cowscauses-types-treatment-prevention-and-control/. The Vet Expert 03/12/24
NFU (2021) Bluetongue – the latest info and updates from the NFU. [online] www. nfuonline.com. Available at: https://www.nfuonline.com/updates-and-information/ bluetongue-essential-information/. Accessed 03/12/24
Red Tractor (2023) Red Tractor Assured Food Standards. [online] Red Tractor. Available at: https://redtractor.org.uk/. Accessed 08/12/2024
Images:
Alexas_Fotos (2018) Cow Allgäu Beef - Free photo on Pixabay. [online] Pixabay. com. Available at: https://pixabay.com/photos/cow-allg%C3%A4u-beef-pastureanimal-3089207/ [Accessed 23 Jan. 2025].
Lightfoot, A (2020) Flock of sheep Southland, New Zealand. Available at: https:// unsplash.com/photos/herd-of-sheep-on-green-grass-field-during-daytimePj6fYNRzRT0
he is most likely to blame for thousands of deaths (BMJ, 2011). Before his false claims, measles had almost been eradicated, yet he is the cause for outbreaks of the disease in Europe (BMJ, 2024) and the United States (Time, 2018). Through scandals like these, the trust in science is undermined, and these conspiracy theories cause thousands of unnecessary deaths.
Conspiracy theories suggesting that the US government was carrying out secret mind control experiments were widely circulated in the 20th century (Oxford Academic, 2018). In rare cases, conspiracy theories can be hugely beneficial, especially when they turn out to be true. One shocking example of a conspiracy that did exactly this is the theory that the US government was attempting to use mind control (Reader’s Digest, 2023). For more than a decade, thousands of Americans were unknowingly used as test subjects for psychological warfare (The Week, 2017). These experiments were known as MK-Ultra. During the Cold War, the CIA were convinced that the Soviet Union had developed ways to control peoples’ minds (ATI, 2022). So, from 1953 the CIA carried out a large program, during which experiments were carried out. These experiments ranged from electrocution, abuse, and even giving test subjects large quantities of various drugs, including LSD, all in an attempt to control the minds of the subjects. Tragically for some unwitting participants, this program left them with permanent psychological damage. The CIA conducted the project away from the public eye, most of the documents being destroyed in the 1970s when the program came to an end. This program is certainly a contender for one of the largest government programs and cover-ups in American history (ATI, 2022). In 1973, after the Watergate scandal, a series of interlocking political scandals involving the US President Nixon (Brittanica, 2018), the CIA director at the time attempted to destroy all MK-Ultra files. This was an attempt to make sure that MK-Ultra would not be discovered. All MK-Ultra files were destroyed, except for a small number of misfiled documents. Two years later, in an attempt to disprove any conspiracies linked to the CIA, the President at the time called for an investigation into CIA activities. However, instead of having its desired effect, this investigation helped to reveal the 8000 remaining documents about MK-Ultra. So, the conspiracy theories at the time regarding the CIA helped greatly in revealing the truth about MK-Ultra, and the horrors that came with it.
Conclusion
In conclusion, conspiracy theories have all sorts of effects. Some conspiracy theories are good, helping to reveal the horrors of MK-Ultra. But some are truly terrible, causing unnecessary deaths and grief, such as those by QAnon. On balance, from the reading I have done, it is clear that the impact of conspiracy theories is overwhelmingly negative. More important than the conspiracies themselves is how we, as a society, choose to deal with them. No good can come from having blind faith in either conspiracy theorists, or those in power. Instead, it is important that we use sound judgement and a critical eye to ensure we get to the true heart of the matter. While there are of course positive and negative effects, the negative effects are far too significant to ignore. Yet it is still important to keep a relatively open
mind, and not to brush off all ideas, because sometimes conspiracy theories are proven to be true (Reader’s Digest, 2023). Not all conspiracy theories are true though, so it is important not to take them all in. It is important not to go to some of the extreme lengths that some have due to a belief with little evidence to back it up.
Brotherton, R, French, C C and Pickering, A D (2013) Measuring Belief in Conspiracy Theories: the Generic Conspiracist Beliefs Scale. Frontiers in Psychology, [online] 4(279). doi:https://doi.org/10.3389/fpsyg.2013.00279.
Cahn, L (2023) 12 Conspiracy Theories That Actually Turned Out to Be True. [online] Reader’s Digest. Available at: https://www.rd.com/list/conspiracy-theories-that-turnedout-to-be-true/ [Accessed 8 Dec. 2024].
Douglas, K (2015) The Negative Social Impact of Conspiracy TheoriesNYTimes.com. [online] Nytimes.com. Available at: https://www.nytimes.com/ roomfordebate/2015/01/04/are-conspiracy-theories-all-bad-17/the-negative-socialimpact-of-conspiracy-theories [Accessed 4 Dec. 2024].
Elliot, J (2021) Man Killed His Kids over QAnon ‘serpent DNA’ conspiracy, U.S. Officials Say. Global News. [online] 12 Aug. Available at: https://globalnews.ca/news/8106455/ qanon-surfer-murder-children-serpent-dna-monsters/ [Accessed 4 Dec. 2024].
Ewbank, A (2017) Why Wartime England Thought Carrots Could Give You Night Vision [online] Atlas Obscura. Available at: https://www.atlasobscura.com/articles/carrotseyesight-world-war-ii-propaganda-england.
Friedman, E (2023) The Carrot Myth That Bizarrely Started from WWII Propaganda [online] The Daily Meal. Available at: https://www.thedailymeal.com/1264816/carrotmyth-see-dark-ww2-propaganda/ [Accessed 4 Dec. 2024].
Fry, W (2021) Santa Barbara surfer dad ‘enlightened by QAnon’ to kill his kids, feds say. [online] Los Angeles Times. Available at: https://www.latimes.com/california/ story/2021-08-11/coleman-charged-mexico-killings-qanon [Accessed 4 Dec. 2024].
Gagnon, A (2023) The Impact of the Wakefield Studies on MMR Vaccine Acceptance Crossing Borders |, 5(2).
Godlee, F, Smith, J and Marcovitch, H (2011) Wakefield’s Article Linking MMR Vaccine and Autism Was Fraudulent. BMJ, [online] 342(7788), pp.c7452–c7452. doi:https://doi. org/10.1136/bmj.c7452.
Jolley, D, Marques, M D and Cookson, D (2022) Shining a Spotlight on the Dangerous Consequences of Conspiracy Theories. Current Opinion in Psychology, [online] 47, p.101363. doi:https://doi.org/10.1016/j.copsyc.2022.101363.
MSN (2024) Who Or What Was 4chan And What Did They Do? [online] Msn.com. Available at: https://www.msn.com/en-in/entertainment/bollywood/who-or-whatwas-4chan-and-what-did-they-do/ar-BB1l5MRl [Accessed 4 Dec. 2024].
Olmstead, K S (2018) Conspiracy Theories in U.S. History. Conspiracy Theories and the People Who Believe Them [online] pp.285–297. doi:https://doi.org/10.1093/ oso/9780190844073.003.0019.
Perlstein, R (2018) Watergate Scandal. In: Encyclopedia Britannica. [online] Available at: https://www.britannica.com/event/Watergate-Scandal [Accessed 8 Dec. 2024].
Quick, J and Larson, H (2018) The vaccine-autism Myth Started 20 Years ago. here’s Why It Still Endures Today. [online] Time. Available at: https://time.com/5175704/andrewwakefield-vaccine-autism/.
Rutherford, A (2018) MMR Scandal – Impact on Vaccination Rates. [online] BBC. Available at: https://www.bbc.co.uk/programmes/w3cswjk9 [Accessed 8 Dec. 2024].
Serena, K (2022) How the CIA Tried to Harness the Power of Mind Control — Using Massive Amounts of LSD. [online] All That’s Interesting. Available at: https:// allthatsinteresting.com/mk-ultra [Accessed 8 Dec. 2024].
The Week Staff (2017) MKUltra: inside the CIA’s Cold War Mind Control Experiments [online] The Week. Available at: https://theweek.com/86961/mkultra-inside-the-ciascold-war-mind-control-experiments [Accessed 8 Dec. 2024].
Walker, J (2013) A Brief History of Conspiracy Theories. [online] Theweek.com. Available at: https://theweek.com/articles/459843/brief-history-conspiracy-theories [Accessed 4 Dec. 2024].
Wendling, M (2020) QAnon: What Is It and Where Did It Come from? BBC News. [online] 20 Aug. Available at: https://www.bbc.co.uk/news/53498434 [Accessed 4 Dec. 2024].
Wilkinson, E (2024) Measles outbreaks: Investing in Patient Relationships through GP Continuity Will Be Key to Boosting MMR Confidence. The BMJ, 384, pp.q221–q221. doi:https://doi.org/10.1136/bmj.q221.
Wikipedia Contributors (2019) Conspiracy Theory. [online] Wikipedia. Available at: https://en.wikipedia.org/wiki/Conspiracy_theory [Accessed 4 Dec. 2024].
Wright-Mendoza, J (2018) The Man Who Invented Modern Infection Control. [online] JSTOR Daily. Available at: https://daily.jstor.org/the-man-who-invented-moderninfection-control/ [Accessed 4 Dec. 2024].
Images:
Reis, R (2020) Carrots. Available at: https://unsplash.com/photos/orange-carrots-onhuman-hand-ZgDHMMd72I8
Joe Norman, Lower Sixth, Feilden House
Editor: Miss Hutchinson
In the past communism has been outlined as intrinsically flawed, it has been responsible for millions of deaths and has barely succeeded. Nevertheless, communism believes in something that too few strive towards.
Communism is a socio-political, philosophical, and economic ideology. It seeks to establish a classless society in which the means of production are owned communally and each individual contributes and receives according to their abilities and needs. Karl Marx and Friedrich Engels – both German philosophers in the 1800s – carved communism out in its first form in their 1848 pamphlet The Communist Manifesto. Highlighted in their work was the struggles between the bourgeoisie (the capitalist class) and the proletariat (the working class), arguing that the deep-seated inequalities of capitalism would ultimately lead to its downfall and the rise of a communist society. Within their work, Marx and Engels had envisioned a revolutionary movement that would overthrow capitalist systems, subsequently leading to a dictatorship of the proletariat as a transitional phase before achieving a stateless, classless society (Marx and Engels, 1848).
Throughout the nineteenth and twentieth centuries; various interpretations and adaptations of communism emerged. Most notably in the Soviet Union under Vladimir Lenin and later Joseph Stalin, who implemented a form of state socialism that diverged from Marx's original ideas. The Bolshevik Revolution of 1917 marked a significant moment in history, as it saw the first communist government established. This specific government sought to eliminate private property and redistribute wealth. However, the implementation of communism in the Soviet Union was met with a number of challenges. This included economic difficulties and political repression, consequently leading to widespread criticism of its effectiveness and morality.
Other countries, such as China under Mao Zedong, also adopted communism. As a result, significant social and economic transformations but often accompanied by authoritarian regimes and human rights abuse. The Cold War period saw the division of the world into capitalist and communist blocs, with the United States and its allies opposing the spread of communism, leading to various conflicts and tensions. The fall of the Soviet Union in 1991 marked a turning point, leading to the decline of traditional communist movements in many parts of the world.
As of today, whilst classical communism as envisioned by Marx has largely waned, various socialist movements and ideologies continue to advocate for social ownership and egalitarian principles. Often reflecting the ongoing debate about the role of capitalism and the pursuit of social justice in contemporary society.
To conclude, communism is a left-wing ideology with a utilitarian focus that aims to provide opportunity for people of all walks of life.
Communism has encountered plenty of praise in its history. Most significant of all is its focus on equality and the elimination of class distinctions. Precisely by advocating for communal ownership over the means of production, communism aims to eradicate the wealth gap that exists in capitalist societies. This principle was notably articulated by Karl Marx in The Communist Manifesto, where he argued that capitalism inherently leads to class struggles and social inequalities (Marx and Engels, 1848). In principle by removing capitalist structures in which individuals can accrue wealth and power at the expense of others; the population would have access to the same resources and therefore the same fighting chance. It is argued that this would leads to more equitable societies and that it ensures a decrease in marginalisation as a result of socio-economic status.
Communism maintains a sharp emphasis on collective welfare and social responsibility. As aforementioned, communism promotes a structure in which the needs of the community are prioritised over individual profit. The impacts of which are improved access to essential services such as healthcare, education and housing. All of which the state intends to provide for the population. For instance, during the early years of the Soviet Union, significant investments were made in education and healthcare; resulting in increased literacy rates and improved public health outcomes (Harrison, 1998). In turn this spending is an example of positive top-down development. In theory this investment from the small yet powerful government into the larger community would create a stable platform for growth. Ultimately, this sheds communism in a positive light.
Furthermore, communism encourages a sense of solidarity among individuals. Fostering a communityoriented mindset and taking a collective approach to building society can strengthen social bonds and promote cooperation as people work together towards common goals. In many communist societies, this has manifested in various forms of community organisation and mutual aid, where citizens support one another in times of need. The emphasis on collective action can also lead to greater political engagement, as individuals become more involved in decision-making processes that affect their lives. For example, in the south-eastern Fujan province of China, 3000 rural dwellings known as 'Tulou' face abandonment and deterioration. A local, Lin Luson, believes in the preservation of his ‘Hakka’ heritage and therefore heads a non-profit organisation by the name of 'Mei He'. He works on several projects surrounding all-female community groups within 'Tulou’s' providing education and other opportunities. Furthermore, he aids them in leading 'restoration, maintenance and ultimately the renewed success' in the 'Tulou’s' (Ashbridge, 2018). This is prime evidence revealing that communism conceives a strong sense of community and nurtures solidarity among individuals.
Additionally, communism has the potential to drive economic planning and development in a way that prioritises long-term societal goals over short-term profits. Central planning can allow for more coordinated efforts in addressing issues such as environmental sustainability and resource allocation. For example, in Cuba, the government has implemented various sustainable agricultural practices that prioritise food security and environmental health,
demonstrating how a planned economy can respond to pressing global challenges (Damián, 2000).
In conclusion, while communism has its flaws its core principles of equality, collective welfare, social solidarity and long-term planning present compelling arguments for its potential benefits.
I would like to start by addressing what is quite unavoidably the most defining negative characteristic of communism. Its tendency to lead to authoritarian regimes. In practice, many communist governments have centralised power in the hands of a few leaders or a single party; undermining democratic processes and individual freedoms. Historical examples, such as the Soviet Union under Joseph Stalin, illustrate how communism can result in oppressive state control, where dissent is not tolerated, and political opposition is often met with violence or imprisonment. This authoritarian nature can stifle creativity, innovation and personal expression. Developing societies in which conformity is enforced and individual rights are severely restricted or impeded on.
Another major criticism of communism is its economic inefficiency. Let’s turn our attention to the central planning model, a hallmark of communist economies, in which the government controls the means of production and runs the economy with the population's interests in mind. The central planning model often fails to respond effectively to the needs and wants of consumers. In fact, without the price signals provided by a free market, central planners
may struggle to allocate resources efficiently which consequently ensues shortages of goods and services. For instance, in the Soviet Union, the emphasis on heavy industry often came at the expense of consumer goods, resulting in a lack of basic necessities for the population (Gaddis, 2005). This inefficiency can stifle economic growth and innovation, as seen in various communist states where the economy stagnated or regressed due to mismanagement and lack of competition.
Furthermore, communism has been associated with significant human rights violations. The pursuit of a classless society has often led to violent purges, forced labour camps, and mass executions. Moa Zedong’s 'The Great Leap Forward' in China, for example, aimed to rapidly industrialise the country but resulted in widespread famine and the deaths of millions. Such failures highlight the dark side of communism, where the state prioritises ideological goals over the well-being of its citizens (Glenn K. and Jennifer L, 2018). The disregard for human life and dignity in the name of achieving a communist utopia raises ethical concerns about the morality of such an ideology.
Communism can create a culture of dependency on the state. In a system where the government controls all means of production and distribution, individuals may become reliant on state provisions for their livelihoods. This dependency can discourage personal initiative and entrepreneurship. Especially because the population may feel disincentivised from working hard or innovation when the fruits of their labour are redistributed by the state. The lack of personal responsibility can lead to a stagnation of society, where individuals do not strive for improvement or progress. Ultimately this can hinder overall development.
Additionally, the international implications of communism can lead to geopolitical tensions and conflicts. The spread of communism has historically been met with resistance from capitalist countries, resulting in ideological wars and military confrontations. The Cold War is a prime example, where the ideological battle between communism and capitalism led to numerous proxy wars, such as those
in Vietnam and Korea. These conflicts not only resulted in significant loss of life but also strained international relations, creating a legacy of mistrust and division that survives in some regions today.
The environmental impact of communist regimes also deserves mention. In the pursuit of rapid industrialisation and economic growth, many communist governments have neglected environmental concerns and contributed to major environmental degradation. The focus on meeting production quotas often results in unsustainable practices, such as deforestation, pollution, and depletion of natural resources. For example, 'China’s carbon dioxide emissions in 2020 were more than that of the United States, the European Union, and India combined' (IER, 2022).
The environmental toll of such policies has and will have lasting consequences, affecting public health and the quality of life for future generations. Nevertheless, this is not to say that the same has not been witnessed in nations across the globe.
In conclusion, while communism may present an appealing vision of a classless society, the practical implementation of this ideology has often led to numerous negative outcomes. From authoritarianism and economic inefficiency to human rights violations and environmental degradation. The historical record of communism reveals a complex and often troubling legacy. Perhaps understanding the lessons of the past can help prevent the repetition of such mistakes and potentially even foster more just and equitable societies in the future.
Recently, I was lucky enough to attend a school history trip to China. On our travels we visited Beijing, the capital. Pingyao, a historical walled city filled with lively markets and culture. Finally, we visited Xian, the quieter former capital city and neighbour to the Terracotta Warriors. Whilst, truthfully, as a tourist it can be hard to get a real understanding of the intricate workings and characteristics of a society. I would like to try and describe as best as I can what I noticed and experienced during our visit.
China is very clearly a country with a rich history, fantastic traditions and a beautiful, fluorescent yet tasteful culture. Furthermore, I believe that it has remained so lush and unique as a result of its technological, geographical and political separation from the Western world. Moreover, social media in China is disconnected from that of the Western world and internet access is restricted. Whilst this may appear solely negative and an infringement on the people's liberty, the positive impacts are clear in that China is protected from cultural diffusion from the Western world and has been able to preserve its non-physical heritage.
Whilst living conditions in China have improved over the last few decades, that is not to say that the gap in between the impoverished and the wealthy is not extreme and further the working conditions for the average worker are also poor. People are expected to work long hours and only take tiny breaks for holidays. A prime example of this was presented by our tour guide Tsun. A man who clearly worked incredibly hard – and knew everything there was to know about the history of Beijing. Tsun rose every morning at 4.00am to cycle to our hotel where he would join the coach. He would walk miles touring us round various attractions in the day and, once finished, cycled home only to go back out and collect his children from school a little later. Tsun was a brilliant example of the strict work culture that exists in China.
It is fair to say that I thoroughly enjoyed my time in China, especially experiencing a culture so different to our own. A culture protected from the homogenisation of the western world. Yet it would be untrue of me to ignore the negative aspects to a life in China, most of all in the city life. China is a beautiful place but not without fault.
A Conclusion on Communism
Communism as an ideology is evidently one of such that exists with the best of intentions. Specifically, in its aim to provide stable livelihood and equal opportunities for everybody living under it. However, when put into practice communism appears to have inherent flaws. In fact, it’s ability to seemingly funnel power into such
a small group of people creates ideal conditions for the formation of dictatorships and mismanagement of economies. Furthermore, the central planning model – a key characteristic – is an example of communism's overly tight grip on innovation and competition. Overall, I believe that communism marks a step in the right direction, namely because it prioritises the greater good for the greater people, however in practice it is cruel and the opposite of incentivising. Communism does not work. However, what communism does and can do is drive us towards a society in which everybody is considered.
References:
Ashbridge, J (2018) Discovering Community-led Design in China. Available at: https:// creativeconomy.britishcouncil.org/blog/18/09/10/community-led-design-china/. Date accessed: 20/11/24
Damián, F (2010) Cuba and the Politics of Passion: A Cuban Perspective on the Cuban Revolution. Available at: https://www.jstor.org/stable/10.7560/725195. Date accessed: 23/11/24
Gaddis, J L (2006) The Cold War: A New History, Penguin Press Institute for Energy Research, IER (2022) China’s Carbon Dioxide Emissions Are More Than Twice Those of the US. Available at: https://www.instituteforenergyresearch.org/ international-issues/chinas-carbon-dioxide-emissions-more-than-twice-those-ofthe-u-s/. Date accessed: 07/12/24
Lenin, V (1918) The State and Revolution. Available at: https://www.marxists.org/ archive/lenin/works/1917/staterev/. Date accessed: 29/11/24
Llewellyn, J and Kucha, G (2018) Alpha History: The Great Leap Forward. Available at: https://alphahistory.com/chineserevolution/great-leap-forward/. Date accessed: 07/12/24
Marx, K and Engels, F (1848) The Communist Manifesto. Available at: https://www. marxists.org/admin/books/manifesto/Manifesto.pdf. Date accessed: 24/11/24
Stalin, J (1952) Economic Problems of Socialism in the USSR. Available at: https://archive. cpgb-ml.org/download/publications/stalin_economic_problems_of_socialism_in_the_ ussr.pdf. Date accessed: 24/11/24
Images:
Kiryl (2019) The great dictator. Available at: https://unsplash.com/photos/man-wearingblue-shirt-photo-on-wall-wb3j8sV5scM
Studzinski, M (2022) A red button sitting on top of a pile of money. Available at: https:// unsplash.com/photos/a-red-button-sitting-on-top-of-a-pile-of-money-XtFPP4RXuxg Tang, L (2019) People at Forbidden City in Chine during daytime. Available at: https://unsplash.com/photos/people-at-forbidden-city-in-china-during-daytimeyBroAF1cN3I
William Hansen, Lower Sixth, Queen Anne House Strickland Scholar
Editor: Miss Hutchinson
Freedomof speech is a cornerstone of modern democratic societies: the first amendment in the US Constitution, to Article Ten in the European Convention of Human Rights in 1950. However, the concept is intrinsically intertwined with the notion of limits. The origins of free speech are rooted in the pioneers of human thought and philosophy; the Ancient Greeks. Dating back to the fifth century BC, the Greek concept related to the term 'parrhesia', meaning 'to speak candidly', has remained paramount throughout the two and a half thousand years that followed. During the time of Ancient Greece, the ability to speak freely and openly to discuss political matters was essential to the democratic society with which much human progress was made, (Berti, E. 1978). To the modern day, within the United Kingdom's laws, 'the right to freedom of expression' exists as the latest reform to free speech, (constitutionalised in Article 10 of the 1998 Human Rights Act). Despite this, the notion of limiting freedom of speech is also recognised in British law. From the Public Order Act of 1986, section eighteen criminalises the use of threatening, insulting or abusive behaviour/words, or to display written material which is threatening (Lawble, 2021). This clear limiting of a constitutional right is a topic in much debate, where the wish to freely express one's views directly clashes with the attempts to prevent spreading of hate. The contentious issue arises with the question: is the Federal right in America, or the Human right in Europe – freedom of speech – being excessively limited in modern day? I believe the resolution to the ambiguous nature of such a concept is not to continue to further restrict information and expression, but to invest in education about equality to reduce the spread of hate from source, not stifling debates over differing opinions.
The concept of limiting freedom of speech has also been of philosophical debate for centuries, not only a topic of conversation in the era of social media. A philosopher who discussed in depth the theories behind limiting of freedom of speech was John Stuart Mill, in the nineteenth century. Mill’s thoughts stem from his belief, from his essay On Liberty, that there should be no attempt to 'control the expression of opinion'. This is an interesting distinction to make, where his belief is as follows: any expression which could be suppressed, is either true, false or somewhat true. He believed that even if the expression was false or only somewhat true, an assertion of such opinion would lead to debate and, when truth prevails in such a debate, greater understanding would be reached, (Bell, M C, 2020). In Mill's mind, whatever the expression was, it is of value in maintaining and finding the truth so should not be prevented. Following this, Mill did assert that there should be a limit to freedom of expression, despite his wish for opinions to remain totally without boundary. Within his essay On Liberty, Mill refers to the 'Harm principle' as the cut-off for freedom of expression. Mill thought this to be the sole reason to restrict freedom of expression: in order to prevent harm. The limitations to Mill's principle, however, are what classifies as harm. Mill refers to the ambiguity of the phrase but gives some clear examples: physical injury, loss of property, restriction of movement are unequivocal examples of harm and expression or speech that directly lead to these actions should be prevented. Another questioning point when using harm as the sole decider in whether or not speech should be restricted is the idea of justifiability. If someone ends up with harm from say the declination of a competitive job opportunity, this harm (from Mill's perspective) is justifiable as it is for the betterment of society (Waldron, A, 2024). This highlights the ambiguity of Mill's principle, which is further heightened
in the 21st century where a social media post can be perceived in so many different ways: harmful, offensive or simply just an opinion.
in the Modern Day
Limiting freedom of speech that is threatening or abusive (as stated in section eighteen of the Public Order Act) is clearly a necessity and something that has been within society as long as the original concept itself has. This concurs with Mill's views, where threatening behaviour and abuse does come under the umbrella of the harm principle. More recently, the development of the concept 'cancel culture' – a term relating to the removing of someone's platform to speak due to them breaching the limits of freedom of speech, mostly online – has become a prominent method to limit such expressions. This method of limiting speech most certainly has merits. For example, cancel culture forces people to be accountable for their actions, an extremely important notion with the ease of which people spread views on social media. This method is effective when someone with a sizeable audience, expresses violent views that do in fact incite violence (Thomas, Z, 2020). An example of this is internet influencer Andrew Tate, (whom according to The Independent in 2023 was more well known by teenage boys than the former Prime Minister Rishi Sunak). Due to his colossal audience, following numerous examples of misogyny and harm –inciting comments, Andrew Tate was cancelled and banned from many social media platforms (Wilson, B. 2022). This is the intended impact of cancel culture and its positive use in limiting hate speech where someone who previously had a large platform to indoctrinate and preach his views to people, has lost much of his power and thus his hateful comments have been suppressed. However, with the growing problem of social media permitting people to feel freer to express extreme opinions, with little fear of
of different political views in America if, 'The political climate prevents them from saying things they believe'. The percentage who agreed in 2017 was 58%. This rose to 62% in 2020, (Ekins, E, 2022). The fact that such a large percentage of the country feel unable to conduct discourse freely, even on political opinions they feel strongly about, is a cause for concern. This decrease in exchange of ideas fundamentally harms the democracy of a country, where democracy thrives on diversity of opinion for its development.
This statistic is even more striking when considering the value of which freedom of speech is to America. From the first amendment in 1791, freedom of speech has been central in American society as a pillar of the country's values, so why does such a large proportion of the country not feel able to exercise this right? Certainly, arising from the growth of cancel culture, many people fear their employment. From the same study, 32% of people feared how their employer would react to some of their political
views, preventing them from voicing their own opinions. This leading to a clear infringement of a constitutional right clearly has issues, suggesting perhaps cancel culture has gone too far in limiting freedom of speech. An example of such excessive limiting is back in 2012 where Adam Smith, a married man with children, was attending a peaceful protest outside restaurant chain Chick-Fil-A following the CEO speaking out against gay marriage. Smith posted a video of himself, admitting after that he got rather enraged, where he questioned the values of a worker who associates with such homophobia. Smith then posted this video on YouTube, without thinking fully about the consequences. Overnight, his video blew up and emails were sent to the company he worked for with death and bomb threats attached. He was immediately discharged and left without an income, or an opportunity to apologise. Throughout the following year, Smith applied for jobs and was repeatedly declined over the discovery of the video and the immense backlash he received. Smith was running out of money and admitted to being suicidal without the ability to feed his
four children and wife. This, however extreme a scenario, is an example of how punishing someone for their freedom of expression leads to further limiting of freedom of speech, where other people may withdraw from speaking their mind, over fear of backlash. This issue is clearly realised with the study stated earlier, and important in the question of limiting freedom of speech. I believe there is an issue with such a proportion of the population feeling scared to speak their political views, but the resolution to this is not to stifle these conversations. Instead, education to a larger proportion of people would promote political discussions and remove fear of harming others or losing employment.
When discussing how or if the UK and United States’ laws on freedom of speech should be changed, it is helpful to consider other strategies from world-leading powers. The implementation of limits to freedom of speech in countries across the world, countries with an alternate political set-up to many Western countries, is of course different. An example of this is China. Differing to the UK, China has incredibly strict freedom of expression laws. One of these laws is the criminalisation of political dissent: anyone who questions the authority of the Chinese Communist Party faces criminal charges, leading to more extreme self-censorship and a highly controlled public view. Moreover, China's radical attempts to control social media and views expressed online involves the employment of 30,000 internet police to maintain an internet 'firewall'. This rather extreme attempt to limit free speech does cause problems. The fundamental right to expression within Western democracies is clearly vacant in this situation, leading to the Chinese population losing their agency in terms of influencing or even choosing a leader. This appears too extreme a solution to problems with hate speech, where this excessive controlling only stemmed from Mao Zedong's, former leader of the Chinese Communist Party, irritation over the perceived failure of the '100 Flowers Movement'. This was a movement designed for the intellectuals of China to publicly criticise the government in 1957. The silencing that followed, where over half a million people were identified and purged as part of Mao's 'Anti-Rightist Campaign', has led to an effective end of public-government discourse. This appears to have numerous issues, one being an ill-informed society with a lack of knowledge about their country's politics. Furthermore, freedom of speech in government to public relationship allows people to check the power of government to avoid dictatorships or exploitation. Unfortunately, where this is restricted, governments can become too powerful and controlling with no fear over political backlash. The positives of this, however, are that extremist views are restricted and people express less hate towards others on social media and thus perhaps leading to less harm induced by discourse between the population. On the other hand, due to the preventing of information being passed to the public, as well as the inability to check the power of government, the Chinese system of limiting freedom of speech appears too controlling and has negatives outweighing the positives.
The current UK and US laws are ambiguous and are more unclear with the struggle to identify the actual harm or impact of social media posts in the modern day. There are many questions over how should breaching of these laws be punished, and whether opinions themselves, as stated by John Stuart Mill, should be punished at all. There are of course arguments in favour of stronger limitation, where in cases such as China, strong censorship does prevent such inter-societal hate spread online, however these strong limitations have overarching detriments that are inexcusable, such as an ill-informed society that could fall into dictatorship. For the UK and US, freedom of speech will continue to remain a philosophical debate, but in an age of such discourse and communication between people in power and all of society, it appears the encouragement of freedom of expression is necessary. The ways to resolve issues of hate and incited harm is investment in education about equality, not to restrict freedom of opinion.
References:
Bell, M C (2020) John Stuart Mill’s Harm Principle and Free Speech: Expanding the Notion of Harm. Utilitas, [online] 33(2), pp.1–18. doi:https://doi.org/10.1017/ s0953820820000229.
Berti, E (1978) Ancient Greek Dialectic as Expression of Freedom of Thought and Speech. Journal of the History of Ideas, 39(3), p.347. doi:https://doi.org/10.2307/2709382.
Cunningham, A (2024) Politician’s wife Lucy Connolly jailed for race hate post BBC News. [online] 17 Oct. Available at: https://www.bbc.co.uk/news/articles/ cp3wkzgpjxvo.
Ekins, E (2022). https://www.cato.org/survey-reports/poll-62-americans-say-theyhave-political-views-theyre-afraid-share#. [online] Cato.org. Available at: https:// www.cato.org/survey-reports/poll-62-americans-say-they-have-political-viewstheyre-afraid-share#.
Gibson, J L and Sutherland, J L (2023) Keeping Your Mouth Shut: Spiraling SelfCensorship in the United States Political Science Quarterly, [online] 138(3), pp.361–376. doi:https://doi.org/10.1093/psquar/qqad037.
Lawble (2021) What is Freedom of Speech in the UK? | Lawble. [online] Lawble. Available at: https://www.lawble.co.uk/freedom-of-speech/.
Thomas, Z (2020) What is the cost of ‘cancel culture’?. BBC News. [online] 8 Oct. Available at: https://www.bbc.co.uk/news/business-54374824.
Waldron, A (2024) Unreasonable | On Liberty. [online] Unreasonable.blog. Available at: https://unreasonable.blog/liberty.html Date accessed: 9/12/2024.
Wilson, B (2022) Andrew Tate’s been banned from social media. But his harmful content still reaches young men. [online] CBC. Available at: https://www.cbc.ca/news/ entertainment/andrew-tate-social-media-bans-1.6573978.
Citak, B (2024) A person holding a smart phone with social media on the screen Available at: https://unsplash.com/photos/a-person-holding-a-smart-phone-withsocial-media-on-the-screen-0cpyFsSUiSc
Malhotra, G (2021) Power to the People. Available at: https://unsplash.com/photos/ grayscale-photo-of-woman-in-black-shirt-holding-flag-QTEk16LzWSI
References for George Bains article (Page 38):
BBC UK missiles= Beale, J and Walker, A (2024) Ukraine fires UK-made Storm Shadow missiles at Russia for first time. BBC News. [online] 20 Nov. Available at: https://www. bbc.co.uk/news/articles/c4g704g051go.
Britannica weapons of The First World War Ray, M (2019) Weapons of World War I In: Encyclopedialike Britannica. [online] Available at: https://www.britannica.com/list/ weapons-of-world-war-i.
Britannica Dreadnought= Britannica (2019) Dreadnought | British Battleship | Britannica. In: Encyclopedialike Britannica. [online] Available at: https://www. britannica.com/topic/Dreadnought-British-battleship.
Britannica Airships= Guilmartin, J (2023) Aviation in World War I | Aircraft, Importance, & Effectiveness | Britannica. [online] www.britannica.com. Available at: https://www. britannica.com/topic/aviation-in-World-War-I-2229993.
Britannica air warfare= MacIsaac, D (2019) Air warfare. In: Encyclopedialike Britannica. [online] Available at: https://www.britannica.com/topic/air-warfare.
Britannica tanks= Richard Marian Ogorkiewicz (2019) Tank | military vehicle. In: Encyclopædia Britannica. [online] Available at: https://www.britannica.com/ technology/tank-military-vehicle.
Britannica Blitzkrieg= Britannica Kids. (n.d.) blitzkrieg. [online] Available at: https://kids. britannica.com/students/article/blitzkrieg/317670.
Britannica battle of France= Hart, B (2022) Battle of France | History, Summary, Maps, & Combatants. [online] Encyclopedia Britannica. Available at: https://www.britannica. com/event/Battle-of-France-World-War-II.
Britannica atomic bomb= The True Story of J Robert Oppenheimer & the Atomic Bomb | Britannica (2024) In: Encyclopædia Britannica. [online] Available at: https://www. britannica.com/video/J-Robert-Oppenheimer-atomic-bomb/-284083.
Britannica nuclear arsenals= Zeidan, A (2024) List of states with nuclear weapons | Countries & History | Britannica. In: Encyclopædia Britannica. [online] Available at: https://www.britannica.com/technology/list-of-states-with-nuclear-weapons-2227841.
Britannica MAD= Britannica (2019) Mutual Assured Destruction. In: Encyclopædia Britannica. [online] Available at: https://www.britannica.com/topic/mutual-assureddestruction.
Council on foreign relations aid to Ukraine= Masters, J and Merrow, W (2024) How Much U.S. Aid Is Going to Ukraine? [online] Council on Foreign Relations. Available at: https://www.cfr.org/article/how-much-us-aid-going-ukraine.
Council on foreign relations Military race= Boot, M (2024) Weapons of War: The Race Between Russia and Ukraine. [online] Council on Foreign Relations. Available at: https://www.cfr.org/expert-brief/weapons-war-race-between-russia-and-ukraine.
Council on foreign relations AI=Council on Foreign Relations (2023) What Is Artificial Intelligence (AI)? [online] Available at: https://www.cfr.org/backgrounder/what-artificialintelligence-ai#chapter-title-0-7 accessed 7th of December
References for Henry O'Brien article (Page 42)
Alcohol Change UK (2020) Alcohol Statistics. [online] Alcohol Change UK. Available at: https://alcoholchange.org.uk/alcohol-facts/fact-sheets/alcohol-statistics.
Alcohol Think Again (2023) Impact of Alcohol On The Young & Teenage Brain. [online] Alcohol Think Again. Available at: https://alcoholthinkagain.com.au/alcohol-andyoung-people/impact-of-alcohol.
Drink aware (2021) Drinkaware. [online] Drinkaware.co.uk. Available at: https://www. drinkaware.co.uk/research/alcohol-facts-and-data/underage-drinking-uk/.
Masterson, L (2021) Drunk Driving Statistics 2021. [online] Forbes Advisor. Available at: https://www.forbes.com/advisor/car-insurance/drunk-driving-statistics/. Priory. (2018) Poll shows public support for alcohol age rise. [online] Available at: https:// www.priorygroup.com/blog/poll-shows-public-support-for-alcohol-age-rise. RAC (2024) Drink driving in the UK - data from the RAC | RAC Drive. [online] www.rac. co.uk. Available at: https://www.rac.co.uk/drive/advice/road-safety/drink-driving-inthe-uk/.
WebMD Editorial Contributors (2021) What to Know About Alcohol and Mental Health [online] WebMD. Available at: https://www.webmd.com/mental-health/addiction/ what-to-know-about-alcohol-and-mental-health.