

MIND’S EYE

“You learn to expect nothing. You get nothing. You start off in school and they take your soul away. They take your brains away. You’re not allowed to have an opinion that differs from theirs.”
Johnny Rotten, The Sex Pistols
IMAGE: The Sex Pistols performing in Amsterdam in January 1977.








Editorials
This year we’ve had a huge number of writers, but what is more impressive is the quality of their work. It is brilliant. I’ve had the pleasure of becoming colleagues with my peers, a privilege to be sure, learning about things I never knew they were interested in. To read an article that is clearly written and well researched is one thing, but to feel such eagerness in the words is another matter entirely: you can feel their passion, and passion is what created the magazine this year. The writers have created their own worlds on every page, and beautifully so.
My own delight in the project has energised me, and I’ve been reading every night to keep up with the knowledge that our contributors display so effortlessly in these pages, and I take great pride in what we have built together. I hope these powerful articles show that my generation is worthy of a future full of opportunities, and that what they have written will inspire others to follow their example.
Lily Robertson Editor
If you want to know what’s going on in the world, ask a teenager. In the following pages, two young writers lay into smart phones with a new voice of disgust, accusing the iPhone of wrecking social skills and damaging mental health. A third attacks social media, warning her peers not to compare themselves to influencers. Healthy hearts and minds are the chief concern of four other writers, who share expertise on sleep, and ADHD, which is real for both adults and children, and one brave girl with severe concussion uses her experience to warn others about the damage it can cause. And one more sickness is under the lamp when our young editor writes about love. The health of other animals and the planet feature, too, when one writer laments the inhumane treatment of livestock in factory farms, and a second explores the link between the food industry and climate change: wiser eating can arrest global warming. A third unpacks the amazing virtues of an unsung foodstuff: seaweed. Several pieces muse on the mazes of the mind: a punchy study of normalcy bias, a brilliant unpacking of the iconic trolley problem, a tribute to Edward de Bono and his lateral thinking, and a fascinating look at whether serial killers are born or made.
Rome features twice, when we look into the city’s transition from Republic to Imperium, and when the last Pope, Francis, is given a less generous send-off than his followers enjoyed back in April. Ancient Greece also appears in a history of the Olympic Games and a poignant celebration of the Olympic spirit. Turbulent modern politics appear too: one writer compares the cost of UK defence with foreign aid, a tricky one, and elsewhere the absurd nightmare of trying to build houses is revealed. And we learn that on balance the world is better off than it’s ever been.
If that lot is not adequate food for thought, you can always tuck into pieces showing why hair is important to black people; how three brilliant scientists were robbed of recognition because they were women; why Welsh and Punk are cooler than ever; and why works of art can never be a private matter. Oh, and there’s a rather good history of submarines. Enjoy! Thanks to all of our contributors and the amazing Lily Robertson, who made this magazine happen.
Mrr Anthony Lyons Editor
www.mindseyemagazine.com

“
TO READ AN ARTICLE THAT IS CLEARLY WRITTEN AND WELL RESEARCHED IS ONE THING, BUT TO FEEL SUCH EAGERNESS IN THE WORDS IS ANOTHER MATTER ENTIRELY. 14
18

80

64


ONE WRITER LAMENTS THE INHUMANE TREATMENT OF LIVESTOCK IN FACTORY FARMS, AND A SECOND EXPLORES THE LINK BETWEEN A THOUGHTLESS FOOD INDUSTRY AND CLIMATE CHANGE.


AILEEN CHAN says the meat industry doesn’t need us to be cruel, just indifferent.
In 2017 survey results showed that 22% of the UK Public do not know that bacon comes from pigs. While I doubt the legitimacy of this statistic and I believe none of our bright readers are part of that percentage, I don’t think we question enough how the meat on our plate is produced, or wonder how the animal was treated before its death.
“
I DON’T THINK WE QUESTION ENOUGH HOW THE MEAT ON OUR PLATE IS PRODUCED, OR WONDER HOW THE ANIMAL WAS TREATED BEFORE ITS DEATH.
This raises a bigger question: how have we become so indifferent to the animals we eat, particularly factoryfarmed animals? Currently, more than 70% of the world’s meat comes from factory farms, an intensive system developed to produce as much meat as possible in the least amount of time, with little regard for the animals’ physical and mental wellbeing to meet the ever-growing demand for meat.
With the neat packaging of meats in supermarkets we often forget that it’s animals we are consuming. Take meat chickens, formerly known as broilers. Over the course of a century, broilers have been genetically bred to feed excessively and grow quickly, reaching in just six weeks the 2.2kg at which they are slaughtered. If this growth rate was applied to human babies, in two months the babies would reach an astonishing 272 kg. The chickens are unable to withstand the weight of their own body, which often causes lameness, and they are tightly packed
indoors, with chicken sheds housing 25,000 to 50,000 chickens each. Litter is often not replaced during the chicken’s stay in the shed, and is soaked in ammonia from the bird’s droppings, a corrosive chemical that causes ulcers and even blindness. These two conditions of the shed often act on each other: when the chickens are unable to withstand their own weight, they find it hard to move, if they can move at all in the overpacked space. They sit down to reduce pressure from their legs and ammonia in the litter burns their skin. With the number of chickens in the shed, workers are unlikely to assess every single one of them, leaving a lot of them unattended until their death.
So how did we become so indifferent to the suffering of these factory-farmed animals? Well, the design of the meat packaging deliberately separates in our minds the flesh of an animal from the animal itself, making it harder for us to consider how the animals have been raised. Another reason is our lack of knowledge. For example, we use the names of animals to insult one another in our daily lives (especially in our early childhood) based on our vague perception: chicken = cowardly; pig = dirty; mule = stupid etc. Contrary to popular belief, pigs are hygienic animals: when given enough space they clearly separate their feeding and defecation

“
THE DESIGN OF THE MEAT PACKAGING DELIBERATELY SEPARATES IN OUR MINDS THE FLESH OF AN ANIMAL FROM THE ANIMAL ITSELF.
They sit down to reduce pressure from their legs and ammonia in the litter burns their skin.
locations. And some think chickens rival dogs for intelligence. Such negative connotations and stereotypes may result in our lack of interest in animals and distort our beliefs.
Thankfully, our attitudes towards animals are changing for the better. Stricter laws protecting animals are being passed, and we are placing more and more of an emphasis on understanding the welfare of our animals to reduce suffering. This specific branch of science, Animal Welfare Science, incorporates behavioural studies with the animal’s physiology, aiming to objectively quantify and identify the wellbeing of an animal. For example, various research has shown that domestic animals have self-motivated behaviours, actions that the animals have engrained in their genes, as part of the evolution

process. Self-motivated behaviours, if not achieved, cause cortisol levels to rise. It shows that not only that animals in these settings need food and shelter, but they must also have an opportunity to act on these natural behaviours to ensure well-being. For chickens, some of these behaviours include perching, dustbathing and exploring.

They must also have an opportunity to act on these natural behaviours to ensure well-being.
Although I have mentioned only meat chickens here, this is a true example of the environments in the factory farms. Most if not all of the animal products you find in supermarkets are produced by animals in similar environments. Animals are not the only victims of the factoryfarming industry, since factory-farmers in the UK and in other countries are often in financial distress, and are left with no choice but to house their livestock in cruel conditions to make a living.

We, as customers, can help alleviate the amount of suffering in this industry by what we buy. Despite meats from uncertified farms being often more affordable, we must recognise that our every contribution to these uncertified farms results in more animals suffering atrocious conditions.

We must recognise that our every contribution to these uncertified farms results in more animals suffering atrocious conditions.
While I don’t believe it’s necessary for us all to adopt a vegan diet, the moderation of meat consumption and ensuring we purchase meats from certified farms will ensure that the animals we consume are at least treated with dignity and respect while they are alive. ¢


“
WE, AS CUSTOMERS, CAN HELP ALLEVIATE THE AMOUNT OF SUFFERING IN THIS INDUSTRY BY WHAT WE BUY.
Crowning
TREASURE AMADI celebrates
the boundless stylistic possibilities of
black hair.
Like in many cultures, hair holds a special significance for those with African or Caribbean heritage. However, unlike many cultures, hair to black people is more than just beauty – it is social capital and can symbolise power, wealth and spirituality.
Generally, black people have thicker, coarser and curlier hair than people of other races. Their hair texture tends to fall between 3c and 4c under the widely used Andre Walker Hair Typing system.
An example of this is dreadlocks, dreads, or locs, a hairstyle that has been prevalent in cultures across the world for centuries. Dreadlocks either form naturally, through hair matting in strands, or they are intentionally twisted to form the style. Despite their further popularisation by rap and hip-hop artists, locking one’s hair has been a significant cultural and religious practice for thousands of years and across several continents.
HAIR TO BLACK PEOPLE IS MORE
THAN
Hair that is curlier absorbs less moisture and tight curls prevent the even distribution of oils secreted from the scalp, making it more brittle and prone to breakage. Ergo, black people, especially women, usually wear their hair in protective styles. Protective hairstyles include braids to which human or synthetic hair is often added to make the wearer’s hair look longer (called attachment), wigs, and weaving, which is braiding of natural hair flat down on the head.
JUST
BEAUTY
– IT IS SOCIAL CAPITAL AND CAN SYMBOLISE POWER, WEALTH AND SPIRITUALITY.
In the past 100 years, dreads have also been commonly associated with Rastafarians, who lock their hair to symbolise the strength of their connection with God, a practice they see as commanded in the Bible. More recently, rap, hip-hop, and R’n’B artists wearing dreads have made them increasingly popular with younger people.
There are also traditional head coverings across Africa. Some of these include gele, elaborate head ties worn by women in the Yoruba culture in West Africa for special events, and doek, hair coverings worn by women in South Africa for everyday.




Getting hairstyles done is often a whole-day process and hair generally needs consistent upkeep.
Despite the advantages of protective styles, they can come at a great cost. Getting hairstyles done is often a wholeday process and hair generally needs consistent upkeep – braids typically get redone every month or two, and many people get their dreadlocks re-twisted to keep them looking ‘fresh’. What’s more, many neglect their natural hair, caring only about the style rather than the health of their hair.
On a practical level, protective styles help to prevent breakage. However, there are multiple reasons why one might choose to wear a protective style. Some find their hair too hard to manage in its natural state. In truth, Type 4 hair is easily knotted and can be painful to manage if not treated properly. Others fear that their natural hair is not presentable enough, or that having other hairstyles makes them more attractive.

Braids typically get redone every month or two, and many people get their dreadlocks re-twisted to keep them looking ‘fresh’.
There is a certain stigma about natural hair from within the black community and from other races. This stigma is especially felt by women because there is already so much value placed on appearance. Some black people feel like their hair should always be done so that they look ‘presentable’. Further, black women’s natural hair is regularly seen as exotic and unprofessional, whereas black men often get this reaction with styles like cornrows or dreads. In this regard, the effects of colonialism are still felt strongly within the black community. Many regard straight, long hair as the more desirable and feminine option for women, and short hair as the proper option for men, disregarding the beauty and cultural significance of traditional hairstyles or the afro.
In practice, many black women resort to wearing wigs, although it may seem unusual to some. This is because some see it as easier in many environments to assimilate with those of other races, rather than deal with unnecessary comments or a feeling of ‘otherness’.
“ON A PRACTICAL LEVEL, PROTECTIVE STYLES HELP TO PREVENT BREAKAGE.

It is unfortunately too common for black people to have their hair touched or for people to ask inappropriate questions about black people’s hair. Although it is good that people are curious about other cultures, black people are not innate political activists, and their hair is not a source of education to the unaware.
Despite awkward situations, the perception of black hair continues to be second to its advantages. The beauty of black hair, in my opinion, comes in the choice it gives you. There is no one way to wear black hair: long or short, tight, loose or with no curls, braids or twists. Its versatility means that you can have hair cascading down your back one day, and the next you could have hair that ‘defies gravity’. It is an extension of yourself and a palette for personalities to shine in a way that is unique to black culture. Black hair needs to not only be celebrated but cared for by those who have it. ¢

Its versatility means that you can have hair cascading down your back one day, and the next you could have hair that ‘defies gravity’.


“
IT IS AN EXTENSION OF YOURSELF AND A PALETTE FOR PERSONALITIES TO SHINE IN A WAY THAT IS UNIQUE TO BLACK CULTURE.


ULIANA EROSHINA celebrates the virtues of humble and under-rated seaweed.
When you think of seaweed, images of sushi rolls or slimy green strands washed up on the beach might come to mind. It may not seem like much more than an ocean nuisance or a culinary novelty, and European culture certainly doesn’t value seaweed very highly. Indeed, Virgil stated in the Aeneid, ‘nihil vilius alga’: there is nothing more worthless than seaweed. And Aristotle placed seaweed last in the Great Chain of Being.
But what if this humble marine plant could tackle some of the world’s most pressing challenges? From combating climate change and addressing food insecurity to cleaning up pollution and replacing plastics, seaweed emerges as a versatile and transformative resource. It’s a quiet yet powerful solution that could redefine sustainability and reshape our future.
“
EDUCATING
CONSUMERS ABOUT ITS BENEFITS, BOTH AS A FOOD SOURCE AND AS A MATERIAL FOR INDUSTRIAL USE, COULD DRIVE DEMAND AND SPUR INNOVATION.
By 2050, the world’s population is expected to reach nearly 10 billion, creating immense pressure on global food systems. Agriculture is already straining under the weight of climate change, soil degradation, and freshwater scarcity. Feeding billions of additional people in a sustainable way is one of humanity’s most significant challenges. Seaweed, however, offers a promising solution. As a nutrientdense superfood, seaweed is rich in protein, vitamins, and minerals. Its nutritional profile rivals, and in some cases surpasses, traditional crops like spinach and kale. But what makes seaweed stand out is how efficiently it grows. Unlike terrestrial crops, seaweed does not require farmland, fresh water, or synthetic fertilizers. It thrives in the ocean, where it grows at astonishing rates, making it one of the most sustainable sources of nutrition on the planet.
Studies suggest that farming just 2% of the world’s ocean surface could produce enough protein to feed 12 billion people — well beyond the Earth’s projected population. Research from institutions like the United Nations and the University of Southern Denmark further supports this claim, highlighting seaweed’s potential to contribute significantly to global food security. By 2050, large-scale seaweed farming could supply over 10% of the world’s protein needs, alleviating the strain on land-based agriculture and reducing the environmental impact of food production. However, seaweed’s role in food production extends far beyond human consumption. It also holds transformative potential for livestock farming, addressing one of agriculture’s most significant environmental challenges.
Few people associate seaweed with cows, but it turns out that adding seaweed to cattle feed could have revolutionary benefits for the planet. Livestock, particularly cattle, are among the largest contributors to methane emissions, a potent greenhouse gas. Methane has over 25 times the warming potential of carbon dioxide, and livestock farming is responsible for approximately 14.5% of global greenhouse gas emissions. A specific species of seaweed, Asparagopsis taxiformis, has shown remarkable potential in reducing methane emissions when added to livestock diets. Studies demonstrate that feeding this seaweed to cows can reduce their methane output by up to 90%. This is a significant breakthrough in the fight against climate change. If just 10% of farmers worldwide incorporated this practice, it could result in a reduction of emissions equivalent to the annual carbon footprint of the entire aviation
industry. In addition to reducing emissions, seaweed-based feed additives could also improve livestock health and productivity. Healthier animals could mean more efficient meat and dairy production, further reducing the resources required for agriculture. This innovation underscores seaweed’s ability to address multiple challenges simultaneously, making it an invaluable tool in sustainable farming practices.
Seaweed doesn’t just help reduce emissions in agriculture; it plays an active role in fighting climate change. As a natural carbon sink, seaweed absorbs large quantities of CO2 from the atmosphere. In fact, seaweed farms can sequester up to 20 times more carbon per hectare than terrestrial forests, making them an incredibly efficient tool for carbon capture. A 2019 study estimated that large-scale seaweed farming could offset up to one gigaton of CO2 annually. To put that into perspective, this amount is roughly equivalent to the yearly emissions of Germany, one of the world’s largest industrial economies. By integrating seaweed farms into existing marine ecosystems, we can significantly enhance the planet’s ability to absorb carbon emissions.

Coastal communities around the world, particularly in nations like Indonesia and the Philippines, have already embraced seaweed farming as a sustainable source of income.
Still, seaweed’s environmental benefits don’t stop there. Seaweed also plays a crucial role in fighting ocean acidification and pollution. Excess nitrogen and phosphorus from agricultural runoff contribute to algae blooms and dead zones in marine ecosystems. Seaweed absorbs these excess nutrients, restoring balance to the ocean and preventing further degradation. This process not only improves water quality but also creates healthier habitats for marine life, fostering biodiversity. In addition, seaweed farms act as underwater forests, providing shelter and food for fish, crustaceans, and

other marine species. By transforming barren ocean floors into thriving ecosystems, seaweed farming supports both environmental and economic sustainability.
The potential of seaweed extends beyond environmental restoration — it also offers significant economic opportunities. Coastal communities around the world, particularly in nations like Indonesia and the Philippines, have already embraced seaweed farming as a sustainable source of income. In Indonesia alone, over 300,000 people are employed in seaweed farming, contributing to the livelihoods of countless families. Globally, the seaweed industry is valued at $16 billion, but analysts predict it could grow to $60 billion by 2030 with increased investment in research and infrastructure. As demand for sustainable products continues to rise, seaweed-based industries are poised for substantial growth.
One of the most exciting applications of seaweed is its potential to replace traditional plastics. Seaweed-based bioplastics are biodegradable, meaning they decompose naturally and do not persist in the environment for centuries, like conventional plastics. These bioplastics could replace singleuse items such as food packaging, straws, and bags, significantly reducing plastic pollution. Studies suggest that seaweed bioplastics could lower the carbon footprint of plastic production by up to 80%. But seaweed’s versatility doesn’t end there. It can also be used to
produce biofuels, fertilizers, and pharmaceuticals. As a source of renewable energy, seaweed-based biofuels could help reduce reliance on fossil fuels. In agriculture, seaweed-derived fertilizers offer an eco-friendly alternative to synthetic chemicals, improving soil health without contributing to water pollution.
Despite its immense potential, seaweed is not yet widely used. Several barriers must be addressed to unlock its full capabilities. One of the primary challenges is the need for substantial investment in technology and infrastructure to scale up seaweed farming. While the benefits are clear, developing efficient farming methods and supply chains requires significant financial and logistical support. Another challenge is the lack of international regulations governing seaweed farming. Without clear guidelines, there is a risk of overharvesting or unsustainable practices that could harm marine ecosystems. Establishing global standards for sustainable farming will be essential as the industry expands. Public awareness is another critical factor. Many people still see seaweed as little more than a niche food or an environmental curiosity. Educating consumers about its benefits, both as a food source and as a material for industrial use, could drive demand and spur innovation. Promising advancements, such as automated harvesting systems and genetically engineered seaweed strains, offer hope for overcoming these challenges and making seaweed a mainstream resource.

“AS A SOURCE OF RENEWABLE ENERGY, SEAWEED-BASED BIOFUELS COULD HELP REDUCE RELIANCE ON FOSSIL FUELS.
Seaweed represents a practical and achievable solution to some of humanity’s most pressing challenges. Its ability to address climate change, improve food security, reduce pollution, and create sustainable economic opportunities makes it an invaluable resource for the future. However, realizing its potential will require coordinated action from governments, industries, and individuals. Investing in seaweed research and development, creating sustainable farming guidelines, and raising public awareness are crucial steps toward unlocking its transformative power. Whether as a protein-rich superfood, a tool for carbon capture, or a replacement for plastic, seaweed has the potential to reshape the way we interact with our planet.

So, the next time you see seaweed — whether on your plate or along the shore — remember its extraordinary potential. It is not just another marine plant; it is a solution that could help save the world. ¢
“
SEAWEED REPRESENTS A PRACTICAL AND ACHIEVABLE SOLUTION TO SOME OF HUMANITY’S MOST PRESSING CHALLENGES.



CHARLOTTE MONTAGUE points out that casual abuse of terms used for diagnosed mental illness can cause discomfort for those with the condition.
“NO ONE IS ‘NORMAL’ BECAUSE THERE ARE SO MANY WAYS YOU CAN BE DIFFERENT FROM SOMEONE ELSE.
Mental illness mislabelling happens when people use mental illness terms provocatively and cause discomfort. For example, most people have heard careless observations such as ‘That’s so autistic!’ or ‘You’re such a psychopath!’ Although in the moment these are meant as a joke, they can have a massive impact on those struggling with the condition concerned.

“
Nowadays a lot of people are diagnosed with different mental illnesses and to hear things like this tends to make the person feel like they can’t live a normal life, like they can’t tell anyone because they will be mocked. For example, if you were diagnosed as a psychopath, the number of negative connotations behind that word could lead to you thinking you’re a horrible person just for being born with a mental illness.
PEOPLE WHO HAVE LESS UNDERSTANDING OF A PARTICULAR CONDITION TEND TO ISOLATE IT AND MAKE FUN OF IT.
People also use mental illness terms wrongly. Now you may think such terms are just a slipup that’s easy to do but switching two really separate things like ADHD and Autism or Sociopath and Psychopath just leads to deeper misunderstanding. Now I get it that new terms can be hard for people sometimes and, yes, everyone accidently does it, but if you can try your best maybe there would be less discomfort.
Because mental illness has only properly been explored in recent years many stereotypes with rude connotations are still going around, and people who have less understanding of a particular condition tend to isolate it and make fun of it. This leads to a lot of people who haven’t been educated properly posting and making fun of these things for the younger generations to see.
Now you may be wondering how you can help and not be provocative. There are many things you can do, like raise awareness or tell someone that it’s wrong, but you may just not be comfortable doing any of that, so the biggest thing that you can do is stop: if you think what you say may be provocative, don’t say it.
No one is ‘normal’ because there are so many ways you can be different from someone else so you can’t seclude one type of person for being different in one way because in that thought why should everyone not seclude you for having a different hair colour or different eye shape?

In the end the only thing you should remember is: ‘Would I like it if I were in their shoes?’ ¢
Although in the moment, these are meant as a joke, they can have a massive impact on those struggling with the condition concerned.

No
Prizes
GEORGIA ROBERTS laments three brilliant female scientists being deprived of the honours they deserved.
Nettie Stevens, Carnegie Institution of Washington.
THEY WERE ALL DISREGARDED FOR THEIR DISCOVERIES SIMPLY BECAUSE THEY WERE WOMEN.
Here are three women who deserve more recognition: Rosalind Franklin, Nettie Stevens and Marthe Gautier. They were all disregarded for their discoveries simply because they were women.
Rosalind Franklin did not get credit for a discovery that led to a huge breakthrough in the understanding of DNA replication, the process by which genetic information is passed down generations, and which means genes can be copied identically so that new proteins are made to replace dead cells and allow growth. This was a really important biological concept that scientists had never understood before Rosalind Franklin came along.

“In 1952, Franklin was studying the structures of DNA. Her postgraduate student, Raymond Gosling, took a photo of a DNA fibre that showed the X-ray diffraction pattern of DNA that became a famous image called Photo 51. Although the image was taken by Gosling, it belonged to Franklin because he was working under her supervision and was doing the intern work. Franklin was working with a man called Maurice Wilkins, who was working on a different theory of DNA structure. The image showed ‘an exceptionally clear diffraction pattern’ but it featured the ‘B form’ of DNA and Franklin at the time was more interested in the ‘A form’, so she put Photo 51 aside to focus on other work. A few weeks later she left King’s College and this
STEVENS WAS COMPLETELY IGNORED AND PUSHED ASIDE. SHE LATER DIED OF BREAST CANCER BEFORE ACHIEVING THE RECOGNITION SHE DESERVED.
is when Gosling showed Photo 51 to Wilkins, because he was now working under Wilkins. Instead of bringing the image’s importance to Franklin’s attention, Wilkins took it to another competing party: Watson and Crick, who at once knew how valuable this photo was and asked Wilkins to join them in their research. Combining Photo 51 with other research evidence, they wrote a paper about the structure of DNA, which was then nominated for a Nobel Prize. Franklin died of ovarian cancer a few years before the Nobel Prize was awarded, which means it could not be awarded to her, and which means the prize went to a group of men – Wilkins, Watson, and Crick –who used her data without her knowledge, and she got none of the credit.
Another biologist, Nettie Stevens, was one of the first scientists to find that the sex of anything was determined by a configuration of chromosomes. She was working with mealworms in 1905, looking at the egg tissue and fertilisation process, when she discovered the small chromosome later to be known as the Y chromosome. When a sperm carrying the small chromosome fertilises an egg, it becomes a male and the bigger chromosome becomes a female. The big chromosome is the X chromosome. Stevens then saw that the chromosomes come in pairs: XY male and XX female. However, at the same time another party was coming to the discovery of the sex chromosomes as well. The difference was that this team was men. Edmund Wilson was the head researcher of the competing group. There was also a man called Thomas Morgan who had made opposing discoveries in the same biological field. Stevens had made and published

In the 1950s Gautier was the only scientist in Paris who knew how to create and grow human cell cultures.
Marthe Gaultier by Rodolphe Escher.
Rosalind Franklin.
DR.
her discoveries before Wilson and Morgan but due to gender bias at the time of the nominations, the Nobel Prize went to Morgan for the discovery, and at conferences Morgan and Wilson were asked to give talks. Stevens was completely ignored and pushed aside. She later died of breast cancer before achieving the recognition she deserved.
Finally, Marthe Gautier discovered the chromosomal abnormalities behind Down Syndrome. In the 1950s Gautier was the only scientist in Paris who knew how to create and grow human cell cultures, the samples of human tissue that are used for research. She was working with Jerome Lejeune for her professor, Raymond Turpin, and set up her own lab for the discovery of Trisomy 21, but Lejeune was to take the credit in the end for her discovery. Turpin did come up with a theory of what caused Down

Patients with Trisomy 21 have three chromosomes in pair 21, instead of the usual two.
Syndrome but it was Gautier’s experiments and research that confirmed the hypothesis and made the scientific breakthrough. Trisomy 21 is the chromosomal defect that causes the condition, and chromosomes usually come in pairs. For humans there are 23 pairs, while other species have different numbers of chromosomes, but patients with Trisomy 21 have three chromosomes in pair 21, instead of the usual two. Lejeune went to America with her discoveries, despite having no connection

“
NETTIE STEVENS WAS ONE OF THE FIRST SCIENTISTS TO FIND THAT THE SEX OF ANYTHING WAS DETERMINED BY A CONFIGURATION OF CHROMOSOMES.

TURPIN DID COME UP WITH A THEORY OF WHAT CAUSED DOWN SYNDROME BUT IT WAS GAUTIER’S EXPERIMENTS AND RESEARCH THAT CONFIRMED THE HYPOTHESIS AND MADE THE SCIENTIFIC BREAKTHROUGH.
Marthe Gautier in her lab at Bicêtre in 1970.
Nettie Stevens’ microscope.
Trisomy 21.
Rosalind Franklin did not get credit for a discovery that led to a huge breakthrough in the understanding of DNA replication.


to the work, since it was Gautier who had done everything, and travelled to America, where he displayed the research as his own and took all the credit for Gauthier’s work.
Two of these three amazing women died from breast and ovarian cancer, areas of medicine disregarded by researchers and doctors because they affected women and not men. It is only recently, in the last 40 years, that the female body is being properly researched and taken seriously. For example, the disease endometriosis was previously disregarded and women complaining about symptoms were accused of being oversensitive so proper diagnosis and treatment was never given.

These three women are just a few of the many scientists who have had their work stolen and been given none of the credit for their discoveries simply because they were female. Even though the science community has now progressed hugely in terms of gender equality, women are still fighting for funding and resources for their research projects, and are often still doubted for their findings and must have their male colleagues vouch for them and confirm their legitimacy. ¢
INSTEAD OF BRINGING THE IMAGE 51’S IMPORTANCE TO FRANKLIN’S ATTENTION, WILKINS TOOK IT TO A COMPETING PARTY.
Rosalind Franklin’s Rover module.
All It’s Welsh

MILLY CORDEAUX is learning Welsh but her friends are not impressed.
When I tell people I’m learning Welsh, instead of saying ‘Oh, cool!’ or ‘That’s really interesting!’, people’s first response is ‘Why?’ But I don’t really have an answer for that. I spend a lot of time in Wales during the holidays, seeing those in my family who live there, and enjoying the weather, but I otherwise have no connection to Wales. The area of Wales I visit every year has one of the lowest concentrations of Welsh speakers in the country, and the only evidence of the language you can find is in signs directing you to the ‘ysgol’ (school), or ‘traeth’ (beach). Despite all of this, I decided to learn Welsh.
Welsh is an example of a Celtic language, with others including Irish, Scottish Gaelic, Manx (spoken in the Isle of Man), and Cornish. These languages developed in a different manner to languages like English and French, which explains why a lot of
their grammar and phonology is so unfamiliar and strange to non-speakers, such as the ‘ll’ sound in Welsh, which doesn’t even exist in English.
“
Celtic languages are all endangered to some degree, with Welsh categorized as ‘vulnerable’, and Manx, Irish, Scottish Gaelic, and Cornish labelled ‘definitely’ or ‘critically endangered’. This decline is due to the emphasis that was placed on using English in the early 20th Century, when some places like schools and workplaces were so English-heavy that speaking anything else was punished. Then, as Englishlanguage media like music and movies became more widespread across the UK and the world, Celtic languages took even more of a backseat and have been in decline ever since.
SOME PLACES LIKE SCHOOLS AND WORKPLACES WERE SO ENGLISH-HEAVY THAT SPEAKING ANYTHING ELSE WAS PUNISHED.
However, on the bright side, most of these languages have recently undergone incredible revitalization efforts that come in the form of compulsory education in schools, translations of English literature, and the creation of media in these languages, such as radio and TV shows.
For example, in Wales, Welsh is offered as a GCSE option, and there are Welsh-medium schools in which primary and secondary education is offered predominantly or entirely in Welsh. This aims to increase the number of younger speakers so that the number of overall speakers increases over time by raising a new generation of language users.
On a global level, there are thousands of endangered languages. Estimations


“
suggest that there are around 7,000 languages spoken globally, but around 3,200 of them are considered endangered, which is alarming. Most of these languages are indigenous and are spoken in South America, Central Africa and Oceania, and are falling out of use due to the increasing use of English and other lingua francas, such as Spanish or Portuguese in South America. With the loss of these languages comes a loss of culture and history that cannot be brought back due to a lack of records. This is why it is so important for languages to be protected and preserved.
THERE ARE WELSH-MEDIUM SCHOOLS IN WHICH PRIMARY AND SECONDARY EDUCATION IS OFFERED PREDOMINANTLY OR ENTIRELY IN WELSH.
children the endangered Ojibwe language, Anishinaabemowin and other indigenous languages. This has helped to preserve these languages, and helped the younger generation of Native Americans to connect with their culture and help them feel proud of their heritage.
A fascinating example of the preservation of indigenous languages is the work being done by Danielle Boyer, a Native American inventor who has created small robots that can teach

If more people learnt about endangered languages we might be able to preserve the linguistic heritage of our world.
This is why it’s important that endangered languages are recognised, recorded, and most importantly protected. Protecting these languages helps to keep cultures alive. In recent years, Celtic languages have seen a small increase in use thanks to these revitalisation efforts, and hopefully this should continue to rise in the future. Perhaps one day these Celtic languages and other endangered languages will become more widespread and continue to be passed down for years to come.
So, although me learning some rudimentary Welsh through Duolingo isn’t going to reverse decades of linguistic oppression, if more people learnt about endangered languages we might be able to preserve the linguistic heritage of our world for future generations. ¢
“
IT’S IMPORTANT THAT ENDANGERED LANGUAGES ARE RECOGNISED, RECORDED, AND MOST IMPORTANTLY PROTECTED.



ELEANOR HOGG defines the climate crisis and suggests a simple way everyone can help.
We are in a climate crisis. That’s a fact. Something needs to be done. And we can do something.
Institute estimates that 70% of the world’s freshwater supply goes towards agriculture, and a third of that towards growing animal feed.
Global warming is caused by greenhouse gases trapping the heat in the atmosphere. While the Earth would naturally warm very slowly over time, the rate of warming is faster than ever. It is averaging around 0.20 degrees Celsius per decade. That is not natural. That is a direct result of human behaviour. This climate crisis is our fault. We must stop it, or we will have no planet.
“
DEFORESTATION
IS
DESTROYING THE TREES
THAT
GET RID OF THE CARBON DIOXIDE AND PROVIDE US WITH OXYGEN, AS WELL AS THE HOMES OF MILLIONS OF ANIMALS.
The main causes are the burning of fossil fuels, which release carbon dioxide. Landfill is harmful because biodegradable materials release methane. Deforestation is destroying the trees that get rid of the carbon dioxide and provide us with oxygen, as well as the homes of millions of animals. And then there is agriculture.
Agriculture, particularly livestock farming, has an immense impact on our environment. The Food and Agriculture Organization of the United Nations (FAO) estimates that livestock production is responsible for 14.5% of global greenhouse gas emissions, while other organisations have estimated it could be as much as 51%. In Northern Ireland, for example, grazing animals account for 30% of all carbon emissions. Modern agriculture is wasteful and inefficient. Currently 800 million people suffer from malnutrition, yet cereal that could feed three times that number is being made into animal feed to make, well, not very much food. The Worldwatch
But we can make a change. I’m sure you have all heard of ‘Meat-free Monday’, and whilst a lot of people aren’t too keen on the notion, it’s probably because they don’t know the benefits. For example, if one person did meat-free Monday for one year they would save 789.25 bathtubs of water, 2.98 tennis courts’ worth of trees and the same amount of carbon emissions as it would take to drive from London to Edinburgh (434.36 miles).
The future, they say, is in the hands of the young. The average school has around 800 pupils, so multiply all those numbers by 800 and one UK school could save 631,402 bathtubs of water, 2,382 tennis courts’ worth of trees and 347,488 miles’ worth of carbon emissions. Those are some fairly massive numbers. The impact might be small, but any amount is progress, and progress is positive.
If you won’t do this for the planet, do it for you. Doing meat-free Monday for one year could extend your life span by around 1.67 days, and there are loads of other health benefits. There is evidence that a more plant-based diet leads to reduced risk of diseases like cancer and heart disease, which would also save the NHS money and could lead to better pay for doctors and nurses, and improved healthcare throughout the UK. The future is on your plate: what will you choose? ¢




ERIC WONG says the raging fire of Punk was never pretty, but it was necessary.
When I was in Year 4 in prep school, I arrived early to Mr Sawyer’s art class, which was the last lesson before school ended. He probably wasn’t on my list of favourite teachers, but that day he was blasting out ‘Holiday in the Sun’ by the Sex Pistols, and that was my first experience of punk rock. That afternoon, I looked up the Sex Pistols and was fascinated by their album and name: ‘Sex Pistols’ and ‘Never Mind the Bollocks’ was a cheeky, dangerous provocation to me that was the coolest thing ever. I didn’t fully understand it, but I didn’t have to. For a while, I listened to the album on repeat, and it was totally different from any other music genre I had listened to up to that point.
“
IT WAS AN ACT OF THEFT, AN ANARCHIC PROMETHEUS SNATCHING SOMETHING PRIMAL AND UNTAMED AND HURLING IT IN THE FACES OF AN INCREASINGLY NUMBED WORLD.
the warm kind that lulls you to sleep, but the dangerous, blistering kind that burns your skin and reminds you that you’re alive. Punk — raw, unpolished and unforgiving — was not just a movement; it was an act of theft, an anarchic Prometheus snatching something primal and untamed and hurling it in the faces of an increasingly numbed world. And like any great act of rebellion, it was about destruction as much as creation.
Punk to me was less a genre of music and more a state of mind. It was a cerebral sensuality that married the raw and the refined, the ugly and the exquisite. It was visceral, like a shot of adrenaline straight to the brain, but it was also philosophical, teeming with questions about power, art, and the role of the individual in a world gone mad. Somewhere in the grime of the Bowery or the damp, smoke-filled basements of London, fire was stolen. Not
Lou Reed was the godfather of this theft: with his deadpan drawl and his unrelenting gaze on the sordid underbelly of human existence, he set the stage for what would become punk. The Velvet Underground’s White Light/White Heat was a sledgehammer to the gentle caress of flower-power idealism. Reed wasn’t interested in peace or love; he wanted you to taste the grit of New York City asphalt, to hear the screech of a subway car, to feel the sting of existential nausea. Reed didn’t just write songs; he painted bleak frescoes of addicts, drag queens, and lovers drenched in despair. His work said, ‘Here is the world as it is, not as you’d like it to be. Deal with it.’ Patti Smith, on the other hand, was punk’s high priestess, conjuring visions from the


The Sex Pistols performing in Amsterdam in January 1977. Photograph: Koen Suyk.
subway platforms of New York, her voice a jagged howl of poetry and rage. She was Rimbaud in a leather jacket, dragging Romanticism into the CBGB era.
The Stooges — fronted by Iggy Pop, the feral god of punk — were the unholy bridge between rock and chaos. Their music didn’t just challenge conventions; it annihilated them. Listening to Raw Power was like watching a building collapse in slow motion, beautiful and terrifying all at once. The Germs, with their wild, self-destructive energy, took that chaos and made it art. Darby Crash sang like someone who knew he was on borrowed time, his voice a mix of defiance and doom.
Then there were the Ramones, who turned punk into a minimalist manifesto. Their songs were twominute explosions of noise, each one stripped down to its bare essentials. The Dead Kennedys brought politics into the mix, their music a furious critique of the system, as scathing as it was smart. The Clash took that anger and gave it a global reach, marrying punk’s DIY ethos with reggae rhythms and protest lyrics that felt like rallying cries.
If punk had a philosophy, it was this: steal fire, not just for the sake of rebellion, but to illuminate the dark

corners of the world. It was an art form that rejected hierarchy and virtuosity in favour of authenticity and immediacy. Punk wasn’t pretty, but it was necessary. It called out the back-porch fantasies of the hippie generation and their pastoral escapism. It was a philosophy of meditation and honesty. In that way, punk was profoundly Promethean. Like the mythical Titan who defied the gods to bring fire to humanity, punk was about taking something sacred — art, music, self-expression — and

Massachusetts, 1979.
ripping it out of the hands of the elite. It was a celebration of imperfection, a rejection of the idea that greatness had to be polished or pretty. To me, Sonic Youth, perhaps more than any other band, embodied this ethos. They made music that felt like an abstract painting come to life — discordant, layered, endlessly intriguing. Jonathan Lethem once described their music as being from the perspective of the ‘permanent novice’, and that, I think, is the essence of punk. It’s about being a student of your own chaos, about finding beauty in the jagged edges of existence; it was sonically naïve and deliberately so. Punk was also an antidote to the bloated, self-indulgent rock of the 1970s. While prog-rock bands were busy writing 20-minute opuses about hobbits and spaceships, punk bands were writing two-minute anthems about unemployment, police brutality, and existential dread. It wasn’t escapism; it was confrontation.
However, punk wasn’t just music. It was fashion, art, literature — a gesamtkunstwerk of rebellion. In fact, it wasn’t all that different from Romanticism. Sure, Byron probably wouldn’t have worn safety pins in his ears, but the spirit was the same — a brazen refusal to bow, a craving for something bigger, wilder, and
THE RAMONES TURNED PUNK INTO A MINIMALIST MANIFESTO. THEIR SONGS WERE TWO-MINUTE EXPLOSIONS OF NOISE.
WITH HIS DEADPAN DRAWL AND HIS UNRELENTING GAZE ON THE SORDID UNDERBELLY OF HUMAN EXISTENCE, LOU REED SET THE STAGE FOR WHAT WOULD BECOME PUNK.
Lou Reed. Photograph: Michael Ochs Archives.
The Ramones at the Orpheum, Boston,
Like any great act of rebellion, it was about destruction as much as creation.


“
THE CLASH TOOK THAT ANGER AND GAVE IT A GLOBAL REACH, MARRYING PUNK’S DIY ETHOS WITH REGGAE RHYTHMS AND PROTEST LYRICS THAT FELT LIKE RALLYING CRIES.

Punk gave me an open-minded attitude to interpretation and tolerance. It taught me that creativity is inherently rebellious.
untamed. Where the Romantics stood on cliffs staring into roaring seas, punk found its edge in city streets — the screech of subway brakes, the hum of flickering neon, the mess of it all. It wasn’t about pretty or polished; it was about feeling, raw and unfiltered, shoved in your face. Both movements worshipped chaos, embraced imperfection, and demanded something real, even if it wasn’t pretty. Punk’s riffs and verses… they’re just another way of staring into the abyss, and finding the guts to spit back.
For me, punk was an epiphany — a reminder that art doesn’t have to be technically perfect to be powerful, that noise can be as meaningful as melody, that sometimes the most important thing you can do is break something open and see what’s inside. Punk gave me an open-minded attitude to interpretation and tolerance. It taught me that creativity is inherently rebellious, a way of stealing fire from the gods and using it to light your own path.
Maybe that’s why I keep coming back to it — why the snarling chaos of ‘Never Mind the Bollocks’ still sends shivers down my spine, why Patti Smith’s voice still feels like a lifeline, why the Ramones’ relentless rhythm still makes my heart race. Punk, in all its messy, magnificent glory, is proof that sometimes the fire is worth the burn.¢
David Bowie, Iggy Pop and Lou Reed by Mick Rock/ Dalle APF.


AVA WANG explains normalcy bias, our tendency to ignore a crisis until it’s too late.
Imagine this. In the near future, you’re on holiday, let’s say in Hawaii. You’re lying on the bed in your hotel room, scrolling through videos on your phone –something light, maybe funny, something you enjoy. Then, out of the corner of your eye, you catch a glimpse of something strange. A wall of ocean is approaching. A tsunami. It’s less than a kilometre away, rolling toward the shore, toward you, steadily and silently. What do you do next? Do you grab your belongings, call your family, run to higher ground? Easy. Obvious answer, right? Too obvious to ask? But here’s the thing: it’s not.
“
Normalcy bias affects around 80% of people and has been described as ‘one of the most dangerous biases we have’. In fact, even in ordinary life, it takes your brain about ten seconds to fully process new information. In emergencies, those seconds can be fatal.
IN FILMS, WHEN DISASTER STRIKES PEOPLE SCREAM, SCRAMBLE, STAMPEDE. BUT IN REAL LIFE, MANY FREEZE OR, WORSE, CARRY ON AS IF NOTHING IS HAPPENING.
Take the tragedy of 9/11. Of the 2,667 people who died that day, many, far too many, had no sense of urgency. Survivors recall calmly packing bags, putting on coats, chatting with colleagues while they filed out. Total normalcy after a plane had struck their building. Why weren’t they panicking? Why didn’t they run? It sounds irrational. It sounds stupid. But it’s something that could happen to you, too. This is due to a psychological trait known as normalcy bias.
Our brains are hardwired to interpret everything through the lens of what’s ‘normal’. When confronted with something that threatens our sense of normality, especially something unprecedented, our instinct is to cling to routine. To deny the danger. To pretend that everything will just go back to the way it was.
Popular media often gets this wrong. In films, when disaster strikes people scream, scramble, stampede. But in real life, many freeze or, worse, carry on as if nothing is happening. Emergency responders call this negative panic, a paralyzing inaction that’s just as lethal as chaotic panic, if not more so. At least panic tries to save itself. Normalcy bias does nothing.
Please note this isn’t the same as the freeze response (or fear bradycardia, a slowing of the heart triggered by fear that can help you think more clearly in life-or-death moments. Normalcy bias doesn’t make you pause and assess. It stops you from thinking at all.
So how do you overcome normalcy bias?
One word: education. Awareness is your best defence. By learning about threats and staying informed through news and current events, you give your brain the tools to recognize when something is not normal. Think of the early days of COVID-19. Many dismissed it, ignored precautions, convinced it wouldn’t affect them. That’s normalcy bias in action.
Back to the hotel. The wave is coming. You can see it. But still, you don’t move. Because that kind of thing only happens on the news, right? It would never happen to you. ¢
Drifting

ETHAN SPAREY explains why sleep is important and how you can get enough.
Sleep is far more than just a period of rest; it is a complex biological process essential for our physical, mental, and emotional well-being. For teenagers, sleep plays a critical role as our bodies and brains undergo rapid development.
Sleep is regulated by two main processes: the circadian rhythm and the sleep-wake homeostasis. Often referred to as the body’s internal clock, the circadian rhythm is a 24-hour cycle that regulates when we feel awake and when we feel sleepy. It’s what causes jet lag and makes it earlier to sleep when it’s night-time. It is influenced by external cues like light and temperature. For teenagers, this rhythm naturally shifts
later, meaning we tend to feel more alert in the evening and struggle to wake up early in the morning. This is why early school start times can be challenging for teenagers.
THE ‘SLEEP-WAKE’ PROCESS TRACKS OUR NEED FOR SLEEP BASED ON HOW LONG WE’VE BEEN AWAKE. “
The ‘sleep-wake’ process tracks our need for sleep based on how long we’ve been awake. The longer we stay awake, the stronger the drive to sleep becomes. This is why pulling an allnighter to finish an essay or study for an exam can leave us feeling exhausted the next day. Caffeine is well-known for disrupting this process and is probably why it is needed so much in the mornings.
Scientists have separated sleep itself into several stages based on your brain
activity during sleep, each with its own purpose:
Stage 1: The transition period between staying awake and falling sleep. This stage is brief and easy to disrupt. This is usually what we fall into when sunbathing or sitting in a very boring Geography lesson.
Stage 2: Your heart rate and body temperature begin to drop when the body prepares for deep sleep, and occasional spikes in brain activity occur here.
Stage 3: Deep sleep is crucial for physical restoration, immune function, and growth. If you’re struggling with sleepiness in the morning, it’s because you have had little deep sleep, which doesn’t necessarily come with a lot of sleep.
REM Sleep (Rapid Eye Movement): This is probably the most famous one and happens when most dreaming occurs. REM sleep is needed for things like memory, learning, and emotions.
For teenagers, it is increasingly important to get a proper night’s rest. Disrupted sleep can impair our ability to retain information (unhelpful for exams), regulate emotions, and even perform physically in sport.
Generally, most teenagers struggle with sleep for a few main reasons. The circadian rhythm shifts later during the teenage years, making it harder to fall asleep early. Obviously, the blue light emitted by screens suppresses melatonin production, delaying sleep onset. Online activities can also be mentally stimulating, making it harder to wind down. Additionally, drinking caffeinated beverages can interfere with sleep if consumed too late in the day.

So how can we improve our sleep? Understanding the science of sleep is the key to getting a good night’s rest. First, respect your circadian rhythm. Try to align your sleep schedule with your natural rhythm as much as possible. If early mornings are unavoidable either for work or school, aim to gradually adjust your bedtime earlier to ensure you’re getting enough sleep even if your friends or family are late-night party animals. Second, and maybe a more obvious one, limit blue light exposure. Reduce your screen time at least an hour before bed. If you must use devices, enable blue light filters or wear blue light-blocking glasses to minimize disruption of melatonin production.
“DISRUPTED SLEEP CAN IMPAIR OUR ABILITY TO RETAIN INFORMATION (UNHELPFUL FOR EXAMS), REGULATE EMOTIONS, AND EVEN PERFORM PHYSICALLY IN SPORT.
I’ve found activities like meditation or stretching right before bed calms my mind and prepares me for sleep. Also, as a coffee-lover myself, it was quite hard to limit my caffeine intake, especially in the afternoon and evening. My rule is no caffeine six hours before bed. Last, it’s incredibly important to remain consistent. Going to bed and waking up at the same time every day, even on weekends, helps regulate your circadian rhythm and improves overall sleep quality.
Over 70% of teenagers lack sufficient sleep, which affects not just their health but also their learning. By improving our sleep schedule, reducing our screen time and preparing ourselves for bed beforehand we can improve our rest. Prioritizing sleep is essential for thriving in today’s demanding world. ¢

Going to bed and waking up at the same time every day, even on weekends, helps regulate your circadian rhythm.
“ACTIVITIES LIKE MEDITATION OR STRETCHING RIGHT BEFORE BED CALMS MY MIND AND PREPARES ME FOR SLEEP.



COCO URSELL BEER, who recently suffered severe concussion herself, explores the condition’s complex character.
Arugby player being stretchered off at Twickenham. A boxer taking a knock-out blow in the ring. Friends say these are the images they see when they hear the word ‘concussion’. However, concussions don’t just affect sports professionals. Most happen to ordinary people, going about their daily lives: a blow to the head when falling off a bike or tripping over an uneven pavement. I personally experienced a severe concussion last year, and I have discovered that the mundane reasons they occur can hide the complexity of the journey that follows.
‘micro-tearing’ within the brain’s white matter.
“
A WHACK ON THE HEAD CAN LEAD TO A WHOLE HOST OF SYMPTOMS, FROM POUNDING HEADACHES AND DIZZINESS TO PROBLEMS WITH CONCENTRATION AND PROCESSING
INFORMATION.
What is a concussion, anyway?
Concussion results from what doctors call a traumatic brain injury, or ‘TBI’ for short. Neuroscientist Naznin VirjiBabul from the University of British Colombia describes the mechanics: ‘When there’s a blow to the head, or any part of the upper body, the brain actually slams into the front part of the skull, and then kind of vibrates backwards. So both the front part of the brain and the back part of the brain are actually slammed against the skull.’ There’s something almost cartoon-like in this researcher’s description, but when her explanation continues, it’s clear the consequences are anything but funny: ‘On top of that there’s a rotational component of movement’ that leads to what she describes as
Our white matter is made up of a large network of nerve fibres that allow the exchange of information and communication between various areas within the brain. The micro-tearing triggers what I’ve heard described as a huge neurometabolic cascade and changes within the ions in the brain. In turn, this creates an energy crisis and changes the way the brain functions. In simple terms, this means the brain ends up trying to do two things at once, working like crazy to repair itself while keeping all its other vital functions going. When concussions happen to children and young people, whose brains are still developing, you need to add into the mix this third element of trying to develop.
No wonder, then, how a whack on the head can lead to a whole host of symptoms, from pounding headaches and dizziness to problems with concentration and processing information. Additionally, because one part of our brain, the prefrontal cortex, which helps to control emotions, is especially vulnerable to strikes to the head, other common consequences of concussion include anxiety, sleep problems and issues with mood.


Women incidentally are more susceptible to head injuries than men and tend to take longer to recove.

There is no such thing as a standard concussion. There’s no ‘one fits all’ response to a hit on the head. Take Lewis Freeth. He was a medical student halfway through his studies at Newcastle University when he experienced two concussions a few years back. The first was in a road traffic accident in South Africa, while he was travelling during a summer break. He was knocked unconscious, taken to hospital and kept in overnight. During the following month he continued with his trip, didn’t feel any overt symptoms and, by the time he got back to his studies in October, was fine. Nine months later, he was knocked for six as a fullback in a rugby tackle. Freeth recalled that this time he didn’t lose consciousness, could remember everything, and yet the effects and symptoms were much worse: ‘There’s an unpredictable nature to concussion,’ he says. ‘There’s no correlation between how hard you are hit and the effect on your brain.’ Nor on the symptoms that follow.
There are countless challenges for health teams, since everyone responds differently to a blow on the head.
There are countless challenges for health teams, since everyone responds differently to a blow on the head. Women incidentally are more susceptible to head injuries than men and tend to take longer to recover; concussion can, at worst, be missed altogether. You could literally be sent home from hospital with a paracetamol. Or, because research in head injuries is moving so fast, it’s possible that even if a correct diagnosis is made differing advice may be given for recovery.
While one hospital team might discharge you and tell you to lay down in a dark room for a few days, another could give you a pamphlet advising you to have someone with you at all times. As my godfather pointed out: ‘Not a lot of help if you’re seeing stars and can’t read it, darling.’ Or you may be referred on for extra investigations. If this happens, as Freeth stresses, it’s possible even then to fall between specialities. ‘When we’re thinking about the brain within medicine, you’ve got neurosurgery, neurology and psychiatry. And I think to make our jobs as doctors easier, we like to pretend they’re three largely separate fields. And I think the lines between them are so much more blurred.’ I’d agree with
this. A brain surgeon decides if you need surgery. If you don’t, you’re passed on to a neurologist who, depending on their own experience, could have varying views on what to do next, such as to rest or not to rest. If you are lucky, like I was, you will get referred on again, to a specialist vestibular rehabilitation physiotherapist, who can help with symptoms of balance, dizziness and even retraining the brain to read and write.
And what about the emotional element to brain recovery?
For many people, it’s input from a specialist psychologist or psychiatrist to help you deal with the emotional turmoil caused by your symptoms is crucial to recovery. Helping, for instance, to teach you to cope with making sense of mood swings, a total inability to absorb information you once found easy, or how to deal with well-meaning comments from people around you like ‘Hey, you look so much better’ when

inside you feel like screaming, ‘Really? My mind’s a fog, I can’t remember your name and I just want to sleep!’
Input from a specialist psychologist or psychiatrist to help you deal with the emotional turmoil caused by your symptoms is crucial to recovery.
So, what’s the answer? In my view, a better understanding of all the above: a thorough awareness of the causes of concussion, the wide-ranging and individual differences in symptoms it can trigger, and the highly personal requirements and time necessary – from weeks to many months, for recovery.
Why do I feel this? It’s probably a question best answered by Lewis Freeth who, after qualifying as a doctor, is now training to be a psychiatrist. ‘Your brain is who you are. It’s very, very vulnerable and when its damaged your whole experience of life is altered.’ I agree and would say if you’ve got to the end of this piece and feel you’d like to know more, have a listen to the ABC All in the Mind podcast, The Confusion Around Concussions, especially if you are in a position to make a difference. After listening, you may want to. ¢

“FREETH RECALLED THAT THIS TIME HE DIDN’T LOSE CONSCIOUSNESS, COULD REMEMBER EVERYTHING, AND YET THE EFFECTS AND SYMPTOMS WERE MUCH WORSE.


LYANNA FIKURY warns us about the threat posed to normal life by ‘smart’ phones.
Ever since the advanced device we call a ‘smartphone’ was invented, our world has drastically changed. It has forever affected newer generations in ways never imagined, providing easy access to amenities such as the internet, maps, and the ability to message anyone from anywhere.
people. Studies show that individuals with problematic smartphone usage are at a higher risk of experiencing depression compared to those with moderate use.
But smartphones have also changed the way we communicate, and not for the better. It’s true that anyone is accessible from anywhere, and phones have saved lives. But they’ve also ruined lives, and they could potentially ruin yours too. They limit people from reaching their true potential, because they are rapidly altering how humans have communicated for thousands of years. The problem is only getting worse. The convenience of texting reduces faceto-face interactions, making it difficult to understand social cues such as tone of voice, facial expressions, and body language. Most importantly, deep conversations require effort to convey emotion, whereas with a single tap an emoji can now replace real expression. This reduction in face-toface interaction increases depression and social anxiety, especially among young
“
PRETENDING TO BE SOMEONE ELSE IS MUCH EASIER BEHIND A SCREEN THAN IN REAL LIFE, AND SOME PEOPLE TAKE ADVANTAGE OF THAT.
People are not always who they seem online. There are many platforms where anyone can talk anonymously: Discord, Snapchat, Reddit – you name it. Pretending to be someone else is much easier behind a screen than in real life, and some people take advantage of that. Harmful words can be sent with the tap of a button. In person, it takes real audacity to say something cruel, and the consequences are usually immediate. Cruel words shape people, whether they are the ones saying them or receiving them, especially online. Behind these screens are people who may no longer know how to make a simple human connection, and that shouldn’t happen to you.
Use phones and screens wisely. Use them to your advantage, not as a replacement for real social interaction. Don’t let a device dictate how you live, connect with others, or spend your time. And if something online seems too good to be true, it probably is. ¢

LUCA SAND reminds us that the Republican and Imperial eras of Ancient Rome were not separate, but linked by a gradual transfer of power to a handful of remarkable individuals.
When we approach Roman history, the Republican and Imperial eras are often separated completely, and treated as unrelated to one another. But this is not correct. The process that paved the way for Augustus to become princeps, first citizen, began around 100 BC and took a century.
“ For
This process was the rise to power of certain individuals, and the massively increasing wealth gap in Rome. As a result of Rome’s staggering successes in the late 2nd Century BC, wealth flooded into the Republic faster than ever before, and over time it became concentrated in the hands of a few. In the early 2nd Century BC, the only way a person could enrich themselves was through serving the Roman State and advancing their political career, such as the Roman General Scipio Africanus, who won fame and riches in the Punic Wars against Hannibal and the Carthaginians. By the time of Caesar, the state of the Republic was such that he could win glory and riches in Gaul virtually without any intervention
by the Republic, and important figures like Pompey or Crassus could even raise their own privately funded legions.
CAESAR’S FAMOUS CROSSING OF THE RUBICON AND SUBSEQUENT MARCH ON ROME WAS NOT ORIGINAL: HE WAS MERELY IMITATING SULLA.
Caesar’s famous crossing of the Rubicon and subsequent march on Rome was not original: he was merely imitating Sulla, a Roman general of the 80s BC who had also made himself dictator. These acts put Rome on the path to becoming an Empire. It is widely known today how much the Romans hated kings, and how firmly opposed they were to the idea of kingship. It may therefore seem confusing and contradictory why the Romans of the early Empire so readily accepted the arrival of the Imperial Age. This can be explained through the events of the previous century, beginning with a famous general called Gaius Marius.
The most successful Roman general of his era, Marius rose to prominence through victory on the battlefield, making him an immensely popular figure. As a result, he won an unprecedented five consecutive


elections as consul from 104 – 100 BC (there were two consuls each year who jointly oversaw the Republic, and each stood for one year). This feat was unheard of, and his years in power gave him a level of influence in the Republic that had not been attained before.
The greatest result of his consulships were the military reforms he pushed through, known as the Marian Reforms. These transformed the Roman Army from its former organisation to a state more recognisable to us, and similar to that of the Imperial period. Cohort units consisting of 480 men each were introduced; the armament of the legions was also standardised. Most importantly, each legion was recruited together from a certain region, so that rather than a mix of soldiers from all over Italy, legions began to form their own identities and loyalties.
Marius also broke from the tradition of only allowing propertyowning men to join the army. Those in the lowest census class, the proletarii, had a lesser concern for the health of the Republic because they owned no land. Thus, they tended to develop a greater attachment to individual generals and would fight for them in the hope of being rewarded with war booty and property once they retired.

They tended to develop a greater attachment to individual generals and would fight for them in the hope of being rewarded with war booty.
As a result of these reforms, Romans fighting other Romans (something unheard of in the previous century) would become increasingly common: Caesar’s legions, for example, exemplified their lack of loyalty to the Roman State when they marched on the city in 49 BC, having developed a strong connection to Caesar personally from the Gallic Wars.
The decade of 90 BC saw extreme social unrest in the Republic and increasing political instability. The Italian peoples revolted in 91 BC against the Romans, and chaos ensued across the Italian peninsula during the Social War of 91-87 BC. Sulla, a general who helped Rome achieve success in that war, was a staunch enemy of the

likes of Marius, the populares, who desired restriction of the Senate’s powers. In 88 BC he took his army to march on Rome and dispose of his political enemies, restoring the Senate’s authority. After briefly introducing a new constitution and placing new consuls, Sulla was forced to depart for the East, where the king of Pontus, Mithridates, was invading Roman territory in Asia Minor (modern Turkey). Once again chaos enveloped Rome, and the consuls fought with each other. Upon Sulla’s return from the East in 82 BC, he marched on Rome a second time, disposing of his enemies. Although Sulla behaved with extreme brutality, which was deplored by later Romans, he genuinely strove for the restoration of the Republic, introducing measures designed to stop others from replicating his own actions.
Sulla’s legacy in Rome was great: his use of force in the seizure of power set the standard for the future. Caesar was greatly influenced by him and learned from some of his mistakes. Where Sulla had settled his veteran soldiers in Italy through expropriation of land, which would cause considerable strike and violence in the future, Caesar sent his abroad. This served to both remove the potential threat of having trained soldiers close to Rome and helped to spread Roman power to areas further afield. Where Sulla had brutally killed his political enemies when declaring himself dictator in the name of the Senate, Caesar preferred a merciful approach. Although this allowed his enemies to remain strong,
Caesar also claimed to represent the people: unlike Sulla, his first acts as dictator were related to easing the debt crisis.


it nonetheless earned him far greater popularity than Sulla ever had. Caesar also claimed to represent the people: unlike Sulla, his first acts as dictator were related to easing the debt crisis and the cancellation of rent payments for 49 BC; his reforms of the Senate and other measures relating to politics came after.
The third important character in the late Republic represented everything wrong with the Republic, and was a figure who prepared the way for the Empire. Pompey had been a subordinate of Sulla, but following the latter’s death in 78 BC he inadvertently helped undo the reforms aimed at strengthening the Empire. Because he was not as personally adept at the complicated politics of Rome as Caesar, Pompey made his fortune abroad through wars and governorships in Spain and the eastern portion of the Republican Empire. Pompey exposed the greatest problem facing the Republic at that stage: it was simply far too large to be ruled efficiently by the Senate. People like Pompey could gain vast wealth by campaigning far from Rome in the East, with minimal government intervention. Caesar did the same in Gaul, creating what was essentially a private domain, with only the name of the Republic. During the Punic Wars against Carthage in the 2nd Century BC, the consuls and other generals were tightly

controlled by the Senate, control made easy by the relatively short distances. The contrast to Pompey’s campaigning in the East during the 60s BC was stark: he signed treaties, directed campaigns and carved out new territories at will, with the Senate only consulted to
The transition from democratic to monarchical ideals was not immediate, nor did it begin with Augustus or even Caesar.
confirm his arrangements upon his return. This decentralisation allowed individual ‘warlords’ like him to essentially create private empires, as those rulers and governors whom Pompey established were loyal to him first, and the Republic second.
The transition from democratic to monarchical ideals was not immediate, nor did it begin with Augustus or even Caesar. Cicero, the renowned orator and prominent politician of the time, envisaged Pompey taking a role as a moderator and protector of the Republic (suggesting his superior status), and even called him princeps, the term later used by Emperor Augustus.
Pompey was eventually defeated by Caesar in a civil war following a falling out, but the number of legions used was staggering at the decisive Battle of Pharsalus (48 BC). Caesar commanded over 22,000 men, and Pompey around 50,000, and each had considerable fleets. Many of these soldiers from both sides belonged to


CICERO, THE RENOWNED ORATOR AND PROMINENT POLITICIAN OF THE TIME, ENVISAGED POMPEY TAKING A ROLE AS A MODERATOR AND PROTECTOR OF THE REPUBLIC.
the generals individually, fighting in their name with little recognition of Republican authority.
A series of Civil Wars from 50 BC eventually culminated in the rise of the Empire and of Augustus – the fifth major person, who amassed more personal wealth and power than anyone before, and he is thought to have been one of the richest men in history with a modern net worth of $4.6 trillion. It is no coincidence that the figures mentioned here – Marius, Sulla, Pompey, Caesar and finally Augustus – are addressed both chronologically and in ascending order of wealth.
And there we have the gradual process by which the authority of the Republic was transferred to individuals. And the process was certainly not peaceful. ¢
HANNIBAL CROSSING THE ALPS WITH ELEPHANTS DURING INVASION OF THE CARTHAGINIAN ARMY.
THE MOST SUCCESSFUL ROMAN GENERAL OF HIS ERA, MARIUS ROSE TO PROMINENCE THROUGH VICTORY ON THE BATTLEFIELD.

ONIMISI SALAMI maps the development of the Olympic Games and tells a story that captures the spirit of a tradition going back 3000 years.
T“ Going for
he Olympic Games are the biggest sport event in the world, but they’ve gone through several changes to make them still the most popular international event ever created. The first Olympic Games were held nearly 3000 years ago. Although these technically weren’t the same as the modern games, we all know they was inspired by this period. The first games were held in Olympia, Greece, in 800 BC. This is interesting because although most things that happened at such a primitive time rarely continued, the Olympics still go on today, expressing the longevity, interest and entertainment offered by sport.
and simply primitive; for example, the games were mainly a religious festival to pay respects to Zeus, the King of the Gods, and there were no medals for winners, just a wreath of leaves, a red wool ribbon tied around the head, a palm branch, elevation to the status of demigod and a hero’s welcome in your home city.
THE GAMES WERE MAINLY A RELIGIOUS FESTIVAL TO PAY RESPECTS TO ZEUS, THE KING OF THE GODS, AND THERE WERE NO MEDALS FOR WINNERS.
The events that were held in the ancient Olympic Games were foot races, jumping, discus throwing and wrestling, then the pentathlon which was a combination of the five, and then boxing. The pankration was a combination of boxing and wrestling. As you would expect, the Olympics have taken on changes in the last hundred years or so since some elements were discriminatory
Women were not allowed to participate and, in the Olympics, no clothes were worn as competitors wanted Zeus to acknowledge their physical power and intimidate the competition. People would often die in events like the pankration because killing wasn’t against the rules and would be considered a victory. This form of Olympics carried on until 400 AD when it was abolished by the Roman Emperor Theodosius because the games were a tribute to Zeus, which opposed Christianity, and this resulted in the abandonment of Olympia.
Over 1000 years later, a man named Baron Pierre de Coubertin formed the


NO CLOTHES WERE WORN AS COMPETITORS WANTED ZEUS TO ACKNOWLEDGE THEIR PHYSICAL POWER AND INTIMIDATE THE COMPETITION.

The events were archery, athletics, swimming, darts, table tennis, wheel chair basketball and wheelchair fencing.
International Olympic Committee (IOC) in 1894. This led to the first modern Olympic Games in 1896, held in Athens, Greece. The new Olympics had developed new events because the world had advanced, and it featured 43 events in athletics, cycling, swimming, gymnastics, weightlifting, wrestling, fencing, shooting and tennis, all new events. Along with new events there were new regulations. In the 1900 Games, held in Paris, women competed in the games for the first time, but out of 997 athletes only 22 women were able to participate, making them only 2.2% of the athlete body. The first winner was Helene de Pourtales who, representing Switzerland, became the first woman to compete and win. She was a member of the winning team in the sailing event.


In the 1900 Games, held in Paris, women competed in the games for the first time, but out of 997 athletes only 22 women were able to participate.
The IOC kept developing and building the Olympics until in 1921 the IOC supported the first Olympic Winter sports week in Chamonix. This event was a huge success, with Norway winning the most medals. In 1925 the IOC modified its order to create a separate Winter Olympics to be held every four years. The IOC also decided to hold the summer and winter games in the same year, until 1992.
In 1948, in the county of Buckinghamshire, England, the Stoke Mandeville Games, which later became the Paralympics, were held on the same day as the opening of the London Olympics. This event was organized by Dr Ludwig Guttman, a neurologist who understood the importance of sports for people suffering from physical disabilities. The games were held annually in the Stoke Mandeville hospital and then went to Rome. The events were archery, athletics, swimming, darts, table tennis, wheel chair basketball and wheelchair fencing. This featured 400 athletes from

23 countries and ever since then has been held once every four years.
The Olympics were never just a sports event. They became something that will continue for centuries to come because they transform people’s lives by making winners heroes to their people. Before the 2008 Beijing Olympics, an Austrian weightlifter who had failed to win a medal in the previous Games, promised his wife, Susann Steiner, that he would win this time, but in 2007 she died in a car crash, and this was all the motivation Steiner needed to win gold. Although he struggled to keep up with the competition, in his final attempt at the clean and jerk he needed to lift 10 kilograms more than his failed previous attempt. When he won his gold medal he showed the world a photograph of his late wife, and cried. This is the spirit of the Olympic Games. ¢

The Olympics were never just a sports event. They became something that will continue for centuries to come.


“
THIS EVENT WAS ORGANIZED BY DR LUDWIG GUTTMAN, A NEUROLOGIST WHO UNDERSTOOD THE IMPORTANCE OF SPORTS FOR PEOPLE SUFFERING FROM PHYSICAL DISABILITIES.
Our In Defence

BEA HATTON asks if the UK needed to increase its defence budget, even at the expense of foreign aid.
On 25 February 2025, Sir Keir Starmer, British Prime Minister, announced an increase in UK defence spending to 2.5% of GDP, which means £13.4 billion more being spent on defence. As a result, the UK’s foreign aid budget was reduced from 0.5% of GDP to 0.3%. Was this switch of funding from charity to weaponry really necessary?
Foreign aid is any type of help that one country voluntarily gives another, which can take the form of gifts, grants or loans. Defence is money spent by a government to provide its military with weapons, equipment, and soldiers. The big question is why one of these is more important than the other.
Foreign aid is important because it helps those most in need. The Official Development Assistance is aimed at ending poverty, while it focuses on helping the least-developed countries. Defence spending is important because it helps to improve national security, deterrence and international relations, while also building a stronger crisis response. In addition, any increase in a country’s defence budget ‘stimulates the economy’ by creating more jobs and driving the development of civilian business sectors.
“THE TIMING SUGGESTS AN URGENT DESIRE THAT TRUMP PERCEIVE STARMER AS A STRONG LEADER WILLING TO USE UK RESOURCES, RATHER THAN RELYING ON THE US.
The UK’s increased defence spending is in response to the ongoing war between Russia and Ukraine and the changing relationship between the United States and Europe over this war. When Starmer announced the increase, it was only just before he travelled to the US for a conference with Trump, and the timing suggests an urgent desire that Trump perceive Starmer as a strong leader willing to use UK resources, rather than relying on the US. Starmer wanted to seem stronger and more reliable after Trump said he

Foreign aid is arguably more cost effective at enhancing national security because it addresses the root problems.
wanted Europe to step up in Ukraine because, since the 2022 Russian invasion of Ukraine, the US had spent about $70 billion on arms for Ukraine compared with only $145 billion in financial, military, humanitarian and refugee assistance to Ukraine from the whole EU: the US is a single country when the EU is a collective of 27 states. But it is also important that Europe becomes less reliant on the US and builds up its own defences because, since Trump re-entered the White House in January 2025, he has spoken about the idea of the United States leaving NATO, which would be cataclysmic because it is a global superpower that contributes most to NATO. But there is also the ongoing war between Palestine and Gaza, and growing instability across the world, and the UK’s military needs to be strong and independent.
Nevertheless, foreign aid is arguably more costeffective at enhancing national security because it addresses the root problems in less developed countries, such as poverty and lack of education, and therefore helps to prevent the outbreak of conflicts in the first place. Foreign aid also helps prevent refugee crises by rebuilding those countries from which people are fleeing.
But one must also consider other countries’ foreign aid policies when setting our own expenditure. In 2023 the United States spent $68 billion on foreign aid but since Trump is committed to cutting foreign aid the UK needs to help countries who are having US aid pulled. The China-Africa Development Fund, China’s aid to Africa, is an important factor. Between the years 2000 and 2023, China has loaned Africa $182.28 billion, with most of the money going towards Africa’s energy, transport and ICT sectors, another reason why the UK should not reduce its foreign aid: China is increasing Africa’s debt while deepening China’s power over the continent, which is a long-term threat to Western powers.
The UK needs to help countries who are having US aid pulled.
So, Starmer has a sound motive for increasing defence spending, when the world order is crumbling, but the policy was rationally put in place for the good of the country while he had no desire to reduce foreign aid.
“STARMER WANTED TO SEEM STRONGER AND MORE RELIABLE AFTER TRUMP SAID HE WANTED EUROPE TO STEP UP IN UKRAINE.


ttenti0n A

GRACE WANG dispels some of the harmful myths about ADHD.
ADHD is everywhere but many of us still don’t know what it is. Attention Deficit Hyperactivity Disorder (ADHD) is a neurodevelopmental condition affecting about 5-10% of children and 6.7% of adults globally. Characterized by symptoms of constant inattentiveness and/or hyperactivity and impulsiveness, ADHD can affect many aspects of people’s lives, such as academic and social performance. Although this scenario is common as a daily reality for millions of people in the world, there are still lots of people who do not comprehend how hard it is for people with ADHD to get through the day.
“
organizing and prioritising assignments as challenging tasks. Moreover, a little or no sense of danger can cause potential harm to their physical health. The intensity of symptoms can vary based on age and gender so an early diagnosis can help.
THERE ARE STILL LOTS OF PEOPLE WHO DO NOT COMPREHEND HOW HARD IT IS FOR PEOPLE WITH ADHD TO GET THROUGH THE DAY.
ADHD always evokes distraction or buzzing around. But scientifically it is categorized in three types: inattentiveness, hyperactivity and combined. People with inattentiveness tend to have short attention spans, and they are easily distracted, unable to stick with tedious tasks and constantly changing activities. People with this type of issue appear to be forgetful and dreamy, and they may periodically start wondering about things irrelevant to the conversation or zoning out. Meanwhile, people with hyperactivity are presented as constantly fidgeting with excessive talking, acting without thinking or making impulsive decisions. Those with combined symptoms experience both types of ADHD, which makes life even more chaotic and overwhelming. Beyond all those visible features, the inner impairment, like emotional dysregulation, will make the person experience sudden mood shifts. Meanwhile, some ADHD people have time management difficulties because they see
Although the exact cause of ADHD has not been clarified yet, there is strong evidence that it is mostly a brain structural and developmental, brain functional and connective, and hereditary condition, which is complex but is not considered to be from a faulty gene. Based on recent research, ADHD is initially congenital, with a heritability rate of over 80%. Therefore, people who have parents or siblings who have also diagonalized ADHD will have a higher potential of developing the same issue. However, only about 20% of cases get ADHD from non-inherited factors. People who were born before the 37th week of pregnancy or with a low birth weight will have a higher likelihood of getting ADHD. Additionally, psychosocial adversity, including low parental education and peer victimization that cause severe psychological stress, or very rarely a mutation in the prenatal period, will trigger ADHD.
Structurally, by using imaging and measuring techniques (such as CT scans and magnetic resonance imaging), scientists find that people with ADHD are presented to have a reduced volume of certain brain areas, such as the prefrontal cortex, posterior inferior vermis of the cerebellum and the basal ganglia. These are structures in the brain that are mainly responsible for
focusing, coordinating movements, cognition and regulating emotions. This finding implies that ADHD is not a difference in behavioural preference. Instead, the laziness, forgetfulness, and hyperactivity that the individual exhibits are more likely due to the differences in brain structure.
Finally, in the functional aspect, ADHD is characterized by neurotransmitter dysregulation, especially dopamine and norepinephrine (involved in maintaining alertness and focus). These two neurotransmitters work together to modulate activity in the prefrontal cortex and other brain regions responsible for self-control and attention. An imbalanced level in these neurotransmitters disturbs the balance of excitation and inhibition in neural circuits, leading to ADHD characteristics. Deficiency in dopamine leads to delayed gratification and managing impulsion, as well as difficulty executing tasks. Deficiency in norepinephrine heightens emotional reactivity and distractibility. Together, these factors display how ADHD is formed with an understanding of the brain structure, function, and underlying biology.

Once the person is diagnosed with ADHD, they will be treated with medical interventions, such as stimulants like methylphenidate and amphetamine-based drugs.
ADHD in children and adults is usually noticeable before the age of six because the symptoms can be easily recognized. However, it is more often diagnosed in boys than girls. This is because girls tend to have symptoms of inattentiveness only, which is less obvious because they are unlikely to show disruptive behaviours. Researchers at Cambridge, England, and Oulu, Finland, followed 49 adolescents diagnosed with ADHD aged 16 and examined their brain structure and memory function in young adulthood (between 20 and 24 years old), compared to a control group of 34 young adults. The results show that the group diagnosed in adolescence

had reduced brain volume as adults due to a reduced grey matter within the caudate nucleus (the brain region that integrates information across different parts of the brain and supports cognitive functions, including memory) deep in the brain, which leads to poor memory function. Therefore, it is important to diagnose early and interact with effective treatments to relieve the impairment.
Diagnosis of ADHD has a comprehensive evaluation, including screening, clinical interview, and diagnostic assessment. If people feel that they may have ADHD, they may participate in a free screening process carried out by a professional psychology team, who will ask about their symptoms and duration, as well as their family history of ADHD. If the results indicate ADHD, the system will suggest they go for a clinical interview. Before the psychologists arrange your assessment, they will ask someone who has known them well since their childhood (because ADHD is congenital) to fill out a report form for further understanding. When they see a GP, the GP will suggest a period of observation lasting around 10 weeks to see if the symptoms of the person have lifted, stayed the same or got worse. If necessary, the GP will conduct cognitive tests to measure attention and executive functioning. To ensure an accurate diagnosis, other conditions like anxiety, depression, learning disabilities, or sleep disorders are ruled out. While ADHD is mostly diagnosed in childhood, adults can also be evaluated based on current symptoms and childhood history.
There is strong evidence that it is mostly a brain structural and developmental, brain functional and connective, and hereditary condition.



Once the person is diagnosed with ADHD, they will be treated with medical interventions, such as stimulants like methylphenidate and amphetamine-based drugs (Adderall, Vyvanse) to improve focus and impulse control by increasing dopamine and norepinephrine. Nonstimulant options, such as atomoxetine or certain antidepressants, are alternatives for those who cannot tolerate stimulants. The interaction of psychotherapy, particularly Cognitive Behavioral Therapy (CBT), helps address negative thought patterns and improve problem-solving. Giving educational support, such as extra tutoring and extended test time, promotes academic success. In addition, having regular exercise, balanced nutrition and sufficient sleep helps regulate energy and mood. Mindfulness techniques, such as meditation and yoga, improve focus and reduce stress.
ADHD is a complex neurodevelopmental condition caused by a combination of genetic, neurological, and environmental factors. It affects brain structure, function, and neurotransmitter regulation, leading to attention, impulsivity, and hyperactivity challenges. Once ADHD is diagnosed, treatment requires a combined approach, including medical interaction and lifestyle and support strategies. With a better understanding of ADHD, we should embrace people with such uncontrollable issues and show more understanding. ¢
Mates
LILY ROBERTSON, our editor, has no idea what love may be, but is quite sure what a soulmate may not be when it is what it isn’t.
Since the beginning of time, or whenever we began to immortalize ourselves in history, people have longed for love. It is my belief that when someone wants something so badly, but it simply does not exist, they create it. We did not have an afterlife, so we made Heaven. We could not blame ourselves for our own wrongdoings, so we conjured the Devil. Men could not accept that it is women who invent life, and so we had to have a creator, God. Of course these are pessimistically phrased examples, but my point stands that people need an explanation. There must be meaning to understand and hold on to; otherwise it is embarrassingly easy to just let go. This general theory is applicable to the legend of soulmates.
“
THIS MYTH THAT WE FASHIONED SYMBOLIZES OUR HOPE THAT LOVE CAN MAKE US STRONGER, AND IT IS WHAT WOULD HEAL AN ANCIENT UNREST INSIDE.
depressed in their solitude. In Plato’s ‘Symposium’, Aristophanes says, “Each one longed for its other half, and so they would throw their arms about each other, weaving themselves together, wanting to grow together. Love is born into every human being; it calls back the halves of our original nature together; it tries to make one out of two and heal the wound of human nature. Each of us, then, is a ‘matching half’ of a human whole…and each of us is always seeking the half that matches him.” I suppose this myth that we fashioned symbolizes our hope that love can make us stronger, and it is what would heal an ancient unrest inside.
In Ancient Greek Mythology, humans were originally androgynous. They had four arms, four legs and a head made of two faces: happy, hopeful and therefore evoking fear in the mind of Zeus, who worried about their power. The god of the sky split the creatures in two, leaving the broken beings to search throughout their lives for their lover. The people were
Another popular take is The Red String Theory. It is deeply connected to East Asian folklore, specifically in Japanese and Chinese culture. It suggests that everyone has an invisible red thread tied to their pinky finger, and the string connects two people who are destined. Regardless of time, place or circumstances, they will meet and eventually be together. It may knot or tangle or stretch, but it never breaks. I think this is the epitome of optimism in expectation. I don’t know about whoever’s


reading this, but I can certainly relate to the sentiment of feeling drawn to a person no matter what. I love the concept of willingness despite hardship. The reliance in the notion that there is someone always connected to you no matter what is going on in your life is undeniably charming.
There are countless other examples. ‘Zivug’ is the Jewish equivalent of the concept. It suggests that there are divine pairings of two souls, often associated as the other half of a person’s being. In fiction, the notable instances are endless. I must say, I always wished for a Diana Barry to my Anne Shirley, or a Morticia to my Gomez, or a WALL-E to my EVE. I loved that two robots who hardly spoke formed such a connection that was pure and undeniable even to a human. I feel it so deeply inside of me that there is something indescribably beautiful in loving, despite its flaws and challenges. There must be trials to love, like all other forms of happiness, otherwise one cannot understand or appreciate it fully. Still, it would not be a proper review unless I went over the discussion’s disagreement.
Soulmates are widely criticized by the same people who hate Disney movies. What I mean to say is that those who favour reality and believe in trusting only what they can see right in front of them are fair to point a finger. The idea of predestined matchmaking

and lovers crossed in the stars pose to set unrealistic standards, and potentially undermine perfectly healthy relationships. Is not feeling an immediate cosmic connection to your Tinder date reason enough to make a break for the exit? Is life worth living if you haven’t found the person to live for? The flaming feminist and antisocial individual inside of me stands up and says that love shouldn’t complete you, but it should not heal something inside of you. You are more than enough all on your own, and growing and evolving shouldn’t have to be achieved only with the held hand of a lover. I

think loneliness is an illness which we have assumed can only be cured by romance. I have many a friend who has come to me distraught, unable to leave a relationship that they know is broken because they would rather the turmoil of picking up broken glass than the heartbreak of being alone. I understand the fear and frustration, of course, and I would never preach to someone that it will all end up alright because their soulmate is out there waiting for them. It is only that love is a feeling which is so sought after that it is becoming harder and harder to truly attain. The purity of the spirit of soulmates is a lovely thing to believe in, truly, but it is not something to be dependent on. There is something deeper in loving that I am not sure I can sum up.
So let’s circle back to Plato! While Aristophanes’ myth in ‘Symposium’ is often cited as the origin of the soulmate theory, Plato himself was not the biggest believer. His own views leaned more towards love being a power that can lead you towards great wisdom and eternal happiness. ‘Phraedrus’ was another brilliant work of his, and my favourite quote from it would be, ‘Love is a serious mental disease.’ I can attest that he has a clear point. He points out that love is almost a kind of madness, but it ultimately brings the soul into truth, and so love must be worth it.
In Plato’s ‘Symposium’, Aristophanes said, ‘Each one longed for its other half, and so they would throw their arms about each other, weaving themselves together, wanting to grow together.’


His philosophy was essentially that soulmates are a lesser interpretation of true loving. It is not just about finding the other person and being with them, but being in love can help your soul to remember its nature, and ascend towards beauty and trueness. So, rather than searching for a missing half, love is about finding yourself and who you have the potential to be. It is important to note that soulmates are not people who are predetermined and made from birth. You will grow in life and be many different people in versions of the same body. Soulmates, if they exist, will be forged and built over time. But that is enough of Plato.
There will be times in life when you’ll meet someone, and it will become apparent that the two of you work. It might be an immediate gut feeling, or it may take years for you both to grow into the right people, whether you are lovers, friends or family, or as a fact that transcends labels entirely. You just make sense. It will become clear that this, this is what life is about. Not the green on the bills but the green in their laughter. The wealth that you have acquired from earning this person’s love. I don’t believe in soulmates; quite frankly I think they’re stupid even after writing this article, because I choose to love people on purpose. But it matters very little what you call these connections you have with people. If a person makes you better so much so that you feel beyond the physical world then there is nothing to do but love that person and enjoy the world as it becomes yours. Soulmates or not, you mean something to one another, and that is the purest, most valuable form of currency you could ever have. Choose to love. ¢



LINA WANG warns us not to take the false appearance of social media seriously.
Social media distorts the perception of beauty. Every day, millions of young people, especially young women, wake up and check social media before getting out of bed. They look in the mirror and wonder if they are ‘enough’. They over-analyse their face and compare it to the flawless faces they see online.
We live in a world now where social media controls trends. When beauty is seen as selfworth, they set an unrealistic beauty standard that makes girls feel the need to change – apply more makeup and photoshop their photos just to be seen as normal, acceptable.
“
Don’t let this constant need for approval shape who you are, because in reality beauty really is not defined by perfection. It’s about uniqueness, individuality and embracing your flaws, because those imperfections are what truly make someone stand out and be unique.
IF YOU KEEP COMPARING YOURSELF TO INFLUENCERS AND CELEBRITIES WHO HAVE ACCESS TO THE BEST OF EVERYTHING, YOU’RE COMPARING YOURSELF TO AN EDITED DREAM.
But in truth, it really doesn’t matter. You don’t have time to live for other people’s opinions. What truly matters is how you see yourself.
Social media makes a meticulously wellchosen world where perfection seems effortless. But in fact it’s all an illusion. Almost every selfie and photo you see online is edited, and deep down everyone knows it’s not real, yet millions of girls still feel pressured to live up to these standards. Despite knowing that these images are altered, the desire to meet social media’s beauty standards is hard to shake. It’s not just about looking a certain way, but about feeling like you’re accepted, fitting in, like you’re flowing and holding on with everybody else in this endless stream of trends. You’re riding the wave of trends but forgetting who you really are underneath.
Think about the people you look up to the most. Is it purely because they have perfect faces? Hopefully not. It’s because they have personality and confidence and humour. Confidence is magnetic. When you stop caring about your appearance and how you look to others, people respect you more. No one actually notices if your hair looks greasy that day or if your nose looks big. They care about how you carry yourself, about how you portray yourself, and the moment when you stop obsessing is the moment you start attracting.
If you keep comparing yourself to influencers and celebrities who have access to the best of everything, you’re comparing yourself to an edited dream. Most of those people spend hours posing and editing. It’s their job, after all, and you have better things to do than waste your 86,400 seconds every day obsessing over how you look. Why try to keep up with something that isn’t even real? The girl with the perfect life? She’s probably just as insecure as you. So stop. Stop trying to meet a beauty standard that isn’t real. Stop editing your face just to feel good enough. Looks fade. Character stays. Focus on growth, not appearance. ¢

Orders

Where the last Pope is concerned, TOBY NIEMAN does not think a eulogy is justified.
Eulogies are full of praise. It is after all what they mean: a eulogy is a noun from the Classical Greek εὐλογία meaning praise. This is because it is considered ‘rude’ and ‘poor form’ to speak ill of the dead. This means that dead men are looked on with kinder eyes. Sometimes these eyes are not deserved. So maybe this is not a eulogy. Maybe this would better be described as an ‘Alítheia’ from the Classical Greek ἀλήθεια meaning ‘Truth’ or ‘Disclosure’. Here begins the Alítheia of Pope Francis, Bishop of Rome.
“
POPE FRANCIS SPOKE ABOUT WOMEN’S ‘UNIQUE GIFTS’ AND CALLED FOR THEM TO HAVE A ‘GREATER ROLE’ IN THE CHURCH.
Pope Francis was born Jorge Mario Bergoglio in 1936 in Buenos Aires, one of five children to an Italian immigrant accountant and a housewife also of Italian descent. His father’s family fled the fascist Italian government led by Mussolini.
Jorge Mario Bergogli studied Chemistry and worked briefly as food technician in a laboratory before discovering a calling to the priesthood. He joined the Society of Jesus, beginning his training as a Jesuit and becoming ordained in 1969. He was the Jesuit Provincial of Argentina during Argentina’s ‘Dirty War’. While he is credited with sheltering some people from the military regime, many said he failed to defend his fellow Jesuits vocally when they were abducted and tortured, leading to his temporary suspension from his roles within the order until the 1990s when he began serving in public roles within the church again.
He became Archbishop of Buenos Aires in 1998 and was made a cardinal by Pope John
Paul II in 2001. As Archbishop, he chose to forgo the traditional palace for a modest apartment, cooked his own meals, and used public transport. This humility earned him the respect of many within and outside the Church. He became known for championing the poor, opposing consumerism, and distancing himself from Vatican elitism. In 2013 after the resignation of Pope Benedict XVI, Jorge Mario Bergoglio was elected Pope. He took the name Francis after the saint of radical poverty and peace, and quickly became known for his simplicity, symbolic gestures, and mediafriendly style. He washed the feet of prisoners, lived in the Vatican guesthouse rather than the Apostolic Palace, and spoke out regularly on climate change, migration, and economic inequality. But symbolic gestures are not the same as institutional reform.
Pope Francis spoke about women’s ‘unique gifts’ and called for them to have a ‘greater role’ in the Church. Under his leadership, however, little progress was made. He maintained that the ‘door is closed’ on the question of ordaining women. Women are permitted to consult and may have symbolic roles but remain excluded from any real authority in the Vatican’s hierarchy.
Pope Francis is quoted by many as being a progressive, especially when compared to his predecessors, although that is not a high bar. In 2013 he famously said, ‘Who am I to judge?’ regarding LGBT people. While he supposedly encouraged respect for LGBTQ children, he also suggested that those same (apparently respected) queer children
should be given psychiatric treatment (aka the pseudoscientific and harmful practice of conversion therapy). Officially the Vatican withdrew these statements telling the French AFP that the comments ‘were removed so as not to distort the Pope’s train of thought’ and made clear that Pope Francis did not say homosexuality was a disease but instead wanted to look at such things from a psychiatric angle.
To this point I understand those who will still excuse these actions performed by a man bound to his faith. I can hardly blame the leader of an organisation that understands its founding doctrine to dictate that gay people are sinners and women have different roles within the church for believing that gay people are sinners and women should have different roles in the church.
Where it becomes more difficult to defend Pope Francis through scripture is in his handling of the church’s sexual abuse scandals.
One noticeable case in which Pope Francis not only failed to reform and improve the Church but even reverse what little action the previous papacy had performed can be seen with Mauro Inzoli. Inzoli was an Italian priest who, after being found guilty by an internal tribunal, was removed from his clerical position in 2012 by Pope Benedict. In 2014 Pope Francis made Inzoli a priest again but he was asked to lead a life of ‘prayer and humble discretion’ and to stay away from children.
In 2016 Inzoli was found guilty on eight counts of sexual abuse, no thanks to the Vatican who, in 2015, refused to hand over the results of the internal investigation to the Italian authorities. Pope Francis claimed to have made a mistake due to his ‘not understand[ing] these things well’ and claiming that he had ‘learned from this’, a lesson that had apparently not sunk in by 2018 with Bishop Juan Barros of Chile, whose accusers he initially dismissed as slanderers; or by 2019 with Bishop Gustavo Zanchetta whom Pope Francis appointed to a Vatican position despite allegations of abuse and fraud of which

he was later found guilty; or in 2022 with Cardinal Jean-Pierre Ricard, who publicly admitted to abusing a 14-yearold girl and yet (while suspended from acting as a priest outside of his diocese for a minimum of five years) retains his status as cardinal. There is sadly no doubt more cases which are never reexamined or turned over to the proper authorities.
To his credit Pope Francis did introduce the Pontifical Commission for the Protection of Minors and did remove the expectation of secrecy for documents regarding sexual abuse in 2019 but he did not introduce rules requiring the church to turn over documents preemptively after finding evidence of a member of clergy’s sexual abuse and seemed to believe the best people to deal with these situations were not the court in the country in which the crime was committed but a micronation in Europe. Carlo Alberto Capella was a Vatican diplomat to the US when in 2017 he was recalled to the Vatican by Pope Francis after US officials revealed that Capella was under investigation for
the possession and distribution of child sexual abuse material. Because he was immune to prosecution in the US due to his diplomatic immunity, Canada (who alleged he had distributed such material while on a trip to the country in 2016) issued a warrant for his arrest. Instead, he was tried in the Vatican and in 2018 sentenced to five years in prison (compared to the likely 7+ years if prosecuted under Canadian law). In 2021 he was allowed day release within the Vatican.
It is my opinion that victims of these crimes are under no obligation to forgive a) those who committed these abuses and b) those who enabled (through either negligence or malintent) their abusers. So frankly I don’t care if he admitted any other individual who commits a crime through negligence and is punished or at the very least held accountable. When he had the power It becomes more difficult to defend Pope Francis through scripture is in his handling of the church’s sexual abuse scandals.
When he had the power to act decisively and compassionately to defend the victims of members of the Church he instead often chose to delay and silence these scandals.


to act decisively and compassionately to defend the victims of members of the Church he instead often chose to delay and silence these scandals. His response to the global sexual abuse crisis in the Church has been full of missed opportunities and mixed messages and, possibly most horrifically, a failure to centre on the hurt of the victims and instead on the hurt of the Church and her image. There seems to be a theme going through Pope Francis’ papacy where action is taken if there is (or is the expectation of) a public outcry.
I understand that Pope Francis inherited a house full of rot. I appreciate his following of the Bible by living in humble conditions, giving hugely to the poor, forgoing the large palace in favour of simple living quarters and how (humble in death as in life) his funeral was understated (at least by papal standards). But humility in housing and living does not cleanse him. A man should not just be measured by how he lives but by what he does. Mercy without justice is not compassion; it is complicity. I believe he meant well. Maybe he thought reform could come without shocking the entire institution. But in my opinion the Catholic Church needed (needs) rebuilding and what it got instead was a priest well trained in PR. This review of his career is not canonisation or condemnation. It is clarity. ¢

SURIYA RAMYEAD takes a look at the most famous ethical thought experiment in the whole of Psychology.
Philosophy isn’t just for old libraries and dusty textbooks; it has a habit of creeping into everyday life in unexpected ways. Take the Trolley Problem, for example. This famous thought experiment, first introduced by Philippa Foot in 1967, puts us face-to-face with an ethical dilemma. You’re standing next to a runaway trolley that’s about to hit five workers on the tracks. You can pull a lever to divert it onto another track, where it will kill one person instead. What do you do?
THE BEAUTY AND FRUSTRATION OF THE TROLLEY PROBLEM IS THAT THERE’S NO EASY ANSWER. HOWEVER, VERSIONS OF THIS DILEMMA PLAY OUT ALL THE TIME IN THE REAL WORLD.
At a glance, it may seem like a purely hypothetical scenario. However, the Trolley Problem has endured because it concerns something universal: the tension between principles and outcomes, between what feels right and what does the most good. While the scenario itself may seem like a puzzle for philosophers, the idea can be found in everything from emerging technologies to real-life moral controversies.
At its core, the Trolley Problem forces us to confront two competing moral frameworks. On one side is utilitarianism, which argues that the right action is the
one that produces the greatest overall good, or, as Jeremy Bentham put it, ‘the greatest amount of happiness for the greatest number of people’. On the other side is deontological ethics (deon means ‘duty’ in Greek), which says that some actions, like intentionally killing someone, are always wrong, no matter the outcome. The problem lies at the core of a timeless question: should morality focus on the consequences of an action, or should it be about following certain universal rules, no matter what?
The beauty and frustration of the Trolley Problem is that there’s no easy answer. However, versions of this dilemma play out all the time in the real world. Politicians, doctors and even engineers often find themselves having to make decisions that benefit individual lives against the greater good.
One striking real-world parallel to the Trolley Problem comes from the recent controversial case of Luigi Mangione, an Italian hospital worker who shocked the world by killing the CEO of a major pharmaceutical company, claiming that the man’s policies were directly responsible


for the deaths of thousands. While the motives behind the killing have not been publicly articulated, law enforcement sources suggest that his actions were driven by a deep resentment of health insurance companies, which Mangione referred to as ‘parasitic’. The CEO is said to have blocked access to affordable, life-saving medications, leaving many to die unnecessarily. By eliminating the CEO, some argue, Mangione was saving lives.
But the public reaction was deeply divided. Some saw Mangione as a hero, someone who did something awful to achieve a greater good, a definitively utilitarian calculation. Others were horrified, arguing that such a coldblooded killing is murder, no matter the rational justification.
This case turns the abstract Trolley Problem into something uncomfortably real. Was Mangione pulling the lever to save the majority at the expense of the few? Or was he crossing a moral line that should never be crossed? The debate around his actions reveals just how messy reallife ethics can get.

Imagine an unavoidable accident where the car has to ‘choose’ between hitting a group of pedestrians or swerving into a barrier.
a blueprint for the kinds of decisions being made in the field of artificial intelligence (AI). Let’s use the example of self-driving cars. While we may not realise it, these vehicles are often faced with scenarios eerily similar to the Trolley Problem. Imagine an unavoidable accident where the car has to ‘choose’ between hitting a group of pedestrians or swerving into a barrier, risking the lives of its passengers. What should it do?
prioritised in split-second decisions. Should the car aim to minimise total casualties? This would be classified as a utilitarian approach. Or should it prioritise protecting its passengers at all costs?
Complicating matters further is the question of accountability. If a selfdriving car ‘chooses’ to kill one person to save five, who’s responsible? The programmer? The company that made the car? The government that approved its use? The Trolley Problem might have started as a philosophical experiment, but in the modern world it’s a matter of life and death.
The Trolley Problem isn’t just a philosophical curiosity or a rare, extreme case like Mangione’s. It’s also
These aren’t hypothetical questions. They’re being debated by engineers and ethicists right now. Designing the algorithms for these cars means deciding, in advance, whose lives are

The same moral trade-offs crop up in public policy, where decisions often affect millions of lives. Consider the allocation of scarce resources during a crisis. During the COVID-19 pandemic, for instance, governments had to decide how to distribute limited vaccines. Should they prioritise frontline workers and the elderly, who were most at risk, or should they focus on younger, healthier populations to ensure longterm societal stability? These are Trolley Problems on a massive scale. No matter what choice leaders make, there are trade-offs, and lives will inevitably be lost.
Climate change presents another version of this dilemma. To avert catastrophic consequences in the future, we may need to take drastic actions now, actions that could disrupt economies and cause hardship for millions. Do we prioritise long-
Luigi Mangione, an Italian hospital worker, killed the CEO of a major pharmaceutical company, claiming his policies were directly responsible for the deaths of thousands.


term survival over short-term comfort? These questions force us to grapple with the same utilitarian-versus-deontological tensions that underpin the Trolley Problem.
The Trolley Problem has stuck around for decades because it forces us to think deeply about morality in a way that’s both uncomfortable and necessary. Real life doesn’t come with neat answers or perfect outcomes. Whether you’re a policymaker, a tech developer, or someone grappling with a personal moral choice, you’ll often find yourself in situations where the right path isn’t clear.
Luigi Mangione’s story serves as a reminder of the stakes involved in these decisions. While some see him as a utilitarian hero who sacrificed his personal morality for the greater good, others view his actions as an unforgivable violation of ethical principles. His case reminds us that even when we think we’re doing the right thing the consequences of our actions can be unpredictable and deeply divisive.

From programming AI to crafting public policy, the dilemmas posed by the Trolley Problem are everywhere. And while we may never find a perfect solution, engaging with these questions makes us more thoughtful and, hopefully, more prepared to navigate the complexities of modern life.
The lever is always there, waiting to be pulled. ¢


ZARA BRETT asks whether a serial killer is born or made.
Everyone knows that serial killers exist, but we don’t typically know much about how they come to commit their crimes. One thing necessary to mention is that there is no one thing that causes someone to become a serial killer. It is a mixture of unique factors. The FBI’s report on serial murder says: ‘Since it is not possible to identify all of the factors that influence normal human behaviour, it similarly is not possible to identify all of the factors that influence an individual to become a serial murderer.’ With this in mind, it is interesting to look at real cases and see the supposed meaning and motivation behind the murders.
“
STUDIES SUGGEST THAT BRAIN ABNORMALITIES MAY CONTRIBUTE TO THE FORMATION OF SERIAL KILLERS.
There is much speculation about whether serial murderers are simply born, through genetics and DNA, or whether it is something that can be developed within a lifetime. The FBI defines serial murder as this: ‘The unlawful killing of two or more victims by the same offender(s), in separate events.’ When we consider both biological and social aspects, should we lean more to the nature or nurture side of the argument?
Many convicted serial killers have been diagnosed with a mental illness, such as antisocial personality disorder, paranoid schizophrenia,
bipolar disorder, and more. This shows a connection between mental disorders, which are typically genetic, and serial murder. There are many examples of this, to name a few: David Berkowitz, who was known as ‘Son of Sam’, killed six people in the 1970’s, claiming that his neighbour’s dog told him to do it. He was diagnosed with paranoid schizophrenia. Aileen Wuornos confessed to seven murders in Florida, and she was diagnosed with antisocial personality disorder. Charles Manson, leader of the ‘Manson Family’ cult and mastermind behind the 1969 murders at the home of Sharon Tate, was diagnosed with antisocial personality disorder. But having one of these disorders definitely does not guarantee homicidal tendencies, and there is always a mixture of circumstances that is different for each individual.
Studies suggest that brain abnormalities may contribute to the formation of serial killers, which is a biological factor that can affect the development of violence. Convicted murderers have been seen to have brain scans that show they are underdeveloped in certain areas of the brain, mostly to do with emotional processing. The University of Chicago conducted a study looking at the brains of over 800 prisoners, and they
compiled statistics that proved ‘individuals who had committed or attempted homicide had reduced gray matter when compared to those involved in other offenses.’ This article also explains that gray matter has more cells and neurons, and therefore a higher ability to process information, for example emotional information that is needed to feel empathy for others. ‘Those reductions were especially apparent in regions of the brain associated with emotional processing, behavioural control and social cognition,’ said a psychologist, Marine Wang, in 2019. Her study portrays the link between the act of homicide and a biological deprivation of certain brain functions.

Many convicted serial killers have been diagnosed with a mental illness, such as antisocial personality disorder, paranoid schizophrenia, bipolar disorder, and more.
Another way that many serial killers are characterized is through their family and upbringing, which are environmental factors, something that an individual is not born with. It is typical for convicted murderers to have had a difficult childhood, and this has influenced how they have developed, and therefore their emotional processing and feelings about other people. In his theory of attachment, John Bowlby came up with the term ‘internal working model’, a theory that suggests our first relationship with our primary caregiver (the first person we attach to as babies, typically the mother) becomes a template for future relationships. This means that if, as a child, an individual has an abusive



“
A STUDY LOOKING AT THE BRAINS OF OVER 800 PRISONERS PROVED ‘INDIVIDUALS WHO HAD COMMITTED OR ATTEMPTED HOMICIDE HAD REDUCED GRAY MATTER COMPARED TO THOSE INVOLVED IN OTHER OFFENSES.’
The FBI’s report on serial murder states that ‘neglect and abuse in childhood have been shown to contribute to an increased risk of future violence.’


It is a very typical debate, whether violent people are born or made, nature vs nurture. But with newer research, it is becoming more apparent.
or manipulative relationship with their primary caregiver, it can affect the stability of future relationships. The FBI’s report on serial murder states that ‘neglect and abuse in childhood have been shown to contribute to an increased risk of future violence.’ This portrays the way that a person’s environment and social surroundings can affect how violent tendencies develop. Similarly to this, social rejection and alienation can also influence violent behaviour. Wang again: ‘…the development of social coping mechanisms begins early in life and continues to progress as children learn to interact, negotiate, and compromise with their peers. In some individuals the failure to develop adequate coping mechanisms results in violent behaviour.’ This introduces the concept of violence as a way to cope with social rejection.
It is a very typical debate, whether violent people are born or made, nature vs nurture. But with newer research, it is becoming more apparent to experts how it is a mixture of both factors. There is no generic template for a serial killer, and what pushes people to the point of serial murder is different for each individual. Through the research into brain abnormality, mental disorders and child abuse, the most common developing factor of a serial killer is a mixture of psychological and environmental roots. It is not limited to any specific characteristic, so it is not possible to determine whether it is specifically nature or nurture. Serial killers fascinate people all over the world, and with the dramatization of their exploits through television, we need to be careful about how they are portrayed, because it is a sensitive subject. However, understanding contributing factors is crucial for identifying risks and preventing future violence. ¢

ESME CLARK thinks things are better than we think.
When we think of lower-income countries, we picture poverty, instability, and dependence on foreign aid, but I will try to challenge this perspective. What if I told you that most Western employees, large multinational co-operations and financial institutions are still trying to operate according to a deeply rooted, outdated and distorted fact base?
Before you read this article, answer these:
1. Worldwide, 30-year-old men have spent 10 years in school, on average. How many years have women of the same age spent in school?
a) 9 years
b) 6 years
c) 3 years
2. How many people in the world have some access to electricity?
a) 20%
b) 50%
c) 80%
3. How many of the world’s one-yearolds have been vaccinated against some disease?
a) Less than 25%
b) 55%
c) More than 85%
4. Global climate experts believe that over the next 100 years the average temperature will:
a) Get warmer?
b) Remain the same?
c) Get colder?
The Destiny Instinct is the idea that ‘things are as they are for ineluctable, inescapable reasons: they have always been this way and will never change.’ If we’re not careful, this instinct could blind us to the revolutionary transformations in societies happening around us. For example, menstrual pad factories. Did you know that every pregnancy results in around two years of lost menstruation? The falling fertility rate across the globe should be a significant indication that change needs to happen in the menstrual pad industry to supply the new consumers. However, they are failing to see this new opportunity. This is where the destiny instinct kicks in. Factories continue to provide for the 300 million women on Level 4 and fail to notice the other 2 billion women on Levels 2 and 3 searching for a reliable and



straightforward pad that will last them a day at work without change because of outdated stereotypes.
The media is also a main contributor. Physiologically, people are more drawn to the negative stories in the news. This is caused by our emotional and physical response to articles. ‘We are evolutionarily wired to screen for and anticipate danger, which is why keeping our fingers on the pulse of bad news may trick us into feeling more prepared,’ says Cecille Ahrens, clinical director of Transcend Therapy. The journalists are aware of this and use it to their advantage by exploiting us. All they see is more clicks = more business. Therefore, they will make things up,
exaggerate stories and use our traumas against us to trigger this reaction. But how would we stay up to date with no news, you may ask? I believe we can still watch the news but we need to limit ourselves to only a couple of times a day and take part in activities that will lift our mood afterwards.
I encourage you to visit gapminder.org, a website founded by Ola Rosling, Anna Rosling Rönnlund, and Hans Rosling. Their mission is to fight devastating ignorance with a fact-based worldview everyone can understand. They also wrote the book FACTFULNESS.
One of the biggest misconceptions is that people in lower-income countries can’t afford our goods and services. As you can see in image 1, where the dotted line expresses the end of extreme poverty, in 1966, 49% of people were in extreme poverty; however, in 2023, it has plummeted to only 9.7%. As you can see in this image above, there are five billion potential consumers out there on Levels 2 and 3 wanting to buy our goods and services, which can be easily missed if we go around thinking they are ‘poor’.
How many of the worlds one-year-old children today have been vaccinated against some disease? To get a vaccine from the factory to a child is an extremely hard job, because the vaccine needs to be kept cool the entire way. The infrastructure required for such a task is the same as if you were to institute a new factory. The answer is 88%! However, when a group of 71 well-dressed top bankers gathered to answer this question, 85% responded that less than 25% of children had a vaccine. This proves that these major investors have missed out on such a significant shift in development, which is a catastrophe!
As you can see in image 2, there are 5 billion potential customers from Levels 2 and 3 who are ready to pay. This is a key indicator that they are failing at their jobs and missing out on some

Image 2
Image 3

A group of 71 welldressed top bankers responded they thought less than 25% of children had a vaccine.
PHYSIOLOGICALLY, PEOPLE ARE MORE DRAWN TO THE NEGATIVE STORIES IN THE NEWS.
of the most significant and profitable investment opportunities in the fastestgrowing part of the world! This is one of the many examples proving we must educate ourselves before it’s too late.
At the beginning of this article, I asked you three questions about gender differences in education, access to electricity, vaccinations and climate change. Here are the answers:

I am sure that you were surprised with the first three results. These questions are from the Gapminder Misconception Study 2017. Using online panels, they asked the general public in fourteen rich countries twelve basic fact questions. The results are shown in Image 3. The reference to chimps is that they will pick the correct answer 1 in 3 times consistently from the three questions. However, the average score for humans was a disappointing 2.2/12 answers correct.

I included the last question in this article because I wanted to show you what a success story looks like. Almost everyone selected the right answer, which is that the climate will get warmer. I believe we can reenact this for the others.
The world is changing rapidly, so we can no longer rely on our outdated assumptions.
The world is changing rapidly, so we can no longer rely on our outdated assumptions. But change is possible, as we saw with the climate change question. We must share the good news and challenge people’s understandings and biases of the world.
So, what can you do? Explore resources like gapminder.org and dollarstreet.org, or read FACTFULNESS by Hans Rosling to expand your perspective. Share what you’ve learned with your friends and family. And be careful with the media. They are a business not a fact-based mission. Most importantly, recognise that the world is changing faster than expected. The future belongs to those who stay informed, so don’t miss your chance to get educated! ¢

CHIMPS WILL PICK THE CORRECT ANSWER 1 IN 3 TIMES CONSISTENTLY FROM THE THREE QUESTIONS.
Viewing Private
EMILY DEANE complains that great works of art are often stowed away by private collectors who deny the public a chance to see parts of their cultural heritage.
Imagine a world where all artwork is inaccessible to the public, where it is hoarded by private art collectors and only seen by a select few. Should the cultural heritage tied to artwork amassed over generations be permitted to be treated as private trophies, or does it belong under the public eye?
“
Since ancient civilizations wealthy patrons have hoarded private collections of artwork as a symbol of their status, to be seen only by a chosen minority. Examples of this relationship begin with Pericles, a Greek politician (born 495 BC), and Phidias, a master designer, in the reconstruction of the Acropolis, an ancient citadel in Athens. This sort of arrangement continued until the Renaissance period when the attitude toward private collections shifted, resulting in the establishment of public museums and galleries in the 18th and 19th Centuries. We can attribute this evolution to a growing recognition of art as a form of cultural heritage, belonging not just to individuals but to society as a whole.
SINCE ANCIENT CIVILIZATIONS WEALTHY PATRONS HAVE HOARDED PRIVATE COLLECTIONS OF ARTWORK AS A SYMBOL OF THEIR STATUS.
part, since most art collectors have more spare financial resources to invest in the preservation and restoration of artworks than some underfunded public institutions. But private collections inherently limit public access, which not only restricts both cultural education and enjoyment of the art, but also creates inequality accessing culturally important works. Damien Hirst, who owns a gallery in Vauxhall, says most artwork is ‘stored away in boxes where no-one can see it’, which prompts the question whether the preservation of art is worth it if only a few can enjoy it.
Although the impact of art is fundamentally tied to its ability to be experienced by viewers, the continuation and preservation of cultural heritage, even if it is only seen by select individuals or scholars, is arguably more significant.
Private art collection allows for artwork to be better maintained for the most
Private art collectors stimulate the art market by driving up prices, which provides support for artists because private collectors tend to purchase art from both well-known and emerging artists, which fosters innovation and originality due to


TO ADDRESS THE DIVIDE BETWEEN PRIVATE COLLECTORS AND PUBLIC INSTITUTIONS, MULTIPLE SOLUTIONS HAVE BEEN PROPOSED IN THE UK.
their diverse interests. Public museums, on the other hand, often lack the funds to invest in emerging artists, only purchasing well known and established artworks.
From a legal standpoint, the right to own property (including artwork) is fundamental in many societies such as the US, where the right is outlined in the Fifth Amendment of the Constitution, so it can be argued that to restrict private art collection would infringe upon a citizen’s individual liberties. In the US there is legislation in place that allows the artist to retain some moral rights even after the artwork has been sold, an example of this being the Visual Artists Right Act (VARA) of 1990 in which some of the clauses contain the right of integrity and the right to prevent destruction. While this is not comprehensive in its protection of the artist, and by extension the artwork, it allows the protection of cultural heritage while also respecting the right to own property.

To address the divide between private collectors and public institutions, multiple solutions have been proposed in the UK in the past, such as tax incentives for public display of private collections, mandatory lending programmes for significant works and digital access initiatives. These bridge the divide well, I think, as they allow for the continuation of private art collection, while also providing the public with access to a tangible cultural heritage.

Some objects such as gold and diamonds are intended as a repository of money and therefore do well stored in vaults. But artwork is intended to be seen, to generate and communicate emotion, ideas and thoughts, to teach and display shared history and to build bridges between people in different cultures and times. Therefore, artwork should always be accessible to the public, although it could be argued that private collections are used for private enjoyment: in 2013 a warehouse complex called ‘Geneva Freeport’ held about 1.2 million works of art valued at $100 billion, including 1,000 works by Pablo Picasso.
Artwork is intended to be seen, to generate and communicate emotion, ideas and thoughts.
The monopolisation of cultural heritage for no purpose other than to be stores of wealth and prestige in my opinion defeats the purpose of the artwork itself, since art is meant to be seen, as was the artist’s intention. I believe that while private art collections and collectors are beneficial and irreplaceable for both the preservation and restoration of artworks and the stimulation of the art market, the greatest artworks should always be made available to the public through either public museums or private-public partnerships. ¢

“
THE GREATEST ARTWORKS SHOULD ALWAYS BE MADE AVAILABLE TO THE PUBLIC.


Home
RUPERT BRETT charts the crazy economic ups and downs when government wants the building industry to house the population.
Between 1947 and 1955 the Conservative and Labour governments implemented planning laws that cemented the notion that all councils had the power to block any development under their jurisdiction. The aim? To centralise planning control, ensure coordinated development and manage growth. This was introduced during a housing boom, when councils built affordable houses to counteract post-war shortages. However, while it regulated construction and protected green space, it gave little to no incentive for councils to grant permission for private development. This provided the foundations for the modern-day housing crisis.
funds they previously could, which limited investment in local property markets; this led to less construction of affordable homes.
“THE RICH GET RICH AND THE POOR GET POORER. THIS IS TRUE, NOT JUST FOR THOSE WHO ARE ADULTS, BUT THOSE WHO ARE YOUNG, THOSE WHO ARE ATTEMPTING TO BEGIN THEIR JOURNEY ON THE PROPERTY LADDER.
The path to planning permission today shares much of its policy with this bill. However, there are a few key differences that have furthered the problem: in the 1980s the building of affordable housing fundamentally changed. In an effort to reduce public sector borrowing the government capped how much a council was able to borrow for capital expenditure. The councils in turn could not obtain the
The 1980s also introduced the Right to Buy scheme, which was intended to allow people living in social housing to eventually buy back their house from the government at a discount. This meant that the supply of affordable housing began to decrease. The combination of these two factors meant that the supply of affordable housing was not being maintained. To combat this major issue the Section 106 Agreement was introduced in the Town and Country Planning Act of 1990. This was an agreement that allowed councils to negotiate with developers to include affordable housing within their developments, a practice that has become more common as council funds dwindle. With councils no longer building social housing, the burden of building affordable housing fell solely on the developers.
So, the government built the social housing, and the developers built
private housing. On paper, the new system was a good idea: the developers were efficient in terms of speed and cost, and the government would be able to use the funds for other purposes. However, in practice, it limited the supply of affordable homes. This is because the developers receive a lower total revenue than they could have achieved with the same site through building private housing. In turn, the developers have less incentive to develop a site, and thus less housing is built – of all kinds. This is because some land that would have been appropriate to develop beforehand will now not produce the same level of profit for the developer (cost-benefit analysis), which could mean that the site is not developed. The previous system functioned more efficiently because the government was not seeking a profit, and this therefore meant they could produce the volume of affordable housing needed to fill the demand on condition that material costs were not excessively high. This new system means that there is less housing built during a poor economic environment due to lack of private investment. This is the opposite of what the market needs: if people are making less money, the demand for more

affordable housing rises. But in this circumstance there are less available. This means that the general price of housing rises, making this effect even worse.
The process of obtaining planning permission for a housing development in the UK is a very long-winded process. For a building to be built on a plot of land planning permission must be

sought. This is a lengthy process for which you must survey the land for any special designations, such as radon levels and flood risks, and assess the likelihood of the council granting you permission through the research of recent grants. With local opinion weighing heavily on a development, delays may be caused: commonly, people in smaller towns or villages are strongly against any kind of development. This could be based on the assumption that it may inconvenience them, be that through noise pollution, crowding of public spaces or even that it may reduce the aesthetic value of their surrounding area. A complaint from one person in the local community can put a spanner in the works of any development site. They must collect the same data, fill out the same forms, produce the same environmental reports, all to reapply to the same council and community that rejected them in the first place. Taking the stance that all developments are bad developments in your local area is a narrow-minded approach that has created this country-wide issue. We are faced with a problem of our own creation.
Obtaining planning permission for a housing development in the UK is a very longwinded process.
The burden of building affordable housing fell solely on the developers.


A broken system that prevents the development of new affordable housing is a system that was always headed towards failure. Generally, this stunts development in the residential property market. Developers are unable to build the housing required to fill the demand gap in the country, so affordable housing begins to become scarcer than ever before. It’s clearly demonstrated in the statistics: in 1970 the annual supply of new homes built in the UK was over 400,000. In 2000 this figure was below 200,000 – a 50% decrease in annual supply.
This is reflected in house prices: the average house price of £4,163 in 1970 became £84,620 in the year 2000. The rich get rich and the poor get poorer. This is true, not just for those who are adults, but those who are young, those who are attempting to begin their journey on the property ladder. If this trend continues it may be nigh on impossible to purchase a first home, meaning that the wealth will continue to shift upwards through the older generations, to those who have been climbing the property ladder for many years.

One of the causes of this trend is highlighted in the reduction of social rented homes, which is due to the lack of replacement of those being taken away from the social housing stock, due to the housing buy-back scheme.
“
IF THIS TREND CONTINUES IT MAY BE NIGH ON IMPOSSIBLE TO PURCHASE A FIRST HOME.
This is a basic supply-and-demand issue, with little supply and high demand, when prices will rise. And that’s a fact. This all leads to one effect: increased economic inequality.

In 1970 the annual supply of new homes built in the UK was over 400,000. In 2000 this figure was below 200,000 – a 50% decrease in annual supply.
And yet, in the UK, planning law and national planning policy do not require a certain percentage of new housing to be classed as ‘affordable’, and this can only be included in the local plan by smaller local councils if they identify the need. This is a problem: the local council may not identify a need for more affordable housing and their neighbouring council might but this neighbouring council may not have the capacity to meet the needs of the general population. Adding another layer to the problem, getting sites into the Local Plan


“
GETTING SITES INTO THE LOCAL PLAN IS ALSO A LONG-WINDED PROCESS, WHICH COUNCILS TAKE 3 OR 4 YEARS TO COMPILE.
In the UK, planning law and national planning policy do not require a certain percentage of new housing to be classed as ‘affordable’.


While affordable housing becomes more scarce, more people are forced to rent, and then rents start to climb.
is also a long-winded process, which councils take 3 or 4 years to compile, and by that time the council will already have been falling short of their allocated building targets. The time spent on this plan could also mean that there is a different government in office, with different policies, targets, and ambitions which could lead to another review of the Local Plan, and this is all before the application makes it into the approval process.
I believe we need to take a more nationalistic approach to the building of affordable housing in the UK so that people and their communities do not only think of just their area, but the whole country. It’s an issue that is not easily resolved because it is in human nature to defend one’s immediate interests, even when this has a self-harming effect in the longer term. One more thing: while affordable housing becomes more scarce, more people are forced to rent, and then rents start to climb. So it goes. ¢

BAHEERATH RAVIKUMAR explores the creative business of lateral thinking and shows the crucial part it plays in human progress.
Ifirst encountered ‘Lateral Thinking’ when my history teacher recommended Lateral Thinking by the great Edward de Bono. According to de Bono, heavyweights like Einstein and Darwin were lateral thinkers, and he describes lateral thinking as closely related to insight, creativity, and humour, which all share a common foundation. De Bono contrasts this kind of thinking with vertical thinking, which is what most institutions promote. Vertical thinking focuses on refining or proving established patterns, while lateral thinking breaks and restructures those patterns. The two are complementary: vertical thinking is selective, while lateral thinking is generative.
De Bono, who passed away in 2021, was a Maltese physician and psychologist widely regarded as a leading authority on thinking. A Rhodes Scholar at Oxford, he developed the concept of lateral thinking based on his studies of the brain and human behaviour. His first book, Lateral Thinking, introduced techniques that sparked innovation and transformed problem-solving in education, business, and beyond. To better understand this concept, let’s look at a famous riddle often used to illustrate lateral thinking: Puzzle: A man lives on the tenth floor
LATERAL THINKING, INTRODUCED TECHNIQUES THAT SPARKED INNOVATION AND TRANSFORMED PROBLEM-SOLVING.
of a building. Every day, he takes the elevator down to the ground floor and leaves for work. However, when he returns, he only takes the elevator to the seventh floor and then walks up the remaining three floors. Why does he do this? Take a moment to try and solve it before reading the solution. Solution: The man is short. He can reach the groundfloor button but not the tenth-floor button. When he comes back, he can only reach the seventh-floor button, so he walks the remaining three floors.

Once you know the answer, it seems simple, right? This is the essence of lateral thinking. It often involves breaking free from obvious assumptions and discovering unconventional solutions. And this is why generalists often thrive in the modern world. The information revolution has made so many resources available that innovations of the 20th and 21st Centuries often came from those with experience across multiple fields. They take concepts from one area and apply them in another, achieving extraordinary breakthroughs.
De Bono famously remarked: ‘What happened was, 2,400 years ago, the Greek Gang of Three – Aristotle, Plato, and Socrates – started to think using analysis, judgment, and knowledge. At the same time, church authorities, who ran schools and universities, wanted logic to prove the heretics wrong. As a result, design and perceptual thinking were never developed.’ This wasn’t anyone’s fault; it’s simply how the system evolved. But still, I remain hopeful that creative thinking will continue to gain traction, for the betterment of humanity. It’s already being recognized in schools.
This nicely segues into a little tale of how lateral thinking was applied in one of the most iconic examples, especially for many from Gen Z. Gunpei Yokoi was a Japanese man who didn’t do well in his electronics exams and found himself with a mediocre job as a machine maintenance worker at a playing card company in Kyoto while his peers moved onto major companies in Tokyo. His job was at a little company called Nintendo. However, Yokoi saw something his peers didn’t. Despite the readily available technology, specialists tended to overlook it due to their narrow focus. Yokoi, with his broader perspective,
began combining existing technologies in ways that no one else could see. This lateral thinking led to the creation of his Magnum Opus, the Game Boy. The technology was already available but Yokoi simply combined it in a new, innovative way.
Now, let’s shift gears a little and look at how lateral thinking can influence creative problem-solving in Computer Science. A great example comes from Steve Jobs, who, in 2008, unveiled my personal favourite MacBook, the Air. He started his presentation with the corny line, ‘There’s something in the air.’ but that wasn’t the highlight. He meticulously showed how previous laptops were bulky, how their specs were less impressive, and how they couldn’t compete with the sleek, cutting-edge technology he was about to unveil. Then, to everyone’s shock and laughter, he pulled the MacBook Air out of a manila envelope. The moment stunned the tech world and probably remains forever one of the greatest marketing stunts in computing history.
“THIS LATERAL THINKING LED TO THE CREATION OF HIS MAGNUM OPUS, THE GAME BOY.
De Bono contrasts this kind of thinking with vertical thinking, which is what most institutions promote.

“HIS ABILITY TO THINK OUTSIDE THE BOX MADE HIM A MARKETING GENIUS.

This analogy demonstrates that lateral thinking isn’t just about generating new ideas; it’s about applying creative thinking in unexpected ways. Jobs didn’t just present the Air like any other product, he challenged his audience’s expectations. A manila envelope, typically used for thin papers, contrasted with the ultra-thin laptop inside. His creative marketing not only blew the audience away but helped define Apple as a sleek, modern, luxurious brand. His ability to think outside the box made him a marketing genius, and his methods still stand unrivalled.
Programmers and computer engineers, too, must be creative in solving problems, often finding unconventional solutions to complex challenges. So, lateral thinking is a valuable asset. If we want to think better, we need to use it in ways that were previously unforeseen.
To write this article, I had to think outside the box to come up with a topic to write about, so I thought why not explore the method of thinking outside the box itself? ¢
JOBS DIDN’T JUST PRESENT THE AIR LIKE ANY OTHER PRODUCT—HE CHALLENGED HIS AUDIENCE’S EXPECTATIONS.


“
PROGRAMMERS AND COMPUTER ENGINEERS, TOO, MUST BE CREATIVE IN SOLVING PROBLEMS.

“ACCORDING TO DE BONO, HEAVYWEIGHTS LIKE EINSTEIN AND DARWIN WERE LATERAL THINKERS, AND HE DESCRIBES LATERAL THINKING AS CLOSELY RELATED TO INSIGHT, CREATIVITY, AND HUMOUR.

ALBERT RICE argues that smart phones are doing an awful lot of damage.
Phones have been around for more than a century, starting off as primitive communication devices, but they have evolved into complex machines that are provoking debate all around the world.
Some argue that today’s smart phones are good for society. Phones allow us to have lifelike interactions with people who are not physically present, navigate much more easily and quickly, track valuables and help parents keep an eye on their children remotely. Phones are also great for education: we all know about Duolingo, YouTube and audiobooks. But knowledge can be acquired just as easily from a live expert, and people who make YouTube videos often haven’t much further knowledge to offer but have just read up and regurgitated material in video
form. One video cannot possibly cover all the nuances of a topic, and one of the great advantages of having someone explain something in real life is that you can ask questions. Even if something is perfectly well explained, one will always have questions. Almost everything I have learnt has been achieved without a phone.
“
5in tall may be tall, but not in a world where everyone else is 7ft 10.
Some also say that mobile phone games help sharpen critical thinking, reaction speed, special awareness, coordination, and teamwork. While this may be true, there are better ways to achieve these skills, such as by playing football, a game of chess, pool, rugby, or table-tennis. 6ft
THE BEFORE-SHOT IS A PERKY, ENERGETIC AND TALKATIVE PERSON, AND THE AFTER-SHOT IS A ZOMBIE.
I often find myself asking friends if they want a game of pool, or a trip into town, or whatever, and the answer I get is always ‘nah, sorry’ because said friend is locked in on insta reels or gaming or something. When I got to my present age, I expected people would hang out more and do stuff together, but I think it’s a bit sad and pathetic that screens are the reason this only ever happens when social events are organised by someone older. When my dad and uncle were teenagers, this was one hundred percent not the case. People were having real fun,
making memories, playing footie or table-tennis, or getting up to ‘no good’. It’s embarrassing and upsetting that people are throwing their youth away. People should be up and about, being with people, learning, maturing, improving their social skills, gaining perspective on life, but instead phones are gobbling up people’s brains, inducing social anxiety and awkwardness in young people who are still developing and maturing. Put it this way: every adult you love, respect, admire for being intelligent, have fun with, and look up to for being wise, grew up without phones.
Phones also negatively affect mental wellbeing. Psychologist Nigel Barber says that ‘virtually every domain of clinical psychology finds that smartphones threaten our psychological wellbeing. Findings include increased anxiety, depression, suicide, bullying, eating disorders, sexual violence, and loneliness, together with learning problems and deficits in creativity.’ Sometimes, things like anxiety aren’t obvious, so it’s easy to say Barber is wrong, but one noticeable thing is social anxiety. When someone suffers from social anxiety, you just know. Numerous studies show that phones cause social anxiety, and for those of you who are pernickety pedants, fine. Look up ‘The Relationship between Social Anxiety, Smartphone Use, Dispositional Trust, and Problematic Smartphone Use: A Moderated Mediation Model’. That will give you proof.

“VIRTUALLY EVERY DOMAIN OF CLINICAL PSYCHOLOGY FINDS THAT SMARTPHONES THREATEN OUR PSYCHOLOGICAL WELL-BEING.

People were having real fun, making memories, playing footie or table-tennis, or getting up to ‘no good’.
I cannot speak for others, but I, and certainly my parents and loved ones, have recently noticed that young people are far more socially anxious than they used to be. And I notice that everyone in my life who suffers from social anxiety spends a lot of time on their phone, and everyone in my life who is socially confident spends less time. In fact, I can actually remember what friends were like without screens: the before-shot is a perky, energetic and talkative person, and the after-shot is a zombie. And if this doesn’t convince you, just apply logic: if someone spends more time on their phone than they spend socializing they will get less practice at the art of socializing and will be less skilled.
Smart phones are stunting the emotional and social growth of our generation. Smart phones are the root of an epidemic that could easily be solved, but only if everyone took a moment to look up from the screen. ¢
“SMART
ARE STUNTING THE EMOTIONAL AND SOCIAL GROWTH OF OUR GENERATION.

PHONES
MIKHAIL URAKOV, whose grandfather commanded a whole fleet of Russian submarines twenty years ago, muses on the unique character of these odd vehicles.
‘W
hy is it that I do not describe my method for remaining underwater and how long I can remain there without coming up for air? I do not wish to publish this because of the evil nature of men, who might use it to murder on the seabed.’
Leonardo DaVinci
Have you ever imagined yourself roaming the vast ocean in search of the enemy? You feel like a steel predator that is silent, deadly, and unseen. You are a submarine, an invention that redefined sea battles forever. Today’s nuclear giants have come a long way from their humble beginnings.
“
ONE OF THE EARLIEST PROJECTS OF A SUBMARINE WAS DESIGNED BY LEONARDO DAVINCI. HOWEVER, HIS CONCEPT WAS NEVER IMPLEMENTED.
Let’s take a moment to define what a submarine truly is. Submarines (also known as U-boats, particularly in the German context) are a class of ships capable of autonomous operations both underwater and on the ocean’s surface. Submarines can carry armaments or perform specialised missions, such as research, rescue, or even deep-sea repair. In some cases, submarines are referred to as unmanned underwater vehicles (UUVs), which are robotic and remotely operated.
The idea of building ships that would allow humans to move not only on water but also beneath it appeared in antiquity. One of the earliest projects of a submarine was designed by Leonardo DaVinci. However, his concept was never implemented, and he chose not to publish it, foreseeing that such an invention could be used for ‘murder on the seabed’. The primary function of his submarine, like many other early ones, was to bring the sailor close enough to an enemy ship, typically when it was docked, and inflict damage manually.
While early concepts remained theoretical or experimental, the dawn of the 20th Century saw submarines formally enter warfare. The first recorded use of submarines as part of an official navy occurred in 1905, in the Russo-Japanese War. Although these early combat submarines weren’t effective in combat, they gave valuable data. The experience helped shape the key concepts of the future submarine warfare and highlighted the main weaknesses of these vessels.
The first successful war for these ‘silent predators’ was World War I. Germany lacked sufficient naval power to challenge British and French fleets directly, so they adopted


U-boats (short for Unterseeboote, or ‘undersea boats’) as their new naval warfare strategy. The submarines focused their attacks on merchant ships, which cut off the vital Atlantic supply routes of the Allies. The U-boat effectiveness reached its peak during the First Battle of the Atlantic, when they dealt significant damage to Allied ships. The new underwater threat emerged as an invisible force, which produced unpredictable terror at sea.
On 22 September 1914, U-9, under Otto Weddigen’s command, sank three British cruisers (HMS Hogue, HMS Aboukir, and HMS Cressy) in a ninety-minute period. The successful operation against the Royal Navy forced them to reevaluate submarine defence measures, starting a new period in naval combat.
By the end of WW1, submarines had sunk 160 battle vessels, ranging from battleships to destroyers, along with countless merchant ships. The total supply weight lost reached 19 million registered tonnes. Through the German U-boat campaign, Britain came extremely close to defeat, forcing the world to recognise the importance of submarines. Although many modifications were made after WW1, submarines at the dawn of WW2

remained primarily submersible ships, rather than true underwater predators. This meant that they could dive only for short periods of time, usually for attack or to evade the enemy, before needing to resurface to recharge their batteries. Consequently, a lot of their time was spent surfaced, making them more vulnerable to enemy aircraft and patrols. At night, German U-boats often attacked while surfaced, sometimes even using deck guns to engage targets, like merchant ships. These tactics, though bold, highlighted the technological limitations of the era.
The new underwater threat emerged as an invisible force, which produced unpredictable terror at sea.
A key episode in the use of submarines during WW2 was during The Second Battle of the Atlantic (1939-1945). The actions of Admiral Karl Dönitz’s ‘wolfpacks’ (groups of U-boats attacking convoys) put the Atlantic shipping routes under severe threat. The most successful and widely produced model was the Type VII U-boat. Out of 1050 ordered, 703 were built in various modifications. From 1944, these U-boats were being equipped with snorkels (air intake tubes), which allowed diesel engines to operate when the submarine was submerged. This allowed them to reduce their exposure to enemy patrols.

By the end of the war, Germany had developed and begun building its first Type XXI U-boats. These were the first submarines, which were adapted to submerged warfare, featuring streamlined hulls and enhanced battery capacity, allowing them to remain submerged for extended periods of time. With a maximum depth of 330m, low noise levels, and increased autonomy, they influenced post-war submarine engineering worldwide.
Over the course of World War 2, German U-boats sank around 4430 merchant vessels, with an estimated 22.1 million registered tonnes of lost supplies; 395 warships were sunk, including 75 submarines, evidence of the high intensity of undersea warfare that occurred during the war.
Over the course of World War 2, German U-boats sank around 4430 merchant vessels, with an estimated 22.1 million registered tonnes of lost supplies.


The post-war period brought about quick advancements in submarine engineering because of the escalating Cold War tensions between the USSR and the USA. The US Navy achieved a historic milestone in July 1953 when the USS Tunny launched a cruise missile from its diesel-electric submarine platform, which led to submarine-based strategic weapons development.
The introduction of nuclear-powered submarines enabled oxygen generators to extract breathable air from seawater, which resulted in the creation of the first true underwater vessels that could operate independently for extended periods without needing to return to their bases. The submarines gained better hunting capabilities because they no longer needed to surface for battery recharging, which made them harder to detect.

The U.S. and Soviet navies introduced their first strategic missile submarines, known as SSBNs, which carried nuclear ballistic missiles during the period between 1959 and 1960. The submarines served as essential elements of nuclear deterrence during the Cold War because they provided second-strike capability, which helped maintain the equilibrium of power. ¢

EXPLOSION OF A DEPTH CHARGE LAUNCHED FROM U.S. COAST GUARD CUTTER SPENCER. A GERMAN SUBMARINE U-175 WAS SUNK AND PREVENTED FROM BREAKING INTO CENTER OF A LARGE NORTH AMERICAN CONVOY. APRIL 1943

DURING THE SECOND BATTLE OF THE ATLANTIC (19391945), THE ACTIONS OF ADMIRAL KARL DÖNITZ’S ‘WOLFPACKS’ (GROUPS OF U-BOATS ATTACKING CONVOYS) PUT THE ATLANTIC SHIPPING ROUTES UNDER SEVERE THREAT.