One Hundred Years of Television

Page 1


ONE HUNDRED YEARS OF

TELEVISION

Reflections on the development and impact of this influential medium

Preface

Ms C E Payne (CR, Classics Department)

This year’s themed publication is titled One Hundred Years of Television, and our academic scholars have been busy delving into all that this remarkable milestone entails. 2025 marks one hundred years since the first television image was transmitted by Scottish inventor John Logie Baird. The first television picture was a greyscale image featuring the head of a ventriloquist’s dummy nicknamed ‘Stooky Bill’ (featured on our cover). Later, Baird fetched office worker William Edward Taynton to demonstrate how a human face would look in the same picture: Taynton became the first person to be televised. Baird could scarcely imagine that from this modest beginning would grow a medium that permeates almost every aspect of modern life.

From its mechanical origins, television has evolved into a sprawling network of content delivery: from wallmounted Smart TVs to handheld phones, VR headsets to gaming consoles. Today, television is a fixture of every day life, whether it’s morning news, afternoon antiques, or the nightly scroll through a streaming platform’s offerings. It educates, informs, entertains, provokes, distracts and occasionally transforms us. To those of us fortunate enough, television is not just a luxury but a backdrop to our lives, and has become a shared reference point which shaped – and continues to shape – generations.

The 21st century is frequently dubbed ‘The Golden Age of Television’, and with good reason. The technological leaps in digital broadcasting and streaming platforms have revolutionised how we consume content. Critically acclaimed series such as The Wire, Breaking Bad, and Game of Thrones have demonstrated that television can now rival, and in many cases surpass, film in terms of narrative depth, production quality, and cultural impact. With the introduction of streaming services in particular, binge-watching has become a cultural phenomenon, encouraging the public to spend even more timeconsuming media than before. In response, this media has become widespread across every conceivable genre: whether it is comedy, drama, documentaries, thrillers, science fiction or more, there is some form of television for everyone.

In light of this century of innovation, the breadth of topics our students cover reflects the incredible diversity of television itself. Whilst some have chosen to delve into its early days, with investigations into the pioneering work of John Logie Baird and the intense rivalries that played out in the so-called Colour War, others have examined the influence of television on public perception and culture, with articles covering television’s role in encouraging historical bias and shaping attitudes through Russian propaganda. Other responses consider the commercial side of the industry, with discussions ranging from the use of subliminal messaging in advertising, to the evolving models of how television is financed. In addition, some pupils have chosen to explore more contemporary concerns such as the neurological impact of consuming short-form content on platforms such as TikTok, to the medical accuracy of the show Grey’s Anatomy. Together, these articles demonstrate that the influence of television, in all its evolving forms, remains as compelling and far-reaching today as ever.

This publication would not be possible without the tireless work of Jackie Jordan, who determinedly worked through injury to ensure that all submissions were the highest quality. Without her editorial precision and attention to detail, this publication would not be brought together so beautifully. Thanks must also go to C A F Moule for organising and leading the academic scholars programme, and providing pupils with the structure and inspiration to produce work of such quality and scope.

Like the medium of television itself, this publication embraces variety: a vivid tapestry of analysis, curiosity, and insight. Each contribution reflects both an area of interest and broader engagement with the world around us. The scholars have approached their chosen topics with both intellectual rigour and personal flair, revealing that television is not just a passive form of entertainment, but can be a cultural mirror; commercial machine; social commentary; and psychological force. Whether our pupils have examined the past, reflected on the present, or speculated about the future, each author invites us to consider how this invention continues to shape our lives, and what the next hundred years might hold. We are delighted to share their work with you.

John Logie Baird and the Origins of Television

Ned B-S (Sh)

In the last 100 years television has become a huge part of our lives. Roughly 5.4 billion people watch televisions worldwide. This is around 68% of the 8 billion strong population of the world. Usage varies from country to country but, in 2023, in the UK the average person would watch around 4.5 hours of television and video content per day. Yet this pillar of our lifestyles was invented less than 100 years ago. The first television was made in 1926, by John Logie Baird, and it wasn’t until 1936 that the BBC were making regular transmissions. So where did television come from?

John Logie Baird was born on the 13th of August in 1888. He was born in the Scottish county of Dunbartonshire to John Baird and Jessie Inglis. John Baird was a priest, but his son did not take up his faith. Logie Baird went to boarding school, Larchfield Academy, before going to college at the Glasgow and West of Scotland Technical College. The College gave him his first taste of engineering in a variety of apprentice jobs. To finish his education, he went to Glasgow University where he studied engineering. Unfortunately, his education was cut short by the World War which started during his first year in 1914.

After leaving university, Baird initially applied to join the army but failed because he had been declared unfit for active service. When he was young, Baird had suffered from a near fatal illness. This had left him with a ‘weak constitution’ which meant that he was never at full health and was always susceptible to illness. So, instead he was sent to work at Clyde Valley Electrical Power Company. To help the war effort, Clyde Valley was working as a munitions factory.

After the war, John Logie Baird had many different business ideas few of which succeeded. For example, he moved to Trinidad in 1919 and seeing the vast array of fruit available, he attempted to set up a jam factory which shortly failed. His most successful idea was a water absorbent sock which he marketed and sold. In 1923, his weak constitution struck again and Baird moved to Hastings. Increasingly, he turned his thoughts towards becoming an inventor.

When John Logie Baird was born, the camera had been around for more than 70 years and in 1923, it was 107 years old. Joseph Nicéphore Niépce had invented his heliograph in 1816. This was a camera which used a light sensitive substance called bitumen. The bitumen would be dissolved in lavender oil before being spread on a polished metal plate in a thin layer. This plate would then be placed inside a dark box with a small hole facing outwards. This device is known as a pinhole camera or camera obscura. It would project the image upside down but clearly onto the plate. The bitumen side of the plate would face outwards towards the hole. Niépce would leave this contraption on his windowsill for days before taking the plate out. On exposure to light, the bitumen would harden. This meant that Niépce could dissolve and wash off the unhardened

One Of the first pictures ever taken, an impressiOn Of niepce’s cOurtyard
a pinhOle camera
a picture Of daguerre taken with his camera

bitumen, leaving an impression of the scene on the plate. 23 years later, Louis Daguerre created a camera which was much closer to something we might see today. He used treated silver to leave an impression of the image in a very similar way to Niépce. In his later models, even though the process of preparing the chemicals on the plate was much more complex, the process could be used to take pictures in under a minute. So, by the time Baird was born, the public were used to seeing photos.

John Logie Baird had also grown up with film. In late 1895, when Baird was six, the Lumière brothers had hosted the first public screening of a moving picture. They used their cinématographe to show a film of the workers leaving the Lumière factory. It lasted just under a minute and was watched by an audience of 30. These early films have since become iconic, for example the Arrival of a Train at La Ciotat Station and The Sprinkler Sprinkled which both lasted around a minute. The cinématographe worked using photographic film. By using shutters, the film would

rotate very quickly in front of a light source with no deceivable separation between images. The Lumières based their device on the kinetograph which was an early form of projector but was very heavy and impractical. They were trying to make a smaller more portable version of it. They used a handle which could be turned to move the film in front of the camera. In later versions, the Lumières incorporated a spherical glass which could be placed in front of the light. The water inside the glass would refract the light inwards. This meant that at certain distances the light would converge in the same spot and be very concentrated. This meant that the film could be put at that distance and exposed to much higher amount of light making the image clearer and more defined.

It is estimated that there were already 10,000 movie theatres in 1910. Films were becoming available to the wider public and were no longer seen as an experiment. They were already estimated to be one of the most popular forms of entertainment despite the fact that the films were silent and black and white. As they became longer, the rich began to see them as more than just a cheap novelty because the films could now tell stories. Some actors and directors were becoming household names and were gaining recognition from the wider public.

There were sceptics who were beginning to raise questions - some thought that films were rotting children’s brains and wasting their time. Others were more worried about theatre fires from the flammable nitrate photographic film which could sometimes catch fire if the light from the projector made them too hot. Other arguments included that films could damage your eyes and the view that actors were social rejects and they did not deserve the publicity. In spite of this, the film industry was booming and the filmmakers were charging more for the longer, higher quality movies, further inflating the worth of the industry.

the cinématOgraphe built by the lumières
a diagram Of the cinematOgraph made by the lumières

In 1923, with the film industry still booming, Baird was in ill health. He decided to put his wider business interests aside and focus on developing his legacy. The media and other inventors were beginning to discuss the idea of a machine that could turn transmissions into pictures, the television. The television could not use photographic film or light reactant chemicals which meant that the inventors were going to have innovate. John Logie Baird had very little funding or resources. To make the first television, which he called the televisor, he used what he had at hand including a hatbox, bicycle light lenses and a tea chest. The televisor eventually incorporated many other designs by other inventors. Baird used Paul Nipkow’s disks to make his transmitter and receiver. These disks when spun would direct the light onto a light sensitive cell as a strip of light. The disks had 30 holes meaning that there were 30 strips on Baird’s first television. The light sensitive cell would turn the light strips into electrical signals which could then be transmitted. At the television end, the electrical signals would be amplified and plugged in to a neon gas discharge lamp. The lamp had a variable brightness depending on how much electricity ran through it. The light from the gas lamp could then be focused onto another Nipkow disk rotating at the same speed as the other one. This one would also have 30 holes and would then shine the light onto and through a screen. This process worked at five images per second and gives a black and white image in 30 very slightly separate vertical lines. This system was exhibited for the first time on the 26th of January 1926 to a few select members of the Royal Institution of Science. The screening took place in Baird’s new lab in Soho, London. The observers initially watched a moving dummy (it was called Stooky Bill) being filmed downstairs. Because this was going so well, an office boy was then added to the set.

Following the success of Baird’s first television, he founded his business, Baird Television Development Company Ltd. By 1927, Baird had improved his system enough to allow it to show 12.5 images per second. To keep up with the quickly moving market, Baird kept developing his idea. By 1928, he had attempted to make a colour television. This used three Nipkow disks and two gas discharge lamps. He added a lamp full of mercury vapour and helium for green and blue to the neon lamp which was red. The same year, Baird demonstrated the first stereoscopic (3D) television. 1930 marks the first sound and picture television transmissions. In 1931, Baird televised the first outside broadcast for the BBC - the Derby. In 1936, the BBC began regular broadcasts but, by 1937, Baird’s television was abandoned for the MarconiEMI system. By the time the Second World War started, there were roughly 20,000 television sets in Britain. Baird’s final invention is the telechrome. This device should have been able to show pictures in colour and 3D. The system created light beams by firing electrons at cyan and red phosphor which created a beam that could be directed at a screen. This could create a variety of colours that was wider than any other television of its time. The process of integrating the telechrome was still underway when Baird died in 1946 of a stroke. This may have been caused by his weak constitution.

In the past 100 years, the television has continued to evolve from the televisor, yet its heritage still lies with Baird. He is remembered through various memorials including the television awards (the Logie Awards) in Australia and a plaque to him in Helensburgh, his home town in Scotland. He certainly wouldn’t recognise the enormous flat screen televisions that adorn our walls today!

the first demOnstratiOn Of televisiOn by baird (standing)
baird’s telechrOme

FCC:

All about the ‘Colour War’

Aun P (Re)

Key words

Short for ‘Federal Communications Commission.’ It is an independent agency of the United States government that holds authority over most areas of the electronics and communications industry.

RCA:

Short for ‘Radio Corporation of America.’ In the early 1920s, RCA became the dominant electronics and communication firm in the US. However, in 1986, GE - General Electronic – acquired RCA and sold or liquidated most of its assets.

CBS:

Short for ‘Columbia Broadcasting System.’ It was once the largest radio network in the US and is now one of the ‘Big Three’ American broadcast television networks. At present, it no longer directly owns or operates radio news stations.

NTSC:

The first American standard for analogue television. It was developed and published in 1941. In 1953, the ‘second NTSC’ standard was adopted and became one of the three major formats for colour television.

Background

After World War I, a worldwide economic boom not only increased television’s manufacturing capability but also provided a huge amount of excess funds, which allowed the television industry to invest in a new medium. In around 1939, the United States only had about 10 experimental TV stations nationwide, each of which was only testing broadcasting ranges and receptions. Being in the experimental stage meant they were prohibited from broadcasting commercial messages; therefore, there was no solid funding for the project. All TV industries were waiting for the FCC (Federal Communications Commission) to settle on a television standard and regulations. Since there was no television standard yet, every company wanted to get their system adopted. They were aiming for the profits that they could get from being able to license that technology. The war for Colour TV involved many parties, but I will describe the situation from these two big corporal giants’ perspectives:

Radio Corporation of American – RCA headed by David Sarnoff.

Columbia Broadcasting System – CBS headed by William S. Paley

Colour War

At first, in 1940, RCA’s (Radio Corporation of America) black-and-white standard for television was considered by the FCC as a standard. But on August 29th, CBS (Columbia Broadcasting System) unexpectedly announced that they had been developing a sequential colour system – created by Dr Peter C. Goldmark - which used a standard monochrome CRT display behind a synchronised rotating colour heel. They claimed that if the wheel were spun fast enough, the primary colours would blend to form a colour television.

+On the other side, RCA was also working on a colour system, but it was barely in the experimental state. Ultimately, the pressure on the FCC to act was too great. So, in June of 1941, the FCC compromised by announcing their black and white NTSC television standard. Unfortunately, despite CBS’s big campaign on their new colour system, they did not meet this standard.

CBS lost the first round

World War II kept everybody preoccupied and there was not much progress during the war years. In December 1946, CBS was still trying to convince the FCC to adopt their system for colour. With Charles Denny, the new FCC chairperson, expressing his staunch support for CBS’s system, it seemed like everything was going well for CBS. However, in March 1947, the FCC finally announced that CBS’s colour system was still too premature.

CBS lost the second round

In mid-1947, a problem occurred. People who lived between stations of the same channel, or adjacent channels, could not watch their television normally as their television’s screen would show a scrambled display. This is because different signals from different stations were interfering with each other. To fix this problem, on September 30, 1948, the FCC halted all new TV station licenses.

Then, in 1949, RCA displayed their newly modelled television set, composed of three vacuum tubes, each projecting an image onto frosted glass (the full set weighed close to a ton). The FCC suggested RCA’s system to be dropped immediately without even field evaluating it. This was a public embarrassment for the RCA. On May 26, 1950, the FCC ruled in favour of the CBS mechanical colour system - approving it on September 1, 1950.

CBS took the win in the third round

To recover from their previous failure, RCA’s David Sarnoff put in all effort toward improving the shadow mask concept, while also leading others in the industry to formally establish the “second NTSC.”

On June 25, 1951, CBS commenced colour broadcasting with Premiere on a five-city network. Three months later, the first CBS sequential colour television went on sale: the CBS-Columbia Model 12CC2. Unfortunately, with only two hundred sets shipped and one hundred sold, most of the public ignored this overpriced television ($500). At that time, TVs were selling in their millions a year. Compared to that, CBS could only sell a hundred. This shows the degree to which CBS’s television failed to attract customers.

CBS lost the fourth round

Senator Charlie E. Wilson of the Defence Production Administration issued an order on November 20, 1951, instructing CBS to abandon production and development of colour television because it might lead to a shortage of vital electronic components needed on the war front. That was considered the unofficial end of CBS’s sequential mechanical colour. Two years later, CBS officially revoked their colour standard. In contrast, on July 22, 1953, the second NTSC committee submitted their new colour standard to the FCC. This time RCA was leading the way with their electronic colour system that separates luminance and chrominance and a CRT screen with Red, Green, and Blue coloured phosphors.

In December of 1953, the new NTSC Colour standard was unanimously adopted.

The RCA took the final win

Black and white TVs continued to outsell colour TVs for yet another decade after that. It was only in 1972 that the colour sets started taking over. The ‘Colour War’ happening in the United States was not the only occasion that promoted the development of the colour system. For example, in 1956, the first alternative colour system was developed in France by Henri France which would later become SECAM. SECAM was a politically motivated colour system; it was made to counter the Americans and protect French Television industries. SECAM uses the same principle of separating luminance from chrominance as the NTSC; however, SECAM fixed a phase problem that the NTSC had. The NTSC could potentially lead to phase distortions over exceptionally long distances which could make the colours go a bit off – e.g. turning green to blue. In 1959 Walter Bruch, working at Telefunken in Germany, developed a hybrid of NTSC and SECAM. His system was called PAL, and, unlike NTSC and SECAM, it would not be backward compatible. It took Europe about 20 years after World War II to adopt the German system. In 1967, the UK implemented the PAL system on BBC2, the second and more high-end channel of BBC. Then, on November 15, 1969, BBC started broadcasting in PAL.

Conclusion

Although now both NTSC and PAL have been abandoned for digital broadcast; nevertheless, the effect of the war of Colour Television also paved the way to how we make television and video today. If you are interested in more information about the history of Television around the world, you can try exploring the ‘Early Television.org’ website.

Bibliography: EarlyTelevision.org by Bob Cooper Inventionandtech.com ‘The Color War’ by Marshall Jon Fisher

How and why the world switched from analogue to digital television

Digital television was first researched as early as the 1960s but due to requiring bandwidths of 200 Megabits per second for SD (Standard Definition) and 1 Gigabit per second for HD (High Definition), it only became a viable possibility around the 1990s. Digital signals are preferred for a variety of reasons, primarily because analogue signals are susceptible to interference. This is primarily because analogue signals are continuous, smooth waves of which amplitude and frequency dictate the brightness of the pixels and so can be negatively affected by distance or obstacles in the transmission path; whereas digital signals are binary (values of zero and one), with each signal being on or off, known as a bit. Since the system must discern between only two values with digital, as opposed to a continuous range of values with analogue, the television presents the user with less distorted images and audio as a result.

Digital systems also have more advanced error-correction techniques such as Reed-Solomon coding, Turbo code, and Viterbi algorithms, all of which can be employed by television systems to “fill in the gaps” of information lost in transmission. An additional reason as to why digital signals are clearer and less prone to distortion is more advanced digital modulation technology. Analogue systems use FM (Frequency Modulation) and AM (Amplitude Modulation), using one path to encode information, whereas digital systems use QAM (Quadrature Amplitude Modulation) and OFDM (Orthogonal Frequency Division Multiplexing), which encodes information in multiple parts of the signal, spreading the data over a variety of frequencies. This makes the signals more robust against interference as even if one part of the signal is corrupted, the other parts of the signal can be preserved, and used to restore the missing signal using the aforementioned error-correction techniques.

However, to make the transition to this superior technology, great levels of research had to be done in fields such as digital signal processing, compression algorithms to lower file sizes of the video and audio involved in television, telecommunications and networking -such as expanding broadband networks and the development of satellite communications-, and devising broadcasting standards such as the ATSC (Advanced Television Systems Committee) in the United States, the ISDB (Integrated Services Digital Broadcasting) in Japan and the DVB (Digital Video Broadcasting) in Europe. These groups helped to establish and define the technical standards of digital television such as aspect ratio and resolution, frame rates, audio formats, and more. To do this, further levels of scientific research were conducted around visual perception, to find the most optimal resolution levels (e.g. 1080p, 1440p/2k, and 4k) for a realistic and immersive experience.

Digital television also employed flat panel displays such as LCD, LED, and OLED as their primary form of display as opposed to analogue, which employed CRT (Cathode Ray Tube) as its primary form of display. Scientists also had to carefully plan the electromagnetic spectrum to make sure that television did not interfere with other forms of communication, developing new antenna technology to ensure the reliability of the new signals. During the transition, the social and economic impacts also had to be investigated, as well as providing information to the public and ensuring that digital converters could be employed so that older analogue televisions could utilise the signals. DRM (Digital Rights Management) and HDCP (HighBandwidth Digital Content Protection) had to be set up to ensure that intellectual property was protected, and that piracy was prevented, and ensure that content was securely transmitted.

Additionally, stronger encryption systems had to be put in place to prevent unauthorised access to premium content such as pay-to-view and subscription-based channels. Software such as EPGs (Electronic Programme Guides) also had to be developed to be user-friendly, as well as smart TV interfaces and transporting software only available for computers such as games and applications. Multi-Platform Distribution such as streaming and VOD (Video on Demand) were developed to transmit television

services via the internet with platforms such as Netfix and YouTube becoming increasingly popular with the rise of digital television. HbbTV (Hybrid broadcast broadband Television) was developed to offer both traditional television broadcast and smart TV services to work harmoniously in the same television system.

The map illustration above depicts stages of digital television transmission, red indicating “Transition completed; all analogue signals terminated”, orange depicting “Transition partially completed; some analogue signals terminated”, yellow depicting “Transition in progress; broadcasting both analogue and digital signals”, green depicting “Transition has not been planned or started, or is in early stages” and grey depicting no information available.

Today, many countries have fully switched over from analogue to digital, with Luxembourg being the first country to complete its terrestrial switchover on September 1st 2006, and Albania being the most recent, completing the switchover on December 29th 2020. The United Kingdom began the switchover on the 17th of October 2007 and completed the switchover on the 24th of October 2012.

In conclusion, the switchover from analogue to digital television, despite requiring intense research and technological requirements, has already shown itself to be an incredible advancement in society, and a pivotal moment in the history of television. It required research in a variety of scientific fields, including data transmission, coding and algorithms, error-correction techniques, telecommunications and networking; new methods of signal processing; analysis on new resolutions and display types to yield realistic results; protection of televised content; new software to give users a choice between on demand video and content spread by the internet, and allowing users to use both forms of software with the same system; satellite development; organising, planning, and regulating all forms of the new technology, as well as providing information to the public and making sure that the technology could be available to people regardless of their current system. As TV continues to develop with new technologies, allowing for even more content and clearer images and audio, the research that has been going on for multiple decades, and been theorised for even longer, will surely continue to be enhanced and improved in the future.

3D TV – Why did it fail?

Ina Q (Re)

3-Dimensional Television was so popular in the 2010s that some may even call 2010 ‘the year of 3D TV’. 3D TV sounds like a modern concept – however, the idea of stereoscopic films has been around since the mid19th century. So why, after so many decades of development, has it failed to maintain its popularity?

A brief history of 3D film and television

The first ideas of stereoscopic films include the invention of the stereoscope. In 1832, Charles Wheatstone developed the idea of the stereoscope – a device in which two photos of the same subject taken at slightly different angles are viewed together, creating an impression of 3-dimensional depth. The stereoscope was the starting point for 3D technology; later in the late 1890s, British film pioneer William Friese-Greene filed a patent for the making of a 3D film. This patent described the concept as two videos that were projected side by side, and the viewer would look through a stereoscope to combine the two images into one to achieve the ‘3-dimensional’ effect. Although this plan was never conducted, many more 3D films were released in the next couple of decades.

In December 1922, Laurens Hammond premiered the Teleview system – the first alternating-frame 3D system ever exposed to the public. The left and right frames viewed in the glasses that came with it were projected alternately. Later on in the 20th century, 3D filming evolved from stereoscopes to projecting films with two projectors. This, however, would result in some minor viewing issues, such as the possibility of one frame becoming uncoordinated on one side of the screen, thus jumbling the final video.

Modern 3D televisions work similarly – from the fancy, futuristic glasses to the two alternating images that converge into a 3D one. In 1981, Matsushita Electric (now Panasonic) launched their 3D televisions into stores, and the technology was quickly adapted in preparation for the first 3D stereoscopic video game, SEGA’s Sub-Roc3D. However, it was only after decades of development that 3D technology gained popularity and interest from the public. This was marked by the release of Avatar in 2009 which became the highest-grossing film of all time, which was shot in 3D as well as the traditional format.

The rise of 3D films and TV sets

After Avatar’s surge in popularity, more major manufacturers began producing 3D technology, along with special 3D glasses. At the Consumer Electronics Show in 2011, a vast range of 3D TVs were unveiled by numerous different tech giants. This was a wide range of novel technology, which included Mitsubishi’s 92-inch model of its new ‘theatre-sized 3D Home Cinema TV’ lineup. Toshiba’s ‘Glasses-Free 4K 3D TV’ prototype was unveiled, and Samsung and LG also announced their 3D TV series. Soon, 3D TVs were the new craze, and their sales had never been better - according to DisplaySearch, 3D television shipments totalised a staggering 41.45 million units in 2012.

The fall of 3D TV

Unfortunately, this craze was short-lived, and many buyers soon discovered (possibly too late) that the television sets they had bought had several drawbacks. The one in particular that dissuaded customers from purchasing was the hassle of setting up the 3D television. A BBC article (by Dan Whitworth) covering the first 3D TVs sold in the UK from April 2010 says: “The price of the first TV released is £1,799 - and there are lots of other bits of kit needed to get the right set-up. A pair of the 3D glasses this system uses cost £150, a 3D Blu-ray DVD player is around £350, and a compatible HDMI cable is £50.” These TVs required expensive new 3D gear the public did not already have, as listed in the article. This was a nuisance to most of the public, and some were unwilling to spend the vast sum of money on equipment they were only going to use for an equally expensive 3D TV.

Furthermore, two different kinds of 3D televisions were sold – passive and active, but they also each required a specific pair of glasses only compatible with its own brand and type. This was both confusing and vexing for customers, as this meant that they would have to buy multiple pairs of glasses if they were viewing as a group, which meant more costs. Moreover, the range of Blu-ray movies available to buy and watch was extremely limited, many of which had poor quality 3D and low frame rates.

In addition to the mounting frustrations, buyers of stereoscopic 3D glasses also discovered that there were potential side effects to wearing them, such as dizziness, headache, and nausea. This was because of the flickering caused by the shutters of the 3D glasses opening and closing, despite some TVs labelling themselves as ‘flicker-free’. Warnings were even issued out occasionally on consumer sets, advising the pregnant and elderly to avoid watching 3D movies. Many had also complained online that the glasses were heavy as well, making them even more off-putting to new customers.

Now that the public was slowly finding out that 3D TV sets were not only pricy, cumbersome, but potentially even harmful, sales began to decline. In 2013, sales plummeted and suddenly, nobody was buying 3D television sets. TV manufacturers caught on, and quickly discontinued their 3D lines. By 2016 only a few premium TV models supported 3D movies, but even that did not last long. The last 3D TVs were sold by LG and Panasonic in 2016, but no more were ever produced. Although 3D in cinemas remained intact throughout the late 2010s (as of my experience), movies shown dropped from 102 in 2011 to 33 in 2017 – a significant drop. 3D home TVs died as quickly as they came to life. However, there is still a slither of hope for 3D television. The popularity of 3D films and movies flickered on and off throughout its lifetime – could the 2016 decline of 3D TV just be another one of those flickers? Only time will tell.

The mysterious television Jon S (Sh)

Televisions: they’ve become a normal aspect of life. But how much do you actually know about TVs? They show images, you can watch the football on them. But how do they work? Why are they the shape they are? These are just some of the questions that almost no one knows the answer to, and I hope I can provide some explanation.

Why are TV screens rectangular? It’s the kind of thing you don’t ever think about, isn’t it? Of course, it is not just TVs, but other visual representations of the world, people, and places: movies, photographs, paintings. Also, windows and mirrors are mostly rectangular, etc.

Some reasons are:

• TVs are shaped like books, windows, or paintings because we’re used to those shapes. A rectangle feels “normal” to look at.

• TV companies might have stuck with rectangles to make their designs consistent across brands so people wouldn’t get confused.

• Other shapes might have caused issues, like blurry images at the edges or harder production, so rectangles were the easiest solution.

• People got used to rectangular screens over time. Even if other shapes were possible, companies might have avoided them because viewers prefer what they already know.

Another thing most people never think about is how do TVs actually work? TV technology has evolved massively, starting with mechanical TVs, advancing to cathode-ray tube (CRT) TVs, and leading to today’s modern displays. But whilst you may have an idea about how modern TVs work, do you know how mechanical TVs and cathode-ray tube (CRT) TVs work?

Mechanical TVs relied on a Nipkow disk (Invented by Paul Nipkow), a spinning disk with holes arranged in a spiral pattern that scanned images line by line. Light passed through the holes and was captured by a photodetector, which converted it into an electrical signal. At the receiver, a synchronised Nipkow disk recreated the image by modulating light intensity. These TVs were limited by low resolution, dim images, and mechanical inefficiency, but they were a vital first step in TV history.

CRT TVs, dominant from the 1930s to the 2000s, used a cathode-ray tube where an electron gun emitted a beam of electrons directed onto a phosphorescent screen. Deflection coils guided the beam in a raster pattern, with its intensity controlling the brightness of each point. Phosphors emitted light when struck, forming the image. Colour CRT TVs used red, green, and blue phosphors, with a shadow mask ensuring correct alignment. While CRTs improved image quality and reliability compared to mechanical TVs, they were bulky, heavy, and consumed significant power. Interestingly curved screens were briefly introduced in 2013, but their popularity declined in 2017 due to increased glare and difficult wall mounting.

Modern TVs, including LED, OLED, and QLED technologies, revolutionised the viewing experience. LED TVs are LCD displays with LED backlights, where liquid crystals modulate light to form images, enhanced by colour filters. OLED TVs use pixels as individual light sources, offering perfect blacks and stunning contrast. QLED TVs enhance LED technology with quantum dots, providing brighter and more saturated colors.

Next, another thing people don’t consider. The sound that comes out of TVs. It comes from speakers, right? But speakers have evolved along with the TV as well. Here’s a brief timeline of how sound has evolved in TVs.

1920s: TVs began as a silent device in 1927, just a picture. The technology to synchronise audio and video had yet to be developed.

1930s and 1940s: By the 1930s, mono sound was introduced, transmitted via amplitude modulation (AM). In the 1940s, frequency modulation (FM) improved audio clearness, allowing matching audio and visuals.

1950s: The black-and-white TVs then set up mono sound as standard. Built-in speakers provided basic audio, and the NTSC standard ensured smooth audio-video synchronisation.

1960s: Stereo experiments began, hinting at richer soundscapes. Though mono sound prevailed, external speaker connections allowed some viewers to enhance their experience.

1970s: The 1970s introduced dual-speaker TVs, making stereo sound more accessible. External outputs enabled connection to home stereo systems for improved audio.

1980s: NICAM (Near Instantaneous Compounded Audio Multiplex) technology debuted in Europe, offering high-quality stereo. Dolby Pro Logic brought multi-channel surround sound, boosting home theatre popularity.

2000s: Flat-panel TVs sacrificed audio quality for sleek designs. HDMI enabled uncompressed digital sound, while soundbars and surround systems compensated for smaller speakers.

2010 – Present Day: Dolby Atmos and DTS:X brought 3D sound to Smart TVs, which also integrated streaming services. Advanced algorithms and multi-directional speakers optimised sound for modern setups.

So, at its core, the invention of television was driven by human curiosity and the desire to send moving images across vast distances. Early pioneers experimented with electrical signals, blending science with creativity to develop a medium that could bring communication and entertainment to life in a completely new way. Fast forward to today, and TVs have evolved into sleek, high-definition devices that are as much about design as they are about technology. From their bulky, boxy beginnings with tiny, flickering screens to the ultra-thin, wall-mounted panels of today, the shape of TVs has been reimagined time and again. Alongside their visual advancements, the sound has undergone a similar revolution. Early models offered tinny, basic audio, while modern TVs deliver immersive surround sound experiences that make you feel like you’re in the middle of the action.

Television has become more than a way to access information or enjoy entertainment—it’s a device that seamlessly blends form and function. From the hum of the earliest black-and-white sets to the crystal-clear images and vibrant soundscapes of today, TVs have transformed how we experience the world. They remain a testament to human ingenuity, shaping not just our living rooms but also our understanding of culture and connection.

The life and impact of Sir David Attenborough

Henry W (U6)

Sir David Attenborough, renowned for his recognisable voice and unparalleled contributions to the broadcasting of Natural History, is an iconic figure whose life has been dedicated to uncovering the wonders of the natural world. Attenborough’s journey from an inquisitive boy to a world-renowned environmentalist and nature broadcaster is a story that greatly interests me, and one that I have thoroughly enjoyed researching and writing about.

David Attenborough was the second of three brothers and was raised in an educationally stimulating and thriving household. His father, Frederick Attenborough, was the principal of University College, Leicester. He was a keen academic, running the scholars’ programme at the College, and he often embarked on research and projects himself. It is no doubt that his father’s intellectual prowess impacted David’s early interest in learning. As a child, Attenborough displayed fascination with the natural world. He used to collect fossils and explore the countryside. He also loved reading books on Natural History and participating in local wildlife expeditions.

Attenborough attended Wyggeston Grammar School, where he excelled in the Sciences. He ended up studying Geology and Zoology at Cambridge, where he was consistently the top of his class. Following his degree, he served two years in the Royal Navy. During these years he travelled and explored new wildlife, deepening his passion for the natural world.

In 1952, Attenborough joined the BBC as a trainee producer. He was initially assigned to work on factual programmes. However, his smarts and passion were soon recognised, and he quickly began rising through the system. In 1954, he launched the show, Zoo Quest. In Zoo Quest, David and his team travelled to exotic countries to capture animals to bring them back to a zoo in the UK (which was legal at the time). In addition, they filmed animals and local people exploring their customs. The programme was immediately successful. It was groundbreaking for its time and captivated audiences, establishing Attenborough as a pioneer in wildlife documentaries.

Throughout the early 1960s, Attenborough continued to develop programmes that explored the beauty of the natural world. He was then promoted to Controller of BBC Two in 1965. During his time, the channel was extremely successful. He helped air incredible shows such as Civilisation, The Ascent of Man, and Monty Python’s Flying Circus. Despite his success, Attenborough wanted to return to the field and write and produce more television series. As such, in 1972 he resigned from his position to return to filmmaking and broadcasting.

Attenborough’s work after this moment went to monumental heights. His series of documentaries began with Life on Earth in 1979 and continued with The Living Planet and The Trials of Life, to name a few. Each programme was meticulously researched and filmed, and Attenborough’s iconic and passionate narration brought information and entertainment to audiences. These series set new standards for programmes and are widely known as the best nature documentaries of all time.

During Attenborough’s time creating his documentaries, his awareness of the threats facing the natural world deepened. In the latter part of his career, he shifted from being solely a presenter of Natural History to becoming an influential advocate for environmental conservation. Recent documentaries such as The Blue Planet and Planet Earth not only showcased breathtaking visuals and spectacular animals, but also highlighted the threats facing ecosystems and the dangers posed by human activity.

In recent years, Attenborough’s message has become more urgent. Works such as Climate Change – The Facts, and A Life on Our Planet are a call to action for our generation and future generations to act on the issues facing our planet. In these films, Attenborough outlines the dramatic changes he has witnessed in the natural world, emphasizing the critical need for sustainable practices to mitigate climate change and preserve biodiversity.

Despite his global fame, David Attenborough has led a relatively private life. He married Jane Elizabeth Ebsworth Oriel in 1950 and they had two children, Robert and Susan. Unfortunately, Jane passed away in 1997, which did deeply affect Attenborough. Nevertheless, Attenborough’s work remained consistently brilliant and a source of inspiration for many.

Sir David Attenborough’s career, spanning over 70 years, leaves an immense legacy. His documentaries have generated hundreds of millions of views and caused countless people to think sustainably and consciously about the state of our world. His work is inspiring and enjoyable for all generations. Attenborough once said that ‘No one will protect what they don’t care about, and no one will care about what they have never experienced’. Attenborough’s work has given people the experience and means to care, so that we can now hopefully act.

How accurate is the TV series Grey’s Anatomy?

Helena C (L6)

Grey’s Anatomy is a popular medical drama TV series that intertwines the world of healthcare with your typical Hollywood drama. The show premiered in 2005 and recently released its 21st season in late 2024 despite the star of the show, Meredith Grey herself, leaving the show in 2023. The show has sparked many young people’s interest in the medical field and has become an influential platform, raising awareness of the world of healthcare. Grey’s Anatomy’s ability to strongly influence society derives from the fact that Shonda Rhimes, head writer and executive producer of the show, has been able to incorporate real-world issues into the storyline, covering a large breadth of topics, and therefore reaching a wider audience.

Whilst it is such a popular show, suspicion has been raised about the accuracy and how realistic the show actually is. It’s obvious that a real functioning hospital wouldn’t burst into song and conduct a fullblown musical as they did in the controversial episode called “Songs Beneath the Song,” season 7, episode 18; however, how accurate is the medical aspect of it? Are the medical procedures, diagnoses, and protocols all just a load of fictitious, fancy-sounding words that the scriptwriters threw in there to give a false sense of reality?

Grey’s Anatomy has often faced criticism throughout its time under the spotlight for the degree to which the featured content is accurate. There almost always seems to be a rare, unsolvable case with a 0.001% survival rate every few episodes that somehow, more often than not, ends up a success. Shonda Rhimes, alongside the show’s writers, consults with real-life physicians, so why are there still inaccuracies? Those who watch the show but are also in the medical field themselves have stated that the terms and actual procedures are real and are used and practiced in the real field of medicine but are also oversimplified or exaggerated. The operating rooms are presented as extremely fast-paced and high-pressure when in reality it is a slow process that requires delicate precision and a calm environment.

The show gives off an overall sense that the information discussed and the terminology used is accurate; however, to a trained medical eye, there are subtle details that seem almost impossible to overlook. Dr. Kailey Remien, a physician from Ohio who recently discussed in an article the inaccuracies of Grey’s Anatomy, stated that she often noticed the inappropriate and improper use of certain instruments. She says that it drives her crazy when actors put their stethoscopes on backwards and still discover a heart murmur! The meticulous details that doctors have engrained into themselves since medical school seem like second nature to them as they know the consequences if steps are missed, but in the show, it seems that those little details don’t have any sort of disastrous effects on the outcome.

The doctors on the show are portrayed as impulsive and always willing to bend the rules to get the preferable outcome. Protocol is often broken, which jeopardises patient safety, and decisions are made in those high-stakes crises that seem reckless and spontaneous.  In real settings, this would be heavily frowned upon and often illegal due to the strict guidelines and ethics that doctors must follow. The punishments that the doctors in Grey’s Anatomy face are, needless to say, lenient. For example, when Izzie Stevens (an intern) cut a patient’s LVAD wire in the interest of her love for him, not only was she still able to retain her license to practice, but also her place in the residency program. In reality, this would have most likely landed the physician in jail as well as having them stripped of their privilege to practice medicine for the rest of their life.

Grey’s Anatomy also inaccurately depicts the hierarchy and the relationship between the different seniorities of doctors, or ‘the surgical food chain,’ as described in the show. Interns normally don’t address or contact the attending, let alone the chief of surgery, as often as the interns in the show do, and this is because, under normal, realistic circumstances, the senior residents act as the bridge between them.

Lastly, something Grey’s Anatomy is infamous for is the extent to which all characters get romantically entangled. It is a frequent question asked of physicians who discuss the show, and the responses are all very similar. Hospitals are incubators, exposed to any and maybe every type of bacteria, so any self-respecting, educated physician would not involve themselves in a sex scandal in the on-call rooms. Additionally, the bottom line is that doctors realistically wouldn’t have the time for such activities when on call. There is always someone to help or something to do because, after all, hospitals are places for people in need and seeking help.

Conversely, some may argue that Grey’s Anatomy is somewhat accurate. Those who are criticising the show often forget that it is a TV show. To a certain extent, everything needs to be dramatised and exaggerated to attract an audience, and at the end of the day, the goal is for the show to achieve that. An inaccuracy that has been brought up by critics is how the show seems to ‘conceal’ the administrative aspects of the job. This may be true, and critics make a valid point that a lot of the job does involve large amounts of administrative work; however, if one thinks about it, as accurate as it may be, no one wants to sit around and watch a show that is just mainly doctors sitting on a laptop or filling in charts.

Some residents have actually commended the writers of the show, saying that they were able to accurately show the reality of working in the field of medicine. The seasons in which MAGIC (the acronym used to describe the 5 interns: Meredith Grey, Alex Karev, George O’Malley, Izzy Stevens, and Christina Yang) were interns were portrayed as brutal, long working days, which is the reality. The writers didn’t sugarcoat it to give a false sense of a fantasy perfect world; instead, it is slightly dramatised so that viewers aren’t led to believe that it’s an easy job. Residents have said that the show was able to encapsulate the emotions of being an intern perfectly. The overwhelming feeling that you are holding someone’s life in your hands but you don’t exactly know what you are doing. It is a terrifying feeling that the show has been able to subtly portray and get across to the viewers.

What viewers often find shocking is that many of the cases featured on the show were based on realworld people and events! For example, it may be hard to believe that the episode where a man walks into the hospital with hands resembling tree branches was anywhere close to being real, but it’s actually a disease called Epidermodysplasia where warts grow on the body that looks a lot like branches or tree bark. Seems more believable now, doesn’t it?

So, in conclusion, just how accurate is Grey’s Anatomy? To the normal eye, Grey’s Anatomy is your typical drama series with crisis after crisis, scandal after scandal, but to a trained medical eye, it is riddled with inaccuracies. From the frequent romantic involvement between characters to the improper use of medical equipment, Grey’s Anatomy does not seem to rank high on the list of most realistic medical shows. However, what people have seemed to lose sight of is the fact that this is Hollywood! Writers most likely consciously left out information to not bore the audience and to make time for other, more engaging storylines. Without the drama and exaggeration, there is no show, and Grey’s Anatomy would not be what it is today: a medical inspiration, an encapsulation of the medical field, and, of course, a prime-time American television medical drama.

How do medical dramas affect doctor and patient expectations in real life situations?

Matilda B (Re)

City Hospital, which aired in 1951, is often considered to be the first televised medical drama. Since then, hundreds of medical dramas have been produced in Western countries alone, with varying degrees of success. Even after all this time, this specific genre of television is still hugely popular among a wide range of people. A few famous medical dramas include: House, Scrubs, ER, Casualty, with the most notable being Grey’s Anatomy, a series which first aired on 27th March 2005 and which is still going 19 years later. The series is the eighth highest grossing series of all time1, and, according to ABC’s president Ben Sherwood, roughly 200,000 people watch the pilot episode on Netflix each month2. However, what makes this genre so popular, and does it affect real life health-care expectations in a positive or negative way?

On the surface, medical dramas do not seem all that appealing. They feature a location and topic which to most people should seem mundane and relatively familiar. Furthermore, upsetting, gory scenes do not sound particularly great for daytime watching. Despite this, not only the huge number of medical dramas but also the equally large quantity of seasons of these shows, prove they remain very popular.

One reason for this is that they are largely relatable. The shows feature a high level of suspense and drama while remaining tangible to the viewers. This, in a way, makes them all the more tense and the situations are plausible to those watching. Scientifically, our brains find it hard to look away from disaster. Dr David Henderson, who is a psychiatrist explains that ‘witnessing violence and destruction, whether it is in a novel, a movie, on TV or a real life scene playing out in front of us in real time, gives us the opportunity to confront our fears of death, pain, despair, degradation and annihilation while still feeling some level of safety […] We watch because we are allowed to ask ourselves ultimate questions with an intensity of emotion that is uncoupled from the true reality of the disaster: “If I was in that situation, what would I do? How would I respond? Would I be the hero or the villain? Could I endure the pain? Would I have the strength to recover?” We play out the different scenarios in our head because it helps us to reconcile that which is uncontrollable with our need to remain in control’. 3 Medical dramas feature a huge number of disaster-like situations, thus keeping viewers engaged for large periods of time.

While being unarguably popular, medical dramas do not always affect patients and doctors in positive ways. The so called ‘Grey’s Anatomy Effect’ is defined by The National Library of Medicine as unrealistic expectations, like those caused by misleading and unrealistically optimistic medical stories, which often lead to worse health outcomes; as a result, mortality rates increase4. Shows such as Grey’s Anatomy and ER do strive for accuracy by having physician consultants; however, inaccuracies are still common. CPR, a vital part of first aid, has been found to be performed highly inadequately on TV, while resuscitation is successful a lot more than in real life5

As well as this, a 2018 study was conducted screening 269 Grey’s Anatomy episodes versus 4,812 patients from the National Trauma Data Bank National Program Sample6. The results showed that, in Grey’s Anatomy, mortality after injury was notably higher than in real life, with 22% of patients dying in the show compared to just 7% in real life. This makes viewers more anxious about hospital visits, and makes the doctors appear far less competent. The study also found that after arrival to the ER, a massive 71% of TV patients were taken directly to the operating room, compared with a relative minority of 25% in the NTDB sample. This can cause frustration in real life patients when they are not taken to the operating room as quickly as they believe they should be.

It is fair to say that these factors do massively add to the drama and watchability of the series, however because so many people get health information from television (for example a survey of geriatric patients demonstrated that 42% of older adults named television as their primary source of health information7), inconsistencies such as these can have a negative effect on both the patients and the medical professionals involved. This is proved by a study by Dr. Brian L. Quick, a University Professor of Communication and Medicine, which found that viewing Grey’s Anatomy had negative effects on patient satisfaction8.

Opposingly, medical dramas can provide awareness to people who otherwise would have incredibly limited medical knowledge. A study has found that 17% of viewers were inspired to speak to their doctors about an issue they had seen on Grey’s Anatomy9. The show can also help to discourage prejudices which are caused by ignorance to medical facts. This is demonstrated in an episode of Grey’s Anatomy in which a young HIV-positive woman who is pregnant asks for an abortion, before learning that with proper treatment she has a 98 percent chance of delivering a baby who is HIV-free. A study surveyed a random group of viewers before the watching of the episode, a week after watching the episode, and then six weeks after watching the episode. One of the questions asked in the survey was ‘is it irresponsible for a woman who knows she is HIV positive to have a baby?’. Before watching the show, 61 percent answered yes, a week after watching the show, only 34 percent said it was irresponsible and six weeks after the show aired, 47 percent of viewers said it was irresponsible9. Lack of knowledge leads to unfair judgement and prejudices, and shows such as Grey’s Anatomy have helped to tackle this issue in the past, proving that they can have positive effects on watchers.

In conclusion, medical dramas are a genre which has been popular for five decades, and which continue to be widely and frequently watched to this day. It is important for viewers to remember that the shows are not documentaries, and that this should be taken into account when medical ‘facts’ are mentioned. At the same time, viewers should make sure that they talk to doctors or other medical professionals about any issues discussed within the series which concern them. As a species, we love drama so I am sure shows such as these will remain popular into the future.

1 The Billion-Dollar Shows: The TV Shows That Made the Most Money | Brand Vision ( brandvm.com)

2 The Grey’s Anatomy Effect: Unraveling the | Mysite (dukemedicalethicsjournal.com)

3 The Science Behind Why We Can’t Look Away From Tragedy (nbcnews.com)

4 Beyond the Drama: The Grey’s Anatomy Effect and Medical Media Misrepresentation – The Monarch (amhsnews.org )

5 Cardiopulmonary resuscitation on television: are we miseducating the public?* | Postgraduate Medical Journal | Oxford Academic (oup.com)

6 Grey’s Anatomy effect: television portrayal of patients with trauma may cultivate unrealistic patient and family expectations after injury | Trauma Surgery & Acute Care Open ( bmj.com)

7 The Grey’s Anatomy Effect: Unraveling th | Mysite (dukemedicalethicsjournal.com)

8 The Effects of Viewing Grey’s Anatomy on Perceptions of Doctors and Patient Satisfaction | Request PDF (researchgate.net)

9 Grey’s Anatomy Raises Health Awareness - CBS News

What similarities can be drawn from dystopian films and real-life environments?

Tallulah B (Re)

The Hunger Games trilogy is a series of dystopian films based on the books by author Suzanne Collins.  The story is based around the annual games that take place in a civilisation called Panem. The Hunger Games are essentially a fight to the death of 24 people, two people randomly selected from each of 12 Districts, leaving one survivor who is showered in riches. This is all led by the Capitol, Panem’s equivalent of a government. The district system is a harsh scale of wealthy to poor, with each being in charge of a particular resource starting at District 1 and ending at District 12.

In an interview with Scholastic Media, Suzanne Collins mentioned that her inspiration for the storyline was the result of channel surfing one night, flipping between a reality TV programme and footage from the war in Iraq. She said ‘the lines began to blur in this very unsettling way, and I thought of the story’. Most dystopian novels may focus on unrealistic or imaginary scenarios to make their book fit into the genre. However, the fact that her idea stemmed from a disconcerting view of reality shows how the storyline of The Hunger Games is primarily focused on the similarities between less fortunate, corrupted countries and a more fortunate countries idea of a nightmare.

Overall, the dystopian trilogy follows three main themes; the District system and economic divide; the exploitation of resources and labour; and control and manipulation of the media. The events in The Hunger Games series are strongly influenced by the segregation between districts and the unjust living conditions that follow. In District 1, where the most common source of work is producing luxury goods, the population is stereotypically wealthy and very comfortable. The same goes until District 3 or 4, the technology and fishing industries. However, lower down the scale is poverty and struggle within District 11, agriculture, and 12, coal mining. These areas deal with notably prejudiced conditions, and it is easy to notice the famine and homelessness in the movies.

Conditions like these are equally as destructive in the current world, directly mirroring circumstances that support the rich getting richer and evidently allowing the lower classes to struggle. The popular image on the left shows the border between the Paraisópolis favela and the affluent district of Morumbi in São Paulo, Brazil. The immediate distinction between wealth is obvious, yet nothing is done about it. At the moment nobody feels the need to change anything, however a further similarity between fact and fiction is that down the line a large-scale revolution ended up happening. The Russian revolution started as a result of dissatisfaction from peasants, workers and soldiers towards Tsar Nicolas, and ended up with them murdering their Tsar, almost parallel to the eventual revolution in the Games.

Likewise, exploitation of labour and resources is a common theme throughout the trilogy. Whilst District 11 supplies the rest of the population with food, they barely have enough for themselves. There is a general resource scarcity in The Hunger Games, and even though most are earnt by hard working labourers, food is passed off to higher Districts.  It is easy to compare this to hard working labourers in Africa, specifically Botswana. Botswana has the biggest yield of diamonds in Africa, and yet its economy still suffers. The reason for this is the foreign owned multi-million-dollar extraction companies managing the sites take the bulk of the profit and get away with paying workers a minimum wage. This is yet another parallel between fictional dystopias and legitimate environments.

Many countries around the world are familiar with surveillance, as it is a secure way to prevent crimes such as theft. However, some governments have harnessed cameras with the intent of preventing freedom of expression. For instance, in North Korea, cameras and CCTV seem to be a standard practice around the country. Moreover, it has been noted that even private conversations can be recorded and examined in a way that completely defies any freedom of speech. Yoshihiro Makino, an author within the book Inside Pyongyang wrote, ‘seemingly, every aspect of a person’s existence in North Korea is monitored. There is a general sense that it is dangerous to engage in any serious conversation about sensitive topics’. This alone proves that with bad intent, even a surveillance system used to prevent crime can become a terrifying concept. In Panem, the country that The Hunger Games takes place in, the Capitol uses cameras in a similar way. What this leads to is an authoritarian government, and a complete lack of free speech. The constant surveillance is a factor that further confirms the dystopian environment that civilians in Panem are living in.

Furthermore, a similarity between worldwide dystopian-like governments and The Capitol is the use of propaganda and media manipulation. Propaganda is now and always has been an alarmingly influential method of promoting a point of view. Displaying posters with information, true or not, or spreading rumours via radio broadcasts, tends to get a point across and support various theories and opinions. Likewise, something as simple as click-bait nowadays is a prime example of media manipulation. However, The Capitol manipulate their media in stronger forms. Every year the games are televised to every District, supposedly to share the ‘entertainment’. Although it may seem innocent, the Capitol subtly benefit from using this as an opportunity to show their population the authority that they hold. Having the ability to make the 24 subjects compete in the first place and being able to force the Districts to watch the games play out show just how much power they possess. The annual game itself was initially created as a form of punishment in order to remind civilians of the cost of rebellions, and to prevent any from every happening again. Overall, it can be said that there are a surprising amount of similarities between a work of fiction with intent to be disorderly, and government systems with an intent to keep order.

Televised football and why it’s become a monopoly

Will S (L6)

Televised football has undergone an exceptional journey, going from a simple broadcast in the ‘30s to a worldwide display that generates billions annually. However, alongside this remarkable growth, the industry has become more and more monopolised with only a few corporations dominating television rights and hiking up prices for the average consumer. The monopolisation of the televised football industry is a worrying, topical issue that has been shaped by modernisation and football as a whole over the course of the past century.

Starting in 1937, the first football game to be broadcasted on TV was a practice match featuring Arsenal’s first team and reserve team in London at Highbury stadium by the BBC. Despite this being a momentous occasion in hindsight, the match reached a very limited audience. However, the stage was set for more to come; in 1938, the first competitive match was televised – the FA Cup Final – between Huddersfield Town and Preston North End. The progression of televised football was then disrupted by World War II yet once that had passed, by the mid ‘50s, football on TV was back up and running. At first, many clubs feared that by televising games match attendance would be reduced which led to an extremely cautious approach with only certain matches being played on TV.

By 1960, ITV took the leap to sign the first contract with the Football League worth £150,000 at the time. This was then followed up by the creation of Match of the Day by the BBC in 1964. This programme featured weekly highlights which quickly attracted thousands of fans, turning it into a cultural phenomenon. Although, it wasn’t until the 1970 football World Cup in Mexico where televised football truly changed. After the BBC’s first colour TV broadcast in the UK in 1969, this was the first tournament to be broadcast in colour providing a live viewer experience that captivated the vibrancy of matches in a way that had never been done before. During this decade, paying leagues and clubs money for broadcasting rights emerged which marked the start of commercial agreements between television and football albeit the sums were very modest compared to what was to come.

The true turning point of televised football came about in the ‘80s. Through the introduction of cable and satellite TV, paired with worldwide economic growth, new opportunities opened for companies. Particularly, in England, attendance in stadiums was declining due to a number of issues including the Heysel Stadium disaster of 1985 and the Hillsborough disaster in 1989 as well as a general decline in the game’s reputation due to hooliganism. From this, teams struggled financially and television was seen as a lifeline by offering clubs new forms of revenue through broadcasting deals. Furthermore, it was through such financial motives that led top clubs to break away and create the English Premier League in 1992 which allowed for their own negotiating of broadcasting rights. Through this, Sky Sports was able to acquire exclusive rights in a massive £304 million deal which included never before seen multi-camera angles, expert analysis and even pre-match entertainment. Seeing this, other leagues and competitions across Europe followed suit negotiating new, lucrative deals. By the year 2000, Sky paid £1.1 billion for a Premier League deal alone.

As the industry expanded, only a few key companies were able to keep up with the rising prices of buying rights. Sky Sports dominated English football although other regions across the world have been monopolised by other giants like ESPN and BT Sports. The shift towards a monopolised industry is down to a few reasons and has many consequences for fans, clubs and the sport itself.

For one, the driving prices for broadcasting deals has begun to reach astronomical levels. In 2015, Sky and BT Sports alone paid £5.1 billion for only three years of Premier League rights. Such costs are passed on to their customers who face rising subscription fees. Moreover, the monopolised industry has led to restricted access for fans across the world. Originally, matches were available on free channels such as the BBC or ITV but now are locked behind paywalls.

Another consequence is that broadcasters have begun prioritising international markets due to the inability for overseas fans to go to matches live and their reliance on televised football. Premier League clubs generate significant revenue from overseas broadcasting deals yet this focus neglects the domestic audience who have to pay more for the service. Lastly, the revenue generated from TV deals has created financial disparity among teams making football less competitive and more ‘pay-to-win’. This is largely due to the fact that bigger clubs are paid more for rights to air their games which simply allows for the gap to grow.

The future of televised football is uncertain in many aspects. The monopolisation of the industry has led to higher and higher prices for the average consumer and there are fears that the average football fan may be priced out of watching weekly games. Fans have voiced frustration and many have begun to turn to piracy. Illegal streams across the internet draw millions of viewers from across the world. To combat this, many teams have started their own TV channels that directly play their games to fans for cheaper prices; however, this comes with many risks that could further disrupt the market.

Overall, the history of televised football is an interesting and important story that describes the rise of football from a simple past-time hobby to a global commercial enterprise. While TV has exponentially grown the game, it has also created a concentrated market in the hands of a few broadcasters that marginalises fans and smaller clubs. This raises large concerns and questions between commercial interests as well as the accessibility and integrity of the game. The sport needs to remain inclusive to all fans across the world and is struggling to do so at this very moment. In conclusion, the future of televised football and the sport as a whole is very uncertain.

Will an English T20 competition ever be worth more than the Indian Premier League?

The Indian Premier League (IPL) is an Indian cricket competition that was founded in 2008 and has since grown its way to have television rights worth over $6.2bn for a five-year period. While there are many other similar competitions across the world, there are a number of reasons that mean it is unlikely that any of them will ever come close to having such valuable rights, such as India’s vast population and the way that their ownership works.

Before answering this question, we have to understand how TV rights work and how their value is estimated. A TV right is the rights for a broadcasting corporation to show an event live or even at a later date, in the form of highlights or replays. These can be extremely valuable, the NFL having deals worth over $100bn for the next 11 years1, and their value is determined by a variety of factors. These include the number of customers likely to watch a certain event, the amount that they are willing to pay in subscriptions and the amount that advertisers are willing to pay to advertise during these events.

The fact that over 1 billion people live in India means that while TV companies are paying many times more for these rights, they are also streaming to many more paying customers, meaning that while they are paying a lot to get the rights in the first place, Star India (the TV company broadcasting the IPL) is also likely to make more money from more subscriptions. This can be demonstrated by the fact that the Hundred’s (England’s T20 competition) TV rights are worth £251m a year, compared to £1.026bn for the IPL. In India, cricket has an audience of 612m people2 compared to only 13m in the UK3 meaning that broadcasters in India, even though they are paying four times more for rights, are reaching over forty-seven times more customers. This just shows that the population of India, compared to any other cricketing nation, is so large that broadcasting companies like Star India will always be willing to pay more for an Indian competition than its competitors.

Additionally, the way that franchise cricket has developed has meant that the owners of IPL franchises have also ended up buying teams in South Africa, USA, the UAE and the Caribbean and these teams all share players with their linked franchises especially the original teams in India. This means that these competitions will never outgrow the IPL because most if not all of the best players in these other leagues will also play in

the IPL. But many will play IPL and not other leagues for various reasons such as the fact that in the IPL they are being paid extravagant amounts so they have no need to play smaller leagues for the rest of the year. Furthermore, the profits from these sister franchises will either go back to the owners of be pumped straight back into the IPL teams, meaning they can pay more for wages and therefore attract better players.

Some have also begun to wonder if the same is about to happen to England, as the owners of the Delhi Capitals (an IPL franchise) have just made a landmark purchase of Hampshire CCC for £120m and the Hundred franchises have just been put up for sale. If many of these are bought by IPL owners, it seems likely that the Hundred will just turn into another secondary league, and while this would provide financially stability and maybe even some better players (currently the Indian cricket board blocks Indian players from playing in oversees leagues), it would mean that the possibility of this competition being the most valuable in the world would be zero because as soon as there appears to be any risk of it becoming more valuable than the IPL, owners would return their focus to India to make sure it stays the premier cricketing competition in the world.

On the other hand, there are a few reasons why the Hundred could attract large TV deals. Firstly, it is held during August, and this is a time in the year where not only are there no other T20 competitions in August, but also in June and July meaning that there will be more people wanting to watch it. Secondly, the fact that a Hundred game is 40 balls shorter than a regular T20 fixture means that it is even more accessible to new spectators, and it is less of a commitment for someone to watch more often. This will be appealing to TV companies as they are always looking for new spectators to broaden their audiences. The Hundred also has the chance to attract new large sponsors as it is not only a popular, widely viewed competition but it is fresh and new and sponsors may want to be associated with this.

To conclude, for the foreseeable future, the IPL will remain the premier T20 competition due to its high quality of matches and the vast population of its audience. However, there is hope as the Hundred grows for it to solidify its spot at number two and maybe at some point challenge the IPL. Additionally, if the Hundred does continue to grow, if handled correctly, it could start to challenge the IPL, although currently the league is way off even thinking about this.

1 https://www.grandprix247.com/2024/08/01/top-sports-with-the-most-expensive-broadcast-rights/#

2 https://www.business-standard.com/cricket/news/india-s-sport-audience-base-678-mn-2-cricketers-are-most-liked-report-124032000232_1.html

3 https://www.ecb.co.uk/news/3334603/new-figures-show-health-of-cricket-in-england-and-wales#:~:text=13%20million%20people%20describe%20themselves,Inspiring%20Generations%20strategy%20in%202020.

Cricket’s Television Revolution

Will J (Re)

Kerry Packer, a media magnate, never played cricket seriously but few people have made a bigger impact on the game. Kerry Packer opened cricket up to thousands more spectators through the medium of television. In 2005, Kerry Packer and Sir Donald Bradman were named as Australian cricket’s most influential men of the past 100 years. In this article, I will examine how Kerry Packer helped transform the sport and the fortunes of its players by launching his own World Series competition.

Packer’s father Frank reportedly gambled a 10 shilling note he found on the street to earn his passage from Tasmania to mainland Australia, where he built their media empire. Kerry inherited his father’s fondness for waging bets, famously squandering more than £13million in three days in Las Vegas although he usually won. He never played first-class cricket but started taking a keen interest in the sport’s revenue earning potential in the 1970s.

In 1977, Packer offered what he thought was a lucrative AUS $1.5million to the Australian Cricket Board (ACB) for the rights to cover Test matches and domestic cricket on his Channel Nine TV network. The board turned him down and agreed a deal with rival broadcaster ABC for significantly less money. Packer, who was not given the opportunity to negotiate, was left fuming and announced that he would set up his own competition which would be TV-led.

After the ACB snub, Packer sought the advice of former Australia captain Richie Benaud about running his own competition, which would be called World Series Cricket. Although Test cricket would still feature, he felt limited overs games, which had previously only been played domestically in England, were the way ahead. Packer knew that television ratings were highest in the evening and was convinced day-night matches

would capture them and new viewers around the world. Three teams were involved in 15 so-called Super Tests - Australia, West Indies and a World XI - while the 49 one-day matches also featured another team of Australians and a Cavaliers XI made up of players who could not get into the main teams. Many one-day games were played under floodlights with a white ball used and coloured clothing worn. With traditional venues like the MCG and Brisbane’s Gabba out of the question, games often took place at football venues. All of this was designed to drive television audiences.

Packer’s first task was to persuade the England captain at the time Tony Greig to recruit some of the biggest names in the sport. South African-born Greig was looking for a way to move to Australia after his playing career ended and the businessman’s offer was too good to resist. Greig hosted a meeting at a London hotel which led to legends like West Indians Viv Richards, Clive Lloyd and Michael Holding, and Pakistanis Imran Khan, Majid Khan and Mushtaq Mohammad signing up. England were not as badly hit by the rebellion but six established Test playersGreig, John Snow, Dennis Amiss, Derek Underwood, Alan Knott and Bob Woolmer - played for Packer. The Australia team contained big names like Greg and Ian Chappell and Dennis Lillee who was regarded then as the premier fast bowler in the world.

Cricketers earned nowhere near what today’s players pick up - salaries were limited and off-field endorsements were rare. Unlike now, cricket boards around the world did not offer long-term central contracts, so financial security was lacking. A number of leading international players had become disillusioned and Packer’s money provided a tempting antidote. For the South Africans, who had been in international isolation because of apartheid, it was a welcome opportunity to pit themselves against highquality opposition on a regular basis.

However, many chose not to take up the big sums being offered because they feared the repercussions: counties said they would only employ Packer players if they were available to play for England all year. Much of the media in England and Australia was hostile and words like ‘traitors’ and ‘circus’ were frequently used to describe the players and Packer’s televised revolution. The ECB’s forerunner, the Test and County Cricket Board (TCCB), was unaware of what Packer had been plotting and was taken by surprise. It reacted angrily: Greig lost the England captaincy and, along with the game’s world governing body (then called the International Cricket Conference), warned players they would be banned if they took part. That ruling was overturned in court but the TCCB averted an exodus of players by offering longer-term contracts with only slight increases in pay.

Despite the eye-catching names on display, the quality of the cricket on offer was not always the highest. However, there were some superb performances from Viv and Barry Richards, Greg and Ian Chappell, and Dennis Lillee as the World XI and the West Indies beat Australia 2-1 in the first season of Tests. The World XI also triumphed in the one-day matches, with 20,000 watching the final on TV. The second season featured day-night Tests and again the Australian team failed to win either series, the World XI victorious by a 3-1 margin and the West Indies earning a 1-1 draw. The biggest winner was Packer, who had shown the cricketing authorities he could do it alone and make it work using his knowledge of television to help him design a winning formula.

The World Series Cricket rebellion lasted only 17 months, ending with a compromise between Packer and the authorities. Packer had his TV rights, cricket was transformed into a professional game, and the way was paved for players and administrators to pick up the riches they do today. Some participants paid the ultimate price in terms of their international careers, and were never forgiven by their countries, but their sacrifice and Packer’s gambling instinct made an indelible impact on the sport and left a lasting legacy. The most watched cricket match on television is the 2011 ICC Men’s T20 World Cup final between India and Sri Lanka which reached a global audience of 558 million viewers and players today now make mouth-watering amounts of money in part funded by lucrative television broadcast rights (Virat Kohli has a monthly salary of $690,000) and this is all a result of the impact and legacy of Kerry Packer on the game of cricket.

The Truth: ‘alternative facts’ and the bubble

Martha F (L6)

What is the truth? In contemporary society, this concept has become increasingly grey, politicised, and controversial with the emergence of baseless conspiracy theories and ‘alternative facts’ too often perceived as valid. As technology has become portable, instantaneous and ubiquitous, there has been a marked shift away from terrestrial television and print media, towards online social media news, driven by Facebook, X (formerly Twitter) and TikTok. As of September 2024, the internet overtook television as the UK’s leading news source, with 71% of British adults utilising virtual media over TV.

Why is this a problem? Put simply, online algorithms have been shown to create filter bubbles and echo chambers that restrict the utility of open-source news, as they only reflect pre-existing beliefs and act to reinforce them. The internet has evolved from a communal metropolis to a series of digital islands upon which society has fractured, adrift and unwilling to debate alternative opinions. There needs to be a force that holds to truth, rather than hidden amongst the shadows of conspiracy and fiction.

The polarisation of opinion has demonstrated the necessity of national media sources such as the BBC to speak truths without bias and debunk conspiracy. Television serves to prevent information cascades, in which prolonged interaction with misinformation spirals beyond our control until we lose any sense of what is true. From the first recognised newspaper in Venice in 1566, print media has played a critical role in society, holding truth to power, educating the masses and giving the citizenry a voice. Likewise, television news outlets are a force for public good, a voice for the masses that serve a purpose far beyond the immediate needs of the individuals. While television remains popular it has been undermined by a disproportionate increase in online media’s power which fosters polarisation and blinds many to the nuances in everyday debate.

The media bubble has formed due to structural changes in journalism, as online news outlets are gaining workers at nearly twice the rate legacy media institutions are losing them. More simply, online media companies have greater reach, and their content is cheaper to create. Social media’s impact on television has been a blessing and a curse. On the one hand it has empowered the people, enabling easy access to unprecedented quantities of information and empowered their ability to share opinions. However, it has decimated national terrestrial TV and print organisations and has become a breeding ground for fake news that challenges the credibility of journalism in contemporary society.

This fundamental shift has been proven to be problematic, as social media is highly unreliable and wholly unregulated. For example, during the 2016 presidential election, fake news on social media channels received more views than stories broadcast on reliable media channels, having a profound effect on people’s opinions and ultimately the result of an era defining election. Additionally, as television audiences have fallen, there has been a reduction in the quality and diversity of journalism that is at the heart of an educated society, despite the enormous surge in the volume of stories. This is, in part, due to the facile reporting on social media ‘news’ sites such as BuzzFeed, which contain little original journalism. Additionally, maligned actors can manipulate stories to suit preferred storylines or more significantly, the course of global geopolitics.

The fall of television and the rise of the new digital empires has created profound issues for society, such as filter bubbles with their resultant echo chambers. During the reign of television, most Western TV stations were separated from the state and were valued for their independence, that offered reliable, diverse and verifiable news. However, social media news is a very different animal. Choices are driven by computer generated algorithms, which can mean that the media consumed is unvaried and tailored to pre-existing beliefs. Customised content algorithms and following patterns push pieces that support this distorted vision and manipulate pre-existing bias. This bias is perpetuated by a level of self-administration, as they present people only comfortable alternatives, rather than more challenging ideas. Interestingly, brain scans have shown that understanding beliefs that oppose your own, cause genuine psychological discomfort through the activation of the brains anterior insulae emotional response mechanisms. Therefore, to avoid distress, many choose to remain ignorant. While conflict may be uncomfortable, ignorance causes dangerously warped realities, meaning it can be beneficial to step outside your comfort zone.

Furthermore, the filter bubble has only increased in size and influence. These ensnaring bubbles, first theorised by Eli Pariser, are a universal problem. They develop through both unconscious self-censorship and digital manipulation that is outside our control, meaning we are all susceptible. Furthermore, filter bubbles are deliberately perpetuated by complex algorithms created by tech giants that forcibly blind you to the truth, distorting our understanding of societies commonly held opinions to boost corporate profit. As an estimated 61% of millennials use Facebook as their main source of news, this leads to a drastic overestimation in the ubiquity of our views. This was horrifyingly illustrated during the American 2020 election and the Republican reaction to the result, which culminated in the 6Th January storming of the Capitol in Washington, perpetuated by the misconception that the majority of the country supported their misguided beliefs.

It could be argued that the benefits of social media - connection, mobilisation and the sharing of information - are of vast societal value, that the rare cases of obstructive filter bubbles are overblown. Moreover, while algorithms do dictate what information is presented to the viewer, they do not prevent access to other ideas, showing they are not as restrictive as is widely believed. But, despite that, the issue of filter bubbles is one of serious note, as algorithm driven platforms continue to grow in importance and prolonged interaction serves to further restrict recommendations leading to ever greater insularity, as evidenced by studies of sites such as YouTube.

It is far too easy to tumble down the rabbit hole of selective news, but there are ways in which this can be combated. The technology that still offers the most effective antidote to the bubble is one that held sway for nearly a century: television. Interacting with this more reliable news source offers greater assurances that the information is verifiably researched. Moreover, it is open to the glare of full public scrutiny.

Contemporary society is stranded in a minefield of truth versus rumour, subject to a battle of opposing forces that are tearing at the heart of democratic norms. In recent years there has been a surge in fake news that blurs the borders between fact and fiction, with many important global events being steered by blatant lies. In addition, the rise of the internet has enabled the malicious spreading of fake news such as ‘PigGate’ and former Prime Minister David Cameron or ‘PizzaGate’ and Presidential Candidate Hilary Clinton- even now, many believe such lies to be true.

These same practices allowed the spread of disinformation about life after Brexit. The Leave campaign adopted an American social media approach that manipulated emotion rather than hard facts, promoting debunked promises of extra billions for the NHS or freedom from strangulating EU regulation and a free market utopia. All dissenting facts were dismissing as ‘project fear’, powered through Facebook chat groups and aggressively manipulative online marketing techniques.

However, the most significant impact of fake news was its undeniable role in the 2016 US Presidential election. During the election campaign, there were 115 pro-Trump fake news stories that clocked 30,000,000 views by members of the public. This led many researchers to believe that without ‘fake news’, Clinton rather than Trump would have become President. Moreover, during Trump administration, the concept of ‘alternative facts’ was coined by Kellyanne Conway, claiming that Trump’s Press Secretary Sean Spicer was not spreading misinformation but merely telling an alternative truth, sparking journalistic outrage that opposing facts could simultaneously be correct.

It is widely believed by both professionals and the public that the solution to the increasing spread of fake news is national television services such as the BBC, and that television’s fight against the spread of misinformation is one that will preserve democracy. In the UK, polls show that television is our most trusted news source, a concept that is amplified across the globe; a survey of 16 African nations revealed that three quarters of respondents trusted national news outlets such as TV over private sources. The miracle that is the BBC is crucial to global journalism enabling economically disadvantaged nations to have a global voice.

Television is a solution to many of the problems that cloud truth, serving as a corrective tool to combat fake news by providing access to reliable news sources. Through transparent, open-source journalism, television wields the sword that can slay malicious misinformation, thereby rebuilding public trust and reuniting communities torn apart by polarisation. This shows the overwhelming potential of public broadcasters to combat fake news, alternative facts and the bubble, to uncover important universal truths.

How did the use of cinema contribute to the war effort in World War II?

The Second World War was a time of suffering across the world. Sons, fathers, and husbands were called up to fight and potentially lose their lives, leaving bereaved families at home. Depression rates soared and morale was low both on the Front Line and at home. People sought a way out of their problems, a distraction, and cinema became one for millions. Cinema became a prevalent entertainment source in many countries, but particularly the United States (because of Hollywood), Britain and Germany. Governments also used cinema as a way to broadcast information and intel on what was happening on the Front Line and other countries.

Boosting morale was one of the main uses of cinema during World War II, with movies, other than radio, being the top entertainment source at the time. In Britain, once they had declared war on Germany, all public entertainment closed which included around 4,000 cinemas. However, this was dismissed after only two weeks and people began to go to the cinema again. Cinema attendances grew from 20 million weekly attendances in 1939 to 30 million in 1944, in a country of only 48 million people. This shows the impact that the cinema had on people’s lives with over half of the country attending the cinema every week, even in a time of devastation.

In the United States, morale was conveyed through movies that showed a sense of group effort from soldiers, and the sacrifice of the soldiers on the Front Line. Examples of the movies Hollywood put out to boost morale included So Proudly We Hail! which told the stories of American nurses trapped in the Philippines that tried to continue with their normal lives. Another example is The Story of G.I. Joe which is about an American soldier abroad writing about the war in Europe and Africa and encountering the brave men risking their lives for the war cause. Film audiences at home heavily sympathised with these movies, allowing for a more empathetic country to support the soldiers and the war effort. Movies also included actors from diverse ethnic backgrounds, which movies hadn’t always included until now, to broaden audiences and therefore ensure widened support.

Entertainment wasn’t the only reason for keeping cinemas open: movies were an effective way to broadcast information. With 30 million attending cinemas a week in Britain alone, the Government could put out informative films and newsreels. These often educated the public on the progress of the war and each

country’s war efforts. In the United States, organisations were set up by the Federal Government, including the Office of War Information (OWI), which supervised the war movies that Hollywood were distributing to make sure that the way the war was portrayed was accurate and educational. The OWI ended up being unhappy with the movies released at the time of the agency’s founding. They believed that the movies extensively exaggerated the Nazi and Japanese use of spies and did not correctly portray what the allies were fighting for. To prove the OWI’s claims, a study was conducted of the movies Hollywood made in 1942 finding that nearly two thirds of the films were about spies, comedies or musicals, and these films were proved to show a twisted version of the war.

Before the United States declared war on Germany, Hollywood had been suspected of campaigning for the involvement of America into the war through the movies they put out between 1939 and 1941. A Senate subcommittee began an investigation into whether Hollywood was inserting pro-British messages into their films leading to Hollywood being charged with attempting ‘to drug the reason of the American people’ with just over 20 films. Even though Hollywood was desperately attempting to get America to join the war, they were conscious of offending foreign audiences. At the Nazis’ request, all Jewish and non-Aryan employees at Hollywood offices around the world were fired.

In Germany, the Nazis had been using cinema to their advantage. They harnessed the power of cinema in the 1920s when Nazi economists published articles and essays with titles such as Spellbound by Movies: The Global Dominance of the Cinema by Hans Buchner and The film as a political instrument by Hans Traub. The goals for Nazi film policy were to push pro-Nazi and anti-Jewish propaganda. Films featured Nazi party organisations such as the Hitler Youth, with a movie called Hitlerjunge Quex and an antiSemitic movie called Jew Suss. Nazi propaganda movies accounted for up to a sixth of all national film production, with the rest being entertainment films.

In 1933, because of the Film Credit Bank, the Nazis were given full financial control over all films and in 1934 all film scripts and to be approved by a Nazi film advisor. However, the Germans could not reach screen quotas so their film exports fell significantly, and the earnings from the international box office made up 40% of all German film earnings in the silent era of film, which dropped to only 11% in 1934-1935. The Nazis continued to distribute Nazi information about the war by insisting all films were followed by a newsreel in 1938. However, film production was limited by petrol and potassium nitrate shortages, and, by the end of the war, the Soviets had begun to manage film production in East Germany and the UFA (Universum Film AG), the main company behind all German film production, was liquidated by the British and French in 1949.

In conclusion, cinema contributed to the war effort through distributing information and boosting morale across countries, with Hollywood allegedly hinting for America to join the war and then keeping the country strong through the movies it produced. The Nazis used the power of cinema to campaign for their party in elections. Cinema continued to boom after the war for a while until attendances began to rapidly decline again to pre-1939 levels. However, the Second World War dramatically altered the trends and styles seen in cinema and proved the power it could hold.

The role of television in homogenising spoken Arabic

Faith S (L6)

With over 370 million people currently speaking Arabic across 25 countries, it is not surprising that over the past couple of millennia, the language has evolved and diversified. With the rise of Arabic television in the last 60 years, and Egypt’s cultural domination, the Egyptian dialect has emerged as the lingua franca across many Arabic-speaking nations. Although not officially recognised by any state as a language itself, the Masri dialect has distinct differences to the formal Modern Standard Arabic that is more widely taught and used in formal settings. However, as the Arabic-speaking world progresses, television is becoming a less important part of daily life, giving way to many other forms of entertainment, promoting dialects other than Masri and even developing new modern forms of the language. So, will the influence of Egypt’s television be enough to hold its place as what is seen by many as the cultural capital of the Arabic-speaking world, or will the new generations bring international linguistic change?

As televisions started to become common household items in Arabic speaking countries during the 1960s, in Egypt the film industry had already been flourishing for a number of years, leading to their many television hits and recognisable language. The first Arab national motion picture organisation, Studio Misr, was founded in Cairo in 1935, and the years that followed brought ‘the golden age for Hollywood on the Nile’, with young and hopeful actors heading to Cairo, where they picked up the Masri dialect. By the time televisions were widely in use, Egypt had built a name for itself through their flourishing industry which brought many of the first Arabic speaking shows and films to people’s screens, a novel and hugely popular idea. Of course, with this entertainment being produced in Egypt, the Arabic that was spoken in sitcoms was typically Masri. Televisions allowed people to watch this entertainment from home, or in many cases from a local street-corner café, and due to Egypt’s previous success in producing films, it unsurprisingly hooked many audiences to television programmes.

Perhaps one of Egypt’s most successful genres in the world of television were musalsalats, the equivalent of melodramatic fictional mini series such as soap operas. Aided by the wealthy film industry, these were given huge budgets, using lavish sets and costumes, and bringing in cultural idols such as film stars or pop singers. Often roughly based on a historical event or relatable, everyday life, musalsalats became incredibly popular during the annual period with a need for nighttime entertainment: Ramadan. Industries realised that waiting for Iftar after work, or the period between Iftar and Suhoor during the night, could be an ideal time to entertain people. Now, the idea has become so widely employed that special Ramadan musalsalats are created with even more lavish production to be shown in the evenings during Ramadan.

More recently, the country that comes closest to competing with Egypt’s television is Lebanon, with adaptations of Western shows including Arabs Got Talent and Arab Idol being popular Beirut-based entertainment, something which also rubbed off on the UAE with Million’s Poet, a take on Who wants to be a millionaire? However, with much of Arabicspeaking television historically having been produced in Egypt, it is not surprising that the Masri dialect became the lingua franca.

But what does the future of Arabic language television look like? As with much of the rest of the world, global accessibility to multiple social media platforms is likely to mean that sitting around a television to watch a show may be a dying activity. For the Arabic language, this means more exposure to other dialects through the ease of posting and watching shorter clips. Instead of watching an hour of television, through mindless scrolling the younger generation can now be exposed to at least 30 different videos per minute, each representing many different dialects from across the region. In order to keep up with the trends, people are also using generational slang in a hybrid form of Arabic, weaving in pop references from other languages with their everyday Arabic, including leaving many brand names or consumer goods such as jeans in their English term despite there being an Arabic equivalent, something which can be witnessed throughout the evolution of languages across the globe.

These hybrid forms of Arabic are also taking off in the worlds of music and texting. A term called ‘code-switching’ is used to describe a singer whose lyrics are not in one language, but change between two or more. For example, Algerian songwriter Khaled uses a blend of both Arabic and French in his song Aicha as a reflection of his bilingual upbringing and the colonial history of his homeland. For texting communication, Franco-Arabic has emerged; a mix of numbers, letters, and numerals to mimic the Arabic script in an abbreviated form. Even modern literature engages with these linguistic changes as authors try to stay relevant, creating a new and more international form of Arabic than generations before.

So, we can see that television, especially that produced in Egypt, has had a significant impact over the past 60 years on the homogenisation of spoken Arabic, leading it to now be a recognisable and commonly referenced dialect across the Arabic-speaking world. However, despite the continued strength of the Egyptian television industry, it seems likely that its influence may decline in lieu of social media popularity, providing a wider exposure to both other Arabic dialects as well as international references. Therefore, the digital era may be ushering in new and potentially more varied linguistic influences that will undoubtedly have an impact on spoken Arabic across the region in the years to come.

The greatest televised event in history

Reuben D (Sh)

John Logie Baird could never have imagined that just 44 years after inventing his original television that was made up of just two Nipkow disks covered in glass lenses that ran at five pictures per second (which is piteous when compared to even modern gaming computers which can run at over five hundred frames per second) would have progressed so rapidly that humans would have the capability to broadcast live to over 650 million people from nearly 400,000 kilometres away on the surface of the moon. The journey from Baird’s early mechanical system to the lunar broadcasts involved a series of pre-eminent innovations:

• Electronic Television: The development of electronic television systems, which replaced mechanical scanning with cathode ray tubes, significantly improved image quality and resolution.

• Colour Television: The introduction of colour television in the 1950s brought a new dimension to the viewing experience, further enhancing the realism and appeal of televised content.

• Satellite Technology: The advent of satellite communication enabled the transmission of signals over vast distances, making global broadcasts - and even broadcasts from hundreds of thousands of miles away in outer space – possible.

Just over a decade after the beginning of the Cold War, Sputnik one, a Russian artificial satellite was launched prompting the beginning of what would become the most colossal international race in our epoch; the race to be the first country to put a man on the moon. In his famous 1962 speech at Rice Stadium in Texas, John F. Kennedy promised to put a man on the moon before the decade was out. This iconic speech appealed to the patriotic nature of the native population and began a mad nation-backed frenzy to put a man on the moon by 1970.

Over the next seven years, ten missions were sent to space, each pushing the boundaries of exploration to a new extreme. Apollo 8 and Apollo 10 were the first missions to explore the possibility of broadcasting from space, with Apollo 10 broadcasting the first live colour television transmissions from space. With Apollo 11 being the first mission to land men on the moon, NASA was under astronomical pressure from the White House to have some sort of transmission from the moon to boost the morale of all classes of American citizen and to defy the stranglehold of the Russian communist regime.

At the NASA headquarters in Washington D.C there was a fierce debate as to whether a television camera should be carried in Apollo 11’s Lunar Module because it would add almost 4 kilograms and by that point the engineers were so worried about the rocket being overweight, they were weighing individual screws. Eventually, after lots of internal arguments between key personnel at NASA, a modified black-and-white Westinghouse camera with a 16mm lens was approved. After the approval of the camera, the deployment of the television camera to transmit signals to Earth became one of the additional flight objectives.

As Neil Armstrong eased himself out of the Lunar Module, he pulled open a storage assembly attached to the module’s

stan lebar, the prOject manager fOr westinghOuse’s apOllO televisiOn cameras, with the mOnOchrOme lunar surface camera (On the right), and the Original camera that it was derived frOm july 1969

lower stage. Within it, surrounded by gold-coloured insulation blankets, was the black-and-white 16mm Westinghouse television camera. To ensure it was able to record images of the mission, the small camera was specially equipped to deal with the high contrast between light and shade on the Moon. The image and sound signals were transmitted via a lightweight antenna on the top of the lander. The umbrella-like antenna was lined with 38 miles of fine gold-plated wire which was thinner than human hair, to reflect the signal 250,000 miles back to Earth. In the cabin, Buzz Aldrin closed a circuit breaker, and black-and-white television pictures of Armstrong’s pale, translucent form were beamed back to Earth. The images were grainy and indistinct, but they represented a stunning breakthrough in television broadcasting.

After successfully landing on the Moon, Neil Armstrong and Buzz Aldrin were supposed to sleep for a few hours. Due to the immense amount of adrenalin, they couldn’t wait and requested to exit the Lunar Module ahead of schedule. This change of plans meant that Honeysuckle Creek Tracking Station in Australia was in the perfect position to receive the first few minutes of the transmission. Goldstone Observatory in California was also receiving the signal, but the picture was grainy and hard to make out At the last second, for obvious reasons, NASA switched from Goldstone to Honeysuckle for the broadcast to the world. The Parkes radio telescope dish in Australia also began transmitting once it was able to have line-of-sight contact with Apollo 11.

This incredible milestone in human history broke records around the world with over 650 million people watching globally. The records held by the spaceflight include; first man on the moon, first geological samples from the moon (they brought back over 21.6 kilograms of material, including rocks, lunar regolith – fine grained soil – and material from over 13 centimetres below the surface of the moon). It was also the first all night live television broadcast by the British Broadcasting Corporation.

In conclusion, the Apollo 11 mission to the moon was an astounding feat of both human willpower and motivation that will be remembered for millennia as a pillar that the human race will forevermore stand on. It made staggering progress in many fields including long range broadcasts, and reducing the necessary size for cameras. The historic broadcast was not surpassed by any single event in terms of viewing numbers until Prince Charles and Lady Diana’s wedding at St. Paul’s Cathedral in 1981 which was watched by nearly 740 million people in 74 countries. Tragically, only 16 years later an audience calculated to exceed 2.5 billion worldwide watched Princess Diana’s funeral.

phOtO Of the high-quality sstv image befOre the scan cOnversiOn

Television and Spaceflight Rhea

S (Re)

From launches to landings, every step of a spaceflight is tracked and, often, broadcast to the public. Tens of millions of people watched during the great Space Race of the 1960s and although general interest has waned, many rocket launches will make it onto the news, and a livestream is available of spacecraft activity, both on Earth and sometimes in space, almost every day. So how has this deep coverage affected spaceflight and the people involved?

A good place to start would be the beginning, with the famous technological and political contest between the Soviet Union and the USA, and some of the greatest successes of pioneering ever, in the space race primarily in the 1960s. The broadcasting of these events, or lack thereof, were also a great political tool. Yuri Gagarin’s historic first venture to space in 1961 was not broadcast until he had arrived safely back on Earth, ensuring that, if the mission went wrong, their reputation and the patriot feelings that had risen due to the space race would not be tarnished.

Eight years later, the USA finally pulled ahead of the Soviet Union and managed its own first, possibly the most famous first of all time, landing of humans on the Moon. Nearly 650 million people watched on TV as Neil Armstrong stepped down from the Lunar Landing Module and onto the surface of the Moon. This was the most watched event in television history when it happened, and has stayed in the top spot for the USA ever since, with 125-150 million people having watched it.

The television fame was not reserved solely for the duration of the mission however, as the media held interviews with the astronauts-to-be in America, and both nations held great ‘victory tours’, and televised visits to major cities across the globe. Part of the selection process for both countries was how well the chosen astronauts would represent their country.

Yuri Gagarin was chosen over his fellow trainee astronaut Gherman Titov simply because he would look good for Russia on TV, because of his friendly smile and personality, and the fact that he was ‘properly Russian’his parents worked on a farm, and some people thought that the name Gherman was a bit too close to German after the Second World War.

The seven astronauts selected to be the USA’s first candidates for spaceflight were also all backgroundchecked, and this was never as obvious as at the press conference that introduced these men to the world. The very first question asked by a reporter was what their

wives and children thought about their career shift from military test pilot to trainee astronaut. Questions also asked included their home addresses, their smoking habits and their attendance of church. The reporters had been given press kits containing the astronauts’ previous careers, ages, siblings and parents, wives and children, and religion. What all this was trying to prove was that these astronauts were quintessentially American, and could represent America well if they became the first man in space, and in the inevitable worldwide fame afterwards, featured in all media in a tour of dozens of countries, each of which would have millions turn out to see the first man in space, and later the first men on the Moon.

However, it does not always go as well as these incredible events. One of the great tragedies of televised spaceflight was the Challenger STS-51L explosion in 1986. The flight on which the ‘Teacher in Space’ program launched and failed, this mission had much greater media coverage than the average Shuttle launch, especially for children. Schools across the USA and even internationally showed a live feed of the launch, and the teacher selected, Christa McAuliffe, was meant to teach two lessons from space, filmed on board the Shuttle. When the Shuttle exploded a minute after lift-off, it deeply affected many children nationwide. Indeed, one eighteen-year-old said ‘When the sky is that certain blue of the day of the launch, I always think of Challenger’. Astronaut recruitment understandably went down after Challenger, even though the original intent was to inspire a new generation of space enthusiasts.

Sometimes, we do not know if a flight will be successful or not. This happens on every flight as the craft re-enters the Earth’s atmosphere, and all communication is lost with the crew, the capsule disappearing from all tracking systems for around four minutes on a typical mission. Almost as if the flight was a drama written for entertainment, at this crucial stage of the mission, when it is just about to be revealed whether the mission has been a success or not, all contact is lost. If there is an issue with the craft’s heat shield, then the ground crews will only know when the crew fail to appear out the other side, and it starts raining debris, such as in the Columbia STS-107 mission, when damage sustained during the launch phase went undetected for the duration of their 16-day mission and led to the demise of all seven astronauts on board.

Modern televised space flight is very different to the early broadcasts. The use of the International Space Station means that astronauts can stay up in space for much longer, even months at a time. This means that they have time to do activities not prioritised by early astronauts, such as filming themselves aboard the Space Station. Many aspects of astronauts’ daily life have been captured on camera, from how they exercise to how they wash. They even do some fun films, a recent one for the 2024 Olympics showing what different sports would look like in zero gravity, such as discus and weightlifting, both much easier on board the Space Station than on Earth!

Television has been a major part of spaceflight since the beginning, and has played a huge role in giving the general public an enthusiasm for space exploration, and allowed them to follow the progress of pioneering through the ups and downs. In my opinion, spaceflight would never have taken off in such a major way if not for television, and space activity increases and wanes with public interest in a way that would be possible if not for televised spaceflight.

How has TV encouraged historical bias?

History is the ultimate plot: informative, gripping, romantic, bloody. Offering unending angles for epic tales, a gold mine for any story teller. Humans yearn for both knowledge and entertainment and both can be explored in the realms of the past. However, while the perspectives, opinions and explanations in history are multidimensional, subjective and even undefinable, TV is not. Ultimately, this subjective nature of all past events means that the opinion of the historian supplying the information to the director, will inevitably be his or her interpretation, which will likely be further refined by the director into something worth watching. Here lies the problem. When watching historical TV, the individual inevitably is gullible to the director’s presentation, and thus an individual’s opinion is propagated onto multiple others. Fundamentally, this is an issue for widespread perception of the past as displaying historical events or people in such a widely accessible form encourages the propagation of modern ideas onto times of the past by the average viewer. This further misconstrues history and effects the way we analyse and learn from it, thereby limiting its capability to liberate and challenge us academically. Potentially, historical television can become a device for spreading information and prejudice rather than the entertaining, fascinating and engrossing tale we all expect and want it to be.

What is the best way to encourage a particular perspective? Simply not acknowledge or propagate opposing ideas. TV, and historical TV in particular, poses a great danger to those who strive for a balanced perspective of history. It is undeniable that the complexity of history, which is widely expected to cover all dimensions of the past (romantic, economic, international, etc), would be impossible to condense accurately and effectively on TV. However, the importance of diverse TV coverage is quintessential when it comes to reducing prejudice between people. The omission of the suffering and sacrifice of unfavourable groups of any kind is an insult to them and the position which they occupy in history, encouraging a false idea of progress and what that actually means. In addition to historical imbalance for minorities who go unrecognised, TV also enacts the catastrophically destructive role of encouraging inter-ethnic, international, and interracial division. TV often reinforces in the mind of the viewer past events which have divided groups of people, this creates a sense of otherness between parties which is deeply unproductive moving forward and should be acknowledged and mitigated.

An example which dominates modern day politics and prejudices, dating back over millennia is that of the tenuous relationship between Middle Eastern Muslims, specifically Arabs, and the West at large. The ideological differences which have arisen are due fundamentally to the history between the two sides which itself is rooted in the different religions traditional to the West and the Middle East, combined with the politics and violence of the recent century from both sides. This is reinforced in media for example in the 2015 film American sniper which portrays the service of Cris Kyle fighting for the US Army in Iraq. Kyle killed 160 people in Iraq making him the deadliest sniper in US history. In the Film, Kyle expostulates with derogatory slurs directed towards the Iraqi soldiers, referring to them as ‘savages’. The film has been criticised for glorifying violence as well as encouraging anti-Islamic tendencies, such coverage of this behaviour is particularly destructive due to the audience to which this type of film would appeal. Indeed, it subtly appeals to an inner sense of nationalism and in many ways normalises and inspires such behaviour in impressionable individuals who may already be harbouring these feelings. The film received six Oscar nominations, as well as achieving record box office sales for a January release in North America. Its popularity increased its visibility and therefore further encouraged the misconstructions, created by viewer interpretation, of Islam as a result. Hundreds of anti-Islamic threats and criticisms were received and spread on twitter and Facebook, reaching Muslims worldwide, as well as anyone else who might read and as a result potentially sympathise with them. The American Arab Anti-Discrimination Committee (ADC) has acknowledged and criticised them. Not only does this increase division between the nations in question but, as an extension, people of the ethnicities in question when both should be looking to reconcile and progress in unison. The critical opinion of Islam in the West also taints the view of history, preventing academic level-headedness, and encouraging the pursuit of the negative media ecosystems which matches the argument and bias which has been previously instilled.   This creates a challenging dilemma, because it is indisputable that films such as American sniper are educational, and equally it is important not to sugar coat the truth, but how can TV look to prevent discriminatory and biased accounts of events, to prevent prejudice in viewers when the events themselves are inherently biased? I believe the answer lies in interpretation. Viewers must learn to suspend their disbelief and appreciate the fiction of historical TV so as not to taint the reality of the event.

Can a simple documentary or series begin to grapple with the complexity of historical events within the confines of a couple of episodes? Perhaps not. When creating a film, the director must address the two factors which motivate films and define their success: entertainment and education. These two factors can often clash, as frequently the historically accurate account of history is both too complex for the big screen, and less exciting than a dramatised version which a director might look to create. This often means the past is misconstrued to those who are interested in it. As ultimately the film industry is a business and like all businesses must make money. Therefore, often education is sacrificed in the name of entertainment, which I think is an idea which epitomises the self-deprecating human nature. Notably in historical TV, history is made to occupy a non- historical rhetoric. Inherently, the very nature of an ‘end’, which is essential in TV, would be inaccurate to apply to any historical story, as the past is a continuum. This in itself stimulates bias as the viewer is able to satisfy their emotions and walk away at the end of the film/TV show believing that there was a ‘happy ending’. However, in order to endure and make profit, the primary goal of TV must be to entertain, as that is what has been proven to attract viewers. Through such a cause the nuanced meaning of history is lost, which provokes inaccuracy leading to misunderstanding and bias.

Jill Godmilow, an accredited documentary maker, says “To survive and to take public space and attention, history must borrow all kinds of structural and strategic devices from fiction to achieve ‘satisfying form’”. For example, in the Netflix documentary The Crown, education and accuracy are forgone to maximise drama and subsequent popularity, with bitter consequences to those who feature. This is a particularly notable example as the criticism faced by the royal figures has been far more acute as the line between truth and fiction has been blurred. Viewers often believe that they are watching a documentary and therefore are far more gullible to the content. As a result, persons who are scrutinised and portrayed in a disparaging context such as King Charles, on whom emphasis is placed on his infidelity and perceived inability to empathise with Diana’s struggles, has been the subject of continued media ridicule, partly due to the show’s portrayal. He’s often satirised as the “unfaithful prince” or the “villain” in the tragic Diana narrative. For example, Charles was the subject of jokes in comedic TV shows and social media memes, with some making light of his perceived role in the collapse of the marriage, often referring to him as emotionally stunted or apathetic. This has a disastrous impact on his functioning role as an active king, which destabilises the monarchy as a whole.  Equally, our view of historical events regarding his reign and the years prior are tainted. As I alluded previously, this would be far less problematic if the line between fact and fiction, and TV series and documentaries were more distinct; and had the director chosen education over entertainment, but would anyone have watched it if they had?

Fundamentally the director will have priorities, and a resounding message which they want to convey. This message is determined by two factors: what appeals to them, and what they think will appeal to their audience; and not groundbreaking, as ultimately, they have to make money somewhere. However, with historical TV another perspective must be considered; that of the historian. Similarly, they too will have an opinion on the topic. Very quickly therefore the TV in question’s outlook is greatly restricted by its creators. Consequently, the director and historian must learn to respect each other’s expertise. Nevertheless however, this reliance on opinions and perspectives makes the final product deeply subjective and the ‘truth’ of the real history, fallible.

An excellent example of the director and researcher projecting their opinions and what they believe their audience will sympathise with is the BBC’s news broadcast on the Bengal famine. On July 21st in 2020, BBC’s News at Ten included a series of reports analysing Britain’s colonial legacy worldwide. The report included the Bengal famine of 1943 and Winston Churchill. The report seemingly personally implicated Churchill in the famine, rather than the policies of Britain, excessively vilifying him, while a host of parties held responsibility. The BBC later admitted that the report contradicted their editorial duty which obligates them to report with impartiality. This illustrates that even documentation can be moderately biased in this case to avoid accusations of bigotry and the marginalisation of minorities. In this example, Churchill is vilified indirectly for his notoriously bigoted and xenophobic views. The report invites criticism of him by the audience who project their modern constitution onto the information the documentary provides. This underscores an underlying issue, which accompanies the creating of all forms of historical television, and that is it encourages the inevitable projection of modern ideas onto an event of the past. This creates a naturally warped view of the time or event in question and therefore encourages bias. Two arguments

accompany this idea when it is distilled: firstly, can a human be blamed for the attitude which they have adopted, and the subsequent deeds enacted if that was a societal norm encouraged and expected? Regardless of this controversial notion however, it demonstrates how susceptible directors are to public opinion, even if this means misconstruing the news. This is particularly alarming as it was a documentary and something on which the audience are accustomed to placing wholehearted trust.

In conclusion, this is not a call to abolish and revolutionise all historical television, but to raise awareness to be vigilant of historical bias.  Increasingly humans are putting more and more faith in electronic media, however equally it has never been more important to remain equitable to all those you encounter both for personal morality, the welfare of others and to guard yourself against being vilified by backlash. It is important to remember that historical TV can be as fictional as anything else televised, and the suspension of disbelief is still recommended in order to maintain an open mind untainted by nuanced prejudices. It is to be used as a device to explore alternative perspectives not a truth on which to base your opinion. In such a way history can remain useful for future productivity and learning.

How Russian television shapes perceptions through propaganda

Esme L (Sh)

The Russian state has been using propaganda for decades now. The Soviet Union used it when they were in their prime and now modern-day Russia is following in their footsteps with a much stronger version of their forebearers techniques. Television is still the main source of news in Russia, unlike in other countries where the internet has now taken over. The average Russian listens/watches around 4 hours per day meaning that every day around 82 million viewers get fed the story the Kremlin wants them to hear. Propaganda reflects a vision of the world; it amplifies truths and constructs illusions so that they resonate with people’s desires and fears. Russia wants to paint an image of an advanced peace-loving country trying desperately to fend of attacks from other evil nations who have lost their way, and through careful manipulation, censoring and scripting; the message is being relayed further and further across the country and into others.

Public support for Putin and his war in Ukraine is high, Putin has been and still is a genuinely popular leader and, on every channel, every day, 24/7, the story that they want the people to hear is being played at full volume. On Channel One (the oldest and most influential channel) they might be discussing ‘Ukrainian Nazis’. Shows and reenactments show handsome soldiers relaxing in camps with beautiful young ladies tending to their wounds and reading out letters of support from home while the evil, fascist, Nazis, the godless, terrorist, broken Ukrainians who are not in their right minds, who are puppets to the Americans and have been brain washed by the West are being bravely held back by ‘our’ heroes. Even the weather reporters help boost morale about the ‘special military operation’ by including Ukrainian cites as part of Russian territory.

Foreign affairs also assume a surprising amount of airtime portraying the West as wicked, telling stories of people trying to overcome their oppressors and the anti-American motif of Russian news is shown again with false rumours of U.S. intelligence and their twisted ideals. All the while, dotted in with news, Putin’s own face will appear. Their great leader meeting with the minister of health, discussing finding new houses for the soldiers returning from the special military operation and replacing old western technology with the new improved versions from the east. The familiar face relieves the tension – at least somebody is trying to do something about our broken world.

Propaganda has long been used to shape the masses memory. In Russia this has been used to cover up some of the more atrocious acts of the past and glorify some of the more triumphant events. For example, Eisenstein made a film celebrating the October revolution which marks the inception of the first communist government. The film showed huge bloodshed and military prowess when they stormed the winter palace, but the reality was vastly different with no bloodshed at all and a very peaceful ‘storming’. Russia is not the only country to change history either, Japan for example has changed many aspects of how it acted in World War II and the Nazi film director Leni Riefenstahl added new depths to propaganda through her films. Nor is it the only one using television as a form of propaganda in the modern day: China has CCTV – China Central Television – to promote the success of Xi Jinping, North Korea uses television to showcase the ‘heroism’ of the people and government and Iran portrays the West as corrupt while advocating their own state. This shows how relevant this piece of technology is in the modernday version of propaganda.

However, television is not the only source of news in Russia, newspapers do offer other opinions although they mainly follow the story the television follows. Online X (Twitter), Instagram and Facebook are all blocked and there are restrictions on reporting; most independent outlets are also blocked. VPN’s have yet to be recognised by the state so most restrictions on independent media can be bypassed by this, but many families are either too scared or too caught up in the web of lies that they don’t know what or who to believe. There are some independent TV channels such as Meduza and Mediazona, which now have to operate from different countries due to being labelled as foreign agents. Reporting things other than Russian ideologies is dangerous, for example reporting ‘false information’ on the war/ special military operation can lead to a 15-year sentence in jail. Criticising Government policies can lead to 10 years as can insulting state symbols like the flag, anthem and president.

Their propaganda is not only contained in Russia, both Russia and China are spending between 6 and 8 billion on global media activities and more countries are now becoming influenced. This investment has started to pay off: in the Lebanon, the radio, which used to be controlled by BBC Arabic - until January 2023 when budget cuts disbanded the service, is now run by Sputnik Radio which is an Arabic speaking Russian Radio station; it is also available in Syria, the content is mainly produced in Moscow from RT Arabic which is state owned. Cleverly, Russia has started to take advantage of the post-colonial resentment towards Europe and has also targeted Africa and much of Latin America. Like RT Arabic there is also RT en Espanol and Sputnik Brazil. Niger has replaced their old ties with France with a new partnership with Russia.

Using these channels, Russia is trying to lessen support for Ukraine, promoting the idea that the war is just a fight against Western oppression, turning people against America and there has been evidence that it is working. In 2022, 26 African countries did not support a UN resolution that blamed Russia for the war in Ukraine and many young Arabs believe that it is NATO and the US to blame for the war. Even in the US and Europe itself, more evidence of Russian propaganda is coming across, by using social media platforms, fake videos, news sites and AI they are slowly trying to turn people against their own leaders and their decisions. Voters not only in America but also throughout Europe have been targeted, infiltrating minds with doubt specifically about support for Ukraine.

In conclusion, Russian propaganda has greatly evolved from its Soviet Union origins. When the Soviet Union was in charge, their ideologies prevented certain strings being pulled. Their anti-religious stance stopped superstition being exploited and the only form of art allowed was socialist realism, but now, with these rules and others like them having been taken down, the government has more varied techniques of propaganda and can resonate with more and more people. The story the Kremlin wants everybody to hear is being told and other countries are starting to listen. Russia has huge sway in the East and due to their presence in the minds of the West they are beginning to have more power there as well. China is a huge supporter or Russia as well as Iran, Syria, Belarus, even India considers Russia a friend. Other countries are being encouraged to join the new Beijing-Moscow capital and their ideologies. This creates a combined force fighting against the West and they are slowly becoming more powerful.

Throughout the world propaganda is being used to influence and sway elections, decisions, even wars. In Russia itself there are over 6,700 channels controlling public perceptions and monitoring the information given. There is also Russian state-owned television in around 40 other countries so while television might take second place to Radio and other media outside of Russia it is still a very relevant form of propaganda for the Kremlin, particularly for those who don’t use social media. Currently media is not focusing enough on the threat this misinformation poses to our lives. This leaves us wondering what costs will have to be paid before the Russian people and everybody else discover the extent the propaganda has got to and the truth behind the lies.

Could Logie Baird have anticipated the take-over of television by advertising?

Television, once described by former late-night interviewer David Frost, ‘permits you to be entertained in your living room by people you wouldn’t have in your home’, has successfully spread across the entirety of the modern World. Since its conception in 1926 by Scottish innovator and electrical engineer, Logie Baird, to developments including the change to HD and UHD, televisions have spread widely, occupying a space in around 1.7 billion homes globally. From the late 1950s, advertising has gone hand in hand with the business of television. But, however important to television finance, has advertising had a positive or negative impact upon the possible uses of television?

On the night of January 26th, 1926, in Soho, London, John Logie Baird successfully demonstrated the first ever working television. This operated using a spinning disk which would break images down into lines sent to a receiver which could recreate the image in the form of a picture. The BBC (British Broadcasting Corporation), a popular radio broadcaster, was the first company willing, although reluctantly, to give Logie Baird’s ‘problematic’ television a chance, believing that it could be the successor to the radio. By 1929, after a series of ‘experimental broadcasts’1 the BBC decided that Baird’s mechanical system was approaching the limit of its potential, only being able to scan thirty lines, making the picture hardly visible. This led to the BBC’s change to the Marconi-EMI’s electronic model in 1933, finally releasing its first programme in 1930, starting regular broadcasts by 1936. At the time the BBC was the sole television broadcaster in Britain. Its daily programmes were financed by government spending, due to being declared a public corporation in 19272 with the licence fee (a fee upon every household which owns a television) being put in place in 1946. Over the years, televisions have become far more technologically advanced, as shown in the change from black and white imaging to colour displays, from mechanical television to cable and satellite TV. This change and innovation have evolved side-by-side with the development of a competitive industry, forming over 480 programmes and 112 streaming services nationally, watched by around 27.4 million households daily in the United Kingdom alone.

In the present day, one could easily forget how the television was viewed upon its conception, especially due to the various uses which the modern TV now offer. Logie Baird’s vision of the uses of television were fairly set and simplistic, that true television would merely transmit images with all the detail of the real world around it, truly encompassing reality. Despite Logie Baird’s view, it was for others to see the potential of television. The managing director of the BBC at the time, Lord Reith, held his own views on the purpose of television. This was shown in his slogan ‘inform, educate and entertain’. These views of the television decided the direction in which the BBC would navigate its broadcasting, intent upon educating the masses through programmes, which were seen by Lord Reith as a ‘public service’. As television increased in popularity, gaining viewers as well as commercial programmes, channels, in need of money to finance programming, were willing to sell time within their programmes to companies keen to market their goods to viewers. The idea of advertisement

through television was first applied in 1955, when Gibbs SR Toothpaste aired its first commercial on the newly created ITV, initiating the role of advertising in television.

Advertising has quickly become a central part of television. Companies invested increasingly large amounts of money into advertising on television reducing the amount spent on newspaper advertisement. This shift occurred as the television allowed companies to showcase ads before a captive audience, in between watching their favourite shows. Companies were able to make short clips, capable of enticing the audience to their product. This kickstarted competition between companies advertising products on television. This competition caused companies to create higher quality adverts and increased what they were willing to pay for slots, generating an entire industry devoted to designing and filming more effective advertisements. Television advertisement is effective as it can combine both engaging visuals and a clear message, either through slogans or musical jingles, creating a multi-sensory experience. Ads can also be tailored to resonate with specific demographics more likely to find interest in the product. This can be achieved through the time of day when the ad is aired and the type of programmes where it can be viewed. Storytelling throughout the ad as well as emotional connection or nostalgia easily achieved on screen makes the advertisement more compelling. Furthermore, ads can be positioned within films or series through product placement. This is where specific products are placed within the set of a movie, easily identified by viewers, providing the brand with a positive image.

Does advertisement have a positive or negative effect on the experience of television? Advertising at its best provides the viewer with useful information, perhaps helpful in the future. Ads also provide viewers with choice when considering various products, providing product transparency, forming trust, as well as giving consumers opportunity to find their desired item. This can be seen through sites such as Go.Compare, a wellknown insurance comparison company, mainly due to its advertising prowess, providing consumers with knowledge of a site able to aid them in making an informed choice over insurance. As advertising makes people aware of numerous options and highlights unique selling points, competition between companies is increased resulting in lower prices for consumers. For example, competition between energy companies aids the reduction of fuel bills, shown in the competition between Shell and Exxon Mobil in 2024, reducing unleaded fuel prices in the US to around three dollars per gallon, the lowest point in the year.

Lastly, advertising can inform viewers about pharmaceutical drugs as well as health care and fitness. Advertising relating to pharmaceutical drugs can provide viewers with information which could prove useful. This can be seen in the widely advertised anti-obesity drug Wegovy, which was the subject of a large TV advertising campaign during 2024, boosting its annual sales by 26 percent from 2023. Fitness brands such as Peloton, a fitness-tech company which creates stationary bikes, advertise widely, increasing viewer knowledge and awareness of the importance of fitness and encouraging a healthier lifestyle, to the benefit of both the viewer and company. In the Peloton Christmas advertisement of 20203 the brand attempted to showcase the importance of daily exercise, made possible by Peloton bikes throughout the workday as well as the joy and addictive nature which sport can offer. This shows the unique possibility which television advertising can offer, both marketing for companies in front of a large viewer base as well as information on health and wellbeing communicated across the screen, benefiting the viewers.

However, not all advertisements are beneficial. Television advertising has the potential to manipulate the views and expectations of viewers, as well as market products harmful to consumers. Gambling advertising, although reduced by Ofcom regulation through the Gambling Act of 20054 is able to air on television despite the potential of such advertisement to set unrealistic expectations of winning large sums. In 2020 a report by the House of Lords Social and Economic Impact of the Gambling Industry Committee reported that the gambling industry spends over £1.5 billion per year on advertising5 mainly focused on television. This advertising can have negative effects on viewers, with numerous ‘problem gamblers’ reporting that gambling adverts created a greater stimulation to gamble, spending more money than intended and an encouragement to believe that they can win. Additionally, in the past the advertising of cigarettes on TV has resulted in the deaths of millions of people. This is due to tobacco companies spending billions on advertising, particularly targeting the perceptions and attitudes of adolescents towards smoking. Advertisements are usually inserted within programmes or films. This can disrupt the flow of the viewers experience and arguably ruin the creative art which is film making. The only exception to this is sport, where live events are left undisturbed while programmes are broken up by a multitude of ads.

In conclusion, I believe that advertising in television scars the face of creativity of programme production, ruining the viewing experience. However, in recent years advertising revenue on television has drastically reduced: dropping from £4.9 billion in 2022 to £3.9 billion in 2023. This can be accounted for by the dramatic increase in viewership of social media, driven by ads by influencers. Unfortunately, most channels rely upon advertising revenue, however I believe that strict regulations on advertisements can reduce its negative impacts and maintain channel credibility. As advertising on television replaced advertising in newspapers, so is advertising on social media usurping that on television.

¹ BBC

² BBC

³ The Atlantic

⁴ Gambling Commission

⁵ The House of Commons Library

How are audio stimuli used in television marketing and advertising?

Max D T (Sh)

Advertising and marketing have always been a big part of television, from the first ever paid advert on June 1, 1941, in the United States to the present day, where advertisements for brands can be seen all over television and popular services such as YouTube and Disney Plus. Such adverts and services often come with a catchy jingle to remember such as the Netflix ‘tudum’ sound or the McDonalds ‘ba ba ba ba baaa’. These sounds can create subconscious effects that are crucial for brand recognition and recollection.

One of the reasons jingles like these can be so effective is because of classical conditioning. It was discovered by Ivan Pavlov, who published it in 1897. He was researching the digestive system of dogs when he noticed that the dogs would begin to salivate when they saw the white lab coat of a lab assistant, as he would bring them food. He realised that the dogs would salivate when they saw the coat even if there was no food present. He speculated that this was because the dogs expected food whenever they saw the coat, and this caused them to salivate even though no food was near them. He then devised an experiment to test whether the dogs could be taught to salivate when they were exposed to a specific stimulus.

Different stimuli used in the experiment

Pavlov’s experiment relied on several different stimuli:

• Neutral stimulus: Pavlov used a metronome, which did not provoke any reaction from the dogs before the conditioning.

• Unconditioned stimulus: The food caused the dogs to salivate as their bodies were preparing for digestion, something that was not conditioned and caused an unconditioned response, which was salivation in this case.

• Conditioned stimulus: During the experiment, Pavlov played the sound of the metronome in combination with the presence of the food, until the food was removed, and the metronome became the conditioned stimulus. Whenever the metronome was played, the dogs would salivate as they had been conditioned to expect food when they heard the metronome, and salivation became a conditioned response.

• Pavlov also found that the conditioned stimulus would lose its effect if the dogs were exposed to it without the presence of the unconditioned stimulus. This meant that the stimulus would have to be used in moderation if its intended effect were to be preserved.

Classical conditioning’s use in advertisement

Classical conditioning is very useful in adverts and marketing as it can evoke an emotion such as excitement or happiness due to the sounds being associated with good visual effects or services. This helps to make the adverts memorable and engaging, causing buyers to be more interested in products and services. In marketing, classical conditioning can be used to help build a brand and make the company memorable.

Netflix’s ‘tudum’ sound is effective because it is short and recognisable, and it makes the viewer want to watch and engage in their programme through conditioning. The sound helps build up the program and excites viewers. These sounds are so effective because they are used sparingly, meaning that the conditioning doesn’t become less effective. In a study taken in late June, USA, during a period of COVID-19, It was found that Netflix’s ‘tudum’ sound was less emotionally impactive due to overexposure to it as participants had watched a lot of Netflix during the pandemic. This is why they are often used at the end of the advert or upon opening the service, building up the anticipation of the viewer and creating the intended effect without reducing the effectiveness of the conditioning.

Auditory stimuli and how they evoke emotions

The sounds used in adverts are also carefully chosen with the intent to instill specific emotions into the consumer. The sound’s duration, pitch and volume are all taken into consideration. The aforementioned ‘tudum’ sound is short and low in pitch, making it sound important and build suspense while also being instantly engaging and drawing the viewer in. Netflix has cleverly used their unique sonic branding, even naming their fan event TUDUM to help build an identity for their platform. In advertisements catchy jingles can be employed to give the product or company recognition and make the advert, and by extension the product, more memorable. The concept of drawing attention to things with sounds is also used in iPhones, as the ding heard when a message comes through is used to catch the attention of the phone’s owner so that they respond to the message immediately.

Sound is a universal language, and it can be instantly recognisable across many cultures in the form of short simple jingles or sounds as opposed to languages where some of the subtle meaning given by the words’ connotations are often lost in translation. Furthermore, studies show that more complex melodic patterns create more catchy and memorable tunes, as opposed to simple flat tones, hence the use of jingles and short songs to build a brand. It also helps for the jingles to be authentic and contain the name of the brand they advertise, as it boosts memorability.

Adaptability is also crucial, as brands that don’t change their logos can suffer as consumers become overexposed to their sonic logos. The sounds must be appropriate for the brand, like the sonic logo for Alzheimer’s association, which is somber but also hopeful, appropriate for a company that helps patients with a terminal disease.

To conclude, audio stimuli are used in television advertisement and marketing to evoke emotions in the consumer and help build a sense of brand identity. Sound is important not just because of the way it conveys emotions, but also because it improves the brand’s memorability. Audio brands also condition the consumer to react in a certain way to them through Pavlovian conditioning, and gradual exposure combined with the presence of an unconditioned stimulus builds the effects of the conditioned stimulus over time. In modern advertising, more effective methods of making an advert impactful and memorable are becoming less common, highlighting the importance of audio branding to help adverts stand out in the market.

Subliminal Advertising in TV

Sophia O (Sh)

How our subconscious can be influenced by commercial companies, through adverts we don’t realise we have seen.

On September 12th, 1957, a market researcher in New York called a press conference to declare the findings of his recent study. James Vicary astonished the assembled audience by announcing that when he flashed the slogans “Drink Coca-Cola” and “Eat popcorn” at a cinema viewing the sales of Coca-Cola rose by 57.7% and the sales of popcorn rose by 18.1%. This was called Subliminal Advertising. The word subliminal comes from the Latin sub and limen, literally meaning ‘below the threshold’, in this case meaning below the threshold of conscious awareness. While Vicary thought that the prospect of imperceptible flashes could replace a barrage of unengaging adverts before a movie and in far more settings, the idea of having your thoughts and even actions influenced without your knowledge disturbed the public all over the globe. “Welcome”, cried one American magazine, “to 1984”. The scare caused it to be banned in many countries before commercial use spread widely. It was banned in 1958 in the UK but shockingly never in the USA. The anxiety and disturbance caused after the 1957 panic was particularly due to the social and political climate of the time. In this article I will discuss, whether subliminal advertising is actually effective, why the political climate of the time made the idea of subconscious influence so petrifying and if and how it is used in modern TV and TV adverts.

For example, if the words flashed are “Drink Coca-Cola” seeing these words can’t make you thirsty. Nevertheless, if you are thirsty, the instantaneous flash of Coca-Cola gives your mind the idea of Coke and it might then make the drink appeal more. Studies have more recently appeared, questioning how effective subliminal advertising is. Though most researchers can agree that Vicary’s studies lacked scientific evidence, his idea was not entirely wrong. All could agree though, that the effects are highly context dependent. Depending for example on whether you are receptive to the idea for example thirsty, hungry, engaged with the screen or aware of the advertising (which renders the technique useless) and many other factors. Summarising, subliminal advertising is maybe not as effective as Vicary said it was. To execute these subliminal adverts successfully in a commercial way on TV and at cinemas would be devilishly tricky and fiercely controversial. The advantages however, are undeniable. Adverts could be instant, effective and cheaper. Cinemas and at-home television could be unimpeded by ads disrupting the watching experience.

Why though did the climate of the 1950s and ‘60s mean that the idea of subliminal advertising was so quickly shut down? The political climate of the time was shaped by the Cold War with an intense fear of propaganda, mind control and mind influence. People feared hidden influences, like subliminal advertising because they felt like their thoughts were being manipulated. The fear of Communist Influence made Americans suspicious of hidden propaganda, covert mind influences, such as subliminal advertising making it feel like a threat to free thought.

Science fiction books such as The Twilight Zone and other earlier sci-fi novels often were directed on themes of mind control and dystopias. George Orwell’s most famous dystopia 1984 reflected and amplified fears of total domination of the mind. Themes like ‘Big Brother’ and the stripping of free-thought and individuality and the control of even subconscious thought resonated with the cultural anxieties of the time. Moreover, a fascination in psychology became apparent in literature. Books like The Hidden Persuaders (1957) explored how advertisers used psychological techniques to influence consumers.

The rise of TV itself was also fairly recent. And people were just learning and understanding how it could shape opinions and popular culture. The idea that subliminal advertising could make TV unsafe for the freedom of their thoughts definitely contributed to the fear and panic of the time. The panic and scare that the idea of subliminal advertising caused, initiated it to be banned in many countries including the UK, Australia and Canada. Surprisingly, where the panic was most concentrated, America, it was not banned. Nevertheless, FCC regulations and general discouragements and public adversity, make it almost impossible to use subliminal adverts or execute them properly in America.

The scare was recognised in more modern pop-culture. The Simpsons released an episode in 2001 where Lisa discovers her favourite music video, containing the catchy chorus “Yvan Eht Nioj” is recruiting for the Navy using subliminal flashes of Uncle Sam and chanting the words “Join The Navy” backwards – “Yvan Eht Noij”. Comedic scenes ensue where Lisa sees her friend joining the Navy after watching the music video and of her confronting the man behind it. John Cleese partners with Schweppes on a series of comedic adverts one of which he is comically defying the idea of subliminal advertising whilst flashes of Schweppes cans appear on his shoes, around the room and he has a can in his pocket. At the end he boldly tells the watcher to “try pouring yourself a glass of the first nonalcoholic sparkling beverage that comes to mind”.

Although, the idea has more recently been recognised in comedy and culture, is it used in modern TV? Clearly the 1950s was not a good time for subliminal advertising to be discovered, with public adversity and perhaps not sufficient technology to execute. If the concept was reintroduced nowadays, how would it be received? Maybe, with AI’s rapid development, subliminal influence would seem less of a big ordeal and its consequences in comparison to AI’s uninteresting. It could easily be implemented into today’s television but that takes us back to the question of how successful it is and whether it is still so context dependent.

To conclude, subliminal advertising caused an enormous scare in the 1950s, was it worth the panic? This type of flash advertising can be effective but is highly context dependent. However, with today’s technology it probably could be executed with more effect. Did this technique pose a threat to viewers of TV and freedom of thought or is this a viable advertising technique? I think the panic caused was out of proportion for the limited effectiveness of the adverts due to the politics of that period. Subliminal advertising should remain restricted and banned over the world as it is not a justifiable technique because of its limited effect and understandable controversy.

How has the TV industry been financed?

Cosimo T (L6)

Why is this an important question?

Before streaming and subscriptions and TV licenses, all you needed to access TV channels was an antenna on your tv that could receive signals. While convenient and cheap for consumers, this led to there being very little incentive to produce films or TV shows to the quality that we see today as opportunities for profit were very low and honestly impossible when we look at the costs of producing well-known, high-quality films. This gave rise to the question of how the television industry should be financed so that profit could realistically be had.

Come the end of World War II, in 1946, the BBC had an interesting and unique way of bringing in revenue: the TV licence was an annual payment of £2 -around £110 in today’s money- that gave households access to British television shows, news, and radio by the BBC. The price of a BBC TV licence is the fourth highest in Europe, as of 2023, behind only Germany, Austria and Switzerland having the most expensive TV licence at 333 euros. Only 65% of the BBC’s revenue comes from TV licences, the other 35% comes from grants and royalties but in spite of this the amount of revenue gained from TV licences is just shy of £5.75 billion. This shows the effectiveness of the TV licence system in national television where almost the entire country pays the license fee.

What are the benefits of the TV licence system?

Since almost everybody in the UK has a TV licence as the BBC is so well known and relied upon as a source of information, the BBC is guaranteed revenue and is thus guaranteed and doesn’t need to worry about going bankrupt. The TV licence is also great for the consumer: while the licence fee money does go to the government in order to finance the BBC, it is illegal for those funds to be allocated elsewhere within the government budget, thus preventing the TV licence essentially becoming an optional tax and keeping revenue for the BBC within the BBC.

The existence of the TV licence system also removes the necessity for advertisements with the BBC which not only benefits the consumer by removing annoying ads but also removes reliance from a nationalised business on private companies and allows the BBC to be unbiased and independent. These features of the TV licence all help to make the BBC what it is and has been for the last 80 years and are clearly efficient for a national television program.

What are the disadvantages of the TV license system?

The main problem with the TV licence system is that it is very easy to simply not pay it. Every time you log onto the BBC it asks you whether you own a TV licence or not but it does not require any proof that you do. This seems like it would be a huge issue, however, it is widely blown out of proportion: the average Londoner believes that 28% of people evade the TV licence and 1 in 5 Scottish people believe that 50% of people are incorrectly licensed for their TVs, however a study has shown that only 6% of people using the BBC are incorrectly licensed which shows that this problem is really not as much of a problem as it would seem. Not paying your TV licence can in fact lead to criminal prosecution and fines of up to £1000.

In recent years, streaming has become a lot more popular, this is of course not funded with a TV licence but with a monthly subscription fee which gives you access to a wide range of films that these streaming platforms carry.

What are the advantages and disadvantages of streaming

and subscriptions?

The main thing streaming has going for it is its convenience, it offers a lot of films all in one place. This removes the necessity for lots of subscriptions or paying for lots of channels or a licence fee as long as you choose a streaming service carefully. The absence of advertisements is also an important reason for why more and more people are choosing streaming. Most streaming services also have apps which means that you can watch TV on a mobile device which enhances the convenience factor.

Streaming has a very short list of disadvantages but a very notable one that doesn’t really affect other TV options is that it is completely dependent on your Wi-Fi speed. This can make it a poor or unreliable option for people who have slow Wi-Fi as the constant buffering and weak quality can ruin any film. However, as time moves on and good Wi-Fi becomes cheaper and more available this becomes less and less of an issue, thus making streaming a clearly effective method of bringing in revenue for TV companies.

Pay Per View

Pay per view is an incredibly popular method of bringing in revenue for televising sporting events, from boxing to football. Pay per view is utilised often by TV companies such as Sky in order to bring in some extra revenue at very little expense to the company. Pay per view is excellent as it gives consumers an option to tune into only the sporting events that they want to see without purchasing another streaming service just for sports. However, if not enough people want to watch an event, then the TV company may make a loss, thus it is important to only televise big events. Nonetheless, this means that fans of smaller teams can’t watch their team play without paying for a subscription thus making pay per view inefficient for them.

Advertisements

Advertisements are also a common method of financing TV however they are often used in conjunction with another method. For example, Netflix offers a subscription that costs half as much as the standard subscription but comes with ads. This is a great system for TV companies as it allows them to bring in significantly more revenue, and the larger your viewership is the more you can charge for advertisements and thus more revenue can be brought in. However, a problem with advertisements is that it is very disliked by consumers, and they may even avoid a TV service that runs too many ads. Also, if the company advertising is not highly regarded in the public eye it may lead to people complaining until the advertisement is removed which loses money for the TV company.

Conclusion

It is important to consider how to finance the TV industry as before all of these opportunities to make money, TV was not about entertainment as we know it today and without these opportunities to achieve profit TV likely would have never made it to where it is today. It is hard to elect a definite best method as something like the TV licence works great for the BBC but would be worse with a smaller service and pay per view would likely not be a great system for watching shows as it would be too expensive. The important thing is that these methods all manage to bring in profit so that TV can evolve and grow as an industry.

TikTok Brain: Three Reasons to worry about the effect short-form videos have on our brains

In this essay I will be exploring the ways in which TikTok’s short-form videos are negatively affecting users’ brains, and why we should be worried about them. The effects I will be going through are attention span, focus, and addiction.

TikTok was the first app and platform to introduce short-form videos. Short-form videos tend to be below 90 seconds in length and were introduced due to this length being seen as optimal for user engagement. This form of video quickly spread to other apps such as Snapchat creating Spotlight, and Instagram creating Reels. The reason other apps followed in TikTok’s footsteps was due to the increasing popularity that these videos got, with an average TikTok user spending 95 minutes a day on the app scrolling through these videos. Considering an average TikTok video is roughly 35 seconds long, this would mean most users are watching over 160 videos every day, showing the alarming addictiveness of these new short-form videos. As many people failed to predict, these short-form videos have drastic consequences on our brains and our ability to learn.

First, short-form videos across all apps are reducing the younger generation’s attention spans. This is due to many reasons, for example, because videos are shorter, the dopamine supplied by the content in these videos is given faster and more frequently. As our brains and bodies get used to this supply, we rely on it not just for watching TikToks, but in other day to day tasks such as watching movies and reading. To me, it is noticeable in daily life the decline of books being read by family and friends, due to many people giving up midway through a book as they are not receiving the fast gratification (dopamine) they are used to from watching any type of short-form video. This overall explains how TikTok and its new short-form videos are shortening our attention spans unless we are constantly receiving dopamine.

To add to this stunted attention span, the way TikTok is designed means that new content is continually available to users. This means that even when we are watching a TikTok video and do not receive the instant gratification we are used to, we can swiftly pivot to new content that will supply us with said gratification. This accentuates the diminishment of our attention spans.

The reason we should be worried about this is because whole generations of TikTok users - of which over 25% are aged 25 and below and therefore have brains which are still developing - will not only end up finding it hard to participate in prolonged activities, but also may struggle with learning and developing their knowledge and their brains due to the lack of interest and attention.

Secondly, similar to the lack of attention I previously explained, TikTok’s short-form videos are affecting users’ focus. This is because they get used to the instant supply of dopamine from these videos, and consistently rely on it. However, dopamine received from real life experiences is slower than the dopamine received from TikTok videos. Therefore, when performing a daily task, people get bored and lose focus as they are not constantly being hooked onto whatever they are doing due to the slow release of dopamine. For example, as movies tend to be well over an hour in length, they do not supply dopamine as fast as TikTok videos do, inevitably causing boredom to people whose brains are not used to such slow receival of gratification.

The reasons we should be worried about this lack of ability to focus are similar to lack of attention, in that users which tend to be so young and that are, in a lot of cases, still learning in schools or universities struggle to take in and store information as they have lost focus to the task at hand.

The last effect I will be looking at is TikTok’s short-form videos leading to its users becoming addicted to their phones. TikTok causes this addiction due to users becoming so reliant on the instant dopamine they receive, that they constantly feel as if they require it. This can be called nomophobia, the fear of being without your phone. This addiction comes with so many problems to users. For example, people (specifically teenagers) who are addicted to TikTok are more likely to experience symptoms of anxiety, depression, and low self-esteem. All of these consequences are increasingly growing in today’s society, showing the large effect short-form videos have had on the world, specifically the younger generations.

In conclusion, the world should be extremely worried about the growing impact of TikTok and its new short-form videos, specifically the effect on the younger generations. This is because the prefrontal cortex (section of the brain that controls directed and prolonged attention) does not fully develop until age 25. Therefore, users below this age are watching these videos and causing their brains to get used to not need this prolonged attention, whilst their prefrontal cortex is growing. This means they are not developing this part of their brain, causing long term effects that may permanently decrease the working capacity and ability of directed attention. To stop this, TikTok and other similar social media apps should clamp down on the age restrictions to use their platforms, as children and teenagers below these age limits are not stopped, and can easily access these apps, leading to this being one of the main reasons children are so commonly affected by these videos.

Has the growth of streaming services had a positive or negative impact on people?

F (Re)

Streaming services have become increasingly popular in the last decade and there have been certain positive and negative impacts on society. As television and streaming services are constantly being developed and changed, it is important to understand the impacts that it could have on us.

The positives

Streaming services have a lot of positive benefits that have helped people in a number of ways. One example of this is more flexible viewing times which have allowed people to not miss out on their favourite shows. Cable television, or traditional broadcasting services, have fixed times where you are able to watch a specific show every day or week. With the use of streaming platforms people are able to watch shows whenever they like and this means that people don’t have to miss out if they are not able to watch a show at a specific time. People are able to catch up and still join in conversations about a particular show without feeling left out or excluded. These conversations frequently take place through social media platforms, providing a community that makes people feel like they are a part of something, because they have someone that shares a similar interest to them and lets people know that there will always be someone there to talk with, even if it is just about a television show.

Another benefit is that by having multiple streaming platforms such as Netflix, Hulu, Disney plus and more, we are able to access more content and a larger variety of genres. This gives people a choice and provides diversity in the options available to them. There is a clear benefit to people having a wide variety of opportunities to choose content, so that people are able to choose what they would like to watch depending on how they are feeling or to coordinate with friends and enjoy watching something that may not have been an option on traditional television services.

An additional advantage to streaming services is people being able to access shows on multiple devices. This allows for more convenient viewing and people don’t have to feel like they missed out because they can watch on a smartphone, tablet or laptop wherever they are and don’t have to feel excluded if their friends have watched it already. Global accessibility has also improved and is making it easier to access television and films from different cultures and in different languages.

Streaming services also use algorithms to suggest what you should watch next based on your previous viewing habits. This is helpful and convenient when trying to find a show or movie that you know you will like or that is similar to something you have already watched. This makes it more convenient when choosing what to watch and makes streaming services more appealing and accessible than traditional TV services.

The negatives

As helpful as streaming services are, they also have negative impacts on the people that use them and it is important to address them when comparing traditional television and streaming services. Streaming services provide flexible watching times, however this has increased binge watching. Binge watching is when someone watches multiple episodes or an entire season or more of a show in a single sitting. Binge watching increases screen time and can have negative effects on people’s mental health. Binge watching also means that people at different parts of a TV show are unable to talk about it together because of the different places that they are in. This creates less of a community with the people that are watching the show as they can’t talk about it and share the experience with each other.

Another negative to streaming services is that there are multiple platforms, which can be a positive, but as there are so many it is incredibly expensive to have all of them. TV shows and movies are spread across many different streaming platforms with each of them costing a significant amount per month. When you then use multiple streaming platforms to access all of the content that you want, you have to pay huge amounts of money to watch everything you want to.

One of the benefits to streaming services is that you can watch on multiple devices wherever you are, but there are also negatives to this. Watching TV in a room of people is more social than watching on your own device by yourself, and I think that streaming services are giving people an opportunity to be less social and interact less with people which may ultimately have negative impacts on people and their social skills, especially for younger children.

Another negative effect of streaming services is that even though there are social media and online platforms to talk about a show or movie with others, since people are not watching together, they are likely to not be able to relate to someone, or have the story spoiled for them. There is also less of a chance for people to talk in real life about what they all watched. With traditional TV, people are able to talk to each other the next day about the episode that they all watched at the same time because there isn’t any way to watch more episodes after that. With streaming platforms, people are able to watch as much as they want and this means that they are likely to be at different points in a show than others.

Conclusion

Streaming services are beneficial in certain ways such as the flexible timings, the places that you are able to watch, the algorithms that help you decide what to watch and the variety of platforms that provide entertainment. However, it also has a number of negative effects. For example, an increase in binge watching, the increased cost of having to use all of these different platforms, the lack of social time that you would usually have with traditional TV, and that you don’t have the same experience talking to people about the show because you are most likely at different points in it.

Whether it is for the better or the worse, streaming services are taking over and are becoming increasingly popular. As of June 2024, it was estimated that 83% of households in the United States have at least one streaming subscription, which was a huge increase from the statistic in 2015 which was at 52%. Streaming services are becoming the new way to watch entertainment and even with the negative impacts, it is quickly becoming the easiest way to watch TV shows and movies.

How the Shift from broadcasting to streaming services has impacted society

This essay will be focusing on the positive and negatives of the shift from broadcasting TV to streaming services like Netflix or Disney+. It will examine how streaming services have disrupted traditional TV, changed viewing habits and affected people’s daily lives for good and for bad.

The accessibility of streaming services is remarkable. You are able to watch television whenever you would like to and do not need to wait as it does not follow a timed schedule like news companies do. You can watch a whole series (not that this is a good thing) and the power to ‘binge watch’ creates an immersive experience. Disney and Netflix assess your viewings and personalise your suggested films (whilst using your data), this makes it easy and quick to decide on a choice. Broadcasting companies have a set list, and do not have the power for you to choose a particular show. Both streaming platforms and traditional television offer engaging educational content with documentaries providing you with a view on nature and journalism. Documentaries are the most trustworthy sources of streamed services for education.

Although there are positives of streaming platforms there are also many more negatives. Since streaming platforms, viewings of broadcast shows and daily news has been on a steady decline for the last 20 years, starting at a similar time to when television platforms were launched like Netflix, which was launched in 2007, Hulu in 2008 and Disney+ which was only launched in 2019. Six million people watch the BBC, which is only 11% of the UK population, compared to the 68.8 % which have a streaming service subscription and on top of that almost 1 in 4 houses in the UK (which is just over seventeen million) have a Netflix subscription. The biggest shows are going to streaming services first and broadcasting shows are losing their notable and popular clients. This results in broadcasting companies making less profit and losing money while streaming services gain profit and new subscriptions. The decline of traditional television has led to loss of employment in broadcasting and journalism as they are producing less money. Loss of jobs means that there is unemployment which directly effects the economy.

A huge downfall for streaming services and social media is news integrity. With most teenagers gaining worldwide information off social media, the reliability and integrity of it is questionable. Although social media does have an aspect of education which is great, misinformation can go undetected making it appear

like it is fact checked and valid news but most often than not it is not at all accurate. These private companies lack evidence behind statements, unlike national broadcasters and then they can spread unsupported false information very quickly around the world. It is better to view news on the BBC or other notable companies which specialise in broadcasting. For a lot of people, news does not appear on their pages at all. This causes less global knowledge and a rise in misinformation spreading round society. People start to believe what they watch on television and believe that what they are watching is real. Television also has biases and often only supports one side of an argument. Movies can give people unconscious bias and influence their view on society whether it be good or bad.

Because streaming services and platforms are so accessible, the ability to binge watch increases and addiction to social media and television spirals. For streaming services like Netflix there is no block or resistance to watching a show, for example, watching the news, or a series or film on live television, there are adverts and often only one episode a week is released. Although this is looked upon as annoying it is beneficial to people’s patience levels and gives them something to look forward to in the next week, but compared to Netflix you have the whole series exposed to you and allows people to watch the whole thing or large quantities at a time. Binge-watching and endless scrolling keeps people up at night and can develop sleep difficulty and insomnia. A lack of sleep increases anxiety and depression levels and it is common among teens to have an addiction to television and social media. These factors result in a decline of people’s mental health. Unhealthy Binge watching supports sedentary lifestyles and encourages long hours of not exercising. This leads to weight gain and further health issues if people take advantage of streaming services.

In our parents’ and grandparents’ generations, they only had two or three channels with a couple of shows on each. They were very active and had to be creative for their entertainment and activities. This is healthy and nowadays live television follows sports and politics which is beneficial for everyone. Watching sport for example, can inspire children and young people to follow a career in sport or inspire them to stay healthy. Or if students watch political events such as the Russia vs Ukraine debate, they might start an interest in politics or the history behind their ongoing feud.

Overall, the change from traditional television to streaming platforms has had an enormous impact on most of the population of the world. Streaming platforms have made your encounter with television extremely accessible. Personalised viewing experience gives suggestions on the sort of things that you would like to watch compared to traditional television which gives a more limited selection. Streaming platforms provide an engaging and appealing experience and the technology behind it is fascinating. However, the rise of streaming platforms has reduced the amount of people watching traditional television like the news. This causes unemployment, the shutting down of businesses, a lack of news integrity and the spreading round of false information. Traditional television has trusted companies to educate the world on politics, sport, history, global news, and everyday events. It also supports a healthier life. The ability to binge watch is dangerous and can cause multiple health issues including insomnia and obesity due to a sedentary lifestyle. The reduced number of options inspires creativity and outdoor activities which is extremely beneficial. The shift from traditional television to streaming platforms like Netflix has been boundless and has brought many obstacles for the world.

Binge-watching TV: is it actually beneficial?

Lukas M (L6)

Binge-watching is defined as ‘the practice of watching multiple episodes of a television programme in rapid succession’ and this so called ‘practice’ is become ever more prevalent in this modern era with the rise of streaming platforms such as Netflix. These streaming interfaces are now coming in a myriad of forms and in copious amounts, essentially exposing the general public to almost any form of content, inevitably making binge-watching easily accessible and easy to implement, hence the unsurprising growth. The urban legend that binge-watching is detrimental for your mental health has long been discussed, however is this actually the case? In this article, the sole aim is unpicking and evaluating such a concept and I will look at psychological, social and health perspectives.

The appeal of binge-watching is almost ubiquitous in the sense that it is so easy to do and allows for instant gratification. Previously, people used to have to wait a week or length of time, whereas in this so-called modern era, shows are queued up so episodes can be watched back-to-back. Such a setup also allows people to engage with the storyline on a deeper level and invest more in the show, making it a more immersive experience as opposed to waiting a week. This induces more emotional response from your brain which ultimately makes it a more appealing method of watching TV. There are also numerous other reasons for the appeal of binge-watching such as convenience and also it can be used as talking points with others, making people feel obliged to watch for social benefits. Having looked at the reasons why binge-watching television is so appealing; it can easily be understood why so many people are against such a practice for time purposes etc.

Now, when diving into the actual evaluation of such a custom, instantly the psychological benefits must be assessed. For starters, binge-watching can be a form of stress relief as fictional worlds offer an escape of some sort meaning people can detach from the anxieties of the world and get immersed in a new reality. Binge-watching, in some cases, has also been seen to be a substitute for addiction as it does release dopamine in the reward centre of the brain making it a healthier alternative for addicts. Furthermore, building off of this, the release of dopamine enhances people’s moods resulting in a more productive economic workforce

and general population which everyone can agree is beneficial for society. Following on the emotional analysis of binge-watching TV, it could also be seen to be a form of emotional catharsis where the release of various emotions when watching a show helps people to process their own pent-up emotions, linking to this stress relieving theme. Binge-watching doesn’t only affect emotions, but also cognition in the brain. Allowing the brain to engage in various complex plots, character developments and narratives instigate analytical and critical thinking patterns and skills in the brain which could be argued to improve day to day cognitive life.

However, whilst this is all true, there are compelling arguments which take the stance that there are many psychological negatives of binge-watching TV. For starters, whilst there may be positives associations with this in terms of addiction as highlighted above, in fact some have the opinion that binge-watching in itself can lead to addiction. This constant and instant gratification can make stopping watching awfully difficult, and this lack of self-regulation could foster into an unhealthy relationship with other addictive tendencies and substances. So, in actual fact, a radical view could argue that binge-watching may be a gateway to worse and even unhealthier addictions. Furthermore, as mentioned above, binging TV could be a great way to escape real world problems. But at what point does this become avoidance as opposed to escapism? This is something that also could contribute to procrastination and anxiety. Furthermore, sitting around watching television for hours means that people in actual fact may become less productive as it may enhance sleep disruption and spur on fatigue.

In a social setting, there are a few benefits and negatives to such a procedure. Looking at the benefits first, slight reference was made to it earlier in this article: the idea that TV as a talking point for shared interests can really allow friendships and relationships to thrive and be a strong foundation on which they can be built on. Furthermore, if engagement of a show exceeds just watching and can branch out to online forums and fan communities, people are allowed to discuss, interact and meet people who may share similar interests to them. So, in that sense, binge-watching can be very social. However, not if it is managed poorly, this can lead to isolation where people don’t want to leave their homes and actually engage in social activities. Furthermore, there are many health impacts. Such as a sedentary lifestyle. This is where sitting around, not getting enough exercise, increases the risk of obesity, heart diseases and generally poor physical health let alone mental health. It also encourages eye strain, poor eating habits which leads to obesity and illnesses with fatigue.

So, to summarise, binge-watching isn’t as detrimental a practice as it may seem, however this is only the case in moderation. If the quality of content chosen is educational, profound and inspiring, the benefits from watching are unending. This may also be the case if exercise is incorporated through the day, and you are engaging with the content. However, if these TV binge-watching habits become poor, there are even more dire consequences that will follow. So, the fundamental takeaway is that balance is key.

Marlborough College, Wiltshire SN8 1PA www.marlboroughcollege.org

Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.