
6 minute read
Postdigital Intimacies for Online Safety
Introduction to the Online Safety Bill
The Online Safety Bill (OSB) is a piece of UK legislation, implemented by the communications regulator Ofcom, aiming to protect children and adults online by imposing a duty of care on social media services and search services, with fines of £18 million or 10% of a technology providers’ global annual turnover for breaching such obligations1. The aims of the OSB date back to the publication of the 2017 Internet Safety Strategy green paper, followed by the 2019 Online Harms white paper, which identified that “appalling and horrifying illegal content and activity remains prevalent on an unacceptable scale”2. After a period of consultation, in 2020 the Government gave a full response and commitment to regulation. The OSB was originally the responsibility of the Department for Digital, Culture, Media and Sport (DCMS), together with the Home Office. Since 2019, the Secretary of State for DCMS has changed several times, impacted by wider political changes and uncertainty, and in 2023 responsibility for the OSB was transferred to the newly-created Department for Science, Innovation and Technology (DSIT). However, as we write this, the OSB is currently progressing through the House of Lords, after which it will reach the final stages and be given Royal Assent. Ofcom will shortly thereafter take responsibility for drafting risk assessment, transparency reporting, enforcement guidelines, and other elements that allow the OSB to come into force, with these frameworks for reporting and enforcing the OSB to take shape between 2023 and the end of 2024. While we recognise that the OSB is likely to pass before this Report can have direct impact, we aim to have the gaps and opportunities we identify here formally recognised, and our recommendations taken up as a part of the implementation process, and subsequent legislation.
Advertisement
Wider implications and gaps
Over the last few years, regular news segments, television programmes, and newspaper articles have outlined the importance of the OSB for protecting vulnerable and at-risk online users, especially children, and thus have also raised awareness of several harms that people may experience through their engagements with online and digital technologies. Vital campaigns have lobbied for the legal recognition of specific harms. For example, teenager Molly Russell’s suicide in 2017 has been directly linked to self-harm content on the social media platforms Instagram and Pinterest, with the coroner in 2022 calling for better regulation, while her father criticised the pausing of the OSB in that same year as the Government sought to ensure ‘freedom of speech’. Others have noted trolling and hate speech as creating hostile environments online that require people have adequate legal protection, with the OSB shaping discussions from online racism in football, to the suicide of Love Island television presenter Caroline Flack following vicious online attacks, to the challenge of regulating hate speech propagated by figures like Andrew Tate and others like him. There is no doubt that the process of legislating the OSB has raised public consciousness around issues related to online harm, risk, and vulnerability.
1 The full current bill can be accessed here https://bills.parliament.uk/bills/3137.
2 Online Harms White Paper, p.12 https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/973939/ Online_Harms_White_Paper_V2.pdf
Other important campaigns have taken place around ‘epilepsy trolling’, where flashing images are sent to people with epilepsy to induce a seizure. Led by 11-year-old Zach Eagling, the practice will now be covered by ‘Zach’s Law’, as part of the OSB. Meanwhile, another new offence currently written into the OSB is the sending of photographs or film of genitals to “cause alarm, distress or humiliation”3 , colloquially known as cyber-flashing, recognition of which is the result of substantial lobbying campaigns by a number of women’s organisations, service providers, academics, and members of the public (see Section 4 of this report for a discussion and links to relevant groups/projects).
However, a number of issues still remain. The OSB’s overwhelming remit is towards children and, while this is undoubtedly an important constituency deserving of protection, recent changes to the OSB fail to recognise how online harms occur across life-stages. For example, the removal of the category of ‘harms to adults’ (or ‘legal but harmful’) from the OSB has been widely criticised as an ideological move that will allow harmful content to persist. Yet, in research by Compassion in Politics, Clean Up the Internet, Fair Vote, and Glitch in 2021, 60% of the public believed freedom from abuse was more important than freedom of speech, and 69% wanted the OSB to contain protections for adults as well as children.
One significant issue has been the way the OSB elides a wide spectrum of online harms that are gender-related and target vulnerable adult populations including women and girls, at-risk men, adults with neurodiversity and/ or mental health conditions, and those with other combined characteristics such as nonconforming gender and sexuality, as well as racial, ableist, and classist marginalisations.
Violence against women and girls (VAWG) is not addressed and, although amendments have been tabled, the terms ‘woman’ or ‘women’ do not appear anywhere in the OSB. As a campaign currently being led by EE and Glitch reads: “262 pages. Zero mention of women and girls. The online safety bill must change,” while Baroness Nicky Morgan, member of the House of Lords, has stated that the OSB is “sorely lacking” in its provisions for women and girls. In her words, “with the recent wholesale removal of ‘legal but harmful’ content regarding adults in its latest iteration, women and girls have in fact lost protections committed to address misogynistic content” (emphasis added).
There are further concerns about the ability of the OSB to protect vulnerable groups, especially in our present context in which online hate is on the rise. For example, police data gained by Leonard Cheshire and United Response has identified a 52% increase in disability online hate crime in the year 2020-2021. Many groups have also noted an increase in the incidents of such crime and abuse during the UK Covid-19 lockdowns.
Another significant issue with the OSB is in its focus on ‘the online’, as opposed to the digital and the technological, and the way it is primarily concerned with content rather than on the systems, structures and affordances of how technology is designed. The OSB will have minimal impact on the development of new technology. For example, there are only limited mentions of ‘health’ in the OSB, and none of these relate to the dramatic growth of industries like Feminine Technologies (FemTech), explored in Section 3 of this report, or the increasing use of digital tools and technologies to deliver healthcare, which remain largely unregulated despite the potential vulnerabilities to mental and physical health, as well as risks to data privacy and protection. Similarly, the focus on large technology companies such as Meta (i.e. Facebook, Instagram), TikTok, and Twitter means that communities of hate will still be able to exist in smaller, dedicated online spaces, and will continue to appeal to precarious groups and at-risk individuals. This was raised by Baroness Anna Healy in the House of Lords during the OSB’s second reading, where she recorded that ‘involuntarily celibate’ or incel communities, which are discussed in Section 5 of this report, are “a threat to women but also to young men” and noted that the largest of these communities online also contains large amounts of content related to suicide and self-harm.
This project and report
For us – as researchers of ‘postdigital intimacy’ – the landscapes we have outlined above are intricately connected to one another, and to the OSB as it reaches its final stages. Our research is collectively interested in the way digital culture, including the online, has become increasingly embedded in the way we build relationships, interact, engage, and connect with each other. We work together on developing research that shows how what happens in digital, technological, and online cultures also shapes our health, wellbeing, and material worlds. Thus, the gaps we outline above cannot be understood separately, and require a cross-cutting analysis of online safety that recognises the multiple lenses and contexts through which someone could be at harm or made vulnerable by digital culture.
The aim of this project, therefore, has been to formulate a collective response to the OSB, working collaboratively across multiple sectors with a number of stakeholders. The project, funded by Research England through Coventry University’s Research Excellence Development Fund, develops discussion around four areas that we ourselves are currently researching, namely: intimate digital health tools and services marketed to those who identify as women; image-based and technologically-enabled abuse; boys, men, and “toxic” internet communities; and the benefits or otherwise of using online communities for mental health support. We recognise that further important areas need to be discussed, especially to foreground the gaps in the OSB around the harms of racism and transphobia, and how these interact with other combined characteristics. Where intersections within and between thematic areas emerged in discussion, we have highlighted them in the pages that follow. We have also proceeded in the hope that our response will see such gaps taken up within a wider remit of definitions of what online safety and online harms mean – including legislation that is separate to the OSB.
In the following section, we introduce our methods, which included stakeholder workshops with several groups and organisations. We then provide a discussion of each workshop, and the strengths, opportunities, concerns, and consequences of the OSB for the communities and groups that our stakeholders represent. We finish this report by outlining shared recommendations.