To Break the Standstill of Social Media Governance, We Need Industry Standards

Page 1

To Break the Standstill of Social Media Governance, We Need Industry Standards

PAPER 2023 TECHNOLOGY AND PUBLIC PURPOSE PROJECT
Technology and Public Purpose Project Belfer Center for Science and International Affairs Harvard Kennedy School 79 JFK Street Cambridge, MA 02138 belfercenter.org/TAPP Statements and views expressed in this report are solely those of the author(s) and do not imply endorsement by Harvard University, Harvard Kennedy School, or the Belfer Center for Science and International Affairs. Copyright 2023, President and Fellows of Harvard College

To Break the Standstill of Social Media Governance, We Need Industry Standards

PAPER 2023 TECHNOLOGY AND PUBLIC PURPOSE PROJECT

About the Technology and Public Purpose Project (TAPP)

The arc of innovative progress has reached an inflection point. It is our responsibility to ensure it bends towards public good.

Technological change has brought immeasurable benefits to billions through improved health, productivity, and convenience. Yet as recent events have shown, unless we actively manage their risks to society, new technologies may also bring unforeseen destructive consequences.

Making technological change positive for all is the critical challenge of our time. We ourselves - not only the logic of discovery and market forces - must manage it. To create a future where technology serves humanity as a whole and where public purpose drives innovation, we need a new approach.

Founded by former U.S. Secretary of Defense Ash Carter, the TAPP Project works to ensure that emerging technologies are developed and managed in ways that serve the overall public good.

TAPP Project Principles:

• Technology’s advance is inevitable, and it often brings with it much progress for some. Yet, progress for all is not guaranteed. We have an obligation to foresee the dilemmas presented by emerging technology and to generate solutions to them.

• There is no silver bullet; effective solutions to technology-induced public dilemmas require a mix of government regulation and tech-sector self-governance. The right mix can only result from strong and trusted linkages between the tech sector and government.

• Ensuring a future where public purpose drives innovation requires the next generation of tech leaders to act; we must train and inspire them to implement sustainable solutions and carry the torch.

For more information, visit: www.belfercenter.org/TAPP

ii To Break the Standstill of Social Media Governance, We Need Industry Standards

About the Initiative

The Democracy and Internet Governance Initiative (DIGI) is a special joint initiative between Belfer Center for Science and International Affairs and Shorenstein Center on Media, Politics and Public Policy.

DIGI aims to research and build solutions to mitigate the harms of digital platforms, with a particular focus on social media. As part of the Initiative, our team worked with a range of stakeholders across government, business, and civil society to address growing public concerns.

About the Author

Amritha Jayanti is the Associate Director of the Technology and Public Purpose (TAPP) Project at Harvard Kennedy School’s Belfer Center for Science and International Affairs. Before assuming this role, she served as a Research Associate, supporting both TAPP and the Belfer Center’s Director and former U.S. Secretary of Defense, Ash Carter. Her work focuses on emerging technology, international security, and public purpose.

Prior to joining the Belfer Center, Amritha was a visiting researcher at the University of Cambridge’s Centre for the Study of Existential Risk where she primarily researched the governance of artificial intelligence in Western military organizations. She has also worked at the Brookings Institution’s Center for Technology Innovation researching the application of artificial intelligence in various sectors, including defense and education.

Prior to her policy-oriented focus, Amritha served as the lead product manager at Clara Labs, a San Francisco-based, Sequoia-back startup. She also served as the Executive Director of a San Francisco-based non profit, Interact, focused on supporting young technologists interested in social impact. Additionally, she founded a nonprofit, Technica, which encourages gender diversity in computer science and STEM more broadly; she remains a member of the board.

Amritha received her degree from the University of Maryland, where she studied computer engineering, economics, and public policy.

iii Belfer Center for Science and International Affairs | Harvard Kennedy School
v Belfer Center for Science and International Affairs | Harvard Kennedy School Table of Contents Key Points.......................................................................................................... 1 The Current Strategy Is Not Working .............................................................. 3 The Case For Industry Standards .................................................................... 6 From Standards To Smart Regulation ........................................................... 10 Conclusion And Next Steps ............................................................................ 12 Endnotes.......................................................................................................... 14

The following analysis is part of Harvard Kennedy School’s Democracy and Internet Governance Initiative, which has focused primarily on improving the quality of our information ecosystem, countering online extremism and radicalization, and addressing harassment and diminishing press freedom online as part of its initial research.

Key Points

• The current strategy for digital platform governance is fragmented, ad hoc, and politicized; the United States has barely moved the needle on an organized governance strategy to improve consumer welfare and communal wellbeing online.

• Industry-wide voluntary standards setting offers a path forward to ensure we have expert-led processes, measurements, and best practices to elevate safety and quality on digital platforms, while paving the way for comprehensive U.S. federal rules and enforcement.

• If U.S.-based firms do not act soon, the European Union and UK will set the standards via policies like the Digital Services Act and Online Safety Bill respectively. This could lead to non-ideal regulatory conditions for American companies, and the broader U.S. innovation ecosystem.

1 Belfer Center for Science and International Affairs | Harvard Kennedy School

Large scale digital platforms, particularly social media platforms, have created and exacerbated significant harms to individual and communal sovereignty, mental health, consumption practices, and public goods, such as robust information access. Journalists, academics, and civil society have been warning about the harms for nearly a decade now, garnering significant attention from lawmakers and the public.

Despite emergent agreement among industry officials and the public on high-level goals such as online safety and security, consumer protection, user choice, and trustworthy information access, little progress has been made to address consumer welfare needs and hold individuals and organizations accountable when harms materialize. Meanwhile, more sophisticated technologies like generative AI are being deployed in the mass market,1 introducing new challenges to consumer and communal wellbeing.

As concerns about these platforms grow, the lack of comprehensive policies is increasingly evident. Technology companies are actively lobbying the United States Congress to avoid external regulations, while Congress is struggling to understand the scope of the problem and what they can do to address it amidst a politically charged environment.

The critical question is: can we break this standstill under the current conditions? We believe the answer is yes. By drawing on historical examples of industry betterments, we propose that the most promising path forward is through a commitment to industry-wide voluntary standards setting.

Industry-wide standardization has a long history both domestically and internationally. In fields like accounting, health care, or agriculture, industry standards promote best industry practices that ensure safety and quality control for consumers. Standards, which provide a common language to measure and evaluate performance of a product or service, aim to reflect the shared values and responsibilities we as a society project upon each other and our world. The standards development process relies on cross-sectoral experts to form technical and operational standards for technology development and deployment.

Standards setting—in its traditional and tested form—has not yet penetrated the digital services space. In this position piece, we make the case for why standard

2 To Break the Standstill of Social Media Governance, We Need Industry Standards

setting is the most viable way for social media governance (as one type of digital service) to move beyond the status quo, which currently fails to prioritize product safety and quality, and serve as a means to an end for smart government regulation.

The Current Strategy Is Not Working

Firstly, a lot of effort has been focused on U.S. Congressional action. However, Congress is not well positioned to address granular digital technology problems–at least not with the approach we have now.

Between 2019 and 2023, around 60 unique bills have been put on the floor in Congress related to digital platform governance, specifically focused on declining information integrity, extremism, incitement to violence online, and diminishing press freedoms.2 None of the bills passed. In fact, only about six percent made it out of committee.3

When considering the processes of Congress, the complexity of the issues at hand, and the hyper-political framing of social media governance, the lack of movement on this legislation is unsurprising. Congress is simply not well positioned to directly address the intricate problems of digital platforms. It has hundreds of competing priority areas and serves as a generalist body rather than a group of subject matter experts. It is not crafted to move fast or pass one-off legislation to directly address the full myriad of risks created by social media products, nor does it have the measurements and information needed to make informed policy choices.

Ultimately, three important factors play into the lack of movement: money, politics, and firm secrecy. Large technology companies spent a record-breaking $69 million on lobbying the federal government in 2022—higher than both the defense and pharmaceutical industries.4 Compounding that, technology policy has become entangled with partisan politics. As a result, it is hard to see how a split 118th Congress could build the consensus needed to pass the numerous pieces of legislation required to address broader societal concerns, such as the mental health crisis, radicalization, and consumer privacy. Finally, platform companies have virtually no industry-specific disclosure practices. This makes it hard for external actors, including the government, to measure and assess the actions that industry has previously taken in an attempt to protect consumer and communal welfare.

3 Belfer Center for Science and International Affairs | Harvard Kennedy School

Despite these challenges, we still look to Congress to act–and rightly so–particularly as consumer safety concerns continue to mount. For Congress to do this effectively, it is important to recognize that it has the tools and position to create macro-level change. Its primary responsibilities and strengths lie in funding government functions and programs, holding informative hearings to shape the legislative process, and conducting oversight of the executive branch. Therefore, we must consider how we can best leverage these strengths to achieve our goals in platform governance.

Secondly, private sector solutions have been fragmented, ad hoc, and platform-specific, which ignore the fundamental interrelation between digital spaces and how harms can propagate through online experiences.

Nearly all social media companies have tested solutions to address the harms of social media on users. Of the 73 proposals identified in DIGI’s DGDP Index, 65 have been tested by the industry. Even President Donald Trump’s Truth Social has content moderation policies embedded in their Terms of Service that appear to limit harmful content across the ideological spectrum.5

While almost 90 percent of proposed solutions have been tested across different platforms, only 7 percent have been fully implemented industry wide.6 And although platform-specific responses are necessary, there are spillover effects from one platform to another: The typical social media user interacts with 6.6 social media accounts on average.7 It is easy for activity on Instagram to influence activity on Reddit, Twitter, Facebook, and so on. Additionally, even if Reddit has safe practices, for example, any single consumer is subject to risks on other seemingly comparable platforms. Without best practices in place across platforms, there is no guarantee to consumers that their online experience has uniform safety considerations.

Moreover, it is difficult for external stakeholders to know how effectives consumer safety features are (of the ones that have been tested) because, again, companies have little to no disclosure practices. Some platform companies recently have participated in third party audits to validate their products8 but even these are difficult to validate because auditors are beholden to the data the platform companies provide. We see this playing out in the algorithmic auditing space, for example, where companies participating in “collaborative audits” end up compromising the integrity of independent review.9

4 To Break the Standstill of Social Media Governance, We Need Industry Standards

Finally, “responsible” innovation teams at platform companies that are meant to champion societally-concious product and policy choices are often underfunded and, in some cases, completely deprioritized. This makes it nearly impossible for the groups tasked with speaking on behalf of the consumer to penetrate the full operations of a company—and that is if the teams exist at all. Towards the end of 2022, Meta chose to dissolve its Responsible Innovation Team.10 Just in March 2023, Microsoft announced it was shutting down its Ethics & Society team.11

Lastly, legislation and private sector governance has primarily focused on the harms caused by the technology of today, without much consideration of where digital services are headed.

We are only now grappling with consumer harms caused by the platforms founded nearly two decades ago. But the digital space is constantly evolving, it is the harm landscape. The Metaverse, web3, consumer-facing applications of large language models are already hitting the market–whether you believe in their viability or not–and yet we have not even scratched the surface within the policy world to mitigate forecasted risks.

Good consumer welfare practices require processes and systems that allow governance to keep up with the pace of technology. Right now, we do not have those processes in place. Experts and pundits have called for global bodies of governance for a range of digital services from social media to general purpose AI. Yet, these recommendations lack the specificity, precedent, and incentives to catalyze real change. So we continue to rely on ad hoc and reactionary proposals from governments, and underfunded, cagey, and piecemeal solutions from industry.

We do not need to reinvent the wheel with new conceptions of global governance bodies though; consumer and societal risks emerge with any new technology or innovation and, historically, we somewhat methodologically created standards systems at both a domestic and international level. They are industry-led and therefore more equipped to move at a speed close to that of innovation itself.

5 Belfer Center for Science and International Affairs | Harvard Kennedy School

The Case For Industry Standards

“The leaders of the new standards committees did not argue that capitalists should be left alone in their selfish pursuits of profit; rather, they believed that some problems of industrial society could be resolved more efficiently by cooperation among experts.”

“– Andrew L. Russell, Open Standards and the Digital Age Consumer and communal welfare problems, such as poor safety and quality, have led to a growing distrust of social media companies.

Facebook is the largest social media platform, with 3.74 billion users.12 Yet, in a recent study, 72 percent of Internet users trust Facebook “not much” or “not at all” to responsibly handle their personal information.13 Beyond Facebook and privacy, only about 17 percent of young adults trust social media platforms to provide them with accurate information14; and 41 percent of Americans have personally experienced some form of online harassment, with a growing percentage of those saying they experience “severe” harassment.15 Social media companies have also been losing consumer confidence amid high-profile cases like Cambridge Analytica, the Facebook Whistleblower case, and the numerous media-covered Congressional hearings aimed at uncovering social media’s data and safety practices.

Ultimately, there are real to consumer and communal welfare involving online platforms: Platforms have increased mental and physical health risks. There is mounting evidence that social media has increased depression and anxiety, both in young adults and older adults.16 They have also introduced financial risks, such as pervasive false digital advertising, misleading financial advice, and scams.17 Consumers also are subject to new reputational and social risks, which can have implications for their professional opportunities. For example, non-consensual release of sexual material or pervasive online defamation that reaches a scale only possible through online mediums can have serious implications for a consumer’s career and private life.

6 To Break the Standstill of Social Media Governance, We Need Industry Standards

There are also risks to public goods, such as a diminishing press freedom as news outlets grow beholden to social media companies to host and promote content.18 Additionally, social media companies create privacy risks, which are incentivized by their two sided marketplace and data-intensive products and services.

Finally, there are risks to individual and communal sovereignty, which we see through addictive product features and foreign influence of elections, respectively. With regards to the latter, in September 2022, the U.S. intelligence community released information that Russia has spent over $300 million in election interference since 2014.19 A portion of that money was proven to be used by Russia’s Internet Research Agency to interfere with U.S. elections through social media troll farms and compromise our democratic process.

These risks at a high level are not new, of course. However, the digital architecture, data practices of companies, and new interactions and networks of people exacerbate the risks, causing them to take on a different form. Therefore, dealing with them will require special attention to ensure our governance systems are updated for the new landscape.

Nearly every social media platform has in recent years been forced to grapple with these issues, and some have even built a competitive advantage on addressing them—for example, Signal. However, consumers are now recognizing that they deserve a baseline level of safety and security online and, for that reason, there is both a need and an opportunity for the industry to collaborate on standards.

Standards setting offers a collaborative and expert-led path forward to develop shared measurements, evaluation schemes, and best practices for consumer and communal welfare that are global in nature.

Standard setting in technology industries refers to the process of establishing technical specifications and guidelines for the design, development, deployment, and interoperability of technology products and services. This process involves bringing together industry stakeholders, such as developers, service providers, regulators, and standards organizations, to create and implement common technical and operational standards. Fundamentally, standards set out a common

7 Belfer Center for Science and International Affairs | Harvard Kennedy School

understanding among experts of “how things should be done if they are to be done effectively.”20

What is compelling about standards setting is that it is a known quantity process with a history of private sector engagement and success from the consumer welfare perspective. The ISO (International Organization for Standardization), which is an independent, non-governmental international organization, has a membership of 168 national standards bodies alone.21 It works across a number of sectors including pharmaceuticals, energy technology, information security, and more. And although voluntary standards are non-binding, they often lead to mandatory standards enforced within a jurisdiction.

For digital platforms, the standards settings process offers a collaborative and ongoing medium to develop a common industry-wide language to measure and evaluate performance of online products and services, which is an important piece of the puzzle that is currently missing. It allows us to use a familiar and tested process to solve these somewhat novel problems, which has implications for global governance of digital platforms–not just domestic.

Industry-led standards development increases consumer confidence, builds trust with government, and can align with the fiduciary responsibilities of firms–all while supporting existing government and public interest initiatives.

Industry-level standards setting has significant upside for the private sector. Standards have the potential to increase consumer confidence in social media companies, which could help platforms like Facebook, Instagram, and Twitter with user retention and growth. For example, industry standards could provide a range of safety guidelines for children’s use of social media products, which would provide parents confidence about uniform safety measures among the participating platforms.

They can also build trust with governments by demonstrating a willingness to create and participate in a robust self-regulation apparatus, as well as developing a track record of compliance, facilitating interoperability, improving transparency, and enhancing security. Further, standards development facilitates more intentional public-private partnerships through a collaborative process by

8 To Break the Standstill of Social Media Governance, We Need Industry Standards

experts across government and the private sector, which is critical in cultivating a healthy relationship between these two camps. With the European Union and UK government ready to move forward on regulation22, and the US debating whether or not to follow along, it would be well timed for American firms–who dominate the digital services space–to signal their willingness to self-regulate.

Additionally, standards can create market advantages for the companies whose technical or operational standard is voted into effect since they have the benefit of already complying with the standard.23 At the same time, standards can also help level the playing field across companies because, in the best case, platforms go through the same processes and practices to uphold some baseline level of safety and quality assurance. This eliminates the tradeoff question with regards to quality and safety versus first mover advantage. (We can see this playing out right now in the generative AI space between Microsoft and Google, with Google moving faster than planned to release BARD—compromising quality assurance—because of Microsoft’s move with OpenAI’s ChatGPT and GPT-4.24)

Finally, one of the biggest advantages of standards setting for private companies is that it allows for pooled resources and collaboration, ultimately saving time and money.25 Nearly every platform company has tested internal standards efforts, responsible innovation teams, and compliance teams. Industry-wide efforts function as a force multiplier for each individual firm because companies are exposed to new ideas, leverage external resources to facilitate the standards process, and benefit from the wisdom of the crowds. Companies have already signaled they are interested in collaborations to solve these difficult problems. As just one example, Facebook launched a Deepfake Detection Challenge, which leans on academics and other industry leaders.26 Standards could provide more opportunities like this, with added due processes and scale, and the opportunity to build market advantages by socializing their adopted methods as industry-wide standards (as noted above).

On the government side, the advantage is that technology companies are able to take responsibility for the problem space, developing shared measurements and best practices that can then be the basis for legislation and regulation. It encourages firms to be more transparent and develop a shared language and measurement scheme for each risk space. This then equips lawmakers the information and infrastructure in the long run to develop smart rules and

9 Belfer Center for Science and International Affairs | Harvard Kennedy School

enforcement schemes to protect consumer and communal welfare. And this has been proven historically: industry-led standards have paved the way for savvy regulation in a range of industries, including telecommunications, agriculture, healthcare, and food. They can do the same for social media, and digital services more broadly.

From Standards To Smart Regulation

In the early 19th century, concerns about the quality and safety of medicines were growing, and there was a lack of standardization in the manufacturing and labeling of drugs. To address these issues, eleven physicians met in Washington, D.C. to establish the United States Pharmacopeia (USP) in 1820 as a non-governmental organization that set quality standards for drugs and their ingredients.27 USP published a reference book, United States Pharmacopeia, which listed the standards for purity, strength, and labeling of drugs.

When the Pure Food and Drug Act, also known as the “Wiley Act,” was passed nearly a century later in 1906, USP became the official pharmacopeia of the United States.28 The law now required drugs to meet the standards set forth in the USP. It also established what is now known as the Food and Drug Administration (FDA) to enforce the law. Since then, the USP has continued to evolve and update its standards to keep pace with advances in science and technology. Today, it sets standards for not only drugs, but also dietary supplements, food ingredients, and other healthcare products.

USP’s standards play a vital role in ensuring the safety and effectiveness of drugs in the United States. By providing a reference for quality standards, the organization helps to prevent the distribution of substandard or adulterated drugs and ensures that patients receive the correct dosage and labeling information for their medications.

This brief historical case study is just one example that demonstrates how self-regulation can lead to smart government intervention that is based in technical and industry-based best practices. Standards can provide a framework for developing shared measurement, understanding of harms, and methods for

10 To Break the Standstill of Social Media Governance, We Need Industry Standards

protecting consumer and communal welfare, as was the case of the Pure Food and Drug Act.

Industry-led standards can also demonstrate to lawmakers and regulators that companies can self-coordinate to develop and enforce their own standards, which can allow them to avoid harsh or ill-informed government regulation. This also provides an incentive for industry practices to be transparent, accountable, and in line with social and environmental values in order to build the right level of trust with governments and consumers.

In the case of social media, the industry standards can lay the foundation for a few things: consumer safety and confidence, and legislative and regulatory input that has more depth and sustainability than lobbying. Ideally, once standards are created, documented, and applied, it becomes easier for Congress to codify those standards in law and appoint a regulatory office to enforce those standards, just like in the case of the FDA.29 This benefits firms who opted into the standards pre-regulation because they are already in compliance. Additionally, domestic enforcement means that firms no longer have to worry about American competitors who opt to ignore the standards for the sake of, for example, first mover advantage.

The bottom line is that voluntary standard setting serves as a means to an end for government regulation. The long term play should be jurisdictional enforcement of standards via law, as well as an assessment by governments and civil society on whether the standards that are built are enough to effectively protect consumer and communal welfare. Do that effectively though, we need the foundation of measurements and best practices to guide the dialogue. (Especially considering the limited scientific and technical capacity within the U.S. government for digital technology issues.30)

11 Belfer Center for Science and International Affairs | Harvard Kennedy School

Conclusion And Next Steps

The standards process is not a perfect or a quick fix of course: Developing standards is an extremely tedious process, and voluntary standards are only as strong as the weakest link. However, standards are a necessary ingredient in the innovation ecosystem that drives transparency, consensus, and due process via deliberate and thorough engagement with experts across the public and private sectors. The standards setting ecosystem also provides the digital services industry with a template (and existing infrastructure) for a robust, rules-based process for self-governance rather than requiring them to reinvent the wheel.

What is promising is that there have been scattered efforts to develop standards for social media companies. For example, the Global Alliance for Responsible Media (GARM) is a collaboration between advertisers, agencies, and social media platforms which aims to “develop and implement global standards for digital advertising, including issues related to brand safety, ad fraud, and hate speech.”31

The Sustainability Accounting Standards Board (SASB) had a Content Moderation on Internet Platforms initiative which was designed to “help companies manage the complex and evolving landscape of content moderation on internet platforms, while promoting user safety, privacy, and free expression.”32 Finally, the Digital Trust & Safety Partnership, which is part of the World Economic Forum’s A Global Coalition for Digital Safety, has been developing best practices and assessments for digital service companies.33

These dispersed efforts are a good start but we need more. The move towards proper standards is about developing a sustained, robust, and comprehensive approach to create and implement measurements and best practices across the board. This is the direction we need to go for digital services more broadly, but the time is especially ripe for digital platforms like social media. In order to do this well, there are a few things at a high-level that are required:

1. The development of a list of key risks that have the need and potential to be effectively addressed by standards in the near future (taking on a risk-based model to ensure the scope and intent of the initiative is focused on the major consumer and communal welfare needs at the time). This could be done through an initial cross-sectoral working group, which

12 To Break the Standstill of Social Media Governance, We Need Industry Standards

would include digital platform companies, related technology companies, civil society, government, academia, and other topic-specific experts–for example, journalists;

2. The establishment of topic-specific consortia to address the identified risks or problems and, ideally, a standards body to house the consortia and other working groups focused on digital platform standards;

3. Companies to invest in standards leaders and delegations within their companies, which should include training for employees to engage effectively in the standards setting process;

4. Intentional and active engagement with academia, government, and civil society to ensure that multi stakeholder interests and ideas are represented throughout the standards process.

Luckily, these pieces have been coordinated countless times for other industries; we believe it is possible and within the interests of platform companies, governments, consumers, and civil society to make progress on this front. In fact, the Democracy and Internet Governance Initiative is already working with a number of stakeholders on first steps in setting up a more robust standards setting ecosystem.

We have reached a critical moment regarding the governance of digital platforms. Social media companies are no longer young and naive—and neither are their consumers. Standards offer a viable path forward to ensure we move the needle on governance of digital services, while paving the way for smart government intervention.

13 Belfer Center for Science and International Affairs | Harvard Kennedy School

Endnotes

1 Heilweil, R. (2023, January 5). What is Generative AI, and why is it suddenly everywhere? Vox. Retrieved from https://www. vox.com/recode/2023/1/5/23539055/generative-ai-chatgpt-stable-diffusion-lensa-dall-e

2 Schultz, J. (2023, March 22). Analyzing the landscape of solutions to social media’s harms. Belfer Center for Science and International Affairs. Retrieved from https://www.belfercenter.org/publication/analyzing-landscape-solutions-socialmedias-harms

3 Ibid.

4 Feiner, L. (2023, January 23). Apple ramped up lobbying spending in 2022, outpacing tech peers. CNBC. Retrieved from https://www.cnbc.com/2023/01/23/apple-ramped-up-lobbying-spending-in-2022-outpacing-tech-peers.html

5 Rosen, D. (2022, September 19). Truth social’s censorship, terms of service defy free speech promises. Public Citizen. Retrieved from https://www.citizen.org/news/truth-socials-censorship-terms-of-service-defy-free-speech-promises

6 Analyzing the landscape of solutions to social media’s harms

7 Ruby, D. (2023, April 12). Social media users in the world - (2023 demographics). Demand Sage. Retrieved from https:// www.demandsage.com/social-media-users

8 For example, Facebook published an article in 2020 about a third party audit conducted by EY of their Community Standards Enforcement. More here: https://about.fb.com/news/2020/08/independent-audit-of-enforcement-reportmetrics/

9 Sloane, M. (2021, March 17). The algorithmic auditing trap. Medium. Retrieved from https://onezero.medium.com/thealgorithmic-auditing-trap-9a6f2d4d461d

10 Bell, K. (2022, September 8). Meta dissolves team responsible for discovering ‘potential harms to society’ in its own products. Engadget. Retrieved from https://www.engadget.com/meta-responsible-innovation-teamdisbanded-194852979.html

11 Belanger, A. (2023, March 14). Report: Microsoft cut a key AI ethics team. Ars Technica. Retrieved from https:// arstechnica.com/tech-policy/2023/03/amid-bing-chat-controversy-microsoft-cut-an-ai-ethics-team-report-says/

12 Dixon, S. (2023, March 14). Topic: Facebook. Statista. Retrieved from https://www.statista.com/topics/751/facebook/

13 Kelley, H., & Guskin, E. (2021, December 25). Americans widely distrust Facebook, TikTok and Instagram with their data, poll finds. The Washington Post. Retrieved from https://www.washingtonpost.com/technology/2021/12/22/tech-trustsurvey/

14 Ray, J. (2021, November 22). Young people rely on social media, but don’t trust it. Gallup.com. Retrieved from https:// news.gallup.com/opinion/gallup/357446/young-people-rely-social-media-don-trust.aspx

15 https://www.pewresearch.org/internet/2021/01/13/the-state-of-online-harassment/

16 Karim, F., Oyewande, A. A., Abdalla, L. F., Chaudhry Ehsanullah, R., & Khan, S. (2020). Social Media Use and Its Connection to Mental Health: A Systematic Review. Cureus, 12(6), e8627. https://doi.org/10.7759/cureus.8627

17 https://www.nab.org/bigtech/

18 Rosenblum, D. (2007). What Anyone Can Know: The Privacy Risks of Social Networking Sites. Security & Privacy, IEEE. 5. 40-49. 10.1109/MSP.2007.75.

19 Merchant, N. (2022, September 13). US: Russia spent $300m to covertly influence world politics. AP NEWS. Retrieved from https://apnews.com/article/russia-ukraine-putin-biden-politics-presidential-elections03d0ae84fb34833b78b1753d0a9602db

20 Hayns, S. (2020, August). The importance of setting standards to support environment bill delivery. Wildlife and Countryside Link. Retrieved from https://www.wcl.org.uk/the-importance-of-setting-standards.asp

21 About Us. ISO. (2023, April 3). Retrieved from https://www.iso.org/about-us.html

22 Wheeler, T. (2023, March 8). The UK and EU establish positions as regulatory first movers while the US watches. Brookings. Retrieved from https://www.brookings.edu/blog/techtank/2023/03/08/the-uk-and-eu-establish-positionsas-regulatory-first-movers-while-the-us-watches/

23 (2022, October 11). 3 Ways Technology Standards Can Benefit Your Organization. IEEE Standards Association. Retrieved from https://standards.ieee.org/beyond-standards/industry/technology-industry/3-ways-technology-standards-canbenefit-your-organization/

24 Nieva, R. (2023, February 7). Google debuts a CHATGPT rival called Bard in limited release. Forbes. Retrieved from https://www.forbes.com/sites/richardnieva/2023/02/06/google-bard/?sh=913bbd4152d6

25 3 Ways Technology Standards Can Benefit Your Organization

14 To Break the Standstill of Social Media Governance, We Need Industry Standards

26 Deepfake Detection Challenge: AWS and New Academics Join, Initial Dataset released. Meta AI. (2019, October 21). Retrieved from https://ai.facebook.com/blog/deepfake-detection-challenge-aws-and-new-academics-join/

27 United States pharmacopeia celebrates 200 years of building trust in medicines, supplements and foods. United States Pharmacopeia (USP). (2020, January 6). Retrieved from https://www.usp.org/news/usp-celebrates-200-years-ofbuilding-trust-in-medicines-supplements-and-foods

28 FDA. (2019, April 24). Part I: The 1906 Food and Drugs Act and its enforcement. U.S. Food and Drug Administration. Retrieved April 13, 2023, from https://www.fda.gov/about-fda/changes-science-law-and-regulatory-authorities/part-i1906-food-and-drugs-act-and-its-enforcement

29 Former FCC Chairman Tom Wheeler, former Senior Counselor to Chairman at the FCC Phil Verveer, and former Chief Counsel of the US DOJ Antitrust Division Gene Kimmelman have a proposal to start a government agency to regulate digital platforms. Read more: https://shorensteincenter.org/new-digital-realities-tom-wheeler-phil-verveer-genekimmelman/

30 Miesen, M., & Manley, L. (2020, November). Building a 21st Century Congress: Improving STEM Policy Advice in the Emerging Technology Era. Belfer Center for Science and International Affairs. Retrieved from: https://www.belfercenter. org/publication/why-us-congress-and-stem-experts-must-work-together

31 Advertisers, W. F. of. (n.d.). Global Alliance for Responsible Media - About GARM. WFA. Retrieved April 13, 2023, from https://wfanet.org/leadership/garm/about-garm

32 Content moderation on internet platforms. SASB. (2022, July 6). Retrieved from https://www.sasb.org/standards/ process/projects/content-moderation-on-internet-platforms-research-project/

33 Digital Trust and Safety Partnership. Digital Trust & Safety Partnership. (n.d.). Retrieved April 13, 2023, from https:// dtspartnership.org/

15 Belfer Center for Science and International Affairs | Harvard Kennedy School
Technology and Public Purpose Project Belfer Center for Science and International Affairs Harvard Kennedy School 79 JFK Street Cambridge, MA 02138 belfercenter.org/TAPP Copyright 2023, President and Fellows of Harvard College Printed in the United States of America
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.