Influencing Mindsets: Use of Troll Farms by Governments as Effective Social Media Marketing

Page 1

Influencing Mindsets: Use of Troll Farms by Governments as Effective Social Media Marketing

Cal. Poly. Pomona

IBM 3012: Principles of Marketing Management

Dr. Randy Stein

December 16, 2023

Nima Mansouri

Influencing Mindsets: Use of Troll-Farms by Governments as Effective Social Media Marketing

Government troll farms now play a crucial role in the spread of information propaganda. This paper will argue that troll farms effectively manipulate public opinion through sophisticated marketing tactics. First, utilizing data-driven insights, troll farms precisely tailor their messages to specific demographics, ensuring that their content resonates more effectively with targeted audience segments. Second, by repeatedly promoting certain narratives and artificially simulating widespread support, troll farms exploit the ‘bandwagon effect’, leading people to adopt opinions they perceive as popular. Lastly, troll farms mimic viral marketing by creating engaging content and employing fake influencer accounts, leveraging emotional responses and perceived credibility to sway public opinion.

Troll farms are defined as organized groups of internet users, often employed, or supported by governments, which engage in coordinated campaigns to influence public opinion, spread disinformation, and manipulate online discourse (Van Dijcke, 2023). These entities leverage the reach and dynamics of social media platforms to sway public perception and political narratives. Troll farms reached 140 million Americans monthly on Facebook before the 2020 U.S. election, showcasing their extensive influence in shaping political narratives (Hao, 2021). In authoritarian regimes, troll farms have been employed as tools for disseminating propaganda and suppressing dissent, with the Iranian (IS), Russian, and Chinese governments being notable actors. The impact of Russian troll farms on major political events, such as their interference in the 2016 U.S. presidential election favoring Donald Trump, highlights their role in significant political outcomes (Zhang, 2022). Additionally, the architecture of social media platforms plays a significant role in enhancing the efficiency of troll farms in spreading disinformation. These platforms facilitate the creation of echo chambers, a phenomenon notably evident during the 2016 U.S. presidential election (Barsotti, 2023).

Using targeted messaging and market segmentation, troll farms can influence specific demographic groups effectively. This strategy involves crafting messages that resonate deeply with

2

audience segments, therefore manipulating public discourse. This strategy was notably evident in the 2016 U.S. election, where Russian entities like the Internet Research Agency (IRA) utilized direct advertising to reach millions of American internet users. They crafted and disseminated microtargeted messages to various demographic and political groups. This tactic was not just about spreading a single narrative; instead, it involved pushing conflicting narratives to different groups to stir societal tensions and chaos (Kliman, 2020). The versatility of this approach was also observed during the 2017 German elections. Here, Russia targeted the Aussiedler community — ethnically German individuals from former Soviet republics — with Russian-language advertisements. These ads were specifically designed to increase support for the Alternative for Germany (AfD) party, leveraging the community's trust in Russian sources. This targeted approach allowed for a more subtle yet effective influence on their political opinions (Kliman, 2020). By exploiting the nuances of various demographic groups, these entities can subtly influence political landscapes and societal discourse.

Troll farms exploit the 'bandwagon effect' through sophisticated social media strategies, as evidenced in the 2020 U.S. election. By simulating widespread support for specific narratives, they influence public opinion. Heinz College professor Ari Lightman (2023) highlights how social media platforms like Facebook create echo chambers and filter bubbles, amplifying this effect. This is achieved through content targeting and aligning with user preferences, often leading to a rapid spread of disinformation by bots and fake accounts. The resultant misinformation feeds into the bandwagon effect, swaying public opinion by creating an illusion of majority consensus (Menczer, 2021). This approach underscores the powerful impact of social media on shaping public perceptions and political processes.

Utilizing viral/influencer marketing, troll farms form credibility and emotional response behind their perspective agendas. Influencer marketing is successful due to the trust and the ethos social media personalities have with their audience. Contents posted by influencers garner higher engagement rates,

3

enhancing brand visibility (Leung et al., 2022). Adding propagandizing to the power of influencer marketing creates the perfect landscape to change mindsets. A striking example of this tactic was observed in the run-up to the 2020 U.S. presidential election. Troll farms based in Kosovo and Macedonia managed to reach an extensive American audience on Facebook. Their pages formed the largest Christian American and African American pages on Facebook, significantly larger than their authentic counterparts. These strategies leveraged the widespread engagement with controversial or sensational material, frequently recirculating content that had already achieved viral status. This approach not only garnered large audiences but also penetrated Facebook's Instant Articles and Ad Breaks partnership programs. This underscores how effortlessly troll farms can integrate into the digital realm and monetize their activities (Hao, 2020).

While the effectiveness of troll farms in manipulating public opinion is evident, it is crucial to consider how influential they are in shaping mindsets. Studies suggest that many social media users, particularly younger generations, are becoming more skilled at identifying the authenticity of online content, potentially diminishing the effectiveness of these manipulation tactics (Wardle & Derakhshan, 2017). At the same time, the role of social media platforms in combating troll farms has increased, with platforms employing more sophisticated algorithms and human moderation teams to identify inauthentic content. This proactive stance by social media companies presents a significant challenge to the reach and influence of troll farms (Allcott & Gentzkow, 2017). This can mean that social media platforms and the increasing awareness of the public can help alleviate the effects of troll farms on public discourse. Although, troll farms have still managed to infiltrate the political landscape in recent times.

This paper has critically examined the role of government-operated troll farms in manipulating public opinion, highlighting their strategic use of marketing tactics to influence societal discourse. Through data-driven messaging, exploitation of the 'bandwagon effect,' and mimicry of viral marketing,

4

these entities have demonstrated a significant capacity to shape public perception. The evidence presented, ranging from the extensive reach of troll farms in the 2020 U.S. election to their targeted campaigns in specific demographic groups, underscores the profound impact these operations can have on democratic processes and societal stability. The importance of this topic lies in its implications for democracy, free speech, and the authenticity of online interactions. Understanding the impact of troll farms is crucial for policymakers, social media companies, and the public to develop effective strategies to combat misinformation. As social media continues to be a dominant force in shaping public opinion, the need for informed, collaborative efforts to address this issue has never been more grave.

5

References

Abogado, G. (2021, September 29). On Facebook and foreign propaganda: Here’s how algorithms can manipulate you. RAPPLER. https://www.rappler.com/technology/features/facebook-algorithms-fueled-foreignpropaganda-campaigns-united-states-election-2020/

Allcott, H., & Gentzkow, M. (2017). Social media and fake news in the 2016 election. Journal of Economic Perspectives. https://www.aeaweb.org/articles?id=10.1257%2Fjep.31.2.211

Barsotti, S. (2018). Weaponizing social media: Heinz Experts on Troll Farms and fake news. Carnegie Mellon University’s Heinz College. https://www.heinz.cmu.edu/media/2018/October/troll-farms-and-fake-news-social-mediaweaponization

Hao, K. (2022, January 10). Troll Farms reached 140 million Americans a month on Facebook before 2020 election, Internal Report shows. MIT Technology Review. https://www.technologyreview.com/2021/09/16/1035851/facebook-troll-farms-report-us2020-election/#:~:text=Troll%20farms%E2%80%94professionalized%20groups %20that,followed%20any%20of%20the%20pages.

Kliman, D. (2020). Digital influence tools used by China and Russia - JSTOR. https://www.jstor.org/stable/resrep25314.8

Leung, F. F., Zhang, J. Z., Gu, F. F., Li, yIWEI, & Palmatier, R. W. (2022, November 24). Does Influencer marketing really pay off? Harvard Business Review. https://hbr.org/2022/11/does-influencer-marketing-really-pay-off

Van Dijcke, H. (2023, April 7). Like, tweet, & torment: Authoritarian troll farms. Human Rights Foundation. https://hrf.org/like-tweet-torment-authoritarian-troll-farms/

Wardle, C., & Derakhshan, H. (2017). Information disorder: Toward an interdisciplinary framework for research and policy making. https://rm.coe.int/information-disorder-towardan-interdisciplinary-framework-for-researc/168076277c

Zhang, Z. (2022). Study confirms influence of Russian internet “trolls” on 2016 election. Columbia SIPA. https://www.sipa.columbia.edu/news/study-confirms-influence-russianinternet-trolls-2016-election

6

Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.