
7 minute read
How Gonzalez v. Google LLC may reshape the Internet
By Novak Chen, Tushara Devapatla & Keerti Koya Staff Writers

Advertisement


On February 21, the Supreme Court heard arguments for Gonzalez v. Google, a case revolving around the liabilities that companies can accrue from their hosted content. e case stems from a series of several attacks on Nov. 13-14, 2015 by Islamic State gunmen from the Islamic State of Iraq and Syria (ISIS), which left 130 people dead in Paris, France. Among the victims was 23 year-old Nohemi Gonzalez, a student at California State University Long Beach. Seven months later, Gonzalez’s father, Reynaldo Gonzalez, filed a lawsuit against Google, Twitter, and Facebook. According to e Oyez Project, he claimed that Google, YouTube’s parent company, contributed to international terrorism by permitting ISIS to use its platform “to recruit members, plan terrorist attacks, issue terrorist threats, instill fear, and intimidate civilian populations.” More specifically, the lawsuit reasoned that Google helped ISIS in spreading its recruitment videos by using algorithms that suggest content to users based on their past viewing history. If companies continue having the power to influence the public on a mass scale, the impacts can be detrimental to society. e results of this case can change the course of the future of the media. To prevent the recurrence of such cases, technology companies must increase their limitations on the types of content they host. At the same time, the general public must also be wary of radical content shown in their media recommendations.


After hearing both arguments, the district court granted Google’s motion to dismiss the claim. e US Court of Appeals for the Ninth Circuit affirmed — agreeing that recommending ISIS’s content falls under the purview of Section 230 of the Communications Decency Act of 1996, meaning that Google is not liable for any actions that result from its content recommendations. Credited with creating the base guidelines for the Internet, the Telecom-
The Willow Project
munications Act of 1996 also protects websites from most legal liabilities by distinguishing its services from traditional publishers; for example, if YouTube had libelous content on its website, it could not be sued for it, while a traditional publisher like e New York Times could be sued for libelous content on its platforms. In addition, any provider or user of an interactive web service cannot be held liable for moderating the content, whether or not it is constitutionally protected. YouTube’s algorithm recommends videos based on metrics such as the user’s search history, viewing time, relevant topics with the highest retention rate, and their channel subscriptions. Beyond the coding of the algorithm itself and the occasional change to account for trends and software updates, there is no human involvement in the content recommendation algorithm.
Some may argue that technology companies are not liable for the content that they host because the content is made by the users on the platform, not by the creators of the platform or the platform itself. Rather than a video’s content, recommendation algorithms focus on its thumbnail and the corresponding clip determined by YouTube, meaning this personalized feed is not officially associated with the promotion of harmful content. Based on such standards, the plaintiff argues that personalized feeds should not be associated with the promotion of harmful content. us, it is claimed that because YouTube determines the thumbnail and the snippet that influence recommendation algorithms, the resulting liabilities are not covered by Section 230. Even if thumbnails or snippets are not covered in Section 230, companies have the responsibility to make their platform friendly for all users and should take accountability for any type of harmful media shown on their site. In order to prevent the spread of extremist content, anything that draws the audience toward said content should be carefully moderated.
Moreover, others may argue that further restrictions of Section 230 would only allow major technology corporations to take on the risks of content moderation, shutting down many smaller technology companies and limiting innovation in the field. But these restrictions would not limit innovation; in fact, they would likely increase innovation because they would demand new technology for content moderation and updated algorithms, similar to how patents encourage research and development of new technolo gies. According to the Harvard Business Review, “individuals, teams, and organizations alike benefit from a healthy dose of constraints.” e main points of the updated Section 230 would not hurt smaller companies, only enforce guidelines that the media conglomerates and Internet corporations should have been following, to begin with.
Even if technology companies are not creating content that supports violent ideas, they are still responsible for their algorithms that suggest such content. Section 230 only prevents liabilities from content that they host, but anything that the company creates of its own volition is not protected. us, companies should reevaluate their current algorithms in order to find what is promoting these ideas, and change the algorithms to be more critical towards harmful or improper content that can motivate their audience to engage in similar poor behavior. Along with this, companies need to understand that even after making changes to ban injurious content, some might still seep through algorithms. In such cases, they must instill harsh guidelines on the content while urging viewers to both be cautious when consuming that content and to avoid it.
To prevent further harm, the Supreme Court of the US (SCOTUS) must consider how the verdict will impact the spread of extremist content online. If Gonzalez loses the case, large tech companies will go on to exponentially increase the control their algorithms have on the public. Additionally, the US Con- gress can amend the Justice Against Sponsors of Terrorism Act to include giving terrorist organizations a digital platform as part of aiding and abetting terrorism. Individual tech companies need to create non-hostile and harmful environments for their users by being the first to update their algorithm to stop recommending users radical content. ey could also use occasional official posts while scrolling, urging users to not engage in negative content if they do come across it. Ultimately, however, while Gonzalez v. Google is an extreme scenario of a lack of media regulation, the case comes down to how the media largely influences individuals’ every action.

As consumers of the media, MSJ students should be more aware of their content recommendations and know when to restrict themselves from further exposure to harmful content. When shown information that is inappropriate or harmful, students should take action to report such media to prevent algorithms from continuously showing it to other users. Students can also take a stand to emphasize the detrimental influence media has on their decisions by lobbying legislators to push for stricter restrictions. For example, the Kids Online Safety Act would require platforms to "provide a minor (or a parent) with certain safeguards, such as settings that restrict access to a minor's personal data; and parents with tools to supervise the minor's use of a platform, such as control of privacy and account settings." As students grow to incorporate the Internet into their lives, their status as the second largest Internet demographic can be used to lobby for a difference.
If SCOTUS, Congress, companies, and the general public push for change, society can move toward a safer future online. e Gonzalez v. Google LLC case may allow for an impactful shift in content regulations, changing media as the world knows it today. ▪
GRAPHICS BY OPINION EDITOR ANNIKA SINGH
"In mid-March, the Biden Administraton was tasked with making the decision for approval of ConocoPhillips' Willow Project, which will produce around 1.5% of the current total US oil producton and generate around 9.2 million metric tons of carbon dioxide a year for 30 years. Despite signifcant public pushback, the project was approved. This fercely opposes Biden’s plan to reduce carbon emissions by 2030, raising questons about environmental actvism and its te with politcal acton. What’s your perspectve on politcians stcking to their commitments, and what do you think the Biden Administraton should have done?"
“When people are told to recycle and reuse to protect the planet but their government is passing legislaton that does the opposite, it decreases the public's morale for mitgatng global warming. The Willow Project's benefts (energy source, jobs) are important, but it's about tme to put our planet's environmental health frst. The Biden Administraton should have only approved the Willow Project on a small scale in order to meet our climate goals.”
"I think the Biden Administraton should've discontnued Project Willow because petroleum, oil, and other non-redeemable fossil fuels are growing increasingly frowned upon. Ripping minerals from Earth and releasing their toxic emissions into the atmosphere not only damages the ecosystem and air quality, but also increases temperatures and quickens global warming. Project Willow will only add to the fuel. Right now, we should focus on more innovatve, efcient and sustainable ways of producing energy instead of drilling the next best oilfeld."
"No mater what economic or politcal benefts such a project could provide, politcians should prioritze the environment above everything else. While there are many ways to bring about politcal or economic success, we only have one planet, and it should not be jeopardized for anything."
“Politcians get voted into ofce based on their promises to their consttuents, so they should stck to their campaign commitments. The Willow Project's environmental cost is drastc, and America should have followed the morally correct over the fnancially benefcial strategy. The U.S. stll has the highest GDP of any country by a large margin, so losing out on 1.5% of our country's oil producton isn't too harmful. The Biden Administraton shouldn't have approved the project and instead should have upheld the principles that enabled its electon to the presidency.”
"The Biden Administraton shouldn't have allowed Project Willow to go through. Considering that the Biden Administraton is pushing for a greener America, this project doesn't have any benefts for our economy. Also, the United States already drills a lot of oil, so the consequence of 9.2 million tons of carbon emissions directly contradicts Biden's plan to cut emissions down by 2030. This makes him look like a hypocritcal politcian who is harming America. It seems as if money plays a large role in decision making, which is why the Biden administraton should not have allowed Project Willow to pass."

Throughout my long life of 16 years, I’ve never set my hands on floral design even once. In fact, my favorite flower attribute has always been the fact that most flowers make a healthy and tasty meal with a touch of olive oil and salt. Regardless — from pet carrots to mooncakes —- my blooming creativity in the Smoke Signal’s DieHard TryHards competitions never brings disappointing results to fruition. Alas, get ready, because Aaron’s got a flowery business waiting to captivate all eyes within the vicinity.