What is Section 230?
The Supreme Court has its focus on big tech as justices hear arguments involving Section 230, a controversial section of the 1996 Communications Decency Act that provides a liability shield for internet providers over content posted by third parties. As Congress is largely at a standstill on how to proceed with rules regulating content monitoring and moderation, all eyes are on how the justices will respond in the first Section 230 case to hit the highest court. The upcoming hearings could play a major role in changing the status-quo on platforms’ immunity over user-generated posts.
Section 230 explicitly states that “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider,” essentially allowing online platforms to skirt responsibility for the content its users post. In recent years, social media companies have been largely successful in escaping responsibility as it relates to monitoring hate speech, misinformation, and other forms of harmful rhetoric that lingers on its platforms.
What is Gonzalez v. Google?
The first case, Gonzalez v. Google, centers on allegations that Google subsidiary YouTube provided a platform and used its algorithm to recommend terrorist content in a way that incited violence during a 2015 terror attack in France that claimed the lives of 130 people, including U.S. citizen Nohemi Gonzalez. The Gonzalez family initially filed suit in 2016, alleging that because Google/’s YouTube suggests content to users based on their interests, the platform recommended ISIS’ content and enabled them to find other ISIS-related content. The suit also accuses Google of not taking action to ensure ISIS remains off its platform, alleging that “tech companies are directly liable for committing acts of international terrorism” and secondarily liable for “conspiring with, and aiding and abetting, ISIS’s acts of international terrorism.”
What is Twitter v. Taamneh?
The second case, Twitter v. Taamneh, makes similar arguments that points fingers at Google, Twitter, and Facebook, for their role in an ISIS terrorist attack that killed 29 people in Istanbul, Turkey. Justice Kagan previously pointed out that there’s an argument that Twitter is “Helping by providing your service to those people with the explicit knowledge that those people are using it to advance terrorism.” The plaintiffs in this case claim that Twitter chose to take little action against the infiltration of ISIS on the platform, and even goes as far to accuse the company of sharing revenue with ISIS by reviewing and approving the terrorist group’s YouTube videos for monetization through its AdSense program.
How does the Supreme Court’s decision affect social media companies?
Both cases will be instrumental in deciding whether social media companies can be held liable for harmful content spread on their platforms, which are supported by their algorithms. Tech companies argue that Section 230 of the 1960 Communications Decency Act should continue to protect them from such liabilities, while opponents insist that social media companies need to be held accountable for the impact they have on users and the general public.
Our agency President, Chris Rosica, weighs in on the issue and says that social media companies are for-profit and should not be given a free pass. “Imagine if a for profit workout facility did not regulate or oversee any of the activities that happened within the walls of its business,” he says. Rosica also adds that “Gyms are responsible for rules and regulations to keep its customers or members safe. Social media, which is highly monetized, should not be any different.” Ultimately, he believes that social media companies have a moral and fiscal obligation to ensure that their users – and the public – are safe.
The social media landscape changes constantly and, as a result, it is entirely reasonable that a 27-year-old statue is no longer working to adequately protect people. While social media is important for allowing expression and shared views, it must be monitored in a way that does not allow for damaging rhetoric and lies that can lead to violence and other maligned consequences. At the core of Section 230 is public protection from criminals and terrorists. Despite this fact, social media companies, including Google, are spending millions of dollars on lobbyists and attorneys to combat what’s right and fair. These tech behemoths must be held accountable and need to be reined in as they perpetuate the Wild West environment and mindset – welcoming fraudsters and dangerous groups to prey on the victims. A preventable issue and responsibility they are constantly working to avoid.