Social media companies should be held liable for harm their platforms can cause

When+it+comes+to+government+involvement+in+social+media%2C+it+can+be+difficult+to+think+of+legislation+that+doesn%E2%80%99t+encroach+on+citizens%E2%80%99+First+Amendment+rights%2C+but+allowing+those+hurt+by+malicious+content+to+directly+argue+their+cases+provides+a+framework+for+accountability+without+doing+so%2C+arts+co-editor+Caroline+Hohner+writes.

Midway staff

When it comes to government involvement in social media, it can be difficult to think of legislation that doesn’t encroach on citizens’ First Amendment rights, but allowing those hurt by malicious content to directly argue their cases provides a framework for accountability without doing so, arts co-editor Caroline Hohner writes.

Caroline Hohner, Arts Co-Editor

Are social media platforms publishers? Current legislation says they aren’t, meaning content posted on a site is completely out of a social media company’s hands. Yet if they were, they wouldn’t be able to grow because they could be sued over tweets, posts and updates. 

Blanket protections granted by Section 230 of the Communications Decency Act, the 1996 statute that resolves platforms of their liability for their content, against harmful content are no longer enough to protect against the modern abuse of social media. However, if platforms are completely stripped of these protections, it would be impossible for competitors to enter the social media industry. There has to be a middle ground. 

Given the massive consumer bases of these companies and the demonstrated effects of allowing false information and the incitement of criminal activity to circulate, these companies should be held liable for the harm their platforms play host to. 

Christopher Cox, a former congressman responsible for the creation of Section 230, said that the original purpose of the statute has been warped and that “Congress should revisit the law.” 

Former President Donald Trump’s tweets drove the insurrection at the Capitol last month. The eventual de-platformings on Twitter, Facebook and various other sites were voluntary, but the individual guidelines of companies are not reliable enough to prevent catastrophe. 

 This instance of social media content leading to criminal activity is only the latest in a long line of disasters in the discussion of Internet liability.

A 2018 NPR article detailing the history of the statue and the tech industry’s changing positions on it described an incident in which the website Backpage.com promoted advertisements for child sex trafficking.

The trafficking victims lost to the website in court, which defended itself with the broad protections of Section 230. The incident ultimately lead to restrictive amendments to the statue directly related to liability for sex trafficking.

In this instance, companies were stripped of their protections when it came to hosting illegal activity. This process can and should be applied more generally.  

Amending Section 230 to create company liability for hosting illegal activity and false information would give these companies financial motivation to crack down on such content. It also means the victims of harmful posts or ads would have the opportunity to fight for reparations without being shut down immediately by the blanket protections of the statute. 

Amending Section 230 would also work toward cleaning up the darker sides of social media without necessitating government censorship. When it comes to government involvement in social media, it can be difficult to think of legislation that doesn’t encroach on citizens’ First Amendment rights, but allowing those hurt by malicious content to directly argue their cases provides a framework for accountability without doing so.

Section 230 should not be an impenetrable shield against taking responsibility for criminal activity and misinformation. The statue needs to be amended to hold social media platforms liable for harmful content.