Supreme Court Addresses Social Media Liability for Terrorist Content 

The Louis D. Brandeis Center recently signed a statement calling on Twitter to adopt the International Holocaust Remembrance Alliance’s definition of anti-Semitism to better enable it to identify and remove anti-Semitic content on its platform. Last week, extremist Internet content made it to the U.S. Supreme Court’s docket. The court heard oral arguments in the companion cases Gonzalez v. Google LLC and Twitter, Inc. v. Taamneh. In those cases, the Court will address whether the families of terrorist victims can recover under the Anti-Terrorism Act (ATA) for social media companies’ recommendations of ISIS content and allowing their platforms to be used for ISIS recruitment. The outcome could significantly reshape the digital landscape. 

 .

Both cases involve victims of ISIS attacks. American student Nohemi Gonzalez was shot and killed by ISIS members while she was eating at a café with her friends during the “Paris Attacks” of 2015. At the time, Gonzalez was participating in a foreign exchange program to learn French. ISIS claimed responsibility for the attack in a YouTube video. 

 .

In the Taamneh case, Nawras Alsaaaf, a Jordanian citizen, was also shot and killed by an ISIS member. In 2017, Alsaaf was vacationing in Istanbul with his wife. He was at the Reina nightclub when ISIS members attacked it. He and 38 other people died in the attack. ISIS claimed responsibility the next day.  

 .

Social media companies claim immunity from ATA suit under 47 U.S.C.A. § 230. Section 230 was enacted in 1996 and provides that interactive service providers are generally not liable for what users, referred to as “information content providers,” say on their platforms. However, the Anti-Terrorism Act (ATA), as amended by the Justice Against Sponsors of Terrorism Act (JASTA), allows for terrorist victims to recover primary and secondary liability against any person who “aids and abets, by knowingly providing substantial assistance, or who conspires with the person who committed” an act of international terrorism.  

 .

Evidence shows that ISIS has used social media sites, including Twitter and Facebook, to spread extremist propaganda, recruit members, instill fear, and plan attacks. A 2015 study found that ISIS supporters controlled between 46,000 and 70,000 Twitter accounts. Included among the posts were graphic videos of their attacks. 

 .

The plaintiffs argue that Google and Twitter have crossed the line from interactive service provider to internet content provider by implementing their advanced algorithms. The plaintiffs assert that the defendants’ algorithmic recommendations convey a message distinct from the recommended videos or posts they present to users by implying that users will enjoy them. As such, they have created additional content and would not be protected under Section 230. Thus, the plaintiffs contend the defendants should be held liable for aiding and abetting ISIS under the ATA.  

 .

The defendants claim that this interpretation of Section 230 is too expansive. The defendants argue that Congress intended the Act to help the internet grow by reducing website operators’ obligation to ensure all posts on their websites are legal. Holding for the plaintiffs, the defendants argue, would defeat this intent. The defendants further argue that even if they are not afforded immunity under Section 230, they did not aid and abet ISIS because their algorithms operate automatically, and therefore they did not aid and abet “knowingly,” which the statute requires. Conversely, the plaintiffs claim that the companies did aid and abet “knowingly” since they knew the recommendations were helping ISIS.  

 . 

Both cases highlight the difficult balance between the rights of victims of terrorism to recover damages from companies which facilitated the work of the terrorist groups which targeted them, and the need to promote free expression and innovation on the Internet. The Supreme Court will decide the cases later this year.