“IHRA allows for criticism of Israel, assuming it’s the same criticism as of other countries,” said speaker Alyza Lewin, president of the Louis D. Brandeis Center for Human Rights Under Law. She went on to say that if social-media companies recognize only some forms of anti-Semitism, they will “severely undermine their own ability to combat hate.” FAYGIE HOLT- October 26, 2020 (October 26, 2020 / JNS) Hate online, particularly anti-Semitism, is continuing to grow at alarming rates, and stopping it will require education, collaboration and a cohesive definition were the findings from the first-of-its-kind, two-day symposium sponsored by the U.S. State Department. “Nowadays, bigots everywhere can spread anti-Semitism online anonymously. In the first eight months of 2020, the Israel anti-Semitism monitoring system recorded 1.7 million anti-Semitic messages from more than 445,000 users on Twitter and YouTube; 37,000 of those messages referred to COVID-19, and many came from foreign agents like government of Iran that ludicrously blamed Israel” for the pandemic, said U.S. Secretary of State Michael Pompeo in his opening remarks of the “Ancient Hatred, Modern Medium: Conference on Internet Anti-Semitism.” The seven-hour conference featured recorded discussions and talks about a host of topics from “The Psychology of Hate” to “Building Coalitions and Alliances” to the “Legal Framework” of combating anti-Semitism. Experts came from across the globe and included members of the U.S. Congress, Israeli government officials, representatives from TikTok and Facebook, university professors and others. “The Internet can be tremendous force for good. It can bring people together and allow everyone a voice,” said Michael Gove, a member of Parliament in the United Kingdom. “But it is also the case that the Internet can be a cause for evil” and recruit “susceptible minds” to the “gospel of evil.” Joel Finkelstein of the Rutgers Institute for Secure Communities and director of the Contagion Research Institute said his group uses artificial intelligence to “track, expose and combat hate.” He explained that hate groups often use code words and memes, including colorful cartoon images, to get across their agendas and increase their appeal. Further, there seems to be a correlation between an uptick in violent speech and increased actions. For instance, in 2018, his group warned of increased anti-Semitism on a platform used by far-right activists, and shortly afterwards, the deadly shooting at the Tree of Life*Or L’Simcha Synagogue in Pittsburgh took place during Shabbat-morning services. Similarly, after an increase in talk about a race war came an uptick in militia groups appearing at public events and “threatening to attack people.” Still, said Finkelstein, it’s not just the far-right using the web. He noted a “parallel structure for left-wing violence, leftist anarchists and Socialists,” who tend to utilize their own code words and memes. ‘Guidance for reviews looking at content’ According to Peter Stern, director of stakeholder engagement and content policy at Facebook, 95 percent of hate speech is removed by automaton, and they are trying to make it harder for administrators of groups that have been removed to create new groups. “It’s easy to forget that at the end of the day, we have to come up with guidance for the more than 15,000 reviewers who will be looking at content every day,” said Stern, adding that they need concrete guidance and not just have people making judgment calls. The popular video platform TikTok is also working to combat hate on its platform. According to Jeff Collins, senior director of global trust and safety, it’s not enough to “disrupt the ecosystem of hate,” but to try to prevent to the “amplification” of it. One step they are taking is utilizing algorithms to prevent “filter bubbles,” where a user would only see one type of content. While watching, for example, cute cat videos and then getting more like-minded videos in an individual feed might not be problematic, the concern of a “filter bubble” is when it’s used to target and amplify hate or other problematic content. Collins also said it’s not always as simple as taking down content due to the sheer amount of material online and because “bad actors” are trying to circumvent security measures. Another issue is defining what constitutes anti-Semitism. It’s a question that vexes not only social-media companies but many others as well, particularly as it relates to Israel and Zionism. That’s why many of the speakers stressed the importance of passing one standard definition of anti-Semitism, specifically the International Holocaust Remembrance Alliance (IHRA) definition. However, stressed a number of the speakers, it is not enough to pass the short definition, but to utilize and include the extended definition, which includes examples of anti-Semitism in real-life speech, media and elsewhere. It is the extended definition that some balk at, claiming that it limits legitimate forms of expression and criticism of Israel; it was also a point that caused some bristling from conference speakers. “IHRA allows for criticism of Israel, assuming it’s the same criticism as of other countries,” said speaker Alyza Lewin, president of the Louis D. Brandeis Center for Human Rights Under Law. She went on to say that if social-media companies recognize only some forms of anti-Semitism, they will “severely undermine their own ability to combat hate.” Rep. Debbie Wasserman Schultz (D-Fla.), a member of the interparliamentary task force to combat online anti-Semitism who has been a target of anti-Semitism online along with her family, said it’s important to demand “real global accountability” in combating hate. The global reach of the Internet means that people can “spew hateful speech online” without fear of repercussions from the “seemingly protected place of their home,” she said. “It’s never been easier for anti-Semitism to spread.” She also pushed for the adoption of IHRA definition and said it’s vital to understand the difference between criticism of Israel and anti-Semitism. Stern of Facebook said that “IHRA definition is helpful to us because it flags important issues we are dealing with,” and that “If you look at our policy and you look at IHRA definition, there is quite a bit of overlap.” Address youth in school and on the sports field Presenting a perspective from the Muslim world, H.E. Dr. Ali Rashid Al Nuaimi, a member of the Federal National Council in the United Arab Emirates, suggested that the way to combat online hate is to start with youth. “We have to start from the schools—from within the educational system to make sure that teachers in the classroom are promoting these values: values of coexistence, of tolerance,” he said. “We have to engage, beside education, public figures. For example, in our region, football (soccer in the United States) is very famous. We should engage some of these public players because the new generation looks at them as role models. Let them be engaged in some activities that promote co-existence.” Al Nuaimi, who also serves the chairman of the steering board of Hedayah, the International Center for Excellence for Countering Violent Extremism, also said it was important that religious clergy from different faiths get involved, and work on common initiatives and programs to help build tolerance. While it may seem a daunting task, it is one that must be fought and won, stressed the speakers. “The ancient curse of anti-Semitism is like a vampire; every time we think we’ve killed it and put in the coffin, somehow it comes back to prowl the night,” said speaker Robert P. George, the McCormick Professor of Jurisprudence at Princeton University. “We never manage to get a dagger into the heart of the monster, but that does not mean we should give up far from it. “We need to learn to use social media to combat, and ultimately, defeat anti-Semitism.” Read the article here.