fbpixel

End the year with a tax-deductible donation.

‘The Ladder Down to Hell’: How Social Media Breeds Hate Speech

Published: March 25, 2019 • Updated: January 12, 2022

Author: Dr. Omar Suleiman

بِسْمِ اللهِ الرَّحْمٰنِ الرَّحِيْمِ

In the name of God, the Most Gracious, the Most Merciful.

This was originally published by Bethania Palma for Snopes.com

Islamic scholar and civil rights activist Omar Suleiman was watching in horror as details of the 15 March 2019 mosque massacre in Christchurch, New Zealand, unfolded when he received a text message from a Jewish friend who is a rabbi and Suleiman’s counterpart in the interfaith activist community. “I’m saddened and horrified by the murders in New Zealand. I’m here for you and our Muslim neighbors,” she wrote.

Scrolling up his phone, Suleiman noticed something chilling: such scant time had passed since a white supremacist gunned down 11 worshipers in a Pittsburgh synagogue that “literally she was sending me the same messages I was sending her” after the synagogue shooting, Suleiman observed.

Occurring a world apart, hate-driven mass murders in houses of worship punctuated the end of 2018 and beginning of 2019. Robert Bowers, charged with gunning down 11 worshipers at the Tree of Life synagogue in Pittsburgh in October 2018, is a 46-year-old American, while Brenton Tarrant, charged with killing 50 people in the mid-March 2019 rampage in Christchurch, is a 28-year-old Australian national. What links them is that both spent large amounts of time on social media platforms spewing and marinating in hate.

Both men subscribed to prominent white supremacist conspiracy theories contending that a plot was afoot to phase out white people. Bowers believed Jewish people were orchestrating Central American caravans in a scheme to increase non-white immigration to the U.S. in order to “replace” whites, while Tarrant believed that non-white immigrants and shifting demographics in predominantly white countries such as New Zealand were evidence of a “white genocide.”

The killings also coincided with a surge of anti-Muslim hate crimes in other regions of the world, including the United Kingdom and Canada, while in the U.S., such crimes have spiked to all-time highs. Meanwhile, researchers at the University of Warwick in England have found a strong statistical correlation between tweets posted by U.S. President Donald Trump on Islam-related topics and anti-Muslim hate crimes.

The New Zealand massacre drew attention to what Suleiman sees as related problems: the proliferation of anti-Muslim hate speech on the internet, the mainstreaming of such rhetoric, and the willingness of some who consume such material to take that online activity into the real world with acts of violence and intimidation.

Referring to an echo chamber of social media personalities and websites that traffic in anti-Muslim conspiracy theories, Suleiman said, “The Islamophobia industry takes one lie and spins that out to create 100 articles and links, so even if you clarify the one lie, 100 articles pollute Google searches.” The goal, he averred, was to create so much online content that those who try to set the record straight can’t keep up.

He pointed to websites dedicated to propagating a conspiracy theory that Muslims in the U.S. (who account for just over 1 percent of the population) are devising a plot to take over the country, and sites that fabricate narratives of Muslims as barbaric or violent.

Islamophobes have “created so much suspicion about the Muslim community that the natural reaction from their readers is to look at us as the worst of the worst,” he said. “Even those that should be our allies keep a distance from us because of all the information that’s on the internet. We’re embraced in these broad coalitions of diversity, but at the end of the day, people don’t want to touch us because of the fear that’s been generated about us.”

As a result of such activity, Suleiman noted, “The mosque attack was not surprising. The only thing that’s surprising is that it happened in New Zealand.”

The Spread of Hate Online

In a response to our query about these concerns, a representative from Google told us that the technology giant has been working to remove hate content from their products, to promote high-quality, authoritative sources in search results, and to prohibit and remove hate speech and other material intended to incite violence. Google has also removed search autocomplete predictions that are hateful towards groups or individuals. A spokesperson for YouTube, which is owned by Google, also told us that the platform was taking a tougher stance on videos flagged by users which are not technically in violation of the platform’s policies but nonetheless include controversial religious or white supremacist content.

Just two days after the New Zealand massacre, Breitbart published a selectively framed story reporting that “Nigerian Muslim militants” had killed 120 Christians within a space of three weeks since late February 2019. As Snopes.com reported, the Breitbart story failed to disclose that other forces were at play in the long-running conflict — namely competition for land and resources — and that the largest atrocity recorded during February and March 2019 was the killing of 130 members of a mainly Muslim ethnic group by a mainly Christian group. Nonetheless, many social media users shared the Breitbart article asserting that the news media were ignoring the killings of Christians in favor of sympathetically covering the massacre of Muslims in New Zealand:

“Memes about how many people have been killed by Muslims are definitely going around” on social media in the aftermath of New Zealand, said Elon University computer science Professor Megan Squire. “It’s whataboutism. It’s just classic, ‘Hey, look over there’ misdirection.”

The idea that the mainstream news media ignore crimes committed by Muslims is not supported by evidence. Research shows that crimes committed by Muslims receive vastly more media attention than similar acts committed by non-Muslims.

For instance, an April 2018 analysis of Washington Post and New York Times articles conducted by the Institute for Social Policy and Understanding (ISPU) found that Muslims arrested on suspicion of plotting violent acts received 770 percent more news coverage than similar plots carried out by non-Muslims in the U.S., while violent acts by Muslim perpetrators received twice the media coverage than attacks carried out by non-Muslims. A January 2019 report published by the Anti-Defamation League (ADL) found that every fatal act of violence resulting from extremism in the U.S. in 2018 was linked to far-right ideologies.

“As a result, you have this situation where people are hyper-concerned about one type of crime that is in reality far less frequent because the media is so much more focused on it, while ignoring what is actually a greater threat,” said Dalia Mogahed, research director for ISPU, a think tank that researches public opinion and policies that impact Muslims.

In addition, Mogahed pointed to ISPU research that shows “on average, prosecutors [in the U.S.] sought three times the sentence length for Muslim perpetrators as for perpetrators not identified as Muslim for similar plots of attempted ideologically driven violence.” Additionally, “Muslim perpetrators received four times the average sentence as their non-Muslim counterparts for attempted plots of similar conduct.” So Muslim perpetrators are not only garnering far more media attention but also heavier penalties Mogahed said.

And when a Muslim commits violence, the entire Muslim community is expected to account for it even though Muslims are statistically the most likely to be victimized by global terror, stated Toronto-based journalist and writer Sarah Hagi.

“A big part of this is that we’re not included in these narratives that are created about us,” she said.

Only a minority of Americans believe false narratives about the Muslim community, according to ISPU research. But prejudice continues to be a significant problem amplified online: Squire’s research of extremist ideologies on Facebook found that anti-Muslim and anti-immigrant sentiments, which often overlap, serve as an ideological center for hate groups.

In a 26 September 2018 research paper published at Elon University, Squire concluded in an analysis of Facebook groups based on right-wing extremist ideologies that “groups with nativist bias, in particular anti-Muslim groups, are front and center in far-right politics. Our analysis also shows that anti-Muslim groups attract the same audiences as other extremist ideologies, including secessionist neo-Confederates, militant anti-government conspiracy theorists, and racist white nationalists. We show that some of the anti-Muslim groups even serve as a convenient lingua franca; their brand of hate is a common denominator that ties people of disparate ideologies together.”

“Anti-Muslim groups weren’t the first ones on my radar,” she told us. “What I started to notice was that every single [far-right] group, no matter what the ideology, were all hating on Muslims.”

And she noticed that when she started building up a sample collection of anti-Muslim Facebook groups, “the language and images they used were way off-the-charts worse than the other groups on Facebook.”

To make matters worse, once users start looking at one such group, they’ll get suggestions for other similar ones.

“That’s the ladder down to hell,” Squire said. “You start browsing one group and you’re in that niche, fetish kind of place where you’ve been transported magically to the place you never knew existed. Now you can find other people that are also into that. It’s all the wonder and promise of the internet, but to this disastrous effect.”

A spokesperson for Facebook told us the platform has been working with civil rights organizations to adjust its policy so that Facebook will now remove events in which people are encouraged to bring weapons to intimidate or harass others. Facebook also reported that they removed 2.9 million hate-speech posts in the third quarter of 2018.

“Hate speech goes against our Community Standards and is not allowed on Facebook,” the spokesperson said in an emailed statement. (Facebook generally comments through unnamed members of their public relations staff.) “We take action on content and accounts that break our Community Standards as soon as we’re made aware of it. We continue to invest in our safety and security teams and the technical tools we use to proactively detect hate speech. We also actively work with a range of groups on ways to improve online safety.”

Nevertheless, Facebook was caught up in the New Zealand massacre after the shooter used the platform’s technology to stream video of the attack in real time using Facebook Live. Only 200 people watched the massacre live, and Facebook moderators took the video down 29 minutes after it began. But with the aid of extremists on the message board 8chan, the mass killer’s propaganda was spread and viewed millions of times. In response to that phenomenon, Facebook declared in a blog post about the issue that:

Immediately after the attack, we designated this as a terror attack, meaning that any praise, support, or representation violates our Community Standards and is not permitted on Facebook. Given the severe nature of the video, we prohibited its distribution even if shared to raise awareness, or only a segment shared as part of a news report.

In the first 24 hours, we removed more than 1.2 million videos of the attack at upload, which were therefore prevented from being seen on our services. Approximately 300,000 additional copies were removed after they were posted.

We’ve been asked why our image and video matching technology, which has been so effective at preventing the spread of propaganda from terrorist organizations, did not catch those additional copies. What challenged our approach was the proliferation of many different variants of the video, driven by the broad and diverse ways in which people shared it[.]

A Mountain of Bigotry

“I don’t think any of these platforms take it seriously enough. I don’t know what’s in it for them, but it shouldn’t have gotten to the point where something like this [New Zealand] was possible,” Hagi told us.

Oftentimes, smaller acts that don’t necessarily capture headlines together create a mountain of bigotry that Muslims face on a daily basis. Suleiman told us his home postal address had been posted online by anti-Muslim internet users, while his 9-year-old daughter had taken to looking over her shoulder during religious observances after being terrified by the sight of armed protesters at the family mosque.

“My daughter first saw the protesters when she was six. It really traumatized her,” Suleiman reported. When the family traveled to Pittsburgh to offer support to the grieving Jewish community there after the Tree of Life massacre, Suleiman said his daughter connected that atrocity to the armed protesters at their mosque in Texas.

“Her first question was, ‘Was it the same people?’ Her second was, ‘Did Donald Trump send them?’ That’s the point where I thought these kids are being forced to wonder about safety and see things no child should have to see.”

Sources
  • Levine, Mike.   “An ‘Odd’ FBI Case Highlights the Impact of Anti-Muslim Bias in US.”
    ABC News.   19 March 2019.
  • Institute for Social Policy and Understanding.   “Equal Treatment? Measuring the Legal and Media Responses to Ideologically Motivated Violence in the United States.”
    April 2018.
  • Chalabi, Mona.   “Terror Attacks by Muslims Receive 357% More Press Attention, Study Finds.”
    The Guardian.   20 July 2018.
  • Hagi, Sarah.   “The Dangerous Normalization of Islamophobia.”
    GQ.   19 March 2019.
  • Institute for Social Policy and Understanding.   “American Muslim Poll 2018: Pride and Prejudice.”
    Accessed 21 March 2019.
  • Squire, Megan.   “Network Analysis of Anti-Muslim Groups on Facebook.”
    Elon University.   26 September 2018.
Welcome back!
Bookmark content
Download resources easily
Manage your donations
Track your spiritual growth

Disclaimer: The views, opinions, findings, and conclusions expressed in these papers and articles are strictly those of the authors. Furthermore, Yaqeen does not endorse any of the personal views of the authors on any platform. Our team is diverse on all fronts, allowing for constant, enriching dialogue that helps us produce high-quality research.