");vwo_$('head').append(_vwo_sel);return vwo_$('head')[0] && vwo_$('head')[0].lastChild;})("HEAD")}}, R_940895_48_1_2_0:{ fn:function(log,nonce=''){return (function(x) { if(!vwo_$.fn.vwoRevertHtml){ return; }; var ctx=vwo_$(x),el; /*vwo_debug log("Revert","content",""); vwo_debug*/; el=vwo_$('[vwo-element-id="1742919897117"]'); el.revertContentOp().remove();})("HEAD")}}, C_940895_48_1_2_1:{ fn:function(log,nonce=''){return (function(x) {var el,ctx=vwo_$(x); /*vwo_debug log("editElement",".stylingblock-content-margin-cell > table:nth-of-type(1) > tbody:nth-of-type(1) > tr:nth-of-type(1) > td:nth-of-type(1) > div:nth-of-type(1) > div:nth-of-type(1) > h2:nth-of-type(1) > span:nth-of-type(1)"); vwo_debug*/(el=vwo_$(".stylingblock-content-margin-cell > table:nth-of-type(1) > tbody:nth-of-type(1) > tr:nth-of-type(1) > td:nth-of-type(1) > div:nth-of-type(1) > div:nth-of-type(1) > h2:nth-of-type(1) > span:nth-of-type(1)")).html("Hello! David Brancaccio here. Do you want instant access to the free online course - “Economics 101” - to understand basic economic concepts?");})(".stylingblock-content-margin-cell > table:nth-of-type(1) > tbody:nth-of-type(1) > tr:nth-of-type(1) > td:nth-of-type(1) > div:nth-of-type(1) > div:nth-of-type(1) > h2:nth-of-type(1) > span:nth-of-type(1)")}}, C_940895_63_1_2_0:{ fn:function(log,nonce=''){return (function(x) {var el,ctx=vwo_$(x); /*vwo_debug log("paste",".actions"); vwo_debug*/!(el=vwo_$(".actions")).parent().find('[vwo-op-1742589780926=""]').length&&el.after('
By submitting, you consent to receive information about MPR\'s programs and offerings. You may opt-out at any time clicking the unsubscribe link at the bottom of any email communication. View our Privacy Policy.
The Center for Countering Digital Hate says Facebook, Instagram, Twitter, YouTube and TikTok failed to act on nearly 90% of the posts containing “anti-Muslim hatred.”
Social media companies say they are working hard to prevent hate speech from being posted on their platforms, and remove it when it is. But that’s an ongoing challenge as they operate in numerous countries with many languages and social contexts.
A new report from the nonprofit Center for Countering Digital Hate reveals anti-Muslim hate speech and misinformation still proliferate online.
Imran Ahmed is the founder and CEO of the group. The following is an edited transcript of his conversation with Marketplace’s Kimberly Adams on CCDH’s latest research on the problem.
Imran Ahmed: We identified hundreds of bits of hatred on their platform. Now that in itself is problematic that we could find it so easily, but then we reported it to the platforms using their own reporting tools, so [by] clicking “Report dangerous post.” We went back a few weeks later to check what action did they take? What we found was really disturbing, that 9 out of 10 times, even when notified about the most egregious hatred — glorifying the terrorist at Christchurch [New Zealand], very, very dangerous conspiracy theories and extreme forms of hatred — 9 out of 10 times, they took no action whatsoever.
Kimberly Adams: And do you have any sense how that compares to the way that these platforms respond to other types of hate speech?
Ahmed: It is very comparable to the work that we’ve done studying antisemitism and misogyny. And in fact, it’s very similar numbers to those that we saw when we looked at COVID conspiracy theories and vaccine misinformation.
Adams: How have the platforms responded to your report so far?
Ahmed: To date, most of the platforms haven’t responded. Twitter said, we know that we can do better. YouTube appears to have told people that they have taken down a few of the videos, but they didn’t tell anyone whether they took them down after our report came out, which is what we believe happened. So at this point, whether it’s legislation around the world coming into place, whether that’s in the United Kingdom with the online safety bill, the European Union with the Digital Services Act, or a raft of bills that have been proposed in the U.S. Congress, I think we’re at the point now where it is time for them to take responsibility and show some accountability for the hatred that they allowed to proliferate on their platforms.
Adams: In Europe, the Digital Services Act is meant to hold social media companies more accountable for the content that’s on their platforms. What sort of impact do you think that could have on limiting anti-Muslim hate speech and hate speech in general?
Ahmed: Well, the key thing to most of the legislation is they hold them to account to the standards that they set for themselves. And so by saying that your failure to meet standards which you’ve incorporated into your community standards, you could be liable for damages. And that is, you know, that’s a perfectly reasonable way to regulate that industry.
Adams: What does this sort of hate speech look like across the different platforms? Are there differences in the way that it shows up depending on which platform you’re on?
Ahmed: The truth is that Twitter, for example, is used primarily to affect, because Twitter is really quite a small platform. It has about 200 million users, and they tend to be richer, wealthier elites. So it’s where you go to try and affect elite discourse and political discourse, media discourse on Muslims. Facebook is where you drip, drip misinformation. You spread a bit of misinformation every day over a year, so that you slowly color the lens through which users see the world. Instagram, YouTube, TikTok, they’re often used as evidence points where people put richer information, misinformation, in order to persuade people, and that’s linked to from Twitter, Facebook and other spaces. So they all form part of a cogent ecosystem which is used by bad actors to spread misinformation and hatred. There is a real-world cost for hate online. The mobilizing ideology behind [the 2019] Christchurch massacre was also that mobilizing ideology, the “great replacement theory,” that led to the massacre of Jewish people in the Tree of Life synagogue in Pittsburgh. If these platforms haven’t learned their lesson by now, we need the hard backstop of legislation and regulation to ensure they do.
In the full report, CCDH tracked a sample of 530 posts containing anti-Muslim hate speech or content and found the images, videos and messages on Facebook, Instagram, Twitter and TikTok were viewed more than 25 million times.
A YouTube spokesperson responded to our request for comment and provided the following statement:
“YouTube’s hate speech and harassment policies outline clear guidelines prohibiting content that promotes violence or hatred against individuals or groups where religion, ethnicity or other protected attributes are being targeted. Of the videos flagged to us by CCDH, five have been removed for violating our hate speech policies and eight have been age-restricted.”
Jack Malon, YouTube spokesperson
YouTube also said that in the last quarter of 2021, it removed more than 410,000 videos for violating its hate speech and harassment policies and that 74% of those videos were taken down before they had more than 10 views.
While Twitter, TikTok, Facebook and Instagram did not provide us with public statements on the report ahead of our deadline, all have policies against hate speech on their platforms.