US rights group call out Facebook on 'anti-Muslim hate' content
After the congressional testimony by Facebook whistle-blower Frances Haugen about hate speech on the platform, a Muslim rights group in the US is calling for greater accountability of the social media giant.
Facebook has come under fire for allegations that it has done little to stop the spread of hate speech on its platform.
Haugen previously warned that hate content on Facebook is fanning violence in countries such as Myanmar, where ethnic cleansing against Rohingya Muslims has taken place.
US rights groups have also called out Facebook for the continued presence of anti-Muslim content on the site.
"It's very clear Facebook has played a very complicit role," Huzaifa Shahbaz, a research and advocacy coordinator at the Council on American-Islamic Relations (CAIR), told The New Arab. "Even after sustained pressure, anti-Muslim hate has continued."
CAIR is one of several civil rights organisations that have spoken out about what it says is insufficient action on the part of Facebook in addressing anti-Muslim hate speech, events, and forums.
In a press release earlier this month, it pointed to dehumanising language referring to Muslims in India on Facebook, which was followed by violence against them by extremists in the country.
Such content is also widely shared on WhatsApp and Instagram, both owned by Facebook.
A 2019 Southern Poverty Law Firm investigation found that a closed Facebook group was used to organise the surveillance of mosques in southern and western US states.
In April, the civil rights group Muslim Advocates filed a lawsuit against Facebook alleging that the company had falsely promised that when they learn of content that violates their own standards or policies, that content is removed.
Facebook has defended its continued hosting of content deemed harmful by invoking free speech and saying that it is difficult to monitor all content in multiple languages.
Huzaifa, however, noted that other social media companies have taken more concrete steps to address hate speech on their platforms.
Twitter, YouTube, and Reddit have all taken action against users spreading hate and conspiracy theories.
"It's a really simple thing to do - just having a team to vet these groups," says Shahbaz.