Is Facebook Even Equipped to Regulate Hate Speech and Fake News?


On Monday, Facebook booted alt-right conspiracy theorist Alex Jones, pulling four of his pages from the social networking website for “glorifying violence” and “using dehumanizing language to describe people who are transgender, Muslims, and immigrants.” The Jones ban coincides with similar efforts from YouTube, Spotify and iTunes, and follows Facebook’s removal of four videos from Jones and Infowars last week for violating hate-speech and bullying policies when it slapped on a 30-day suspension from the site.

Many are thankful that Facebook is finally addressing the issue, but critics like Keegan Hankes, a senior research analyst at Southern Poverty Law Center, tells Rolling Stone that Facebook has been particularly slow in removing the pages, “given all the noxious conspiracy theories that have been allowed to proliferate on the platform from Jones,” including his vile claim that the 2012 Sandy Hook Massacre is a hoax. (Parents of victims there are currently suing Jones for defamation.) For Facebook to remove Jones because of hate speech and not fake news is “surprising,” says Hankes, and “not a coherent line of enforcement.”

Confronted with the SPLC’s position, a Facebook spokesperson provided Rolling Stone with the company’s community standards policies on fake news and hate speech. Basically, Facebook’s fact-checkers use a rating system to determine content accuracy and, if they find falsities, they can push it down the News Feed. Repeated offenses could mean stripping the rights to advertise. Also, Facebook defines hate speech as a direct attack on people based on “protected characteristics,” such as race, religion, gender, sexual orientation and sometimes immigration status. Facebook removes hate groups based on the calling for or carrying out imminent violence against people based on protected characteristics.

ALSO READ   Ogling Apple's Product RED iPhone 8 Plus

Basically, a person posting misinformation might find their content cast farther down the News Feed, but that does not mean it will be pulled, unless it violates hate speech or incites imminent violence.

Alex Jones addresses a Los Angeles crowd in 2016.

The Facebook spokesperson says context matters and it is all nuanced. But as Hankes sees it, Jones made a business of promoting hate speech on the site and SPLC-designated U.S.-based hate groups, like The Right Stuff and The Proud Boys still operate on the platform despite inciting imminent violence. “We must have a different definitions of hate speech and hate groups,” Hankes says, growing embittered with years of disagreements between SPLC and Facebook on who should be removed. (SPLC defines hate groups as having “beliefs or practices that attack or malign an entire class of people, typically for their immutable characteristics.”)

Despite Facebook’s rationale, SPLC is not the only one confused by how the company handles misinformation, fake news, hate speech and hate groups. The world’s largest social media network has been under fire for muddling up Russia’s use of the platform in the 2016 Presidential Election, the Cambridge Analytica debacle, misinformation leading to real-world violence in countries such as India, Myanmar and Sri Lanka, and being a funhouse for #Pizzagate believers like Jones.

Facebook itself has been unclear about how it will enforce its own policies. Last month, Facebook CEO Mark Zuckerberg said in a Recode interview that “as abhorrent as some of this content can be, I do think that it gets down to this principle of giving people a voice.” Zuckerberg added that as a Jew, he found Holocaust deniers to be “deeply offensive” but that his beliefs did not warrant taking down content if it were just people getting information wrong. “It’s hard to impugn intent and to understand intent,” Zuckerberg said.

ALSO READ   Social media giants zoning in on far-right extremism as Facebook bans Britain First

For Hankes, Zuckerberg’s comments on allowing Holocaust deniers to remain on the site are “really troubling” and shows how he “is willing to give the benefit of the doubt to far-right actors.” But others, like Vera Eidleman, a fellow at the American Civil Liberties Union, notice the blunders yet agree with the refusal to push on restricting content. “Given its role as a modern public square, Facebook should resist calls to censor offensive speech,” Eidleman writes in an email to Rolling Stone. She draws attention to how Facebook’s “decisions to silence individuals for posting hateful speech have backfired time and time again,” including the recent shut down of the counter-rally page “No Unite The Right 2” for engaging in “inauthentic” behavior after anonymous, fake accounts tried disrupting the real event.

So, what exactly is the responsibility of Facebook? Back in April, Zuckerberg testified before Congress and promised lawmakers that he would address the dire effects of misinformation. Facebook has since been on the hook for tackling fake news and hate groups and shepherding personal information. Whether it will be successful remains unknown.

It is not a question of Facebook’s sincerity but about its capabilities. Believing that the main problem with Facebook is its grandiose attempts to operate on such a large scale, Siva Vaidhyanathan, a professor of media studies at University of Virginia an author of Antisocial Media: How Facebook Disconnects Us and Undermines Democracy, tells Rolling Stone that the social network has been volunteering to rid the German elections of anti-immigration sentiment, keep the Russians out of French elections and block foreign advertisements in the Irish abortion referendum. And now, Facebook is trying not to screw up the U.S. midterm elections, while clumsily enforcing policies. “Facebook is trying desperately to make sense of this monster, and it just refuses to concede that it’s too big to govern,” Vaidhyanathan says.

ALSO READ   YouTube App Gets Beta Channel; YouTube Go Hits 100M Downloads

Sure, Zuckerberg believes that he can “give people the power to build a community and bring the world closer together.” But does that mean people should rely on him to stop election interference or protect them from hate groups and ranting windbags like Jones? “We’ve all been tricked to look for companies to be ethical and responsible when in fact that’s the job of the State,” Vaidhyanathan says. “We should expect companies to do everything they can to maximize revenue and look to the State to curb their excesses and punish their violations of rules and laws.”

Since covering Zuckerberg’s trip to Congress, Brookings Institution research fellow Nicol Turner-Lee says that she sees “movement” among Senators and Representatives concerned with passing privacy and speech legislation, but she doubts anything passes soon. “Legislators are going through the same type of ambivalence as the companies about what they should do in terms of regulating free speech,” Turner-Lee says. “The question of [why] a private corporation wants to put its feet in the middle of unresolved, unsettled debated on the order of power and wealth and race and speech in this world is beyond me.”



READ SOURCE

Leave a Reply