During a recent interview at the Aspen Ideas Festival, a Facebook executive provided a fascinating example of the kind of choices the company faces when “combatting the fake-news problem.”
Facebook’s chief product officer, Chris Cox, said the company’s ability to outsource credibility rating to fact-checkers meant Facebook didn’t need to make “difficult calls” like whether a recent controversial Time cover was misleading.
“The partnership with fact-checkers means that we can rely on institutions that have standards, are showing their work, and allow us to not be in a situation where we feel like we need to be making these really difficult calls,” Cox said. “And they are difficult calls. I mean, the cover of Time magazine is a difficult call.”
He’s referring to a widely discussed photo illustration that showed President Donald Trump towering over a crying little girl against a red background.
The implication of Time’s illustration, however, is that the girl was separated from her mother at the US-Mexico border because of the Trump administration’s “zero tolerance” immigration policy that separated nearly 3,000 children from their parents.
That conclusion would be misleading: A man identified as the girl’s father told Reuters that she hadn’t been separated from her mother and that she was being held with her mother at a detention facility in Dilley, Texas.
It’s those details that complicate the process of determining whether an illustration like Time’s cover is misleading. “It was part of the debate in the fact-checking community this week,” Cox said. (Snopes, the leading fact-checking website, wrote a blog post about the controversy on June 22.)
Though Facebook uses third-party fact-checkers to determine whether items should be demoted in the News Feed, it doesn’t fact-check photos in the US, so it did not demote the Time cover, a Facebook representative told Business Insider.
Time’s editor defended the cover, telling The Washington Post in a statement that “our cover and our reporting capture the stakes of this moment.”
Facebook published the internal guidelines its moderators use to police the social network for the first time earlier this year. The guide has 64 pages, according to Cox, including this section specifically on “false news”:
“Reducing the spread of false news on Facebook is a responsibility that we take seriously. We also recognize that this is a challenging and sensitive issue. We want to help people stay informed without stifling productive public discourse. There is also a fine line between false news and satire or opinion. For these reasons, we don’t remove false news from Facebook but instead, significantly reduce its distribution by showing it lower in the News Feed.”
You can read the entire quote and the longer interview at Wired.