Facebook's plan to curb 'fake news' may create more problems than it solves
After coming under fire for the flood of fake election news, Facebook has announced new measures to respond to the problem, but they may not be enough to satisfy its critics.
"Normally we wouldn't share specifics about our work in progress, but given the importance of these issues and the amount of interest in this topic, I want to outline some of the projects we already have underway," Zuckerberg announced in a Nov. 21 post.
Those projects include improving technical systems "to detect what people will flag as false before they do it themselves." The social media company has also reached out to third party fact-checkers to verify information, and they are now looking at ways to publicly label stories that have been flagged as false. In addition, Facebook has also taken steps to disincentivize the production of fake news for profit, by cutting off unreliable providers from accessing their advertising platform.
Rutgers University professor of media and journalism, John Pavlik, said that Facebook's policy changes to curb fake news are a response to public pressure in the aftermath of the presidential election. "Facebook is responding to reaction that says, 'You're letting something very bad happen that potentially could have influenced the outcome of the election.'"
Following Donald Trump's victory, Buzzfeed looked into the impact of fake news on the 2016 election, discovering that social media users engaged with fake election news more often than they engaged with news from traditional media outlets. Buzzfeed also looked into a hub of pro-Trump fake news sites churning out sensational headlines from the Balkans that may have given Trump a boost in the polls.
Mark Zuckerberg denied allegations that Facebook's practices contributed to the election outcome. He defended Facebook saying that only a small fraction of the content featured on the site is fraudulent. "Overall, this makes it extremely unlikely hoaxes changed the outcome of this election in one direction or the other."
According to Pavlik, who is doubtful that fraudulent headlines swayed the 2016 election, "Fake news is influential, but in different ways." He continued that "there is a lot of evidence over the past century that mediated fake news that is perceived as news can definitely affect people, and there is in fact evidence that it can cause a panic."
In 1938, during early days of radio broadcasting, CBS aired a portrayal of H.G. Wells's novel, The War of the Worlds, interspersed with dramatized news bulletins describing an alien invasion. The 1938 broadcast became famous for causing mass panic from people who were unable to tell that the broadcast was a fiction.
Within the relatively new space of digital media, Pavlik noted, "a lot of people don't necessarily have a lot of critical ability."
Many are pleased that Facebook is taking steps to address the proliferation of false news, but some of the proposed projects could open the door to unseen abuses.
Mark Glaser is the executive editor at MediaShift, an online platform providing media and technology analysis and education. He applauded Facebook for taking a "proactive" approach to fake news, "instead of denying that it's a problem," but said its remedies could have downsides.
When it comes to Facebook's program to make it easier for users to report fake stories, and flag false or misleading content, it is possible that people may try to "game the system and mark true stories as false," according to Glaser. "Just like anything else, when you give people more power to moderate and tag, they can often get things wrong."
Another concern is over who Facebook will cooperate with for third party verification. In their Nov. 21 announcement, Zuckerberg only indicated that Facebook has "reached out" to fact checking organizations.
Glaser stressed that Facebook "should provide more transparency about who is doing the fact-checking and how they are making decisions."
Prof. Joseph Turow, an expert in new media at the University of Pennsylvania's Annenberg School for Communications, explained that good fact-checking is a heavy lift.
What Facebook may be trying to accomplish is incredibly complicated. "I honestly don't know how you can automate the process of trying to check facts," Turow said. The Annenberg School operates the reputable online fact-checker FactCheck.org, but even they are only able to look at "a small, tiny percentage of some of the claims by various politicians."
The issue of fake news is intersecting what Dr. Pavlik describes as "a crossroads" in journalism. Mainstream news providers are losing revenue and losing readership. The volume of data passing through the internet will exceed a Zettabyte (1 trillion gigabytes) by end of 2016, which is just one indicator that there is enough information and variety of information available at the click of a button to feed almost any viewpoint.
The competitive information environment and the unprecedented ability of consumers to design their news feeds to their worldview, has also intersected a heightened skepticism of traditional, mainstream news media, Dr. Turow explained.
"We are moving into an era, partly because of Donald Trump, but it certainly precedes him...where people are willing to throw overboard the kind of notion of a profession of journalism," he said. Within academia there has been a historical tendency to be skeptical of mass media and mass communications, but this can be taken to an extreme where all basic facts become disputed.
"Journalism as a profession exists, but the organizations that have historically supported them are going south pretty quickly," Turow said. "It is making the whole idea of being a journalist, a fact-checker like that an explorer of fact, precarious."
In an article published in August, Goldsmiths University professor William Davies suggested that western democracies may be entering "the age of post-truth politics," fueled by the combination of populist movements and social media. "Individuals have growing opportunities to shape their media consumption around their own opinions and prejudices, and populist leaders are ready to encourage them."
These kinds of consequences raise the question of whether the new information space should remain a totally ungoverned wild west.
In the past, Facebook has been criticized for having too much of a heavy hand in mediating information on the social media platform. In August Facebook changed its policies to reduce the role of human editors in writing the summaries of Trending news. Facebook assured its users that despite the changes, humans would still be involved "to ensure that the topics that appear in Trending remain high-quality.
Only hours after the policy change, fake news headlines were among the top stories.
This kind of action and reaction puts Facebook in a difficult spot. It wants to remain an open space for all viewpoints, but it also appears to have increasing editorial tasks.
Facebook may claim to be a technology company, but according to Pavlik, they are functionally a media company.
"I think they function as a media company, but they don't want to admit they're a media company," he said. "I think they're trying to avoid that responsibility."
Facebook's Zuckerberg has repeatedly emphasized that the platform does not seek to become a mediator for truth or fact. According to Facebook, they "take misinformation "seriously," but the CEO stops far short of making any kind of commitment to seek out truth and facts, saying "our goal is to show people the content they will find most meaningful."
Despite the pervasiveness of false or misleading news, no source has cornered the market on truth.
"The bigger problem is that people don't trust mainstream media, so they are looking for news that fits with their agenda -- and that often means spreading fake news because it sounds right to their worldview," Glaser said. If mainstream media wants to repair its reputation, it has to clean house as well, providing transparent, fact-based news "without spreading the latest rumors in a search for more page views."