Posted: 17 Nov 2016 10:49 AM PST
As critics slam Facebook for the role they believe it—and in particular its penchant for fake news stories—played in the election of Donald Trump, CEO Mark Zuckerberg continues to resist any attempt to pin some of the blame on his company. But in doing so, he misses the point. Over the weekend, the Facebook co-founder took to the site to respond to some of those criticisms. He said he “cares deeply” about the fake news problem and wants to get it right. But he also said that he doesn’t believe fake news contributed to the election’s outcome. “Of all the content on Facebook, more than 99% of what people see is authentic,” Zuckerberg wrote. And since only a very small amount of those hoaxes relate to politics, and an even smaller number related to Clinton specifically, he argued that “this makes it extremely unlikely hoaxes changed the outcome of this election in one direction or the other.” The Facebook CEO also scoffed at the suggestion that the company is actually a media entity, or that it should behave like one, something many have argued for some time. “Facebook is mostly about helping people stay connected with friends and family. News and media are not the primary things people do on Facebook, so I find it odd when people insist we call ourselves a news or media company,” Zuckerberg said. “We are serious about building planes to beam internet access, but we don’t call ourselves an aerospace company.” Zuckerberg responded to the “filter bubble” argument by saying the company’s research shows that people “are exposed to more diverse content on Facebook and social media than on traditional media like newspapers and TV.” In other words, filter bubbles used to be worse. All of these arguments are true to some extent, but they still miss the larger point. And if anything, they reinforce the idea that Zuckerberg is desperately trying to avoid responsibility for the way Facebook influences the media consumption of its users. Even some of his own staffers are wrestling with this problem, according to a New York Times report. As a number of people have pointed out, the social network can’t have it both ways. It can’t argue that Facebook is hugely influential for advertisers, or that it plays a key role in social movements like the Arab Spring, and yet also argue that fake news has no effect on users. To borrow Zuckerberg’s aerospace analogy, if Facebook flew planes and 1% of them crashed and killed all passengers on impact, most people probably wouldn’t accept the argument that this is a small number, and that in any case it isn’t really an airplane company because its real focus is the technology that keeps them from crashing most of the time. In a sense, debating whether Facebook should call itself a media company or not is a red herring. Whatever it calls itself, it is one of the largest distributors of news that has ever existed. This problem doesn’t just affect Facebook either. Google has also come under fire for pointing to fake news sites in Google News, including one site that topped its election results but had the wrong numbers. The company said it is “looking into” what happened. In today’s news environment, traditional outlets have lost control over the distribution of their content, and platforms like Facebook and Google have taken over. In the process, they have become the most powerful media companies in history. And that brings with it certain responsibilities. It’s obvious why Facebook doesn’t want to admit that it has these responsibilities, of course. As a prominent Silicon Valley venture capitalist and Facebook investor told my colleague Jeff Roberts recently, admitting this would open the company up to an avalanche of criticism for being biased on one side or the other. And that would be a recipe for disaster. That’s exactly what happened during the Trending Topics fiasco, when editors responsible for weeding out bad sites were accused of bias against conservative sites. Facebook’s response was to fire them all and sweep the problem under the rug by pretending it could all be done algorithmically. According to a report from Gizmodo, Facebook’s engineers actually developed a way of using its algorithms to detect and block fake news, but a test of the feature found that a disproportionate number of these stories were from right-wing sites. Because the social network was afraid of more backlash over allegations of bias, Gizmodo says the feature was never released (Facebook has since denied that this occurred). In his post, Zuckerberg said the company is working on ways for users to flag fake news. But are users who see fake articles that reinforce their beliefs going to flag them? That seems unlikely. The core problem for Facebook is that what it wants more than anything else is engagement, and as a former Facebook staffer argued recently, “bullshit is highly engaging.” That means despite Zuckerberg’s protest that he cares deeply about the problem, finding a solution is never going to be as economically advantageous as not finding a solution. One other point: The Facebook CEO says he doesn’t want his company to become “arbiters of truth.” But the reality is—as technology publishers Tim O’Reilly pointed out in a recent blog post—that the social network is already doing this on a regular basis by deleting posts and censoring content it deems unacceptable. Figuring out what “fake news” consists of and flagging it in some way, or even down-ranking it in the news feed, may be a difficult technological challenge. And it may expose Facebook to charges of being biased in one direction or another. But the status quo is not going to work. Regardless of whether he personally wants to grapple with this thorny problem or not, Zuckerberg will have to admit that Facebook is a large and increasingly powerful player in the news business, and that certain duties and responsibilities come along with that. And the sooner he does so, the better. This article originally appeared on Fortune.com |
Monday, November 21, 2016
Mark Zuckerberg Continues to Miss the Point on Facebook as a Media Entity - Fortune
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment