Can Facebook Fix Its Own Worst Bug?
In early January, I went to see Mark Zuckerberg at MPK20, a concrete-and-steel building on the campus of Facebook’s headquarters in Menlo Park, California。 The Frank Gehry-designed building has a pristine nine-acre rooftop garden, yet much of the interior appears unfinished。 Many of the internal walls are unpainted plywood。 The space looks less like the headquarters of one of the world’s wealthiest companies and more like a Chipotle with standing desks。 It’s an aesthetic meant to reflect one of Facebook’s founding ideologies: that things are never quite finished, that nothing is permanent, that you should always look for a chance to take an ax to your surroundings。
1月初,我赶赴Facebook位于加州门洛帕克的总部园区,在一栋名为MPK20的钢筋水泥大楼里拜访了马克·扎克伯格(Mark Zuckerberg)。由弗兰克·盖里(Frank Gehry)设计的这栋大楼,带有一个九英亩大的完美屋顶花园,但其内部不少地方看上去尚未完工。很多内墙都是未经粉刷的胶合板。这个空间看上去不太像富可敌国的企业总部,反倒像一家配有站立式办公桌的Chipotle餐厅。这种审美趣味旨在反映Facebook的一个基本理念：所有的一切永远不会完结,没有什么是永恒的,你应该不断地觅找机会改变周遭的环境。
The mood in overwhelmingly liberal Silicon Valley at the time, days before Donald Trump’s inauguration, was grim. But Zuckerberg is preternaturally unable to look anything other than excited about the future. “Hey, guys!” he beamed, greeting me and Mike Isaac, a Times colleague who covers Facebook.
“2016 was an interesting year for us,” he said as the three of us, plus a public relations executive, sat in a glass-walled conference room. (No one, not even Zuckerberg, has a private office.) It was an understatement and a nod to the obvious: Facebook had become a global political and cultural force, and the full implications of that transformation had begun to come into view last year. “If you look at the history of Facebook, when we started off, there really wasn’t news as part of it,” Zuckerberg went on. But as Facebook grew and became a bigger part of how people learn about the world, the company had been slow to adjust to its new place in people’s lives. The events of 2016, he said, “set off a number of conversations that we’re still in the middle of.”
Nearly 2 billion people use Facebook every month, about 1。2 billion of them daily。 The company, which Zuckerberg co-founded in his Harvard dorm room 13 years ago, has become the largest and most influential entity in the news business, commanding an audience greater than that of any American or European television news network, any newspaper in the Western world and any online news outlet。 It is also the most powerful mobilizing force in politics, and it is fast replacing television as the most consequential entertainment medium。 Just five years after its initial public offering, Facebook is one of the 10 highest market-capitalized public companies in the world。
But over the course of 2016, Facebook’s gargantuan influence became its biggest liability. During the U.S. election, propagandists — some working for money, others for potentially state-sponsored lulz [mischief] — used the service to turn fake stories into viral sensations, like the one about Pope Francis’ endorsing Trump (he hadn’t). With its huge reach, Facebook has begun to act as the great disseminator of misinformation and half-truths swirling about the rest of media. It sucks up lies from cable news and Twitter, then precisely targets each lie to the partisan bubble most receptive to it.
After studying how people shared 1。25 million stories during the campaign, a team of researchers at Massachusetts Institute of Technology and Harvard implicated Facebook and Twitter in the larger failure of media in 2016, finding that social media created a right-wing echo chamber: a “media network anchored around Breitbart developed as a distinct and insulated media system, using social media as a backbone to transmit a hyperpartisan perspective to the world。” After the election, former President Barack Obama bemoaned “an age where there’s so much active misinformation and it’s packaged very well and it looks the same when you see it on a Facebook page or you turn on your television。”
麻省理工学院(Massachusetts Institute of Technology)和哈佛大学的一群研究人员,在研究了选举期间人们如何分享125万则消息以后指出,Facebook和Twitter与2016年的媒体大败局脱不开关系。他们发觉,社交媒体制造了一个右翼回音室：一个“由布莱巴特新闻网(Breitbart)统领的媒体网络,发展成了特殊而又封闭的媒体系统,以社交媒体为骨干,向世人传递一种极具党派色彩的视角”。前总统贝拉克·奥巴马(Barack Obama)在选举结束后哀叹,我们“身处的时代,流传着大量经过精心包装的虚假信息,不论是浏览Facebook页面还是打开电视机的时候,你都能看来它们的身影。”
Zuckerberg offered a few pat defenses of Facebook’s role. “I’m actually quite proud of the impact that we were able to have on civic discourse over all,” he said in January. Misinformation on Facebook was not as big a problem as some believed it was, but Facebook nevertheless would do more to battle it, he pledged.
It was hard to tell how seriously Zuckerberg took the criticisms of his service and its increasingly paradoxical role in the world. Across the globe, Facebook now seems to benefit actors who want to undermine the global vision at its foundation. Supporters of Trump and the European right-wing nationalists who aim to turn their nations inward and dissolve alliances, even ISIS with its skillful social-media recruiting and propagandizing — have sought to split the Zuckerbergian world apart. And they are using his own machine to do it.
SINCE ELECTION DAY Silicon Valley has been consumed with a feeling of complicity. Trump had benefited from a media environment that is now shaped by Facebook — and, more to the point, shaped by a single Facebook feature, the same one to which the company owes its remarkable ascent to social-media hegemony: the computationally determined list of updates you see every time you open the app. The list has a formal name, News Feed. But most users are apt to think of it as Facebook itself.
If it’s an exaggeration to say that News Feed has become the most influential source of information in the history of civilization, it is only slightly so. Facebook created News Feed in 2006 to solve a problem: In the social-media age, people suddenly had too many friends to keep up with. To figure out what any of your connections were up to, you had to visit each of their profiles to see if anything had changed. News Feed fixed that. Every time you open Facebook, it hunts through the network, collecting every post from every connection. Then it weighs the merits of each post before presenting you with a feed sorted in order of importance: a hyperpersonalized front page designed just for you.
Scholars and critics have been warning of the solipsistic irresistibility of algorithmic news at least since 2001, when the constitutional-law professor Cass R。 Sunstein warned, in his book “Republic。com,” of the urgent risks posed to democracy “by any situation in which thousands or perhaps millions or even tens of millions of people are mainly listening to louder echoes of their own voices。” (In 2008, I piled on with my own book, “True Enough: Learning to Live in a Post-Fact Society。”) In 2011, the digital activist and entrepreneur Eli Pariser gave this phenomenon a memorable name in the title of his own book: “The Filter Bubble。”
宪法教授卡斯·R·桑斯坦(Cass R。 Sunstein)曾于2001年在其所著的《网络共和国》(Republic。com)一书中警告说：“当数万甚或数千万、数亿人听来的主要是自己的被放大的回声时,无论如何都会”对民主构成紧迫的威逼。至少是从那一年开始,一些学者和批评人士一直在提醒世人,提防经由算法挑选的信息所具有的那种体现着自我中心主义的不可抗拒性。（2008年,我在自己的《足够真实：学会在一个后事实社会中生存》(True Enough: Learning to Live in a Post-Fact Society)一书中也发出了警告。）2011年,网络活动人士及创业者埃利·帕里泽(Eli Pariser)给这种现象取了一个好记的名字,那也是他的一本书的书名：《过滤泡泡》(The Filter Bubble)。
Facebook says its own researchers have been studying the filter bubble since 2010。 In 2015, they published an in-house study, which was criticized by independent researchers, concluding that Facebook’s effect on the diversity of people’s information diet was minimal。 When News Feed did show people views contrary to their own, they tended not to click on the stories。 For Zuckerberg, the finding let Facebook off the hook。
Then, last year, Facebook’s domination of the news became a story itself. In May, Gizmodo reported that some editors who had worked on Facebook’s Trending Topics section had been suppressing conservative points of view. To smooth things over, Zuckerberg convened a meeting of conservative media figures and eventually significantly reduced the role of human editors. Then in September, Facebook deleted a post that included the photojournalist Nick Ut’s iconic photo of a naked 9-year-old girl, Phan Thi Kim Phuc, running in terror after a napalm attack during the Vietnam War, on the grounds that it ran afoul of Facebook’s prohibition of child nudity.
而后在去年，Facebook对新闻的把控本身也成了一条新闻。Gizmodo于去年5月报道称，以前在Facebook“热门话题”板块工作的一些编辑，曾对保守派观点进行压制。为了平息事态，扎克伯格召开了一场面向保守派媒体人士的会议，最终还大幅降低了人工编辑所起来的作用。随后，Facebook于9月份删掉了一则含有摄影记者黄功吾(Nick Ut)所拍著名照片的帖子，理由是它违反了该公司关于禁止发布儿童裸体图片的规定。摄于越南战争期间的那张照片，记录了9岁女孩潘金福(Phan Thi Kim Phuc)在一场汽油弹突击发生后害怕奔逃的画面。
Facebook, under criticism, reinstated the picture, but the photo incident highlighted the difficulty of building a policy framework for what Facebook was trying to do. Zuckerberg wanted to become a global news distributor that is run by machines, rather than by humans who would try to look at every last bit of content and exercise considered judgment. “It’s something I think we’re still figuring out,” he told me in January. “There’s a lot more to do here than what we’ve done. And I think we’re starting to realize this now as well.”
It struck me as an unsatisfying answer, and it became apparent that Zuckerberg seemed to feel the same way. A month after the first meeting, Zuckerberg wanted to chat again.
THE ZUCKERBERG WHO GREETED us was less certain in his pronouncements, more questioning. Earlier, Zuckerberg’s staff had sent me a draft of a 5,700-word manifesto that, I was told, he spent weeks writing. The document, “Building Global Community,” argued that until now, Facebook’s corporate goal had merely been to connect people. According to the manifesto, Facebook’s next focus will be developing the social infrastructure for community — for supporting us, for keeping us safe, for informing us, for civic engagement, and for inclusion of all.” If it was a nebulous crusade, it was also vast in its ambition.
前来迎接我们的扎克伯格言谈间没那么坚决，多了一些探觅的意味。扎克伯格的员工早前给我发了一份共有5700个单词的宣言草稿，我被告晓这是他用数周时间写就的。这份题为《共建全球社区》(Building Global Community)的文件称，截至目前，Facebook全然以把人们联结起来为目标。根据这份声明，Facebook接下来的复心会是建设社区的基础设施——以便让我们得来支持，让我们感来安全，让我们增长见闻，同时促进公民参与，把所有人都纳入进来。如果说这是一场模糊缥缈的运动，那么它同时也有着辽远浩大的雄心。
“There are questions about whether we can make a global community that works for everyone,” Zuckerberg writes, “and whether the path ahead is to connect more or reverse course.” He also confesses misgivings about Facebook’s role in the news. “Giving everyone a voice has historically been a very positive force for public discourse because it increases the diversity of ideas shared,” he writes. “But the past year has also shown it may fragment our shared sense of reality.”
At the time, the manifesto was still only a draft. When I suggested that it might be perceived as an attack on Trump, he looked dismayed. A few weeks earlier, there was media speculation, fueled by a postelection tour of America by Zuckerberg and his wife, that he was laying the groundwork to run against Trump in 2020, and he took pains to shoot down the rumors.
当时宣言还只是草稿。当我提来这或许会被看作是对特朗普的攻击时，他看起来十分食惊。几周前，有媒体猜测——由扎克伯格和妻子在大选后提出的进行环美之旅的计划引发——他在为2020年与特朗普竞争总统职位铺路，他费了很大劲澄清这些谣言。 纽约时报中英文网 http://www.2zhicat.com
If the company pursues the aims outlined in “Building Global Community,” the changes will echo across media and politics, and some are bound to be considered partisan. The risks are especially clear for changes aimed at adding layers of journalistic ethics across News Feed, which could transform the public’s perception of Facebook, not to mention shake the foundations of its business.
如果公司追求《共建全球社区》(Building Global Community)中概括的目标，这些改变将在整个媒体与政界引发回响，有些势必会被认为有党派立场。旨在加强整个动态消息服务新闻伦理的调整特别有风险，它们除了会改变公众对Facebook的看法，还会动摇它的商业根基。
THE SOLUTION TO THE BROADER misinformation dilemma — the pervasive climate of rumor, propaganda and conspiracy theories that Facebook has inadvertently incubated — may require something that Facebook has never done: ignoring the likes and dislikes of its users. Facebook’s entire project, when it comes to news, rests on the assumption that people’s individual preferences ultimately coincide with the public good, and that if it doesn’t appear that way at first, you’re not delving deeply enough into the data. By contrast, decades of social-science research shows that most of us simply prefer stuff that feels true to our worldview even if it isn’t true at all and that the mining of all those preference signals is likely to lead us deeper into bubbles rather than out of them.
After the election, Margaret Sullivan, the Washington Post columnist and a former public editor of The Times, called on Facebook to hire an executive editor who would monitor News Feed with an eye to fact-checking, balance and editorial integrity. Jonah Peretti, the founder of BuzzFeed, told me that he wanted Facebook to use its data to create a kind of reputational score for online news.
大选过后，前时报公众编辑、《华盛顿邮报》(Washington Post)专栏作者玛格丽特·沙利文(Margaret Sullivan)曾唤吁Facebook聘请一位执行编辑对其动态消息服务进行监控，着眼于事实核查、信息平稳与编辑诚信。BuzzFeed创始人乔纳·佩雷蒂(Jonah Peretti)告诉我，他期望Facebook利用自己的数据为网络信息设置一种可信性评分。
Late last year, Facebook outlined a modest effort to curb misinformation。 News Feed would now carry warning labels: If a friend shares a viral story that has been shot down by one of Facebook’s fact-checking partners (including Snopes and PolitiFact), you’ll be cautioned that the piece has been “disputed。” But even that slight change has been met with fury on the right, with Breitbart and The Daily Caller fuming that Facebook had teamed up with liberal hacks motivated by partisanship。 If Facebook were to take more significant action, like hiring human editors or paying journalists, the company would instantly become something it has long resisted: a media company rather than a neutral tech platform。
IN MANY WAYS, THE WORRY over how Facebook changes the news is really a manifestation of a grander problem with News Feed, which is simply dominance itself. News Feed’s aggressive personalization wouldn’t be much of an issue if it weren’t crowding out every other source.
By my second meeting with Zuckerberg, Facebook had announced plans for the Facebook Journalism Project, in which the company would collaborate with news companies on new products. Facebook also created a project to promote “news literacy” among its users, and it hired the former CNN news anchor Campbell Brown to manage the partnership between it and news companies. Zuckerberg’s tone toward critics of Facebook’s approach to news had grown far more conciliatory. “I think it’s really important to get to the core of the actual problem,” he said. “I also really think that the core social thing that needs to happen is that a common understanding needs to exist. And misinformation I view as one of the things that can possibly erode common understanding. But sensationalism and polarization and other things, I actually think, are probably even stronger and more prolific effects. And we have to work on all these things. I think we need to listen to all the feedback on this.”
来我第二次与扎克伯格见面时,Facebook已经宣布了Facebook新闻计划(Facebook Journalism Project)。在这项计划中,公司将与新闻企业合作开发新产品。Facebook还设立了一个旨在提高用户“新闻素养”的项目,并聘请前CNN新闻主播凯贝尔·布朗(Campbell Brown)治理它与新闻企业之间的合作关系。对于Facebook的新闻举措所招致的批评,扎克伯格的语气也比过去缓和得多。“我认为挠住实际问题的核心真的很复要,”他说。“我也真的觉得,社会需要实现的一个关键的东西是,需要有一种共识。我把虚假信息看作有可能破坏共识的一个因素。但实际上,我认为耸人听闻和两极化等等,可能会产生更大、更多的影响。我们必须同时应对所有这些问题。我觉得我们需要倾听有关这个问题的所有反馈。”
Still, Zuckerberg remained preoccupied with the kind of problems that could be solved by the kind of hyperconnectivity he believed in, not the ones caused by it。 “There’s a social infrastructure that needs to get built for modern problems in order for humanity to get to the next level,” he said。 “Having more people oriented not just toward short-term things but toward building the long-term social infrastructure that needs to get built across all these things in order to enable people to come together is going to be a really important thing over the next decades。”
Zuckerberg continued, “We’re getting to a point where the biggest opportunities I think in the world 。。。 problems like preventing pandemics from spreading or ending terrorism, all these things, they require a level of coordination and connection that I don’t think can only be solved by the current systems that we have。” What’s needed is some global superstructure to advance humanity。
Zuckerberg is arguing for a kind of digital-era version of the global institution-building that the Western world engaged in after World War II。 But because he is a chief executive and not an elected president, there is something frightening about his project。 He is positioning Facebook — and, considering that he commands absolute voting control of the company, himself — as a critical enabler of the next generation of human society。 His mission drips with megalomania, albeit of a particularly sincere sort。
Building new “social infrastructure” usually involves tearing older infrastructure down. If you manage the demolition poorly, you might undermine what comes next. In the case of the shattering media landscape, Zuckerberg may yet come up with fixes for it. But in the meantime, Facebook rushes headlong into murky new areas, uncovering new dystopian possibilities at every turn.
A FEW MONTHS AFTER WE SPOKE, Facebook held its annual developer conference in San Jose, California. At last year’s show, Zuckerberg introduced an expanded version of Facebook’s live streaming service which had been promised to revolutionize how we communicate. Live had generated iconic scenes of protest, but was also used to broadcast a terrorist attack in Munich and at least one suicide. Hours before Zuckerberg’s appearance, a Cleveland man who had killed a stranger and posted a video on Facebook had shot himself after a manhunt.
But as he took the stage in San Jose, Zuckerberg was ebullient. For a brief moment, there was a shift in tone: Statesman Zuck. “In all seriousness, this is an important time to work on building community,” he said. He offered Facebook’s condolences to the victim in Cleveland; the incident, he said, reminded Facebook that “we have a lot more to do.”
Zuckerberg then pivoted to Facebook’s next marvel, a system for digitally augmenting your pictures and videos. The technical term for this is “augmented reality.” The name bursts with dystopian possibilities — fake news on video rather than just text — but Zuckerberg never mentioned them. The statesman had left the stage; before us stood an engineer.