The streets of Davos, Switzerland, were iced over on the night of January 25, 2018, which added a slight element of danger to the prospect of trekking to the Hotel Seehof for George Soros’ annual banquet. The aged financier has a tradition of hosting a dinner at the World Economic Forum, where he regales tycoons, ministers, and journalists with his thoughts about the state of the world. That night he began by warning in his quiet, shaking Hungarian accent about nuclear war and climate change. Then he shted to his next idea of a global menace: Google and Facebook. “Mining and oil companies exploit the physical environment; social companies exploit the social environment,” he said. “The owners of the platform giants consider themselves the masters of the universe, but in fact they are slaves to preserving their dominant position … Davos is a good place to announce that their days are numbered.”

Across town, a group of senior executives, including COO Sheryl Sandberg and vice president of global s Elliot Schrage, had set up a temporary headquarters near the base of the mountain where Thomas Mann put his fictional sanatorium. The world’s biggest companies often establish receiving rooms at the world’s biggest elite confab, but this year ’s pavilion wasn’t the ual scene of airy bonhomie. It was more like a bunker—one that saw a succession of tense meetings with the same tycoons, ministers, and journalists who had nodded along to Soros’ broadside.

Over the previo year ’s stock had gone up as ual, but its reputation was rapidly sinking toward junk bond stat. The world had learned how Rsian intelligence operatives ed the platform to manipulate US voters. Genocidal monks in Myanmar and a despot in the Philippines had taken a liking to the platform. Mid-level employees at the company were getting both crankier and more empowered, and critics everywhere were arguing that ’s tools fostered tribalism and outrage. That argument gained credence with every utterance of Donald Trump, who had arrived in Davos that morning, the outrageo tribalist skunk at the globalists’ garden party.

May 2019. Subscribe to WIRED.

Frank J. Guzzone

Mark Zuckerberg had recently pledged to spend 2018 trying to fix . But even the company’s nascent attempts to reform itself were being scrutinized as a possible declaration of war on the institutions of democracy. Earlier that month had unveiled a major change to its News Feed rankings to favor what the company called “meaningful social interactions.” Feed is the core of —the central stream through which flow baby pictures, press reports, New Age koans, and Rsian-­made memes showing Satan endorsing Hillary Clinton. The changes would favor interactions between friends, which meant, among other things, that they would disfavor stories published by companies. The company promised, though, that the blow would be softened somewhat for local and publications that scored high on a user-driven metric of “trustworthiness.”

Davos provided a first chance for many executives to confront ’s leaders about these changes. And so, one by one, testy publishers and editors trudged down Davos Platz to ’s headquarters throughout the week, ice cleats attached to their boots, seeking clarity. had become a capricio, godlike force in the lives of organizations; it fed them about a third of their referral traffic while devouring a greater and greater share of the advertising revenue the indtry relies on. And now this. Why? Why would a company beset by fake stick a kne into real ? And what would ’s algorithm deem trtworthy? Would the executives even get to see their own scores?

didn’t have ready answers to all of these questions; certainly not ones it wanted to give. The last one in particular—about trtworthiness scores—quickly inspired a heated debate among the company’s executives at Davos and their colleagues in Menlo Park. Some leaders, including Schrage, wanted to tell publishers their scores. It was only fair. Also in agreement was Campbell Brown, the company’s chief liaison with publishers, whose job description includes absorbing some of the impact when and the indtry crash into one another.

But the engineers and product managers back at home in Calornia said it was folly. Adam Mosseri, then head of Feed, argued in emails that publishers would the system they knew their scores. Pl, they were too unsophisticated to understand the methodology, and the scores would constantly change anyway. To make matters worse, the company didn’t yet have a reliable measure of trtworthiness at hand.

Heated emails flew back and forth between Switzerland and Menlo Park. Solutions were proposed and shot down. It was a classic dilemma. The company’s algorithms embraid choices so complex and interdependent that it’s hard for any human to get a handle on it all. you explain some of what is happening, people get confed. They also tend to obsess over tiny factors in huge equations. So in this case, as in so many others over the years, chose opacity. Nothing would be revealed in Davos, and nothing would be revealed afterward. The execs would walk away unsatisfied.

Inside the Two Years That Shook —and the World

  • Let Dozens of Cybercrime Groups Operate in Plain Sight


  • ’s Sloppy Data-Sharing Deals Might Be Criminal


After Soros’ speech that Thursday night, those same editors and publishers headed back to their hotels, many to write, edit, or at least read all the pouring out about the billionaire’s tirade. The words “their days are numbered” appeared in after . The next day, Sandberg sent an email to Schrage asking he knew whether Soros had shorted ’s stock.

Far from Davos, meanwhile, ’s product engineers got down to the precise, algorithmic of implementing Zuckerberg’s vision. you want to promote trtworthy for billions of people, you first have to specy what is trtworthy and what is . was having a hard time with both. To define trtworthiness, the company was testing how people responded to surveys about their impressions of dferent publishers. To define , the engineers pulled a classication system left over from a previo project—one that pegged the category as stories involving “politics, crime, or tragedy.”

That particular choice, which meant the algorithm would be less kind to all kinds of other —from health and to and —wasn’t something execs discsed with leaders in Davos. And though it went through reviews with senior managers, not everyone at the company knew about it either. When one executive learned about it recently in a briefing with a lower-­level engineer, they say they “nearly fell on the fucking floor.”

The confing rollout of meaningful social interactions—marked by internal dissent, blistering external criticism, genuine efforts at reform, and foolish mistakes—set the stage for ’s 2018. This is the story of that annus horribilis, based on interviews with 65 current and former employees. It’s ultimately a story about the biggest shts ever to take place inside the world’s biggest social network. But it’s also about a company trapped by its own pathologies and, perversely, by the inexorable logic of its own recipe for success.

’s powerful network effects have kept advertisers from fleeing, and overall er numbers remain healthy you include people on Insta­gram, which owns. But the company’s original culture and mission kept creating a set of brutal debts that came due with regularity over the past 16 months. The company floundered, dissembled, and apologized. Even when it told the truth, people didn’t believe it. Critics appeared on all sides, demanding changes that ranged from the essential to the contradictory to the impossible. As crises multiplied and diverged, even the company’s own solutions began to cannibalize each other. And the most crucial episode in this story—the crisis that cut the deepest—began not long after Davos, when some reporters from The New York Times, The Guardian, and Britain’s Channel 4 came calling. They’d learned some troubling things about a shady British company called Cambridge Analytica, and they had some questions.


It was, in some ways, an old story. Back in 2014, a young academic at Cambridge University named Aleksandr Kogan built a personality questionnaire app called ­thisisyourdigitalle. A few hundred thond people signed up, giving Kogan access not only to their data but also—becae of ’s loose policies at the time—to that of up to 87 million people in their combined friend . Rather than simply e all of that data for research purposes, which he had permission to do, Kogan passed the trove on to , a strategic consulting firm that talked a big about its ability to model and manipulate human behavior for political clients. In December 2015, The Guardian reported that had ed this data to help Ted Cruz’s presidential campaign, at which point demanded the data be deleted.

This much knew in the early months of 2018. The company also knew—becae everyone knew—that had gone on to work with the Trump campaign after Ted Cruz dropped out of the race. And some people at worried that the story of their company’s relationship with was not over. One former s official remembers being warned by a manager in the summer of 2017 that unresolved elements of the story remained a grave vulnerability. No one at , however, knew exactly when or where the unexploded ordnance would go off. “The company doesn’t know yet what it doesn’t know yet,” the manager said. (The manager now denies saying so.)

The company first heard in late February that the Times and The Guardian had a story coming, but the department in charge of formulating a response was a hoe divided. In the fall, had hired a brilliant but fiery veteran of indtry PR named Rachel Whetstone. She’d come over from Uber to run s for ’s WhatsApp, Insta­gram, and Messenger. Soon she was ing with Zuckerberg for public events, joining Sandberg’s senior management meetings, and making decisions—like picking which outside public relations firms to cut or retain—that normally would have rested with those officially in charge of ’s 300-person s shop. The staff quickly sorted into fans and haters.

And so it was that a confed and fractio s team huddled with management to debate how to respond to the Times and Guardian reporters. The standard approach would have been to correct mis or errors and spin the company’s side of the story. ultimately chose another tack. It would front-run the press: dump a bunch of out in public on the eve of the stories’ publication, hoping to upstage them. It’s a tactic with a short-term benefit but a long-term cost. Investigative journalists are like pit bulls. Kick them once and they’ll never trt you again.

’s decision to take that risk, according to multiple people involved, was a close call. But on the night of Friday, March 16, the company announced it was suspending Cambridge Analytica from its platform. This was a fateful choice. “It’s why the Times hates ,” one senior executive says. Another s official says, “For the last year, I’ve had to talk to reporters worried that we were going to front-run them. It’s the worst. Whatever the calcul, it wasn’t worth it.”

The tactic also didn’t work. The next day the story—foced on a charismatic whistle-­blower with pink hair named Christopher Wylie—exploded in Europe and the . Wylie, a former employee, was claiming that the company had not deleted the data it had taken from and that it may have ed that data to swing the n presidential election. The first sentence of The Guardian’s reporting blared that this was “one of the giant’s biggest ever data breaches” and that had ed the data “to build a powerful program to predict and influence choices at the ballot box.”

The story was a witch’s brew of Rsian operatives, violations, confing data, and Donald Trump. It touched on nearly all the fraught issues of the moment. Politicians called for regulation; ers called for boycotts. In a day, lost $36 billion in its market cap. Becae many of its employees were compensated based on the stock’s performance, the drop did not go unnoticed in Menlo Park.

To this emotional story, had a programmer’s rational response. Nearly every fact in The Guardian’s opening paragraph was misleading, its leaders believed. The company hadn’t been breached—an academic had fairly downloaded data with permission and then unfairly handed it off. And the that built was not powerful, nor could it predict or influence choices at the ballot box.

But none of that mattered. When a executive named Alex Stamos tried on Twitter to argue that the word breach was being mised, he was swatted down. He soon deleted his tweets. His position was right, but who cares? someone points a gun at you and holds up a sign that says hand’s up, you shouldn’t worry about the apostrophe. The story was the first of many to illuminate one of the central ironies of ’s struggles. The company’s algorithms helped stain a ecosystem that prioritizes outrage, and that ecosystem was learning to direct outrage at .

As the story spread, the company started melting down. Former employees remember scenes of chaos, with exhated executives slipping in and out of Zuckerberg’s private conference room, known as the Aquarium, and Sandberg’s conference room, whose name, Only Good , seemed increasingly incongruo. One employee remembers cans and snack wrappers everywhere; the door to the Aquarium would crack open and you could see people with their heads in their hands and feel the warmth from all the body heat. After saying too much before the story ran, the company said too little afterward. Senior managers begged Sandberg and Zuckerberg to publicly confront the issue. Both remained publicly silent.

“We had hundreds of reporters flooding our inboxes, and we had nothing to tell them,” says a member of the s staff at the time. “I remember walking to one of the cafeterias and overhearing other ers say, ‘Why aren’t we saying anything? Why is nothing happening?’ ”

According to numero people who were involved, many factors contributed to ’s baffling decision to stay mute for five days. Executives didn’t want a repeat of Zuckerberg’s ignominio performance after the 2016 election when, mostly off the cuff, he had proclaimed it “a pretty crazy idea” to think fake had affected the result. And they continued to believe people would figure out that ’s data had been eless. According to one executive, “You can jt buy all this fucking stuff, all this data, from the third-party ad that are tracking you all over the planet. You can get way, way, way more -­violating data from all these data brokers than you could by stealing it from .”

“Those five days were very, very long,” says Sandberg, who now acknowledges the delay was a mistake. The company became paralyzed, she says, becae it didn’t know all the facts; it thought had deleted the data. And it didn’t have a specic problem to fix. The loose policies that allowed Kogan to collect so much data had been tightened years before. “We didn’t know how to respond in a system of imperfect ,” she says.

’s other problem was that it didn’t understand the wealth of antipathy that had built up against it over the previo two years. Its prime decisionmakers had run the same playbook successfully for a decade and a half: Do what they thought was best for the platform’s growth (often at the expense of er ), apologize someone complained, and keep phing forward. Or, as the old slogan went: Move fast and break things. Now the public thought had broken Western democracy. This violation—unlike the many others before it—wasn’t one that people would simply get over.

Finally, on Wednesday, the company decided Zuckerberg should give a interview. After snubbing CBS and PBS, the company summoned a CNN reporter who the s staff trted to be reasonably kind. The network’s camera crews were treated like potential spies, and one s official remembers being required to monitor them even when they went to the bathroom. ( now says this was not company protocol.) In the interview itself, Zuckerberg apologized. But he was also specic: There would be audits and much more restrictive rules for anyone wanting access to data. would build a tool to let users know their data had ended up with . And he pledged that would make sure this kind of debacle never happened again.

A flurry of other interviews followed. That Wednesday, WIRED was given a quiet heads-up that we’d get to chat with Zuckerberg in the late afternoon. At about 4:45 pm, his s chief rang to say he would be calling at 5. In that interview, Zuckerberg apologized again. But he brightened when he turned to one of the topics that, according to people close to him, truly engaged his imagination: ing AI to keep humans from polluting . This was less a response to the scandal than to the backlog of acctions, gathering since 2016, that had become a cesspool of toxic virality, but it was a problem he actually enjoyed figuring out how to solve. He didn’t think that AI could completely eliminate hate speech or nudity or spam, but it could get close. “My understanding with food safety is there’s a certain amount of dt that can get into the chicken as it’s going through the processing, and it’s not a large amount—it needs to be a very small amount,” he told WIRED.

The interviews were jt the warmup for Zuckerberg’s next gauntlet: A set of public, televised appearances in April before three congressional committees to answer questions about and months of other scandals. Congresspeople had been calling on him to testy for about a year, and he’d successfully avoided them. Now it was time, and much of was terried about how it would go.

As it turned out, most of the lawmakers proved astonishingly uninformed, and the spent most of the day ably swatting back soft pitches. Back home, some employees stood in their cubicles and cheered. When a plodding Senator Orrin Hatch asked how, exactly, made money while offering its services for free, Zuckerberg responded confidently, “Senator, we run ads,” a phrase that was soon emblazoned on T-shirts in Menlo Park.

Adam Maida


The Saturday after the scandal broke, Sandberg told Molly Cutler, a top lawyer at , to create a crisis response team. Make sure we never have a delay responding to big issues like that again, Sandberg said. She put Cutler’s new desk next to hers, to guarantee Cutler would have no problem convincing division heads to work with her. “I started the role that Monday,” Cutler says. “I never made it back to my old desk. After a couple of weeks someone on the legal team messaged me and said, ‘You want to pack up your things? It seems like you are not coming back.’ ”

Then Sandberg and Zuckerberg began making a huge show of hiring humans to keep watch over the platform. Soon you couldn’t listen to a briefing or meet an executive without being told about the tens of thonds of content moderators who had joined the company. By the end of 2018, about 30,000 people were working on safety and , which is roughly the number of room employees at all the papers in the . Of those, about 15,000 are content reviewers, mostly contractors, employed at more than 20 giant review factories around the world.

was also working hard to create clear rules for enforcing its basic policies, effectively writing a constitution for the 1.5 billion daily ers of the platform. The instructions for moderating hate speech alone run to more than 200 pages. Moderators mt undergo 80 hours of training before they can start. Among other things, they mt be fluent in emoji; they study, for example, a document showing that a crown, roses, and dollar signs might mean a pimp is offering up prostitutes. About 100 people across the company meet every other Tuesday to review the policies. A similar group meets every Friday to review content policy enforcement screwups, like when, as happened in early July, the company flagged the Declaration of Independence as hate speech.

The company hired all of these people in no small part becae of pressure from its critics. It was also the company’s fate, however, that the same critics discovered that moderating content on can be a miserable, soul-scorching job. As Casey Newton reported in an investigation for the Verge, the average content moderator in a contractor’s outpost in Arizona makes $28,000 per year, and many of them say they have developed PTSD-like symptoms due to their work. Others have spent so much time looking through conspiracy theories that they’ve become believers themselves.

Ultimately, knows that the job will have to be done primarily by machines—which is the company’s preference anyway. Machines can browse porn all day without flatlining, and they haven’t learned to unionize yet. And so simultaneoly the company mounted a huge effort, led by CTO Mike Schroepfer, to create articial intelligence systems that can, at scale, identy the content that wants to zap from its platform, including spam, nudes, hate speech, ISIS propaganda, and videos of children being put in washing machines. An even trickier goal was to identy the stuff that wants to demote but not eliminate—like misleading clickbait crap. Over the past several years, the core AI team at has doubled in size annually.

Even a basic machine-learning system can pretty reliably identy and block pornography or images of graphic violence. Hate speech is much harder. A sentence can be hateful or prideful depending on who says it. “You not my , then you are done,” could be a death threat, an inspiration, or a lyric from Cardi B. Imagine trying to decode a similarly complex line in Spanish, Mandarin, or Burmese. False is equally tricky. doesn’t want lies or bull on the platform. But it knows that truth can be a kaleidoscope. Well-meaning people get things wrong on the internet; malevolent actors sometimes get things right.

Schroepfer’s job was to get ’s AI up to snuff on catching even these devilishly ambiguo forms of content. With each category the tools and the success rate vary. But the basic nique is roughly the same: You need a collection of data that has been categorized, and then you need to train the machines on it. For spam and nudity these databases already exist, created by hand in more innocent days when the threats online were fake Viagra and se memes, not Vladimir Putin and Nazis. In the other categories you need to construct the labeled data sets yourself—ideally without hiring an army of humans to do so.

One idea Schroepfer discsed enthiastically with WIRED involved starting off with jt a few examples of content identied by humans as hate speech and then ing AI to generate similar content and simultaneoly label it. Like a scientist bioengineering both rodents and rat terriers, this approach would e to both create and identy ever-more-complex slurs, insults, and racist crap. Eventually the terriers, specially trained on superpowered rats, could be set loose across all of .

The company’s efforts in AI that screens content were nowhere roughly three years ago. But quickly found success in classying spam and posts supporting terror. Now more than 99 percent of content created in those categories is identied before any human on the platform flags it. Sex, as in the rest of human le, is more complicated. The success rate for identying nudity is 96 percent. Hate speech is even tougher: finds jt 52 percent before ers do.

These are the kinds of problems that executives love to talk about. They involve math and logic, and the people who work at the company are some of the most logical you’ll ever meet. But was mostly a privacy scandal. ’s most visible response to it was to amp up content moderation aimed at keeping the platform safe and civil. Yet sometimes the two big values involved— and civility—come into opposition. you give people ways to keep their data completely secret, you also create secret tunnels where rats can scurry around undetected.

In other words, every choice involves a trade-off, and every trade-off means some value has been spurned. And every value that you spurn—particularly when you’re in 2018—means that a hammer is going to come down on your head.


Crises offer opportunities. They force you to make some changes, but they also provide cover for the changes you’ve long wanted to make. And four weeks after Zuckerberg’s testimony before Congress, the company initiated the biggest reshuffle in its history. About a dozen executives shted chairs. Most important, Chris Cox, longtime head of ’s core product—known internally as the Blue App—would now oversee WhatsApp and Insta­gram too. Cox was perhaps Zuckerberg’s closest and most trted confidant, and it seemed like succession planning. Adam Mosseri moved over to run product at Insta­gram.

Insta­gram, which was founded in 2010 by Kevin Systrom and Mike Krieger, had been acquired by in 2012 for $1 billion. The price at the time seemed ludicroly high: That much money for a company with 13 employees? Soon the price would seem ludicroly low: A mere billion dollars for the fastest-growing social network in the world? Internally, at first watched Insta­gram’s relentless growth with pride. But, according to some, pride turned to spicion as the pupil’s success matched and then surpassed the professor’s.

Systrom’s glowing press coverage didn’t help. In 2014, according to someone directly involved, Zuckerberg ordered that no other executives should sit for magazine profiles without his or Sandberg’s approval. Some people involved remember this as a move to make it harder for rivals to find employees to poach; others remember it as a direct effort to contain Systrom. Top executives at also believed that Insta­gram’s growth was cannibalizing the Blue App. In 2017, Cox’s team showed data to senior executives suggesting that people were sharing less inside the Blue App in part becae of Insta­gram. To some people, this sounded like they were simply presenting a problem to solve. Others were stunned and took it as a sign that management at cared more about the product they had birthed than one they had adopted.

Most of Insta­gram—and some of too—hated the idea that the growth of the photo-sharing app could be seen, in any way, as trouble. Yes, people were ing the Blue App less and Insta­gram more. But that didn’t mean Insta­gram was poaching ers. Maybe people leaving the Blue App would have spent their time on Snapchat or watching Netflix or mowing their lawns. And Insta­gram was growing quickly, maybe it was becae the product was good? Insta­gram had its problems—bullying, shaming, FOMO, propaganda, corrupt micro-­influencers—but its internal architecture had helped it avoid some of the demons that haunted the indtry. Posts are hard to reshare, which slows virality. External links are harder to embed, which keeps the fake- providers away. Minimalist also minimized problems. For years, Systrom and Krieger took pride in keeping Insta­gram free of hamburgers: icons made of three horizontal lines in the corner of a screen that open a menu. has hamburgers, and other men, all over the place.

Systrom and Krieger had also seemingly anticipated the lash ahead of their colleagues up the road in Menlo Park. Even before Trump’s election, Insta­gram had made fighting toxic comments its top priority, and it had rolled out an AI filtering system in June 2017. By the spring of 2018, the company was working on a product to alert ers that “you’re all caught up” when they’d seen all the new posts in their feed. In other words, “put your damn phone down and talk to your friends.” That may be a counterintuitive way to grow, but earning goodwill does help over the long run. And sacricing growth for other goals wasn’t ’s style at all.

By the time the scandal hit, Systrom and Krieger, according to people familiar with their thinking, were already worried that Zuckerberg was souring on them. They had been allowed to run their company reasonably independently for six years, but now Zuckerberg was exerting more control and making more requests. When conversations about the reorganization began, the Insta­gram founders phed to bring in Mosseri. They liked him, and they viewed him as the most trtworthy member of Zuckerberg’s inner circle. He had a background and a mathematical mind. They were losing autonomy, so they might as well get the most trted emissary from the mothership. Or as Lyndon Johnson said about J. Edgar Hoover, “It’s probably better to have him inside the tent pissing out than outside the tent pissing in.”

Meanwhile, the founders of WhatsApp, Brian Acton and Jan Koum, had moved outside of ’s tent and commenced fire. Zuckerberg had bought the encrypted messaging platform in 2014 for $19 billion, but the cultures had never entirely meshed. The two sides couldn’t agree on how to make money—WhatsApp’s end-to-end encryption wasn’t originally ed to support targeted ads—and they had other dferences as well. WhatsApp insisted on having its own conference rooms, and, in the perfect metaphor for the two companies’ diverging attitudes over , WhatsApp employees had special bathroom stalls ed with doors that went down to the floor, unlike the standard ones ed by the rest of .

Eventually the battles became too much for Acton and Koum, who had also come to believe that no longer intended to leave them alone. Acton quit and started funding a competing messaging platform called Signal. During the scandal, he tweeted, “It is time. #delete.” Soon afterward, Koum, who held a seat on ’s board, announced that he too was quitting, to play more Ultimate Frisbee and work on his collection of air-cooled Porsches.

The departure of the WhatsApp founders created a brief spasm of bad press. But now Acton and Koum were gone, Mosseri was in place, and Cox was running all three messaging platforms. And that meant could truly pursue its most ambitio and important idea of 2018: bringing all those platforms together into something new.


By the late spring, organizations—even as they jockeyed for scoops about the latest meltdown in Menlo Park—were starting to buckle under the pain caed by ’s algorithmic changes. Back in May of 2017, according to, drove about 40 percent of all outside traffic to publishers. A year later it was down to 25 percent. Publishers that weren’t in the category “politics, crime, or tragedy” were hit much harder.
Jake Rowland/Esto

At WIRED, the month after an image of a bruised Zuckerberg appeared on the cover, the numbers were even more stark. One day, traffic from suddenly dropped by 90 percent, and for four weeks it stayed there. After protestations, emails, and a raised eyebrow or two about the coincidence, finally got to the bottom of it. An ad run by a liquor advertiser, targeted at WIRED readers, had been mistakenly categorized as engagement bait by the platform. In response, the algorithm had let all the air out of WIRED’s tires. The publication could post whatever it wanted, but few would read it. Once the error was identied, traffic soared back. It was a reminder that journalists are jt sharecroppers on ’s giant farm. And sometimes conditions on the farm can change without warning.

Inside , of course, it was not surprising that traffic to publishers went down after the pivot to “meaningful social interactions.” That outcome was the point. It meant people would be spending more time on posts created by their friends and family, the genuinely unique content that offers. According to multiple employees, a handful of executives considered it a small pl, too, that the indtry was feeling a little pain after all its negative coverage. The company denies this—“no one at is rooting against the indtry,” says Anne Kornblut, the company’s director of partnerships—but, in any case, by early May the pain seemed to have become perhaps excessive. A number of stories appeared in the press about the damage done by the algorithmic changes. And so Sheryl Sandberg, who colleagues say often responds with agitation to negative stories, sent an email on May 7 calling a meeting of her top lieutenants.

That kicked off a wide-ranging conversation that ensued over the next two months. The key question was whether the company should introduce new factors into its algorithm to help serio publications. The product team working on wanted to increase the amount of public content—things shared by organizations, es, celebrities—allowed in Feed. They also wanted the company to provide stronger boosts to publishers deemed trtworthy, and they suggested the company hire a large team of human curators to elevate the highest-quality inside of Feed. The company discsed setting up a new section on the app entirely for and directed a team to quietly work on developing it; one of the team’s ambitions was to try to build a competitor to .

Some of the company’s most senior execs, notably Chris Cox, agreed that needed to give serio publishers a leg up. Others phed back, especially Joel Kaplan, a former deputy chief of staff to George W. Bh who was now ’s vice president of global public policy. Supporting high-quality outlets would inevitably make it look like the platform was supporting liberals, which could lead to trouble in Washington, a town run mainly by conservatives. Breitbart and the Daily Caller, Kaplan argued, deserved protections too. At the end of the climactic meeting, on July 9, Zuckerberg sided with Kaplan and announced that he was tabling the decision about adding ways to boost publishers, effectively killing the plan. To one person involved in the meeting, it seemed like a sign of shting power. Cox had lost and Kaplan had won. Either way, ’s overall traffic to organizations continued to plummet.


That same evening, Donald Trump announced that he had a new pick for the Supreme Court: Brett Kavanaugh. As the choice was announced, Joel Kaplan stood in the background at the , smiling. Kaplan and Kavanaugh had become friends in the Bh , and their families had become intertwined. They had taken part in each other’s weddings; their wives were best friends; their kids rode bikes together. No one at seemed to really notice or care, and a tweet pointing out Kaplan’s attendance was retweeted a mere 13 times.

Meanwhile, the dynamics inside the s department had gotten even worse. Elliot Schrage had announced that he was going to leave his post as VP of global s. So the company had begun looking for his replacement; it foced on interviewing candidates from the political world, including Denis McDonough and Lisa Monaco, former senior officials in the Obama administration. But Rachel Whetstone also declared that she wanted the job. At least two other executives said they would quit she got it.

The need for leadership in s only became more apparent on July 11, when John Hegeman, the new head of Feed, was asked in an interview why the company didn’t ban Alex Jones’ InfoWars from the platform. The honest answer would probably have been to jt admit that gives a rather wide berth to the far right becae it’s so worried about being called liberal. Hegeman, though, went with the following: “We created to be a place where dferent people can have a voice. And dferent publishers have very dferent points of view.”

This, predictably, didn’t go over well with the segments of the that actually try to tell the truth and that have never, as Alex Jones has done, reported that the children massacred at Sandy Hook were actors. Public fury ensued. Most of didn’t want to respond. But Whetstone decided it was worth a try. She took to the @ account—which one executive involved in the decision called “a big fucking marshmallow we shouldn’t ever e like this”—and started tweeting at the company’s critics.

“Sorry you feel that way,” she typed to one, and explained that, instead of banning pages that peddle false , demotes them. The tweet was very quickly ratioed, a Twitter term of art for a statement that no one likes and that receives more comments than retweets. Whetstone, as @, also declared that jt as many pages on the left pump out mis as on the right. That tweet got badly ratioed too.

Five days later, Zuckerberg sat down for an interview with Kara Swisher, the influential editor of Recode. Whetstone was in charge of prep. Before Zuckerberg headed to the microphone, Whetstone supplied him with a list of rough talking points, including one that inexplicably violated the first rule of n civic discourse: Don’t invoke the Holocat while trying to make a nuanced point.

About 20 minutes into the interview, while ambling through his answer to a question about Alex Jones, Zuckerberg declared, “I’m Jewish, and there’s a set of people who deny that the Holocat happened. I find that deeply offensive. But at the end of the day, I don’t believe that our platform should take that down, becae I think there are things that dferent people get wrong. I don’t think that they’re intentionally getting it wrong.” Sometimes, Zuckerberg added, he himself makes errors in public statements.

The comment was absurd: People who deny that the Holocat happened generally aren’t jt slipping up in the midst of a good-faith intellectual disagreement. They’re spreading anti-Semitic hate—intentionally. Soon the company announced that it had taken a closer look at Jones’ activity on the platform and had finally chosen to ban him. His past sins, decided, had crossed into the domain of standards violations.

Eventually another candidate for the top PR job was brought into the headquarters in Menlo Park: Nick Clegg, former deputy prime minister of the . Perhaps in an effort to disguise himself—or perhaps becae he had decided to go aggressively Silicon Valley casual—he showed up in jeans, sneakers, and an untucked shirt. His interviews mt have gone better than his disguise, though, as he was hired over the luminaries from Washington. “What makes him incredibly well qualied,” said Caryn Marooney, the company’s VP of s, “is that he helped run a country.”

Adam Maida


At the end of July, was scheduled to report its quarterly earnings in a call to investors. The numbers were not going to be good; ’s er base had grown more slowly than ever, and revenue growth was taking a huge hit from the company’s investments in hardening the platform against abe. But in advance of the call, the company’s leaders were nursing an additional concern: how to put Insta­gram in its place. According to someone who saw the relevant s, Zuckerberg and his closest lieutenants were debating via email whether to say, essentially, that Insta­gram owed its spectacular growth not primarily to its founders and vision but to its relationship with .

Zuckerberg wanted to include a line to this effect in his script for the call. Whetstone counseled him not to, or at least to temper it with praise for Insta­gram’s founding team. In the end, Zuckerberg’s script declared, “We believe Insta­gram has been able to e ’s infrastructure to grow more than twice as quickly as it would have on its own. A big congratulations to the Insta­gram team—and to all the teams across our company that have contributed to this success.”

After the call—with its payload of bad about growth and investment—’s stock dropped by nearly 20 percent. But Zuckerberg didn’t forget about Insta­gram. A few days later he asked his head of growth, Javier Olivan, to draw up a list of all the ways supported Insta­gram: running ads for it on the Blue App; including link-backs when someone posted a photo on Insta­gram and then cross-published it in Feed; allowing Insta­gram to access a new er’s connections in order to recommend people to follow. Once he had the list, Zuckerberg conveyed to Insta­gram’s leaders that he was pulling away the supports. had given Insta­gram servers, health insurance, and the best engineers in the world. Now Insta­gram was jt being asked to give a little back—and to help seal off the vents that were allowing people to leak away from the Blue App.

Systrom soon posted a memo to his entire staff explaining Zuckerberg’s decision to turn off supports for traffic to Insta­gram. He disagreed with the move, but he was committed to the changes and was telling his staff that they had to go along. The memo “was like a flame going up inside the company,” a former senior manager says. The document also enraged , which was terried it would leak. Systrom soon departed on paternity leave.

The tensions didn’t let up. In the middle of Augt, prototyped a location-­tracking service inside of Insta­gram, the kind of intrion that Insta­gram’s management team had long resisted. In Augt, a hamburger menu appeared. “It felt very personal,” says a senior Insta­gram employee who spent the month implementing the changes. It felt particularly wrong, the employee says, becae is a data-driven company, and the data strongly suggested that Insta­gram’s growth was good for everyone.

Friends of Systrom and Krieger say the stre was wearing on the founders too. According to someone who heard the conversation, Systrom openly wondered whether Zuckerberg was treating him the way Donald Trump was treating Jeff Sessions: making le miserable in hopes that he’d quit without having to be fired. Insta­gram’s managers also believed that was being miserly about their budget. In past years they had been able to almost double their number of engineers. In the summer of 2018 they were told that their growth rate would drop to less than half of that.

When it was time for Systrom to return from paternity leave, the two founders decided to make the leave permanent. They made the decision quickly, but it was far from impulsive. According to someone familiar with their thinking, their unhappiness with stemmed from tensions that had brewed over many years and had boiled over in the past six months.

And so, on a Monday morning, Systrom and Krieger went into Chris Cox’s office and told him the . Systrom and Krieger then notied their team about the decision. Somehow the reached Mike Isaac, a reporter at The New York Times, before it reached the s teams for either or Insta­gram. The story appeared online a few hours later, as Insta­gram’s head of s was on a circling above New York City.

After the announcement, Systrom and Krieger decided to play nice. Soon there was a lovely photograph of the two founders smiling next to Mosseri, the obvio choice to replace them. And then they headed off into the unknown to take time off, decompress, and figure out what comes next. Systrom and Krieger told friends they both wanted to get back into coding after so many years away from it. you need a new job, it’s good to learn how to code.


Jt a few days after Systrom and Krieger quit, Joel Kaplan roared into the . His dear friend Brett Kavanaugh was now not jt a conservative appellate judge with Federalist Society views on Roe v. Wade; he had become an alleged sexual assailant, purported gang rapist, and national symbol of toxic masculinity to somewhere between 49 and 51 percent of the country. As the charges multiplied, Kaplan’s we, Laura Cox Kaplan, became one of the most prominent defending him: She appeared on Fox and asked, “What does it mean for men in the future? It’s very serio and very troubling.” She also spoke at an #IStandWithBrett press conference that was live­streamed on Breitbart.

On September 27, Kavanaugh appeared before the Senate Judiciary Committee after four hours of wrenching recollections by his primary accer, Christine Blasey Ford. Laura Cox Kaplan sat right behind him as the hearing descended into rage and recrimination. Joel Kaplan sat one row back, stoic and thoughtful, directly in view of the cameras broadcasting the scene to the world.

Kaplan isn’t widely known outside of . But he’s not anonymo, and he wasn’t wearing a fake mtache. As Kavanaugh testied, journalists started tweeting a screenshot of the tableau. At a meeting in Menlo Park, executives passed around a phone showing one of these tweets and stared, mouths agape. None of them knew Kaplan was going to be there. The man who was supposed to smooth over ’s political dramas had inserted the company right into the middle of one.

Kaplan had long been friends with Sandberg; they’d even dated as undergraduates at Harvard. But despite rumors to the contrary, he had told neither her nor Zuckerberg that he would be at the hearing, much less that he would be sitting in the gallery of supporters behind the star witness. “He’s too smart to do that,” one executive who works with him says. “That way, Joel gets to go. gets to remind people that it employs Republicans. Sheryl gets to be shocked. And Mark gets to denounce it.”

that was the plan, it worked to perfection. Soon ’s internal message boards were lighting up with employees mortied at what Kaplan had done. Management’s initial response was limp and lame: A s officer told the staff that Kaplan attended the hearing as part of a planned day off in his personal capacity. That wasn’t a good move. Someone visited the human resources portal and noted that he hadn’t filed to take the day off.

The hearings were on a Thursday. A week and a day later, called an all-hands to discs what had happened. The giant cafeteria in ’s headquarters was cleared to create for a town hall. Hundreds of chairs were arranged with three aisles to accommodate people with questions and comments. Most of them were from who came forward to recount their own experiences of sexual assault, harassment, and abe.

Zuckerberg, Sandberg, and other members of management were standing on the right side of the stage, facing the audience and the moderator. Whenever a question was asked of one of them, they would stand up and take the mic. Kaplan appeared via video conference looking, according to one viewer, like a hostage trying to smile while his captors stood jt offscreen. Another participant described him as “looking like someone had jt shot his dog in the face.” This participant added, “I don’t think there was a single male participant, except for Zuckerberg looking down and sad onstage and Kaplan looking dumbfounded on the screen.”

Employees who watched expressed dferent emotions. Some felt empowered and moved by the voices of in a company where top management is overwhelmingly male. Another said, “My eyes rolled to the back of my head” watching people make specic personnel demands of Zuckerberg, including that Kaplan undergo sensitivity training. For much of the staff, it was cathartic. was finally reckoning, in a way, with the #MeToo movement and the profound bias toward men in Silicon Valley. For others it all seemed ludicro, narcissistic, and emblematic of the liberal, politically correct bubble that the company occupies. A guy had sat in silence to support his best friend who had been nominated to the Supreme Court; as a consequence, he needed to be publicly flogged?

In the days after the hearings, organized small group discsions, led by managers, in which 10 or so people got together to discs the issue. There were tears, grievances, emotions, debate. “It was a really bizarre confluence of a lot of issues that were popped in the zit that was the SCOT hearing,” one participant says. Kaplan, though, seemed to have moved on. The day after his appearance on the conference call, he hosted a party to celebrate Kavanaugh’s letime appointment. Some colleagues were aghast. According to one who had taken his side during the town hall, this was a step too far. That was “jt spiking the football,” they said. Sandberg was more forgiving. “It’s his hoe,” she told WIRED. “That is a very dferent decision than sitting at a public hearing.”

In a year during which made endless errors, Kaplan’s insertion of the company into a political maelstrom seemed like one of the clumsiest. But in retrospect, executives aren’t sure that Kaplan did lasting harm. His blunder opened up a series of eful conversations in a workplace that had long foced more on coding than inclion. Also, according to another executive, the episode and the press that followed surely helped appease the company’s would-be regulators. It’s eful to remind the Republicans who run most of Washington that isn’t staffed entirely by snowflakes and libs.


That summer and early fall weren’t kind to the team at charged with managing the company’s relationship with the indtry. At least two product managers on the team quit, telling colleagues they had done so becae of the company’s cavalier attitude toward the . In Augt, a jet-lagged Campbell Brown gave a presentation to publishers in Atralia in which she declared that they could either work together to create new digital models or not. they didn’t, well, she’d be unfortunately holding hands with their dying , like in a hospice. Her off-the-­record comments were put on the record by The Atralian, a publication owned by Rupert Murdoch, a canny and persistent antagonist of .

In September, however, the team managed to convince Zuckerberg to start administering ice water to the parched executives of the indtry. That month, Tom Alison, one of the team’s leaders, circulated a document to most of ’s senior managers; it began by proclaiming that, on , “we lack clear strategy and alignment.”

Related posts