There was a time when Mark Zuckerberg didn’t see the mainstream media as the enemy. He even allowed me, a card-carrying legacy media person, into his home. In April 2018, I ventured to hear his plans to do the right thing. It was part of my years of embedding in Facebook to write a book. Over the past two years, Zuckerberg’s company has been roundly criticized for its failure to curb disinformation and hate speech. Now the young founder had a plan to address it.
Part of the solution, he told me, was more content moderation. He’s going to hire a lot more people to vet jobs, even if it costs Facebook significant capital. He will also strengthen efforts to use artificial intelligence to proactively remove harmful content. “It’s no longer enough to give people tools to say what they want and then just have our community flag them and try to respond afterwards,” he told me as we sat in his sunroom. “We need to get in there more and just play a more active role.” He admitted he was slow to realize how harmful toxic content on Facebook was, but now he is committed to fixing the problem, even if it could take years. “I think we’re doing the right thing,” he told me, “it’s just that we should have done it sooner.”
Seven years later, Zuckerberg no longer thinks moderation is the right thing. In a five-minute line, he described his actions to support it as a regrettable attack on the government’s jaw on Covid and other subjects. He announced a shift away from content moderation — no more proactive removals and downgrades of misinformation and hate speech — and the end of a fact-checking program aimed at debunking falsehoods circulating on his platforms. Fact-checking by trusted sources will be replaced by “community notes,” a crowdsourcing approach where users provide alternative views on the veracity of posts. That technique is the exact thing he told me in 2018 “isn’t enough.” While he admits his changes will allow “more bad stuff” now, he says it’s worth it in 2025 for more “free expression” to flourish.
The policy shift was one of several that signaled that, whether Zuckerberg wanted to do it all along or not, Meta was positioning itself in sync with the new Trump administration. You’ve heard the litany, which has become a meme in itself. Meta promoted its top lobbyist, former GOP operative Joel Kaplan, to chief global affairs officer; he immediately appeared on Fox News (and only Fox News) to introduce the new policies. Zuckerberg also announced that Meta will move employees who write and review content from California to Texas to “help remove concerns that biased employees are over-censoring content.” He disbanded Meta’s DEI program. (Where’s Sheryl Sandberg, who was so proud of Meta’s diversity effort. Sheryl? Sheryl?) And Meta changed some of its terms of service specifically to allow users to demean LGBTQ people.
Now that it’s been a week since Meta’s turnaround—and my first viewing of Zuckerberg’s speech—I’m particularly haunted by one aspect: he seems to have debased the basic practice of classic journalism, describing it as nothing better than the -reported sightings from podcasters, influencers, and countless random people on its platforms. This was alluded to in his Reel when he repeatedly used the term “legacy media” as a pejorative: a force that, in his view, encourages censorship and stifles free expression. All this time I thought the opposite!
A hint of his revised account of trustworthiness comes from the shift of fact-checkers to community notes. It’s true that the fact-checking process hasn’t worked well — in part because Zuckerberg didn’t defend the checkers when ill-meaning critics accused them of bias. It is also reasonable to expect that community notes are a useful signal that a post may be in error. But the power of refutation fails when participants in the conversation reject the idea that differences of opinion can be resolved by convincing evidence. That’s a core difference between fact-checking — which Zuckerberg has gotten rid of — and the community notes he’s implementing. The fact-checking worldview assumes that definitive facts, arrived at through research, by talking to people, and sometimes even believing your own eyes, can be conclusive. The trick is to recognize authorities who have earned public trust by pursuing the truth. Community Notes welcomes alternative views – but judging which ones are trustworthy is all up to you. There is a saying that an antidote to bad speech is more speech. But if verifiable facts cannot successfully disprove easily debunked flapdoodle, we are stuck in a suicidal quicksand of Babel.
This is the world that Donald Trump, Zuckerberg’s new role model, has consciously begun to realize. 60 minutes reporter Leslie Stahl once asked Trump why he insulted reporters who were just doing their jobs. “Do you know why I do this?” he responded. “I’m doing this to discredit you all and humiliate you all, so when you write negative stories about me, no one will believe you.” In 2021, Trump further revealed his intention to profit from an attack on the truth. “If you say it enough and keep saying it, they will start to believe you,” he said during a rally. A corollary to that is if social media promotes enough lies, people will believe them too. Especially if previously recognized authorities are discredited and disparaged.