On Monday, a new coronavirus disinformation video exploded across the internet. Created by the right-wing site Breitbart, it was a clip of a press conference from a group calling themselves America’s Frontline Doctors containing dangerously false claims about the coronavirus, including that masks are useless and that chloroquine cures the disease. (There is no known cure.) The video was a test of social media platforms’ stated policies against pandemic disinformation, and by some measures they passed. By Tuesday morning, Facebook, Twitter, and YouTube had all taken down the post for violating their policies on false information about treatments and cures for Covid.
For Facebook, the episode could be seen as a particular success. Many people, including the company’s own employees, have argued that it moves too slowly in response to false and harmful posts on the platform. Here, Facebook was the first major platform to act. There was just one problem: The video had already been viewed more than 20 million times by the time Facebook took it down on Monday night, according to NBC News. The horse was miles away before the barn doors were closed.
On the eve of an extremely high-profile congressional hearing on antitrust and competition issues in Big Tech, the episode has revived a common critique of Facebook: that the platform is simply too big to police effectively, even when it has the right policies in place. As The New York Times’ Charlie Warzel put it on Twitter, “facebook cannot manage mis/disinformation at its scale. if videos can spread that widely before the company takes note (as they have time and time again) then there’s no real hope. it’s not a matter of finding a fix – the platform is the problem.”
This is a very popular view, but it doesn’t make a great deal of sense. It’s true that no site that relies on user-generated content, and has millions or billions of users, can ever perfectly enforce its content rules at scale. But in no industry, save perhaps airlines and nuclear power plants, do we suggest that anything short of perfection is equivalent to failure. No one says there are simply too many people in the world to enforce laws at scale; we just employ a ton of cops. (Of course, the protest movement against police violence has powerfully argued that those funds would be better spent elsewhere—a question for another article.) The issue is whether Facebook can get from where it is now—taking so long to crack down on a flagrantly misleading video created by one of its own official news partners that it was already seen by tens of millions of users—to a situation that doesn’t lurch from one disinformation crisis to the next. And there’s no reason to think it couldn’t make progress toward that goal if only it invested more resources into the task.
“They need to hire more content moderators, a whole lot more of them,” said Jennifer Grygiel, a communications professor at Syracuse University. “It’s a myth to create this concept that it’s too big to moderate, there’s too much content.”
In 2019, CEO Mark Zuckerberg said Facebook would spend more than $3.7 billion on platform safety—more, he pointed out, than Twitter’s entire annual revenue. The much more relevant number, however, is Facebook’s revenue, which last year was about $70 billion. In other words, Zuckerberg was claiming credit for devoting just over 5 percent of the company’s revenue to making its product safe.
While Facebook barely cracked Forbes’ ranking of the 100 biggest companies by revenue last year, its $24 billion in profit before taxes easily made it one of the world’s most profitable. Why? Because its costs are so much lower than most other huge companies’. Ford Motor Company had $160 billion in revenue in 2018 but only cleared $4.3 billion in pretax profits. Building cars costs money. Ford faces tough competition from lots of other manufacturers, meaning it has pressure both to invest in making cars people want to drive and in charging prices people are willing to pay. Meanwhile, it must comply with extensive safety and emissions requirements imposed by the government.
Recent Comments