Advertisers don’t like controversy and right now that’s a big and growing problem for Facebook. Even other social media companies, like Twitter, are starting to feel the sting.
Facebook has long taken a hands-off approach to content moderation, and while this upset some activists it never ultimately hurt their bottom line. Until now.
Over the years, Facebook’s content policies have been fairly open. As long as you or your group isn’t documenting or advocating for any directly illegal activities, they’d pretty much let you post away. The algorithms tended to care more about signs of interest like interactions, so it promoted whatever was popular and paid no heed to which side that meant it was effectively taking on any given issue. It promised a type of corporate neutrality where the final product result would be determined by the users, in a way resembling democracy.
The facade of corporate neutrality wouldn’t hold long, however. The critics and activists increasingly noted that far-right groups were among Facebook’s largest ad buyers. All of Mark Zuckerberg’s most high profile meetings and dinners were with right wing celebrities and politicians.
Further, it would be revealed that firms like Cambridge Analytica were using Facebook’s ad platform to build voter population profiles down to the individual level. Some of these campaigns were even funded by Russian groups. Some of the campaigns were even funding both sides of a conflict – just for the sake of increasing conflict.
But we’re just a neutral corporation, Facebook would continue insisting… it’s just that people are starting to see through it.
Facebook becomes the controversy
There doesn’t even seem to be a particular action by Facebook that finally crossed the line. It was, instead, the accumulation of many years of playing down and ignoring external controversies that made Facebook, itself, the controversy.
Karl Popper noted this problem in his concept of the paradox of tolerance: “In order to maintain a tolerant society, the society must be intolerant of intolerance.”
Facebook was casually tolerant of intolerance for a long time – as long as no one directly advocated for anything illegal. Then came a steady trickle of evidence that people were becoming radicalized by groups and political operatives who were advocating for intolerant beliefs. Some even turned to violence, like the Christchurch Mosque shooter who even live streamed the shooting on Facebook so his friends and fans could watch.
So what changed?
It’s important to emphasize that Facebook hasn’t done anything different this time. There is no specific, new scandal. Facebook’s behavior and policies have been markedly consistently on this topic over the last few years – and they were able to ride this strategy to more than $70 billion in annual advertising revenues.
What’s changed is the social mood in relation to these transgressions. Three years ago, corporate board rooms weren’t discussing Popper’s paradox of tolerance. Society, at large, was more sympathetic to the corporate neutrality argument. Why alienate your potential customers if you can avoid taking sides? Quite often, corporations would even take that a step further, and blame any controversy on “both sides,” as if the act of pointing out problematic behavior was in itself problematic.
Something changed, in America, when we watched what happened to George Floyd. Many people who had once been able to dismiss or ignore systematic racism were suddenly forced to confront one of American racism’s most cruel and disgusting consequences. It’s not something one can stay neutral on, and it’s certainly not a situation that can be spun to blame “both sides.”
Facebook’s problem is that it didn’t catch this shift in it’s consumer’s preferences, and it didn’t change fast enough.
Twitter and Reddit reacted faster
While the nature of social media itself is potentially controversial, Reddit and Twitter did a much better job of getting out in front of this issue than Facebook did. They’re still suffering some collateral damage in addition to the ongoing weakness of the advertising market, but they had to take much bigger steps than Facebook did – even at the risk of alienating their user base.
Twitter took the bold action recently of putting warnings on Tweets that appear to incite or encourage violence. This may sound mild, but some of the offending tweets came from the president, himself. With 35-40% of the population still supporting him, it was a risky position to take. It was probably also the right one, since Trump continues to use their service. They’re also not the target of a large and growing ad boycott right now, like Facebook is. There are still at least a few companies that have cut ties with Twitter because their actions have been so limited, and so long in coming.
Reddit also recently banned more than 2,000 subreddits that it accused of advocating violence or bigotry. Like Twitter, this included a direct jab at the president since the highest profile ban was /r/The_Donald, a subreddit that used to be immensely popular with Donald Trump fans. Although the ban is recent, the subreddit had already lost most of its traffic due to various “subreddit quarantines” and other manual actions the administration had already taken to reduce their influence on the site. There’s no word if any of Reddit’s advertisers are cutting contracts with them now over its policies, but there are some advertisers who are simply turning the lights off social media, in general.
Will Facebook recover?
Probably. As long as Facebook has the traffic, they’ll be able to find someone to buy their ad space. Revenues and bonuses and salaries might all take a hit if they’re not as competitive as they once were, but they’d have to continue to antagonize advertisers – willfully – for several more years before they were anywhere near bankrupt. Before that, other shareholders would start to demand changes in corporate leadership. All of that will probably be moot, however, as Facebook’s executive’s are likely to already be plotting a way to get back on the advertisers’ good sides again.
Before large brands start to open up large advertising accounts on Facebook again, Facebook will probably have to make some major changes to the platform. They’ll need to show that they take action – pre-emptively – against groups that promote racism, political violence, and dangerous conspiracy theories. They’ll need to disclose their political connections more transparently, or at least hide them better. They’ll need to handle the upcoming elections without causing any further controversies or featuring prominently in resultant investigations of foreign campaign interference.
Then again, none of those controversies should have been difficult to avoid in the first place. If Facebook’s leadership walked in to them all willingly, maybe they’ll just continue to walk in to the next controversy, and the one after that…