I think I understand why Facebook Chief Executive Officer Mark Zuckerberg hasn't publicly responded to the Cambridge Analytica scandal. He's stuck in a catch-22. Any fix for Facebook's previous big problem — fake news — would make the current big problem with data harvesting worse.
As a media company and one of Americans' top sources of information, Facebook's de facto anonymity and general lack of responsibility for user-generated content make it easy for propagandists to exploit. Making matters worse, it isn't willing to impose tighter identification rules for fear of losing too many users, and it doesn't want to be held responsible in any way for content, preferring to present itself as a neutral platform. So Zuckerberg has been trying to fix the problem by showing people more material from friends and family and by prioritizing "trusted publishers" and local news sources over purveyors of fake news.
"Making sure time spent on Facebook is time well spent," as Zuckerberg puts it, should lead to the collection of better-quality data. If nobody is setting up fake accounts to spread disinformation, users are more likely to be their normal selves. Anyone analyzing these healthier interactions will likely have more success in targeting commercial and, yes, political offerings to real people. This would inevitably be a smaller yet still profitable enterprise, and no longer a growing one, at least in the short term. But the Cambridge Analytica scandal shows people may not be okay with Facebook's data gathering, improved or not.
People are angry at Facebook.
The scandal follows the revelation (to most Facebook users who read about it) that, until 2015, application developers on the social network's platform were able to get information about a user's Facebook friends after asking permission in the most perfunctory way. The 2012 Obama campaign used this functionality. So — though in a more underhanded way — did Cambridge Analytica, which may or may not have used the data to help elect President Donald Trump.
Many people are angry at Facebook for not acting more resolutely to prevent CA's abuse, but if that were the whole problem, it would have been enough for Zuckerberg to apologize and point out that the offending functionality hasn't been available for several years. The #deletefacebook campaign — now backed by WhatsApp co-founder Brian Acton, whom Facebook made a billionaire — is, however, powered by a bigger problem than that. People are worried about the data Facebook is accumulating about them and about how this data is used. Facebook itself works with political campaigns to help them target messages; it did so for the Trump campaign, too, perhaps helping it more than CA did.
The anger over the CA incident is akin to the more benign anti-Facebook outbreak in 2014 after revelations that Facebook had been running secret psychological experiments on users, attempting to alter their mood by tweaking their newsfeeds. People may give up personal data easily for the sake of convenience, but they hate being turned into guinea pigs.
Is there a Zuckerberg response that would reassure users that this is not going to happen to them? In theory, sure. Zuckerberg could say his platform would reject all political advertising, take measures against all data scraping and provide no data to political actors. That, however, would be a slippery slope; nobody wants to be a guinea pig for big corporations, either. Give users a finger and they'll bite off the whole arm, destroying Facebook's painstakingly built microtargeting-based business model. Or if they don't, they'll take precautions, disguise themselves, and delete or obscure much of their personal data.
Would the world be a worse place without Facebook?
Smaller sacrifices, however, may be useless against the critical mass of popular disapproval that has accumulated while Zuckerberg struggled with his minimalist solution to the fake news issue. What do people want from him, anyway? Do they want an environment that produces lots of quality data or do they want Facebook to stop collecting data? Perhaps both? But then, how would Facebook make money?
Or perhaps even neither? Would the world be a worse place without Facebook? What would we lose? People can always have an uncivil conversation with bots about divisive politics on Twitter. They can stay in touch with friends, family, neighbors and co-workers on any of the numerous messenger apps. Young people are giving up on it, and Germany's new digital minister Dorothee Baer recently teased it for turning into "a senior citizens' network." But what's keeping the older generations on it except inertia?
Zuckerberg, who is expected to break his silence soon, probably won't make any radical moves. But what if he did?
Sometimes, dreams help clarify reality. I have this picture of him in my mind, framed as a self-launching video. Quietly and choking up a little as he speaks, the Facebook CEO makes an announcement. "We've come so far from that dorm room at Harvard," he says. "Perhaps too far. I'm sad to announce that today, we're closing the main Facebook app and website: It's clear that it's been abused by anyone and everyone, including ourselves, and you folks no longer want it. We'll still help connect the world through Instagram, Messenger and WhatsApp. We promise they won't turn into another Facebook."
Would there be many people — except perhaps the remaining Facebook shareholders — who wouldn't heave a sigh of relief? I know I would.
See more from Opinion / Analysis here