Political pressure is gradually forcing Facebook executives to take responsibility for the content that appears on the social network, at this point mainly for the advertising. So far, they are doing a good job of avoiding the real subject.
On Wednesday, Chief Operating Officer Sheryl Sandberg addressed the recent discovery that ads on Facebook could be targeted at "Jew haters" — or, rather, at the small number of people who self-describe as such. Fixing that required temporarily ceasing all targeting by profession. Sandberg promised a new level of human oversight so that new targets would require approval from an actual Facebook employee.
On Thursday, Chief Executive Officer Mark Zuckerberg weighed in on a scandal with potentially more dire consequences for his company — Russian influence operations through Facebook during the 2016 presidential election. Facebook accounts, apparently set up in Russia, purchased some 3,000 ads during the U.S. election campaign. Zuckerberg explained that this took place "programmatically through our apps and website without the advertiser ever speaking to anyone at Facebook."
We don't check what people say.
Zuckerberg insisted that Facebook wasn't going to police content: "We don't check what people say before they say it, and frankly, I don't think our society should want us to. Freedom means you don't have to ask permission first, and that by default you can say what you want." Zuckerberg offered some policy changes on ads, similar to Sandberg's, such as more human oversight, greater transparency when it comes to who's paying for "political ads" and what ads an entity is running.
Zuckerberg tried to offer a sop to critics: "But even without our employees involved in the sales, we can do better," he said, without specifying what he meant. Within its current model, however, Facebook cannot really do much better. Only once Facebook is forced to admit that it's really a media company (just one that doesn't pay for content) will that stance have to change.
It's pointless to force a Facebook page to list what ads it's running to different audiences. A Russian troll farm can set up any number of pages and use any number of payment options; that's what troll farms are for. It's also impossible to screen for "political ads" programmatically: Filtering by certain words or expressions will stop a lot of legitimate ads, and a player intent on getting a message across will soon find a way to bypass the filters. Involving humans in screening all ads is the only fool proof solution. It would bring Facebook's practice closer to that of a traditional media company, which always knows what ads it's running, who paid for them and what legal problems may arise from running them. This may be inevitable unless Facebook is prepared to wage war on the U.S. political establishment in the name of information freedom — something Zuckerberg has been disinclined to do so far.
While it would be fair for Facebook to accept media regulation to level the playing field, it's a shame Zuckerberg isn't putting up that fight. It would give the public a chance to find out more about the way Facebook advertising works today.
A 2016 paper on the company's ad targeting, written by Facebook's Neha Bhargava and Dan Chapsky in cooperation with Northwestern University's Brett Gordon and Florian Zettelmeyer sheds some light on the issue. It notes that targeting engines "stack the deck" by sending adds to the people most likely to purchase the advertised product or service, "making it very difficult to tell whether the ad itself is actually having any incremental effect." Applied to the "Russian ads," this suggests that Facebook's engine was programmed to preach to the converted, or at least the easily swayed. That, of course, is the problem with the whole concept of Russian propaganda warfare: Messages generated by Russian propagandists bounce around echo chambers that are full of similar home-grown content, produced by and for people who love this kind of thing. Figuring out whether they had an effect is like deciding whether it was the tenth drink that did all the damage or the previous nine.
The other revelation from the paper is that, when it comes to commercial ads, the efficiency of Facebook advertising is rather low. For the paper, 12 experiments were run with millions to hundreds of millions of ad impressions. These were commercial ads, for financial, telecom and tech products, as well as retail sales. The average click-through rate — the ratio of the number of times users click on an ad to the number of times the ad is shown — was 1.6%.
Zuckerberg and Sandberg have offered some concessions — but it's far from enough.
In the third quarter of 2016, the average cost of one click-through on Facebook in the U.S. was 27.3 cents. The average cost per 1,000 impressions was $7.19. Depending on how the alleged Russian operators paid — per click or per impression — $100,000 would have bought them 366,300 clicks or 13.9 million impressions; this implies a 2.6% click-through rate, higher than in the Facebook experiments, meaning that it probably made more sense to pay per click.
Facebook's real-life click-through rate shows the precision of its targeting may be somewhat overblown. In 2016, according to the Data and Marketing Association, the average response rate to old-fashioned direct mailings to prospect lists was 2.9%. That's higher than Facebook's real and implied click-through rates.
Of course, that's not an argument Zuckerberg can really make. Some would argue that even 366,300 clicks could have made a difference in a close election. Facebook may yet find more Russian-bought ads: Zuckerberg said the investigation is ongoing. And, of course, no tech company likes to advertise that its advanced methods lead to similar outcomes as tried-and-true legacy advertising techniques.
In some ways, it's a good thing that the debate is not focusing on whether Facebook ads could have had affected the outcome of the 2016 election. Regardless of its efficiency, advertising should be transparent, and it should be run by humans, not highly imperfect robots. But the public should also have a realistic view of the power of Facebook and other social platforms when it comes to influencing opinion. Zuckerberg and Sandberg have offered some concessions — but it's far from enough. What's needed is full disclosure on how many accounts on the network are fake, actions against their prevalence and full transparency about programmatic advertising, the core of Facebook's business model.
See more from World Affairs here