Be Careful What You Wish For When It Comes to “Fake News,” Cautions Facebook CSO Alex Stamos

Screenshot of Facebook's chief security officer, Alex Stamos, testifying before a U.S. Senate subcommittee in May 2014. Stamos was head of security for Yahoo at the time.
Facebook's chief security officer Alex Stamos, the Mr. Spock of the tech world, shown here testifying before a U.S. Senate subcommittee in May 2014, while he was head of security at Yahoo (screenshot via C-SPAN)

As the U.S. Department of Justice’s special counsel Robert Mueller and both houses of Congress  continue to investigate Russian meddling in the 2016 election and the country’s use of social-media platforms such as Facebook to influence America’s internal politics, the demands on these platforms to police themselves are growing louder.

The debate over how to self-police blew up recently on — where else? — Twitter, with respected Brookings Institution scholar Quinta Jurecic, a research analyst at the Brookings Institution and an associate editor of the think tank’s Lawfare blog, crossing swords with Alex Stamos, chief security officer for Facebook, who warned critics of unintended Orwellianism, ominously quoting Oscar Wilde: “When the gods wish to punish us they answer our prayers.”

Jurecic initiated the discussion by retweeting an Axios.com article about Facebook’s new ad-review policy that quotes an email in which Facebook founder and CEO Mark Zuckerberg warned advertisers that ads pertaining to “politics, religion, ethnicity or social issues” would be subjected to “manual review.”

Jurecic suggested that human review, as opposed to electronic vetting via algorithm, might be a “red herring,” deeming it the “flipside of treating The Algorithm as a holy, neutral god (which got us into this mess).”

In follow-up tweets, Jurecic opined that “algorithms are not neutral, they are designed” and asked rhetorically whether Facebook could “design an algorithm to review these ads more quickly than humans?” She further speculated  that the problem might not be with the algorithm (the arbiter of all things internet), but rather with its maker — suggesting that Facebook’s algorithms might have been “designed poorly and irresponsibly.” Though she admitted, “I am open to being told I’m wrong about this!”

Stamos, who worked as Yahoo’s chief information security officer before joining Facebook in 2015, was more than up to the challenge, responding to Jurecic in an extensive and thought-provoking thread.

“I am seeing a ton of coverage of our recent issues driven by stereotypes of our employees and attacks against fantasy, strawman tech cos,” Stamos tweeted, adding, “[L]ots of journalists have celebrated academics who have made wild claims of how easy it is to spot fake news and propaganda.”

(Indeed, studies have noted the difficulty in defining and identifying so-called fake news, and distinguishing it from reliable information and/or opinion.)

“[I]f you don’t worry about becoming the Ministry of Truth with ML [machine learning] systems trained on your personal biases, then it’s easy!” he chided a little further down, urging critics to consider the “downside” of ML systems that rely on “ideologically biased training data…. If you call for less speech by the people you dislike but also complain when the people you like are censored, be careful,” he added.

Many of the responses to Stamos’s tweetage were savage and cynical, referencing Zuckerberg’s comments after the 2016 election, wherein the tech mogul dismissed as risible the charge that Facebook had been gamed by Russian bots.

Zuckerberg ate his words after Stamos issued a statement in September revealing that Facebook had indeed published thousands of ads associated with inauthentic accounts that subsequent analysis indicated “were affiliated with one another and likely operated out of Russia.”

Though Facebook has rightly taken a credibility hit, the fact is that to the extent the Russians may have interfered with a U.S. election, they did so by weaponizing the gullibility of the social network’s users via good-old-fashioned confirmation bias and a collective decline in our critical-thinking skills.

One of the more salient replies to Stamos came from an individual who posted a link to a recent New York Times op-ed by Nina Jankowicz, a fellow at the Woodrow Wilson Center’s Kennan Institute.

Jankowicz argues that the best defense against Russian disinformation is an offense girded by training the American public in “critical reading and analysis skills for the digital age” and an investment in the Fourth Estate “to ensure that it is driven by truth, not clicks.”

Otherwise, she concludes, Americans will continue to be easy marks for online confidence men who exploit the “weaknesses of our own making.”

Click here to follow Alex Stamos on Twitter, here to follow Quinta Jurecic, and here to read Stamos’s extensive Twitter thread about policing content online.

Print Friendly, PDF & Email

About Stephen Lemons

Stephen Lemons is an award-winning investigative journalist with more than 20 years of experience covering everything from government corruption to white-supremacist gangs. In addition to Front Page Confidential, his work has appeared in Phoenix New Times, the Los Angeles Times, Salon.com, and the Southern Poverty Law Center’s Intelligence Report magazine.