The Future

Experts: Facebook’s media survey could actually work if it doesn’t get gamed

It depends on how many users see it and what Facebook does with the data.

The Future

The Future

Experts: Facebook’s media survey could actually work if it doesn’t get gamed

It depends on how many users see it and what Facebook does with the data.

Facebook has a plan to make sure the news people see is “high quality” by surveying users about the sources they trust. But the survey, published by BuzzFeed, is only two questions long — and experts say that if Facebook doesn’t take steps to curb troll responses and other forms of bias, it could could end up boosting the visibility of partisan fake news sites.

“I think even the two questions could give you some really valuable information, and I wouldn’t say it’s terrible to ask those things,” said CUNY Graduate School of Journalism's director of social journalism Carrie Brown. “But it depends on how you use it, and how much weight you give it.”

This is the survey, as reported by BuzzFeed:

Do you recognize the following websites?
- Yes
- No

How much do you trust each of these domains?
- Entirely
- A lot
- Somewhat
- Barely
- Not at all

The truncated survey has the potential to collect useful data, according to experts we spoke to, but it’s also vulnerable to a range of problems that affect self-reported surveys, which can produce inconsistent or inaccurate results.

It’s possible that only Facebook users who feel strongly about the news will respond to the survey, for instance, skewing the results with their extreme views. It’s inevitable that gangs of internet trolls will try to rig the system by coordinating their responses on 4chan or in Discord chats. And maybe other respondents, anticipating those phenomena, will alter their own responses to try to compensate, making the data Facebook collects even less dependable.

The survey’s phrasing, which uses the word “trust,” could also encourage responses that draw more on partisanship and emotion than on objective criteria about the accuracy of different sources of news. At the end of the day, it could just tell the company which outlets various groups of readers prefer — a trove of data the company already has access to from observing the media diets of its users.

“I think this is going to lend itself to conservatives saying ‘I trust Fox News and liberals saying ‘I trust Democracy Now’,” Brown said. “It does seem vulnerable to that.”

If fewer users have access to the survey, the impact of trolls could be minimized.

According to David Carroll, a professor of media design at the New School, whether the two-question survey will be effective will depend largely on which users it’s presented to and which publications they’re polled about. One major concern — that the survey will be brigaded by extremists — could be alleviated by keeping the sample size at the small end of what would provide statistically meaningful results. Unlike with something like Time’s Time 100 survey, when a coordinated campaign by 4chan users sent the founder of 4chan, Christopher Poole, to the top spot, Facebook could make its survey only available to some users. If fewer users have access to the survey, the impact of trolls would be minimized — and as long as the sample size is significant and randomized, the results should not be meaningfully different than they would be with a larger sample.

But the company probably won’t say precisely how the survey is rolled out or data is ultimately used, Carroll said, because that could give bad actors insight into how to game the system.

“I think it’s technically possible for the survey to contribute toward trustworthiness but I am skeptical it will solve the overarching problem, especially if it’s not being managed by experts in survey design and public opinion research,” Carroll said.

If Facebook wants to make good use of the data, Carroll and Brown agreed, it should treat any results from the survey as only one input in a complex system. In particular, Brown pointed to research showing that genuine news sites have a very different fingerprint, in terms of the third party technology they deploy to serve ads and track readers, from fringe sites — differences that the social network could leverage to keep bad links off its platform. And Facebook, he pointed out, is already sitting on a wealth of data that could be mined to understand how different groups of users feel about different types of news.

Judging which news sources are legitimate is not a position Facebook ever wanted to be in. After the 2016 election, Zuckerberg shrugged off the idea that misinformation on the site had affected the U.S. presidential election as “crazy.” After criticism, he posted a note in which he apologized for his language, but doubled down on the position that Facebook had been a minor factor in the election. Later, though, he pledged to explore how the site could fight misinformation. In the post announcing the survey, he said that the changes would mean news makes up about 4 percent of users’ feeds, down from 5 percent.

An awkward step in that journey was when the company attempted to create a panel of outside organizations, including Snopes and PolitiFact, which would mark questionable articles on the site with a “disputed” tag. But the company rolled back the “disputed” tags back late last year by replacing them with related articles that “give more context” to questionable stories. To Carroll, that movement away from human judgment calls is a mistake.

“I think they should consult experts,” he said. “They’re experts for a reason. But there’s a political reason for them not to do the right thing. That’s unfortunate, and it’s why they’re looking for technical solutions to avoid a human problem.”

Carroll also pointed to a question raised by  The Atlantic’s Alexis Madrigal: whether Facebook will share the results of the survey with media organizations themselves. If the social network shared that data — which it probably won’t, he cautioned — it could potentially provide publications with a wealth of insight about how they could improve trust with readers.

And though she had misgivings about the methodology of the Facebook survey, Brown agreed that it’s a good idea for journalists to try to better understand how their audiences feel about their work.

“I think journalists need to respect their audiences enough to care about what people find credible, so I don't think the idea of Facebook surveying their users about news is a terrible idea on its face,” she said. “In fact, scoffing at the stupidity of our audiences is part of what got us in the situation in which we are relying on Facebook to give us clicks and engagement in the first place.”