Will Facebook's crowdsourced news rating move backfire?

Sources that are widely respected will now get better play in users’ news feeds, while those that aren’t will get less

FILE - In this April 18, 2017, file photo, conference workers speak in front of a demo booth at Facebook's annual F8 developer conference in San Jose, Calif. Facebook said Thursday, Jan. 11, 2018, that it is tweaking what people see to make their time on it more “meaningful.” The changes come as Facebook faces criticism that social media can make people feel depressed and isolated. (AP Photo/Noah Berger, File)
Powered by automated translation

Facebook says it is now going to crowd-source the trustworthiness of the news it displays in users’ feeds. As with any effort that attempts to rely on the supposed wisdom of the masses, the inevitably sarcastic question arises: what could possibly go wrong?

“As part of our ongoing quality surveys, we will now ask people whether they’re familiar with a news source and, if so, whether they trust that source,” the Facebook founder Mark Zuckerberg wrote in a blog post last week.

“The idea is that some news organisations are only trusted by their readers or watchers, and others are broadly trusted across society even by those who don’t follow them directly.”

As per the new changes, Mr Zuckerberg added, sources that are widely respected will now get better play in users’ news feeds, while those that aren’t will get less. Users won’t see more news as a result – it accounts for about 4 per cent of feeds, he added – but what they do see will hopefully be of better quality.

The effect on media businesses is likely to be profound, given that a large portion of Facebook’s 2 billion users get much of their news from the site. More than two-thirds of Americans, for example, get news from Facebook, Twitter and other social media services, according to a Pew Research Center study last autumn.

News organisations that end up on the good side of Facebook’s trustworthiness polls stand to benefit in two ways. Not only are they likely to see upticks in referral traffic and therefore potentially higher ad revenue, they’ll also receive a reputation bump that they can leverage into convincing users to pay for subscriptions.

On the other hand, news outlets that end up on the wrong side of the surveys will inevitably suffer. Relatively new businesses – those without decades or more of track records – could find it tougher to build their audiences.

In that way, Facebook is creating its own sort of net neutrality problem. Boosting reputable news organisations is a valid goal, but should it come at the expense of new challengers to the older guard?

Mr Zuckerberg explained that the crowd-sourcing option was one of three possible routes the company considered as part of its ongoing effort to clean up news feeds, which he acknowledged had become vulnerable to sensationalism, misinformation and polarisation.

The first was for Facebook itself to decide trustworthiness, “but that’s not something we’re comfortable with”, he wrote. The company also considered asking outside experts to decide, but ultimately found that wouldn’t solve the objectivity problem either.

So, the crowd it is.

Although there’s no perfect solution to the unmistakable trust issue that Facebook and all social media companies are facing, this may in fact be the worst possible option.

Crowdsourcing has proven to be fraught with problems in almost every instance in which it has been applied, from Amazon product reviews to YouTube video comments. It invites the most vocal users – and potentially those who are most on the fringe – to try and skew results by finding loopholes in the system. It doesn’t necessarily reflect the views of the majority.


Read more:

Facebook will open digital hubs to train one million Europeans

Facebook agrees to widen probe of Russian meddling in the Brexit vote


Facebook is being vague so far on how it will compile and administer its trustworthiness index, but it will need to be vigilant to protect against groups of users trying to torque the system in their favour.

Given that social media has so far failed miserably at doing so – Twitter this past weekend informed nearly 700,000 users that they had been exposed to Russian propaganda – it’s an open question as to whether that’s even possible.

The crowdsourcing move is thus an effort to stave off having to make an even tougher decision. If Facebook is seeking true impartiality, the only way to accomplish it may be to remove news postings from users’ feeds entirely, a highly unlikely outcome given how integral they’ve become.

The more probable outcome is that Facebook goes back to the first two options it considered – either judging trustworthiness itself or relying on outside experts to do so.

Again, neither option is perfect and both are open to criticism. But it’s becoming increasingly obvious that Facebook is a media company and, eventually, it will have to act like one.

There's little difference between the company displaying news from, say, The New York Times or The National, and a newspaper running stories from a wire service such as Reuters or Associated Press.

If newspapers must shoulder some responsibility for the third-party content they publish, there doesn’t seem to be much reason why Facebook should be exempt from the same.

Mr Zuckerberg says he isn’t “comfortable” with judging the trustworthiness of news outlets, doubtless because he doesn’t want to alienate users of any particular political stripe or affiliation.

But, sooner or later, Facebook is going to have to get comfortable with making those hard decisions. Offloading them to the public sphere, where they can be twisted and corrupted, only stands to make the matter worse.