This is how Facebook's news feed fact-checking will work in the UK

Facebook is partnering with the British firm Full Fact to help cleanse its platform from misinformation as part of its third-party fact-checking efforts

Facebook's effort to stem the flow of fake news has finally come to the UK, with the social media firm enlisting the help of the British fact-checking charity Full Fact.

Facebook started partnering with third-party fact-checkers in December 2016, in the wake of the US elections which saw disinformation widely strewn across the platform. Now Full Fact's team of six fact-checkers – who will work on Facebook alongside their other responsibilities – will be able to review stories, images and videos posted on Facebook and rate them as true, false or a blend of fact and inaccuracy.

Users about to share content Full Fact has checked will be notified of the review’s results, and given more information about the claim and its sources. Posts labeled as false will still be shareable on the platform, but Facebook’s algorithm will show them lower in users’ news feeds, a move intended to decrease the number of people who end up seeing the content.

Given the deluge of content published on Facebook every day, one might wonder how Full Fact’s will winnow out the claims to check. According to Katie Bamber, Full Fact’s head of communications, the social network will provide the charity with “a queue of content” that is potentially false.

A post on Facebook’s blog last year explained that such a queue is created automatically on the basis of users’ reports and sceptical comments under a certain piece of content. According to a Facebook spokesperson, one signal it will be looking out for is geography. For example, if someone from the UK flags a post from a publisher in the UK as potentially false, or a potentially false post originates outside the UK but is seen by people in the UK, these could provide reasons to flag the content for review by Full Fact.

Read more: As fake news flourishes, the UK's fact-checkers are turning to automation to compete

It will be up to Full Fact, though, to choose what particular posts to assess, as Facebook will have no say on either what claims Full Fact reviews or the content of its fact-checks. “Factchecking is slow, careful work — so realistically we're not going to be able to factcheck everything that appears on Facebook,” Bamber explains.

“That's one of the reasons we will be prioritising content that we think could potentially do most damage to health, people's safety or to democratic processes.” In a press release, Full Fact highlighted false stories about cancer cures or dubious claims spread in the aftermath of terror attacks as  textbook examples of stories with the potential to seriously harm people’s health and safety.

Full Fact will be able to review content not included in Facebook’s list, but Bamber says that the charity has no plans to do so, although that could change in the future. Every three months, Full Fact will publish a report on the state of misinformation on the platform, and the efficaciousness of the fact-checkers’ counteroffensive.

This project will be funded by Facebook as part of the firm's third-party factchecking initiative. The funding, Bamber confirmed, will be dependent on how many pieces of content Full Fact checks. Facebook and Full Fact had previously collaborated on a one-off project in 2017, when Full Fact produced a decalogue of tips to spot "fake news" Facebook subsequently promoted on users’ newsfeeds.

While the new partnership is the first to be launched in the UK, Facebook has been working with fact-checking organisations since late 2016, following public outcry over the role “fake news” and foreign-planted disinformation might have played in the US Presidential Election. As of today, the company has similar arrangements with 49 fact-checkers in 24 countries, all of which are signatories of the International Fact Checking Network Code of Principles.

Facebook has been keen to highlight its fact-checking efforts: in October 2018 the company said that various pieces of research seemed to indicate that its work with external fact-checkers, among other countermeasures, was bringing down the amount of misinformation shared and interacted with on the website. On the other hand, in December 2018, The Guardian reported that some fact-checkers involved in the initiative had grown disillusioned with Facebook’s real willingness to quash misinformation, lamenting that the company never released data on the real impact fact-checking was having on the platform.

The revelation, on the pages of The New York Times, that Facebook had hired opposition research firm Definers to disseminate misleading news story about its critics – including Apple and Hungarian philanthropist George Soros – also added to some fact-checkers’ misgivings about the the company’s  real commitment to cleansing its platform from junk content.

This article was originally published by WIRED UK