Facebook is now rating users on how trustworthy they are in identifying and reporting real and false news.
The ratings system was first reported by The Washington Post.A Post reporter interviewed Facebook product manager Tessa Lyons,who leads the company's efforts to fight misinformation.
Lyons told the newspaper the system was developed and put into place over the past year.
In January,Facebook announced a similar system to produce ratings for the trustworthiness of news sources.At that time,Facebook founder Mark Zuckerberg said those ratings would be based on information provided by Facebook users.
Zuckerberg said news sources receiving higher trustworthy ratings from the community would be prioritized in the social media service's News Feed.
The new ratings system is designed to predict how effective a user is at identifying and reporting false news stories.Every user is given a rating between zero and one,the Post reports.
Lyons told the newspaper such a system is necessary because some users are incorrectly reporting whether a story is true or false.
It's"not uncommon for people to tell us something is false simply because they disagree with the premise of a story or they're intentionally trying to target a particular publisher,"Lyons said.
As an example,she said a user's trustworthiness rating would go up if they previously reported a news story as false,and the story was later confirmed as false by an independent fact checker.
In addition to information from users,Facebook uses machine learning systems to choose stories to be checked for truthfulness.The company has a partnership with several major news and fact-checking organizations that examine news stories reported as possibly being false.
Lyons noted that the numbered rating is not the single measure Facebook uses to judge a user's overall trustworthiness.She said the company also uses other"signals"to rate users,but did not provide further details.
Efforts to fight misinformation
In addition to taking steps to fight misinformation,Facebook has also sought to limit efforts by foreign organizations to influence the U.S.political process.Facebook previously found evidence that false accounts created in Russia and other nations were used to try to influence American voters in the 2016 election.
On Tuesday,Facebook said it had identified and removed hundreds of accounts linked to Russia and Iran.It said the accounts were part of separate disinformation campaigns on Facebook.
In announcing the findings,Facebook chief Zuckerberg said there was still a lot the company does not know about the operations.However,he described the campaigns as"sophisticated"and well-financed efforts that are likely to continue.
"You're going to see people try to abuse the services in every way possible...including now nation states,"Zuckerberg said.
Spokesmen for both Iran and Russia denied any state involvement in the activities described by Facebook.
This week,American software maker Microsoft reported it had taken control of several websites created by hackers linked to Russia's government.The company said the websites were made to look like they belonged to the U.S.Senate and conservative research groups.But they were actually false websites created in an effort to gather personal details of users.
Microsoft warned the hacking incidents were further evidence that Russia is expanding its attacks before U.S.congressional elections in November.