Facebook Has a Secret 'Trustworthiness Score' For Users
Facebook is introducing a new rating system for users that evaluates them on their trustworthiness, the Washington Post is reporting. The system is part of a broader effort to fight misinformation on the platform.
Facebook has long had a problem with fake and misleading stories on its platform, and the company has launched a number of initiatives to reduce the number of those stories. One of those is a feature that asks users to vote on how truthful certain news stories are. If a particular news story is reported as false by many people, Facebook employees will look at it and determine if that assessment is accurate.
But that process creates another problem: How do you tell if the users reporting news stories are themselves trustworthy? To solve this problem, Facebook created another metric that tracks how well a particular user’s assessment agrees with Facebook’s. If users report a story as fake when it’s actually true, their trustworthiness score will go down, and it will go up if they use the tool as intended.
It’s “not uncommon for people to tell us something is false simply because they disagree with the premise of a story or they’re intentionally trying to target a particular publisher,” said Facebook product manager Tessa Lyons to the Post.
The "trustworthiness score" is a hidden value only visible to Facebook’s team, and is a single number ranging from 0 to 1. Facebook says that this is just one of several metrics the company uses to evaluate reports from users.
Source: The Washington Post
thanks for your visit