90 Miles From Tyranny : Facebook Is Ranking Users’ Trustworthiness Without Telling Them

Wednesday, August 22, 2018

Facebook Is Ranking Users’ Trustworthiness Without Telling Them

Facebook has reportedly been ranking users on their “trustworthiness” by giving them a score between zero and one, according to a report in The Washington Post.

A Facebook product manager, Tessa Lyons, explained to The Post that Facebook’s new reputation ranking is to help crack down on people who report news stories as being fake because they disagree with them politically.

She explained that it is “not uncommon for people to tell us something is false simply because they disagree with the premise of a story or they’re intentionally trying to target a particular publisher.” The score is reportedly one of “thousands” of different behaviors Facebook takes into account to determine whether an account is maliciously flagging behavior.

“I like to make the joke that, if people only reported things that were false, this job would be so easy!” Lyons explained. “People often report things that they just disagree with.”

The Post put the new revelations about Facebook’s ranking system into the context of the recent actions the site, and its competitor Twitter, took against Alex Jones:
The system Facebook built for users to flag potentially unacceptable content has in many ways become a battleground. The activist Twitter account Sleeping Giants called on followers to take technology companies to task over the conservative conspiracy theorist Alex Jones and his Infowars site, leading to a flood of reports about hate speech that resulted in him and Infowars being banned from Facebook and other tech companies’ services. At the time, executives at the company questioned whether the mass reporting of Jones’s content was part of an effort to trick Facebook’s systems.

The Post also asserted “experts” claim people on the right coordinate “harassment campaigns” through mass reporting.

To assuage fears about the ranking system, Lyons said that the system is not an “absolute indicator of a person’s credibility,” according to the paper’s assessment of the situation.

Explaining the concept more, Lyons said, “[...] if someone previously gave us feedback that an article was false and the article was confirmed false by a fact-checker, then we might weight that person’s future false-news feedback more than someone who indiscriminately provides false-news feedback on lots of articles, including ones that end up being rated as true.”

Facebook is cautious about telling reporters how its behavioral signals are...Read More HERE

No comments:

Post a Comment

Test Word Verification