Facebook says it will punish individuals who repeatedly share false information. The company introduced brand-new warnings that will inform users that repeatedly sharing incorrect claims might result in “their posts moved lower down in News Feed so other people are less most likely to see them.”
Up until now, the companys policy has been to down-rank specific posts that are exposed by reality checkers. However posts can go viral long before they are examined by fact checkers, and there was little incentive for users to not share these posts in the first location. With the change, Facebook says it will caution users about the consequences of consistently sharing misinformation.
Pages that are considered repeat offenders will include pop-up cautions when new users attempt to follow them, and individuals who regularly share false information will receive notices that their posts may be less visible in News Feed as an outcome. The notices will likewise link to the truth check for the post in question, and provide users the chance to delete the post.
The upgrade comes after a year when Facebook has had a hard time to control viral misinformation about the coronavirus pandemic, the presidential election and COVID-19 vaccines. “Whether its false or deceptive content about COVID-19 and vaccines, environment modification, elections or other subjects, were making sure fewer people see misinformation on our apps,” the company composed in a blog post.
Facebook didnt show the number of posts it would require to set off the decrease in News Feed, however the company has utilized a similar “strike” system for pages that share misinformation. (That policy has provided debate after reports that Facebook authorities removed “strikes” from popular conservative pages last year.).
Scientists who study false information have actually pointed out that its often the same individuals behind the most viral false claims. A current report from the Center for Countering Digital Hate discovered that the vast bulk of anti-vaccine misinformation was linked to just 12 individuals.All products advised by Engadget are chosen by our editorial group, independent of our moms and dad business.
Until now, the companys policy has actually been to down-rank specific posts that are unmasked by fact checkers. Posts can go viral long before they are reviewed by fact checkers, and there was little reward for users to not share these posts in the first place. With the modification, Facebook states it will warn users about the consequences of repeatedly sharing false information.