In an effort to combat misinformation on its platform, Facebook is introducing new ways to alert users when they interact with content that has been flagged as false by a fact-checker, as well as tougher penalties for people who share false information repeatedly.
From false or misleading content about COVID-19 and vaccines to climate change, elections, and other topics, the tech giant is working to combat false information and ensure that fewer people see it on its platform.
According to further reports, Facebook has also revamped the notifications that appear when users share content that has been deemed false by fact-checkers.
“We currently notify people when they share content that a fact-checker later rates, and now we’ve redesigned these notifications to make it easier to understand when this happens,” according to an excerpt from Facebook’s blog post. The notification includes a link to a fact-article checker to debunk the claim, as well as an invitation to share it with their followers. It also contains a warning that people who repeatedly share false information may have their posts moved lower in the News Feed, making them less visible to others.”
How will Facebook users be notified about fake news pages?
If you try to like such a page, a pop-up will appear, claiming that it has “repeatedly shared false information” and that “independent fact-checkers confirm it as false information.”
You will then be given the option of returning to the previous page or continuing on with the current page. Basically, people will be able to make an informed decision about whether or not they want to follow the page as a result of this.
Facebook also announced that it would increase penalties for individual Facebook accounts that repeatedly spread misinformation by limiting the distribution of all posts from that account’s News Feed if they share content that has been rated by the company’s fact-checking partners.