Facebook has rolled out a host of updates to its website policy to maintain the integrity of information that flows throughout its platform and fight misinformation and harmful content including hate speech.
The world’s leading social media giant said Wednesday it introduced a new section on the Community Standards site for users to track its monthly updates and keep informed of its policy changes to stay ahead of new trends raised by content reviewers and experts from outside the company.
Facebook will closely monitor online communities or Facebook Groups and will restrict their influence by reducing their reach if they are found to “repeatedly share misinformation.” The new limitation was enforced immediately starting Wednesday across the world to reduce the overall News Feed distribution of Groups whose content is rated false by independent fact-checkers.
Facebook also launched globally a new metric called Click-Gap on Wednesday by incorporating a “Click-Gap” signal into News Feed ranking, which is aimed at limiting the spread of websites that are disproportionately popular on Facebook, so as to help people stay away from low-quality content on its platform. Facebook has in recent months ramped up its fight against misinformation and false content on its website as it is under increasing pressure from the public and regulators with regard to the company’s management of website content.
The social media company has been criticized for failing to take timely steps to block a suspected terrorist from streaming the live video on mass terror attacks on two mosques in New Zealand last month. The massacre, which killed 50 people, was briefly broadcast on Facebook before the video clips were removed from the platform.