大象传媒

Meta

Meta tactics in fighting disinformation on their platforms include blocking and removing fake accounts, finding and taking down bad actors, limiting the spread of misinformation, giving people more context and accurate information and constantly evolving their policies to ensure they are relevant and timely. They now have more than 40,000 people working on safety and security across their platforms to combat such threats. They also expanded their third-party fact-checking capacity in Russian and Ukrainian languages across the region and have provided additional financial support to Ukrainian fact-checking partners. 

More broadly, they continue to grow the number of independent fact-checking partners they work with, have improved their machine learning capabilities to more effectively find and remove violating behaviour, have disrupted more than 100 campaigns of Coordinated Inauthentic Behaviour, and have grown their team focused on influence operations to more than 200 people across the company. Details of Meta鈥檚 Third Party Fact-Checking Program can be found , and the criteria for partners includes adherence to the . 

They continue to publish their , providing metrics on how well they enforced their policies globally. Meta have their own definitions of disinformation and misinformation, which can be found on pg.2 of their submission to UN Special Rapporteur on Freedom of Opinion and Expression for Report on Disinformation. 

Facebook have expanded their Community Standards to specifically include their and to clarify their actions in this space. Meta policies can be found .