This year we have all seen, vividly, the clear connection between disinformation and real world harm. The ´óÏó´«Ã½â€™s North America Editor, Jon Sopel spoke movingly at the ´óÏó´«Ã½ Academy Trust In News conference about reporting on that on 6 January, when, incredibly, the foundations of the most powerful democracy on the planet were shaken by the impact of disinformation. And Covid-19 has also made the potential harm of fake cures and bad health advice crystal clear. The drip drip drip of bad information corrodes civic debate, threatens democracy and costs lives. So, what to do about it?
The Trust In News conference brought together expert speakers over three days from Chennai to Brazil and California to Indonesia to share what we are finding out: the contours of the emerging disinformation landscape and what news organisations and tech platforms around the world doing in response.
First, those pushing disinformation are learning from us - the established news organisations. Increasingly, their content is highly produced. We have relied on our own high production values being one of the ways audiences know they can trust us: we can’t rely on that implicit signifier to our audiences any longer.
They are using apparently trustworthy sources. Anti-vax content often uses interviews with people who have medical degrees for instance.
And there is frequently a grain of truth to what is claimed. That makes untangling the true from the false harder. Go back and watch ´óÏó´«Ã½ Reality Check’s Peter Mwai. Speed matters - but so does accuracy of course.
But in this new environment, we need to double down on what makes us different. ´óÏó´«Ã½ Director-General Tim Davie stressed the crucial importance of news that is impartial. That was echoed by Clara Jiménez Cruz from Maldita, a Spanish non-profit focused on stopping disinformation: she said audiences won’t believe your fact-checking content unless they think you are impartial.
We heard there is a hunger for holding power to account. Worldwide, many politicians are often using their vast online reach and very emotionally satisfying messages to push disinformation. Maria Clara Pestre from AFP talked about how to fact check the Brazilian president, Jair Bolsonaro - being battle-ready involves keeping alert, having experts on standby and above all, working fast.
But a theme raised several times was - we have to learn from those pushing disinformation - the bad guys as one speaker put it. Disinformation often gives people agency, social capital and community - for example we heard QAnon supporters disproportionately work at night. Rebecca Skippage, who leads on disinformation at ´óÏó´«Ã½ Monitoring, has written about this as part of her Reuters Institute Journalist Fellowship. Building on her findings, ´óÏó´«Ã½ Monitoring is working with ´óÏó´«Ã½ Local Radio to listen to our audience’s experience of the disinformation they come across in their networks.
Cameron Barr from the Washington Post laid down the challenge of telling stories in ways that will resonate - he said the Washington Post was looking at more innovation in data visualisation, and new audio and video formats to enlighten and inform. Santanu Chakrabarti, who leads Audience Insight at ´óÏó´«Ã½ World Service, made a prediction: India has super cheap mobile data charges, and the effect has been audiences want to communicate with more pictures, less text. When that super cheap data is available in other countries, audiences will look for content that contains more images.
And no one can have watched day two’s session without reflecting on the human cost of disinformation - especially for women and people of colour. If you didn’t see it, go back and watch it, including Marianna Spring’s very powerful testimony. Fighting disinformation now means you are involved in a very personal way with a story long after you have published your journalism. Trolling has a cost – and we all need to protect those at the frontline.
So, what is the action we can take? The Trusted News Initiative is taking action by working with our partners, where we can. That partnership runs from Facebook, Microsoft, Google and Twitter through to Reuters, AFP, AP, the EBU, The Hindu and many others. Being part of this partnership will never muzzle our journalism though.
Almost two years ago, we came together with our partners to fight the most serious disinformation - where it risks an immediate threat to life or an immediate threat to the integrity of the electoral process worldwide, and has the potential to go viral.
It’s about agreeing standards for countering some of the most harmful disinformation.
But it is also about creating a system. We have a fast alert - a sort of break glass moment - to alert each other when we come across the disinformation that meets that high bar, and that includes the most potentially harmful imposter content - that purports to come from the ´óÏó´«Ã½ for instance, but doesn’t. One example of when we used the fast alert is when Covid conspiracists in several countries urged people to attack 5G infrastructure.
We also talk about the evolution of disinformation and about media education - and technological solutions.
The ´óÏó´«Ã½ World Service is investing in research to find out what works. Dr Rasmus Kleis Nielsen, Director of the Reuters Institute for the Study of Journalism, presented the designed to counter false and misleading information about COVID-19 on social media platforms across the UK, Brazil, and India.
Meanwhile, Project Origin is the engineering approach spearheaded by the ´óÏó´«Ã½, Microsoft, CBC/Radio-Canada and the New York Times to detecting the provenance of media and fighting deep fakes. The ambition is to make it a global standard used everywhere.
This is a work in progress - but the role of the ´óÏó´«Ã½ has never been more important.