大象传媒

  1. Try  
  2. Rate 139 ratings

How did you rate this?


Your device is not currently supported by Evidence Toolkit
We鈥檙e experimenting with new technology and the browser you are using is unable to run this pilot. Try visiting this page in a different browser or check out some of the other ideas on Taster below.
Pilot ended 28th December 2018
0
Tried
139
Rated
35
Shared
Evidence Toolkit trailer
Team up with 大象传媒 School Report and take fake news to task with the Evidence Toolkit. Equip yourself with all you need to dissect the news and figure out what's really going on.
139 ratings 35 shares

The Inside Story

To explain the project, here's Professor Chris Reed of the Centre for Argument Technology, Dundee University

Can You Sum Up the Project

The Evidence Toolkit is designed to equip you with a toolbox full of gizmos that help you dissect news and figure out what's really going on. It's being deployed as a part of 大象传媒 School Report, which this year is focusing on the challenge of fake news.

How was it created?

A lot of thinking around fake news focuses on how an article relates to other things – where a picture comes from, what the source of an article is etc. We're focusing on a different and insightful angle – digging in to the actual content of the article. Looking at claims and evidence and crucially how they are connected and looking at how well an article presents impartial, balanced assessment are all ways of probing news.

We've carefully selected news articles from recent media and teased them apart to show the many ways in which evidence, claims, and objections come together into a coherent story. Each article is then presented as a series of challenges for the user: Where is the claim? What type of evidence is supplied? How well is that evidence working? How well are alternative views considered etc?

Many of the ideas and skills are quite slippery so we use a series of illustrations from Radio 4’s Moral Maze showing how expert debaters put the concepts to work in real situations.

How was it built?

The software has been developed using a variety of web technologies at the front end to ensure a clean and consistent user experience across platforms. At the back-end, there is some state-of-the art deep learning going on to supply the ReasonChecker that acts as a guide to help you if you get stuck.

Any challenges and what were the solutions?

There are two big challenges. First, we had to trawl through hundreds of hours of Moral Maze episodes to find examples that lay out clear illustrations of often pretty tricky ideas. The solution: lots of listening and searching!

The second big challenge is known in scientific research as “Argument mining” – automatically finding claims and reasons. It's one of the toughest AI language-processing tasks around at the moment, which is why we need to be clear that this prototype is not an infallible oracle. But it represents the state of the art. You can find out more from this article in .

What do you hope to learn from it being on Taster?

The challenge we're addressing is how to help young people in particular deal with a world in which fake news and biased reporting is ever more common. It's partly about getting across a demanding skillset, and partly about doing it in a way that is engaging and appealing. So the big questions for this Taster deployment are whether we can get the skills across and whether we can do it in a way that's still fun.

What next?

Potential future steps will include exploring further collaborations between ARG-tech and additional 大象传媒 departments, alongside Radio 4's Moral Maze, to scale up the public availability of tools that help all of us deal with the onslaught of conflicting media.

Evidence Toolkit