´óÏó´«Ã½

Scientists aim to spot abusers from their hands

  • Published
A man's hands in handcuffsImage source, Getty Images

Scientists are hoping to find a way to identify child sex abusers just from images of their hands.

Often, the backs of hands are the only visible features of abusers in footage and images shared online.

A new study aims to discover whether our hands are truly unique by looking at physical differences between them.

Researchers plan to do this by training computers to spot anatomical features in anonymous images sent by the public.

This will allow algorithms to be designed that will help police to link suspects to crimes just from images of their hands, scientists hope.

'Step-change in science'

Scientists from the universities of Lancaster and Dundee are now calling for more than 5,000 "citizen scientists" to take part in their study, so there is enough data to prove beyond reasonable doubt whether our hands are unique.

Forensic anthropologist Prof Dame Sue Black said: "Our hands display many anatomical differences due to our development, influence of genetics, ageing, environment or even accidents.

"We know that features such as vein patterns, skin creases, freckles, moles, and scars are different between our right and left hands, and even different between identical twins.

"We are looking to deliver a step-change in the science so we can analyse, and understand, all the factors that make a hand unique."

A web-based app for anyone aged 18 and over to contribute their images to the project is available to use on smartphones at h-unique.lancaster.ac.uk

The images are not shared with any external agencies and will be destroyed at the end of the five-year research project, funded through a 2.5m euros (£2.1m) grant from the European Research Council.

Dr Bryan Williams, lecturer in biometrics and human identification at Lancaster University, said: "The tools we will develop will reliably and robustly inform decisions in criminal courts.

"They could also be used to assist law enforcement agencies to rapidly and autonomously analyse hours of footage and thousands of offensive images."