Fingerprints


The National Institute of Justice has recently awarded a substantial grant to researchers at Virginia Tech to figure out a method of measuring and establishing a standard for sufficiency of information available in fingerprint patterns. This is an important step forward. While prints taken in controlled conditions - for instance at border checkpoints - can be analysed by machine, those at for instance, the scene of a crime, are much less clean and may be distorted or partial.

Such fingerprints cannot be machine analysed, but rely on the judgement of human fingerprint experts. Unfortunately, there is no quantitative standard used by the worldwide fingerprint community to determine the quantity and quality of the information in an image, or even the number of points of comparison required for identification.

For those who are not familiar with the details, I should perhaps explain that fingerprinting in criminal cases does not rely on a complete match of the two prints, only that a certain number of 'points' on the prints match. The number of points 'required' seems to vary with the expert. In the US fingerprinting experts have declared positive fingerprint matches in court after finding even less than eight points matching. In the past this has lead to innocent people being arrested.

For instance in 2004 three FBI fingerprint experts declared that Oregan lawyer Brandon Mayfield's fingerprints match a partial fingerprint found on a bag in Madrid containing detonators. They claimed that there were 15 points of similarity. US officials called the match "absolutely incontrovertible", and Mayfield was immediately taken into custody. In point of fact, the fingerprint belonged to Algerian Ouhnane Daoud, and shortly afterwards the FBI was forced to climb down and eat humble pie over the whole affair.

Clearly, it's important to establish a baseline for just what does constitute a match, and also what the uncertainty is on that match. Equally clearly the less points you match, the more chance there is of false positives. Unfortunately, the more points you match, the fewer matches can be done by machine, and the more you have to use human experts, who are more time consuming, more expensive, and (arguably) more error prone. It is this that makes the forthcoming Virginia Tech study so important. If we can at least start to get everyone using the same measuring stick we are in with a chance. That on its own is not enough (for instance, as far as I am aware, there is actually no scientific proof that fingerprints are unique), but you can't even start to look at anything else without such standards. Fingers crossed that this comes up with some useful metrics!
http://www.physorg.com/news178810066.html
http://www.washingtonpost.com/wp-dyn/articles/A64711-2004May28.html
http://www.onin.com/fp/fphistory.html

In a totally ironic piece of serendipity, having written the above, I was looking through the Royal Society's 'Trailblazing' website with a view to reporting on it, when I discovered an entry for 1891 which is Galton's paper on the proof that fingerprints are unique. Galton's fingerprinting categorisation system is still in use today. It was even more ironic that my pdf reader reported that the file was damaged and couldn't be shown!

In spite of this little hiccup (and it may have been a problem with my pdf reader), the site is a superb collection of classic pioneering papers published by the Royal Society in the 350 years of its existence. You can read Dirac's paper predicting the existence of the positron (you need to be good at sums for this one), Watson and Crick on the structure of DNA, Captain James Cook on the prevention of scurvy, and Isaac Newton on light and color. Well worth a look for anyone interested in science and in its history.
http://trailblazing.royalsociety.org/

Alan Lenton
6 December, 2009


Read other articles about computers and society

Back to the Phlogiston Blue top page


If you have any questions or comments about the articles on my web site, click here to send me email.