Television seems to be filled with police procedurals that portray latent print and other forensic examiners as prestigious individuals who use hard evidence to convict hard criminals. The reality is that no matter how much it's glamorized, fingerprint analysis has its share of flaws. To be sure, it comes with undeniable benefits, but it's equally important to consider the fact that it also comes with disadvantages.
While DNA and fingerprint analysis does not prove the guilt or innocence of suspects, it can provide compelling evidence. Unfortunately, only about 1 percent of major crimes offer these types of hard evidence. Therefore, juries are more often forced to rely on subjective forms of evidence like eyewitness testimony. While the opposing counsel has the right to cross-examine witnesses, psychological studies have proven that this doesn’t always overcome flaws inherent to personal testimony, like storytelling bias and memory reconstruction and distortion.
In 1924, James W. Preston of Los Angeles was arrested on a minor charge. Soon, Los Angeles newspapers ran stories based on misinformation stating that he had been identified as the assailant in a recent robbery and shooting based on fingerprint evidence. The jury convicted Preston based on the news stories even though none of the evidence was presented in the case; two years later, the real felon was discovered after being arrested for other burglaries. In 2004, Brandon Mayfield of Oregon was wrongly convicted of a Madrid, Spain, bombing where FBI investigators purported a 100 percent fingerprint match. Weeks later, an Algerian was found to be the true perpetrator, leaving citizens to consider the validity of fingerprint analysis.
Becoming a latent print examiner requires a bachelor’s degree, a minimum of 80 hours of formal training and at least two years of full-time experience. Less desirable duties of an examiner include preparing court exhibits, providing testimony, preparing reports on print examinations and training other officers and investigators in proper fingerprinting techniques.
An alternative to traditional fingerprinting analysis called brain fingerprinting emerged in the early 1990s. Similar to traditional fingerprinting, brain fingerprinting helps determine with a high degree of accuracy whether a suspect was present at a crime scene. However, it’s estimated that the technique applies to approximately 60 to 70 percent of major crimes. This gives brain fingerprinting the potential to have a huge impact on the criminal justice system. A judge first ruled brain fingerprint evidence as admissible in court in a 2002 case -- you may see it used more frequently in the future.