Criminal Television: Effecting more than just entertainment

Let’s create a scenario. A man is shot dead in the middle of a busy Brooklyn street. He is later found a few blocks away from the crime scene, gun in his backpack, and blood on his shirt. A group of eye-witnesses immediately identify the man as the shooter. Is this evidence considered to be beyond a reasonable doubt to incriminate the man? In traditional trends of history, this would have been more than enough evidence to link the suspect to the crime. However, with the increasing interest in crime based shows such as CSI: Crime Scene Investigation and Bones, jurors are shifting gears.

A new trend has emerged where jurors are no longer supporting convictions without DNA analysis. In a recent case, they have even let a man who was identified by multiple eye witnesses as a shooter walk free, due to lack of DNA evidence. This latest fad is being denoted as the CSI Syndrome, the CSI Infection, or most commonly, the CSI Effect. It essentially refers to the belief that jurors are holding a higher standard of actuality in order to support guilty convictions. They are demanding a significantly higher amount of forensic evidence from prosecution and are raising the bar for adequate standards of proof. As it has been found, jurors are holding circumstantial evidence to a much lower threshold than decades prior. What’s the issue with this up and coming trend? Reality and television have their vast differences.

Although in a television show it may be easy for an acting detective to conveniently find and collect a perfect fingerprint or DNA sample. In reality, this luck is rarely as painless. DNA fingerprint collection can be subjected to inaccuracy depending on the competency of the equipment and the lab personnel, it loses accuracy in proportion to time, and it can be nearly impossible to collect in public places that are flooding with fingerprints. There have been many cases of false imprisonment due to fingerprint identification errors by expert examiners—such as the arrest of Oregon lawyer Brandon Mayfield in connection to the Madrid terrorist bombing, when the fingerprint on a suitcase in Spain’s terrorist attack was found to be similar to his.

Three top ranking FBI analysts made the same identification mistake when they linked Mayfield’s fingerprint to this bombing. Following his false imprisonment the Federal Bureau of Investigation was required to recant their long standing belief that fingerprint identification was an exact science, yielding 100 percent certainty. Similarly, over the last 100 years there have been numerous cases in which convictions were reversed, following the establishment of faulty fingerprint identification.

The International Association of Identification (IAI) offers certification tests to people who are already working professional within this field. Astonishingly, over half of the examiners who take the test end up failing it. Furthermore, there are virtually no crime laboratories in the United States that require a person to be certified prior to working and no judge requires an examiner to be certified before testifying in court. For this reason, many believe that the issue is not the physical evidence, but rather, is the expert analyzing it. In the case of Rick Jackson, he was falsely imprisoned when two fingerprint analysts positively linked him to a bloody crime scene fingerprint. Although the prosecution had two analysts that confirmed this connection, the defense’s two certified analysts rejected the identification—thus proving human fallibility.

In regards to other DNA samples, such as blood, semen, skin, saliva, or hair, they are not excluded from imprecision either. Although it is thought that human DNA can potentially hold full genetic “blueprint,” nearly 95 percent of DNA is still not understood. An interesting article published by the United Kingdom’s WideShut Webcast: Alternative News and Views shed light on a novel controversy. Dating back to the era of the Romans, criminal trials have been marked by the principle of “innocent until proven guilty.” However, with the rising popularity of crime scene DNA analysis, this ancient standard is being reversed. With DNA collection, a suspect is essentially guilty until proven innocent.

Similar to the human error’s provided in fingerprint identifications, these same fallibilities can occur with DNA collections as well. For example, in 2012, a man named Adam Scott was positively matched to the DNA found in a rape victim 300 miles away in Manchester. Scott, who had never been to Manchester and who was incarcerated on other unrelated charges at the time of the incident, was in no way connected to this crime, besides an unusual DNA match. How could this have happened? A private firm in England failed to properly dispose of a used plastic tray during part of the robotic DNA extraction process—thus contaminating their evidence and wrongfully incriminating an innocent man.

Despite the risk of human error being low in these specific areas of DNA collection, they do come with high costs. If a person is circumstantially and undeniably linked to a crime, is DNA analysis still needed to further validate their incrimination? Is it better to be safe than sorry? This relatively new phenomenon has gained international support since its first introduction into criminal investigation in 1986 and has actually resulted in the exoneration of 330 people within the United States (as of 09/03/2015). Of this 330 wrongfully convicted people, the average length of time served was 14 years, and their average age was 26.5 at the time of incarceration. 20 of the 330 were on death row and 16 others were charged with capital crimes. Due to the high success rate of DNA within our criminal justice system, should it become a regular part of our procedures? Do the pros outweigh the cons?