white robot on brown rock beside plant

Behind Glass Eyes: A.I. Facial Recognition and the Evolved Horror of Eyewitness Error

The recent boom in the use of Artificial Intelligence models such as ChatGPT, Google Bard, and Microsoft Co-Pilot (and a future offering from Apple on the horizon) shows that society is increasingly willing to embrace and rely upon artificial intelligence to do a great number of tasks for us. Certainly, artificial intelligence can and does provide a great deal of utility to society. However, some technologies must be unleashed on society more cautiously than others, with facial recognition software likely being very near the top of the “tread cautiously” list.

Indeed, while the controversiality of the use of facial recognition in the world of law enforcement is not new,1 a recent lawsuit filed in Texas will send chills down your spine, as it illustrates the abhorrent dangers of trusting algorithms more than ourselves.

The Unfortunate Story of Mr. Murphy

In January of 2022, Harvey Murphy, a 61-year-old grandfather, found himself locked in a Texas jail, accused of armed robbery of a Sunglass Hut store. He was arrested not because of crackerjack detective work, not because they had his fingerprints, not because of any DNA left at the scene, but because the “loss prevention” person at Sunglass Hut utilized facial recognition software that identified a match between the blurry CCTV footage and Mr. Murphy. There was just one glaring problem – Mr. Murphy was, demonstrably, and without question, hundreds of miles away in California at the time of the crime.

What was Mr. Murphy doing in California, you may ask, and how did they know it with enough certainty to allow for his release without going to trial? Well, in a twist, Harvey Murphy’s alibi for the Texas robbery – his documented presence in California – stemmed from him being in a Sacramento jail at the time, having arrested for violating a condition of probation by not reporting his whereabouts. His probation stemmed from non violent crimes. Ironically, the very system intended to hold him accountable provided him with the proof he needed to exonerate him from the unjust accusation of armed robbery.

And if the loss of liberty was not egregious enough, during his unjust incarceration, Mr. Murphy suffered an unspeakable horror. Hours before he was to be released from jail, Mr. Murphy “was followed into the bathroom by three violent criminals. He was beaten, forced on the ground, and brutally gang raped,” according to the lawsuit filed by Mr. Murphy against the retailer.

It Has Happened Here

What happened to Mr. Murphy, which seems torn right out of an episode of Black Mirror, is of course tragic and terrifying, but there is nothing new under the sun. Eyewitness testimony, long considered the golden standard of courtroom evidence, has been repeatedly proven in recent years to be inherently unreliable.

Let’s please remember Mr. Wilton Dedge, the Florida man who spent approximately 20 years behind bars before DNA finally set him free. Mr. Dedge was convicted in 1982 of aggravated battery and burglary. At his trial, the prosecution presented evidence of eyewitness identification, microscopic hair comparison, snitch testimony, and dog sniffing evidence to secure the conviction – all of which were eventually disproven by DNA evidence. Eerily, some of the defense presented by Mr. Dedge at trial was sworn testimony from his mother and brother that he was not even in town when the crime occurred.

Mr. Dedge was released from prison in 2004 with assistance from The Innocence Project, an organization dedicated to exonerating the wrongly convicted and that has helped secure the release of hundreds of people wrongfully convicted, many of whom, like Dedge were imprisoned based upon eyewitness misidentification.

What’s Old is New – But is Now More Disgusting

Mr. Murphy’s ordeal echoes these injustices, but with a modern spin. Like many exonerated before him, he faced the double whammy of being stripped of his freedom and grappling with the psychological trauma of wrongful accusation. But his story takes on a more surprising turn – his arrest didn’t occur in the midst of a high-pressure investigation, but during a seemingly mundane activity – Mr. Murphy was at the DMV renewing his driver’s license when he was arrested. This detail underscores the insidious reach of private sector facial recognition, where corporations, not police officers, can hold the keys to a a person’s jail cell.

Mr. Murphy’s tragic story serves as a cautionary tale of blind trust in algorithms and the dangerous ease with which corporations wielding facial recognition can influence law enforcement. The police, seemingly accepting the software’s match as gospel, apprehended Mr. Murphy based solely on the private company’s faulty analysis. This raises chilling questions about accountability – who is responsible when algorithmic bias results in a life wrongfully imprisoned?

We all have a dog in this fight.

I do not practice in the area of criminal law and I am not licensed to practice in Texas, where these tragic events unfolded. That said, I took an oath to uphold the Constitution of the United States and these events are precisely the types of government overreach that we, as citizens, should be protected from. Law enforcement should not be able to skirt due process and flimsily rely upon a private company’s false accusation, to deprive an innocent man of his life and liberty. If artificial intelligence is going to further degrade our constitutional rights, the time to do something about it is now, not years from now when more of our liberties have been taken, all in the name of corporate profit margins.

Mr. Murphy’s story should be a wake-up call, not just for legal professionals but for all of us.

Be outraged.

In the pursuit of truth and justice, we must demand meticulousness and critical thinking, remembering the lessons of eyewitness error and recognizing the inherent fallibility of technology. We must question any blind trust placed on technology and algorithms, especially when life and liberty hang in the balance. We should oppose corporations’ usage of this technology when dropping the hammer of the law, all in the name of shareholder profit. And we must demand accountability, ensuring that the consequences of flawed technology don’t fall solely on the shoulders of innocent individuals like Mr. Murphy. In the land of the free, we, as Americans, should tolerate nothing less.

If you would like to take action, below is a link to locate the elected officials representing you. You can reach out to them to let them know you will not stand for this.

https://www.usa.gov/elected-officials

  1. The usage of facial recognition technology by law enforcement has been a controversial topic for years. In 2021, Amazon extended “until further notice” a moratorium it had previously imposed on law enforcement’s use of its AI facial recognition software, Rekognition. And in 2023, despite previously publicly denouncing its usage, IBM signed a $69.8 million contract with the British government to develop national biometrics platform that will offer a facial recognition function to immigration and law enforcement. ↩︎

Call Us Skip to content