*
الثلاثاء: 27 يناير 2026
  • 27 January 2026
  • 09:27
How does artificial intelligence reopen mysterious files in police departments

Khaberni - More than 30 years after a crime whose details disappeared in the drawers, the victim's name suddenly came back to the forefront. No new witness appeared and no late confession was recorded; what simply happened was that an algorithm revisited old recordings and connected two threads that had never met before. In the age of artificial intelligence, long silence is no longer the end of the story.

The silent evidence speaks
For decades, the accumulated evidence in police departments was more of a burden than an opportunity, with unclear surveillance footage, incomplete fingerprints, and massive communication logs that couldn't be sorted manually, along with forensic reports that were closed without result.

But today, the scene has changed, as image enhancement technologies, pattern recognition, and big data analysis have redefined "value" in this evidence. What was not suitable for investigation in the 1990s now serves as raw material for re-analysis, and possibly for reopening cases that remained unanswered for many years.

In some cases, re-examining DNA databases using newer algorithms has narrowed the scope of suspicion, and in others, analyzing old footage—of poor quality—has extracted details that the human eye could not detect at the time, turning time—once an enemy of investigation—suddenly into a technical ally.

In Alaska, this development has offered new hope for the files of missing indigenous people, after these cases represented a "dead end" due to the accumulation of paper documents and unsorted evidence for decades.

The new investigator.. a non-sleeping algorithm
Within modern interrogation rooms, paper files are no longer the star, but rather artificial intelligence systems that operate around the clock scanning millions of records in record time, detecting intersections between phone locations and car movements, connecting disparate reports, and extracting "threads" potentially leading to new investigative paths.

Startups like "Closure" and "Longeye" have already begun providing police departments with systems capable of unloading thousands of hours of audio recordings and converting them into searchable texts in minutes, alongside classifying images and videos and linking old evidence with new data.

This shift has reshaped the role of the human investigator, who instead of just collecting evidence, now analyzes and verifies results gathered by algorithms. The speed here is astonishing but it's a double-edged sword, as the more the system is capable of suggesting potential connections, the greater the need for critical human eyes questioning, "Is this connection logical or just another statistical coincidence?"

Theoretically, the algorithm suggests and does not condemn, but in practice, this "intelligence" may tempt some departments to believe the results before subjecting them to sufficient skepticism.

The police chief of Anchorage describes this shift by saying that new technology has "transformed files that seemed impossible to understand into clear summaries that can be worked on," confirming that it has reactivated stagnant cases—especially those concerning missing indigenous people—without automatically resolving them.

The dark side.. biases, privacy, and deadly errors
Despite the great promises, the increasing reliance on artificial intelligence carries real risks. For example, facial recognition systems have shown in multiple trials higher error rates when dealing with certain population groups due to being trained on unbalanced data, where errors are not just technical but could turn into wrongful accusations that change the life of an innocent person.

Rights experts warn of the slide towards dealing with "algorithmic matches" as quasi-definitive evidence, although it is fundamentally a statistical probability, and the greater risk is that the assisting tool turns into an invisible authority that is difficult to question or hold accountable.

Privacy is the most sensitive question, as to what extent the police have the right to reuse old data—from city cameras to biometric records—under the pretext of solving files? And does the pursuit for justice justify expanding surveillance on citizens who were never suspects?

Legal and ethical boundaries remain nebulous
The problem does not stop at technology, but extends to a clear legislative vacuum. In many countries, there are no detailed legal frameworks regulating the use of artificial intelligence in criminal investigations, and there are fundamental questions still unanswered "Who reviews the algorithm and tests its biases? Who is responsible if its results lead to a judicial error? And should the accused be informed that one of the pieces of evidence is the product of algorithmic analysis?"

As reliance on these tools escalates, the problem grows from a technical to a legal and then to an ethical one, affecting the very essence of justice. The conclusion here is that artificial intelligence has the power to restore hope to families that have long awaited the truth, but if left unchecked, it may open a new door to errors as harsh as the crimes it seeks to uncover.

Topics you may like