Police body camera, stack of documents, courtroom gavel

Law enforcement is using AI to synthesize evidence. Is the justice system ready for it?

Max Dorsey, a small-town South Carolina sheriff probing a sprawling conspiracy case, has been using a little known artificial intelligence tool to augment his investigative team.

In an effort to process the vast amount of data his agency is collecting and save investigators time, Dorsey has turned to TimePilot, software produced by the startup Tranquility AI. The platform is now being used by at least a dozen law enforcement agencies nationwide. 

“The tool allows us to sort through massive amounts of data that the human brain just cannot process because it's so much,” Dorsey said. “It’s not unusual to find a cell phone that has a terabyte of data and it is very difficult for a person to properly look through all that.”

The sheriff, whose department is located in largely rural Chester County, says he counts on TimePilot to summarize key elements of the conspiracy case and quickly surface relevant evidence. Dorsey can type in a phrase, and the AI spits back information pulled from the body of data his team has put into the system.

Dorsey said that while he can’t guarantee TimePilot is accurate 100% of the time, he does not rely on the AI’s information without going back to the source evidence and verifying it.

Maryland-based Tranquility, which came out of stealth mode in February, is one of a few new companies offering police and prosecutors artificial intelligence that synthesizes evidence to deliver neat summaries and tidily packaged insights. Other companies in the category include Truleo and Allometric. 

All three firms promise law enforcement they can save valuable time. Unlike Tranquility, Allometric offers little information on its website about how its product works and which agencies use it. Truleo discloses more, saying that investigators can access its “virtual crime analyst” for automated reports, “case query and hardening” and witness canvassing. 

Digital freedoms groups and advocates for defendants say such technology corrupts the criminal justice system. They point to the potential for exculpatory evidence to be missed when police and prosecutors are relying on AI. The concern is that cops and district attorneys are overworked and may be tempted to rely on AI summaries without combing through case data to see what TimePilot — or similar products like Allometric’s AirJustice and Truleo’s Analyst — have omitted or checking the software’s output against original evidence for accuracy.

The advocacy groups also cite the risks of AI hallucinations and sycophancy — instances when the technology tells a user what it thinks they want to hear. 

Experts and advocates interviewed for this story said they had never heard of artificial intelligence being used to synthesize evidence in criminal cases. Those experts offered assessments after reviewing the numerous claims, case studies and videotaped demos featured on Tranquility AI’s website.

“Summarizing pages and pages of evidence or hours of footage … is really just editorializing and, when liberty is at stake, these shortcuts are really dangerous,” said Tom Bowman, a former public defender who is now policy counsel for the Security and Surveillance Project at the Center for Democracy and Technology.

“You're creating risks that the AI is going to omit context, mislabel events, even overlook exculpatory evidence, and when that gets incorporated into a narrative of a case that's not just a technical flaw — it’s a civil rights violation.”

‘Human oversight must remain at the center’ 

Many of the law enforcement agencies deploying TimePilot have only had the tool on hand for a few months. How the AI will be treated by courts remains to be seen — if judges and juries are even apprised that it has been used, which experts said is far from a foregone conclusion.

A spokesperson for Tranquility did not respond to multiple requests for comment. The company emphasizes its “commitment to ethical AI development” on its website, saying that TimePilot is designed to “uncover objective facts while actively preventing bias.”

The company addresses ethics in a blog post added to the site on September 25, saying that AI “should support, not replace, judicial or law enforcement decision-making. Human oversight must remain at the center of the process.”

The blog post acknowledges concerns about expanded access to sensitive personal data through AI use, saying that records must be anonymized when possible and only used for their “intended purpose.”

Defendants and their lawyers “must have the ability to question and understand AI-influenced evidence or recommendations,” the blog post says, but does not explain how they can do so. 

The founder of Allometric, which launched this summer, said the company’s AI is now being piloted by seven jurisdictions, including one public defender’s office, and that despite its newness, it already has been used to secure at least one conviction.

“Attorneys remain ultimately responsible for identifying and disclosing exculpatory evidence and complying with their discovery obligations,” Allometric CEO Patrick Robinson said via email. “Our goal is to make the discovery process faster, more accurate, and more consistent, never to automate away professional judgment.”

Truleo’s AI product for investigators launched in June. It costs $200 per month for each user. Tejas Shastry, the company’s co-founder and chief technology officer, also placed the onus on investigators to ensure they are backstopping the AI.

“The product summarizes all evidence the investigator provides it,” Shastry said via email. “It is up to the investigator how to use those summaries to further the case.” Hundreds of police departments have signed on to use the Truleo product since the June launch, he said.

The use of such products will soon become more widespread, according to Ian Adams, a former police officer and criminal justice professor at the University of South Carolina.

“This is a new category of AI products that I see a lot of development, commercialization, and promise in,” Adams said via email. 

Adams also sees major weaknesses, particularly in terms of what he calls the “quiet structural problem of omission,” though he said that hallucinations are unlikely. 

“The ‘savings’ the vendor promises never fully materialize, because you can’t safely shortcut the due diligence,” said Adams, who has led independent research into the use of AI by law enforcement, including by evaluating Truleo. 

“While I’m optimistic about this product category, we need a lot more evidence,” he said.

The appeal of a time-saver

Most of the 12 law enforcement agencies named as clients on Tranquility’s website are in rural areas. The only large entity is the Orleans Parish district attorney's office in New Orleans. 

In July, the company inked a deal with top government information technology vendor Carahsoft — a relationship that will likely dramatically increase TimePilot’s reach. In the meantime, Tranquility has given some agencies free access through grant awards, according to law enforcement officials interviewed for this story.

Sheriff Max Dorsey of Chester County, South Carolina. Image courtesy of Max Dorsey

Sheriff Max Dorsey is one of several U.S. law enforcement officials with early access to an AI platform that helps process evidence. Image courtesy of Max Dorsey

It is unclear how much the product costs users who have not been awarded a grant. Dorsey said he negotiated a special price by agreeing to only use TimePilot for one case but declined to say how much he paid.

A use case featured in a video on the Tranquility website shows how the AI breaks down evidence from the Boston Marathon bombing investigation for a hypothetical prosecutor. An invisible user types into a search bar, asking, “What investigative leads regarding the travel to Russia need to be followed up on?”

TimePilot instantly generates two lists. One offers several examples of “key investigative steps that should have been taken” when investigators probed a key suspect’s travel. A second gives several reasons why TimePilot thinks the travel should be considered “particularly significant.” 

The AI is not just enumerating facts but is also offering analysis by picking which facts to highlight. 

The software understands at least 120 languages and most slang, according to the Tranquility website. It also can read and summarize handwritten reports.

Investigators can dump data from a variety of sources into the platform, including from Axon, which makes automated license plate readers and police body cameras; Ring, which manufactures doorbell cameras; Cellebrite, which extracts data from cellphones; Cash App; Venmo; Prison Calls; TikTok; Instagram and Facebook, the website says.

The platform also can digest data from so-called tower dumps, which provide police with records of every device connected to a specific cell tower during a particular time period, law enforcement officials said. 

McCord Larsen, a prosecutor in rural Cassia County, Idaho, is using the product for free thanks to a grant. Larsen said he asked TimePilot to provide information on a specific evidentiary question and it zeroed in on relevant information in seconds.

“The material I am searching through is thousands of pictures, hours of video and, of course, thousands of pages,” Larsen said via email.

TimePilot’s analysis of the question Larsen asked “saves me hours of time,” he said.

“I can see where it'd be very useful with cold cases — somebody just coming in and needing, like, a crash course on what went on.”

— Kelly Marshall, police chief for Choctaw, Oklahoma

Police in Choctaw, Oklahoma, also have recently received a grant to use TimePilot, according to Kelly Marshall, the department’s chief. Tranquility’s demo of the product was impressive, she said.

“It can link a lot of clues together,” Marshall said. “I can see where it'd be very useful with cold cases — somebody just coming in and needing, like, a crash course on what went on.”

TimePilot’s ability to produce a “quick snapshot” is valuable, she said, because it “cuts to the chase.”

Andrew Guthrie Ferguson, a law professor at George Washington University and the author of the forthcoming book “Your Data Will Be Used Against You: Policing in the Age of Self-Surveillance,” said it's no coincidence that TimePilot and similar products are hitting the market now. 

“Prosecutors will soon be deluged with data from body cams, surveillance cams, and other data-rich surveillance technologies,” he said. “The temptation to upload an overwhelming amount of data into a bespoke AI model will be too strong for many offices to resist.”

‘Bring the receipts and get the pleas’

Tranquility’s website includes videotaped case studies showing TimePilot working with fact sets from well-known cases — including the Jeffrey Epstein probe and the Gabby Petito murder investigation — alongside an article about the despair victims’ loved ones feel when crimes go unsolved. Visitors are repeatedly invited to “book a demo.” 

One section of the website imagines how TimePilot could have uncovered vital evidence for investigators who spent 13 years desperately trying to find a serial killer terrorizing sex workers on Long Island. The suspect in that case, Rex Huermann, was arrested in 2023 and has been charged with killing seven women.

“Make your investigators super human with TimePilot,” the website says. “Process months of data in minutes. … Force multiply your investigative team.”

TimePilot can do the work of 10 investigators making an average of $60,000 a year, clearing 50 backlogged cases annually, the website promises. Time saved? 14,520 hours. Cost savings are estimated to be $418,846.

In the section of the website aimed at prosecutors, similar promises are made.

“Prosecutors are overwhelmed,” the website says, asserting that plea negotiations are cut from 30 days to three on average when TimePilot is used. “Bring the receipts and get the pleas with TimePilot.”

Bowman, the former public defender, said that the marketing underscores the dangers. Overworked people will turn to Tranquility and products like it, he said, and those same people may be too busy to check their output. 

“We might want to think that prosecutors are always going to be able to exercise their discretion and say, ‘Oh, this doesn't seem like it actually matches the initial investigation or other evidence that we have,’” Bowman said, “but the reality is that when you are in a courtroom, both prosecutor and defense attorneys might not have had a good opportunity to fully review the case.”

Lawyers on both sides of criminal cases are stretched, to be sure. Another product, JusticeText, was created by technologists interested in helping public defenders keep up with better-resourced prosecutors. 

The result is an audiovisual evidence management software that creates automated transcripts of body camera footage and interrogation videos and allows for AI-powered evidence review.

‘AI is not trained to be a prosecutor’

"Brady rights," a term drawn from the 1963 Supreme Court decision Brady v. Maryland, require prosecutors to share any evidence that may prove a defendant is innocent or raise questions about the truthfulness of a witness, for example. 

Brady violations can lead to conviction reversals. But with technology like TimePilot, instances of exculpatory evidence being withheld could increase as such errors become much harder to detect, according to Jumana Musa, director of the Fourth Amendment Center at the National Association of Criminal Defense Lawyers.

“AI is not trained to be a prosecutor; it is trained to look for particular things and put them together,” Musa said.

“If your idea is this person has done this thing and there's a gun and a red sweatshirt and a blue car, you say ‘Find me all of these elements,’” Musa said. “Maybe what you're missing is something else that is not a gun, a red sweatshirt or a blue car.”

It’s not clear how many convictions, if any, have resulted from TimePilot’s use so far.

“What is the failsafe? Where's the process? Is there a process?”

— Jumana Musa of the National Association of Criminal Defense Lawyers

The fact that Tranquility is not publicly revealing how its AI is trained troubles Musa. The ethics blog post on Tranquility’s website acknowledges the issue but doesn’t disclose the training, saying only that AI companies “must be able to explain how AI models work, what data they use, and what limits they carry. This makes accountability possible.”

While it is not unusual for a private AI company to guard its proprietary information, Musa said that such secrecy is not appropriate in the criminal justice context.

(Patrick Robinson, of Allometric, said the company does not train its own large language models and instead relies on application programming interfaces and models developed by “leading technology companies.”) 

Ferguson echoed Musa’s point, saying that AI “shifts primary responsibility from a democratically-authorized and licensed lawyer to an undemocratic and unlicensed algorithm.” 

The public’s perception matters, too, the experts said. Citizens know how to assess fingerprint evidence because the process for collecting them is common knowledge. Independent experts also can replicate fingerprinting based on established science.

The same is not true for a large language model using methods protected as trade secrets, Musa said.

“What is the failsafe?” she said. “Where's the process? Is there a process?”

Tranquility’s website does not address model training in detail. Neither do Truleo’s and Allometric’s. 

In Tranquility’s case, the website does hint at why its leadership believes its software is vital, however. The preamble to the U.S. Constitution is featured prominently, highlighting the founding document’s promise of “domestic Tranquility” — with Tranquility in bold text. 

For Musa, a different bedrock of the Constitution comes to mind. The Fifth Amendment promises Americans that no citizen shall be "deprived of life, liberty or property without due process of law." 

“Somebody's life and liberty is at stake, and that is where the most heightened protections should come in,” Musa said. “That is not where we should be outsourcing the development of the case to somebody's black box AI tool.”

Get more insights with the
Recorded Future
Intelligence Cloud.
Learn more.
Recorded Future
No previous article
No new articles
Suzanne Smalley

Suzanne Smalley

is a reporter covering privacy, disinformation and cybersecurity policy for The Record. She was previously a cybersecurity reporter at CyberScoop and Reuters. Earlier in her career Suzanne covered the Boston Police Department for the Boston Globe and two presidential campaign cycles for Newsweek. She lives in Washington with her husband and three children.