hoan-ton-that-clearview-ai (1)

At war with facial recognition: Clearview AI in Ukraine

Hoan Ton-That, CEO of Clearview AI sat down with Click Here to discuss his company’s operations in Ukraine and the controversies that have dogged what has emerged as the world’s largest facial recognition network. (He wants 100 billion images in the database within a year. That’s almost 13 photographs for everyone on earth.)

Among other things, some 400 investigators in Ukraine are using Clearview AI to build war crimes cases, identify the dead – both Russian and Ukrainian – and tell families back in Russia that their children have died in the war. 

Not long after the company began providing free logins for Ukrainian investigators, a volunteer hacking force called the IT Army of Ukraine appeared to use a program that resembles Clearview AI for similar purposes.

The Ministry of Digital Transformation in Ukraine confirmed to Click Here that it is also using Clearview to tell families about fallen soldiers. They say it is partly to give Russians a taste of the true cost of the war and partly to tell families that if they wish to pick up the bodies of their loved ones, they can travel to Ukraine to do so.  

Hoan Ton-That talked about the controversy surrounding his biometric company, the perils of using technology in a war zone and what to expect from Clearview going forward. 

The interview has been edited for clarity.

Click Here: How did Clearview come to play such a major role in Ukraine?

Hoan Ton-That: When the war started, it was really shocking to me and a lot of members of our team to see especially the video footage of women and children suffering and it made me think, how can we help? Originally, I would see photos of captured Russian soldiers, and I realized with that kind of photo quality, our facial recognition technology could be helpful. So, I reached out to a lot of people on our advisory board to ask them, do you know anyone in the Ukrainian government, and one person — Lee Wolosky —he’s on the National Security Council under three presidents, he quickly said yes. We thought that it could be helpful in identifying deceased soldiers and track misinformation.

CH: How could it be useful to track misinformation?

HT-T: You'd see a lot of things on social media saying this is a captured Russian soldier, but you might see people on the other side saying actually that's a paid actor and here's the name of the paid actor. So with this technology, the level of accuracy can be used to identify if someone is who they say they are. 

CH: Who in Ukraine has logins to Clearview AI?

HT-T: It's in six different agencies, including the National Police of Ukraine. Then the people on the ground would be using it. So once we gave a demo, we'd give them training on how to use facial recognition technology responsibly. So part of the training is that they would send photos of unidentified people and run it through the system and we could show them, ‘Hey, this is how you verify who it is.’ So for example, if they have a tattoo that matches online, there’s a very high chance it's the same person.

CH: You inserted this technology into a war zone, which presents a lot more problems than having a police department in the United States use it. How are you accounting for that?

HT-T: You want to be careful to make sure that they really know how to use it properly, and so there are all these scenarios that we want to make sure don't happen. For example, what if someone takes a photo of someone and says, ‘Hey, I think you're a Russian traitor.’ And then they detain them. And it's all incorrect based on incorrect information. So we'd never want something like that to happen. As long as these investigations are done by trained investigators, this can be a very helpful tool. If it was used by everybody, I think that's when problems happen.

CH: What’s the most surprising use you’ve seen in Ukraine?

HT-T: War crimes investigations. We were talking to people in the National Police of Ukraine and others where they have video footage from surveillance cameras. Smartphones are more prevalent, so there's a higher chance of something being recorded. Now with this technology, if it's easy to identify someone, I think people are going to think twice about war crimes. So that was a surprise to me.

CH: So this is a subscription service, and you say that gives you more control if someone is misusing it… how does Clearview AI work?

HT-T: We vet every person to make sure they’re a government official who is using it. There’s also two-factor authentication, so they still have to verify their device before they log in. Once they have an account, there's an administrator for each agency. So, the administrator can see who is conducting searches and what reason they're conducting the search. There is an intake form that requires a case number and a crime type before conducting a search. So people when they're on-boarded and they're learning about the software, they know that their searches can be audited because we want to make sure they're using it for the right kind of stuff. Because it’s a Cloud service, we have the ability to revoke access. If there's any egregious abuse of the technology, you want to make sure that we have the ability to take it away.

CH: The IT Army appears to be using it. In a video, the group demonstrated the use of a facial recognition program that appears to resemble Clearview AI. This is a volunteer hacking force, so how is it that the Ukraine IT Army appears to be using Clearview AI? 

HT-T: All I can say is that everyone we've on-boarded is a government official. We haven't onboard anyone in the IT Army directly. Everyone we talk to and on-board, we give them proper training on its usage. The speculation that the IT Army is running a Clearview AI search does not match any information we have on this matter. Clearview AI is intended for use in Ukraine by law enforcement and government officials.

CH: Did maybe somebody give a username and password to somebody who's in the IT Army?

HT-T: it's possible that someone shared a screenshot or shared how it worked, but we want to make sure that whatever the usage of the technology is — say it is to identify someone deceased — that is done in a way that is positive. The policy of the National Police and all our users is to tell the family members in a humane way.

CH: Had it occurred to you that the IT Army would use this technology to notify families of dead soldiers as a propaganda tool?

HT-T: I talked to some of the officials [in the Russian government] and I said, ‘Look, is this something you knew about? Is that your procedure for doing this?’ Then said that’s not our official procedure. And they assured me that’s not what they want to have happen either. Again, it is war time. Tensions are really high. Those things can happen.

If I thought it would be used in a really bad way, then I don't think I’d give access to them. We think that just getting the information out in a humane way is the most important thing. What we can control as Clearview is giving access to the right people. So for example, we don't give access to the Russians or anything like that and we make sure Ukraine is trained as appropriately as possible. 

CH: Have you revoked any access related to Ukraine because you thought it wasn't being used properly? 

HT-T: No, not at this time, but the administrators of these agencies in Ukraine have the ability to do so. They can go in and audit the searches, remove access to an account and give access as they deem appropriate. Clearview AI would only revoke access to an agency if there was an egregious amount of abuse in that agency. Until something really escalates to that level, we haven't revoked any access. 

CH: The NSO Group is an Israeli company that makes surveillance software that can be remotely implanted in smartphones. It has come under heavy criticism for its tech being used by authoritarian governments to spy on citizens. With your facial recognition technology, how do you avoid the NSO trap? 

HT-T: I think NSO is a very different kind of technology than what we do. We are searching public information from the internet. So it is really just like Google. If you can type someone's name and the word LinkedIn into Google and you find their photo, then Clearview can basically do the same thing, but it's search by photo. We also work with law enforcement. NSO is different because it’s breaking into phones, very private data. Also, when they sell their software, they don't have the ability to take it back if they sell it to the wrong actor or government. Whereas Clearviews software is deployed in the cloud. If we ever found anything totally egregious or abusive, we have the ability to revoke access. We also want to be aligned with democratic countries, making sure that this is a technology that can be used responsibly and across democratic nations.

CH: Can you imagine a scenario, years and years from now when everyone has this capability that it would be in like VR glasses or built into a phone? 

HT: I can imagine like augmented reality is an interesting where it could be deployed on military bases so in Afghanistan, they had a situation where when they were pulling out at a checkpoint, terrorists could blow up people. They are looking up close at the ID’s. To verify someone at a distance in a hands-free way, I think that's a very positive kind of use case.

Get more insights with the
Recorded Future
Intelligence Cloud.
Learn more.
No previous article
No new articles

Dina Temple-Raston

Dina Temple-Raston

is the Host and Managing Editor of the Click Here podcast as well as a senior correspondent at Recorded Future News. She previously served on NPR’s Investigations team focusing on breaking news stories and national security, technology, and social justice and hosted and created the award-winning Audible Podcast “What Were You Thinking.”

Sean Powers

Sean Powers

is a Senior Supervising Producer for the Click Here podcast. He came to the Recorded Future News from the Scripps Washington Bureau, where he was the lead producer of "Verified," an investigative podcast. Previously, he was in charge of podcasting at Georgia Public Broadcasting in Atlanta, where he helped launch and produced about a dozen shows.