Friday, July 26, 2024

How facial recognition tech is identifying people in Ukraine

Last month, controversial facial recognition company Clearview AI announced it had given its technology to the Ukrainian government.

The BBC has been given evidence of how it is being used – in more than 1,000 cases – to identify both the living and the dead.

A man lies motionless on the floor, his head tilted down. His body is naked, apart from a pair of Calvin Klein boxers. His eyes are ringed with what look like bruises.

The body was found in Kharkiv, eastern Ukraine, in the wreckage of war. The BBC has seen pictures taken at the scene, but does not know the circumstances around his death. There is clear evidence of head trauma. He had a tattoo on his left shoulder.

Ukrainian authorities didn’t know who the man was, so decided to turn to a cutting-edge method; facial recognition using artificial intelligence.

Clearview is perhaps the most famous facial recognition system in the world. The company has scraped billions of photos from social media companies including Facebook and Twitter to create an enormous database of what its CEO and founder Hoan Ton-That calls “a search engine for faces.”

“It kind of works like Google. But instead of putting in a string of words or text, the user puts in a photo of a face,” says Ton-That.

The company has faced a string of legal challenges. Facebook, YouTube, Google, and Twitter have sent cease-and-desist letters to Clearview asking them to stop using pictures from the sites. The U.K.’s Information Commissioner’s Office even fined the company for failing to inform people it was collecting photos of them.

Now, its use by the Ukrainian government has raised questions over the implications of infusing this powerful technology into an active war.

Clearview is used extensively by law enforcement agencies in the U.S. Ton-That says 3,200 government agencies have either bought or trialled the technology.

When Russia invaded Ukraine, Clearview’s founder saw another application for the technology.

“We saw images of people who were prisoners of war and fleeing situations, and you know, it got us thinking that this could potentially be a technology that could be useful for identification, and also verification,” he says.

He quickly offered the Ukrainian government the technology – an offer that was accepted.

Back in Kharkiv, authorities took a picture of the dead man’s face – his head held up, his sunken eyes directed toward the camera.

They snapped a picture, and ran it through Clearview’s database. The search returned several pictures of someone who looked very similar to the dead man.

One picture had been taken on what looks like a hot day. The man was shirtless. He had a tattoo on his left shoulder.

The design matched. They had a name.

Using facial recognition to identify the dead is not new, and Clearview isn’t the only platform being used to do it in Ukraine.

“We’ve been using this stuff for years now” says Aric Toler, research director at Bellingcat, an organization that specialises in investigative journalism.

In 2019, Bellingcat used facial recognition technology to help identify a Russian man who had filmed the torture and killing of a prisoner in Syria. This is not facial recognition’s first war.

But its use in Ukraine is more wide-ranging than in any previous conflict. Toler says that he uses the facial recognition platform FindClone in Russia, and that it has been particularly helpful for identifying dead Russian soldiers.

As with Clearview, FindClone searches through publicly available internet images, including Russian social media pages. Even people who do not have social accounts can be found.

“They might not have a social media profile, but their wives or girlfriends might … sometimes they do have profiles and they live in a small town with a big military base. Or they may have a lot of friends who are currently in their unit,” Toler says.

This last point is fundamental in understanding the power of facial recognition technology.

Critics point out that facial recognition technology is by no means always correct – and that in a time of war, errors could have potentially disastrous consequences.

Clearview isn’t just being used to identify dead bodies in Ukraine. The company also confirmed it was being used by the Ukrainian government at checkpoints to help identify enemy suspects.

Clearview showed the BBC an email, from a Ukrainian agency, confirming that the system was being used to identify the living.

“The system gave us the opportunity to quickly confirm the accuracy of the data of detained suspects,” reads the email, from a Ukrainian official who did not want to be named.

“During the use of Clearview AI, more than 1,000 search queries were performed to conduct the appropriate verification and identification.”

This worries some analysts.

Conor Healy is a facial recognition expert at IPVM, an organization that reviews security technology.

“It’s important for the Ukrainian forces to recognize that this is not a 100% accurate way of determining whether somebody is your friend or your foe,” Healy says.

“It shouldn’t be a life-or-death technology where you either pass or fail, where you could get imprisoned or, god forbid, even killed. That’s not how this should be used at all.”

Others have issued more dire warnings. Albert Fox Cahn, of the watchdog group Surveillance Technology Oversight Project, has called it “a human rights catastrophe in the making.”

“When facial recognition makes mistakes in peacetime, people are wrongly arrested. When facial recognition makes mistakes in a war zone, innocent people get shot,” he told Forbes.

Ton-That has defended the accuracy of Clearview’s technology, saying tests had found it to be more than 99% accurate.

Much depends though on the quality of the image, the position of the head, or whether the face is covered, for example by a mask.

Then there is the issue of privacy, which has been problematic for Clearview in the U.S. and Europe. The company pulls publicly available pictures from firms like Facebook and Instagram to build its database.

But it didn’t ask social media companies, or anyone in fact, whether it could scrape these pictures. If you are reading this, you are almost certainly in the database, though you likely didn’t give Clearview permission to use your image.

Facial recognition technology might be useful to the Ukrainian authorities in a time of war. But will they simply hand the technology back to Clearview in a time of peace?

“There are any number of examples of technologies that are introduced in wartime and that persist into peacetime,” says Healy. “I hope that that’s not the approach they take.”

 

https://www.bbc.com/news/technology-61055319

 

BIG Media
BIG Media
Our focus is on facts, accurate data, and logical interpretation. Our only agenda is the truth.
spot_img

BIG Wrap

Gang kills women and children in Papua New Guinea massacre – reports

(BBC News) Dozens of villagers have reportedly been killed after a gang of young men launched a series of attacks in a remote region...

Ten people drown in Panama river as migration risks escalate

(Al Jazeera Media Network) Ten people have drowned in a river near Panama’s border with Colombia, Panamanian border police say, as the rainy season increases the...