Facial recognition technology widely used at sporting events, privacy watchdog says
A growing number of sports stadiums are using facial recognition technologies to surveil spectators, spurring a leading privacy nonprofit to appeal to the United Nations to step in.
Twenty-five of the top 100 soccer stadiums in the world augment their existing video surveillance with such technology, according to Privacy International (PI). The organization has submitted findings from its research and recommendations to the UN Special Rapporteur in the field of cultural rights.
The UN office is preparing a report on the right to participate in sports, which will be presented to the body’s General Assembly in October.
An increasing number of sporting arenas are testing facial recognition technologies and some have also begun experimenting with artificial intelligence in a bid to monitor and police crowds, the PI report said. It noted that French authorities plan to use AI to surveil public spaces at the Paris Olympics this summer.
PI and seven other organizations previously appealed to the French constitutional court asking it to block the use of AI at the Paris Olympics, but the court found that the technology’s deployment does not violate constitutional rights.
“The roll out of such intrusive technology does not only pose significant privacy and data protection questions, but also ethical questions around whether modern democracies should ever permit its use,” PI wrote in its letter to UN Special Rapporteur Alexandra Xanthaki.
“The radical introduction of FRT will inevitably result in the normalisation of surveillance across all societal levels and accordingly cast a ‘chilling effect’ on the exercise of fundamental rights, including the human right to participate in and enjoy sports freely,” it added.
Such technology has previously been used inappropriately, PI said, pointing to the 2023 revelation that the owner of Madison Square Garden used facial recognition to ban and even eject lawyers affiliated with firms that had sued him.
The technology has also been used at sporting events to discriminate against women in repressive societies, PI noted. In 2018, authorities in Iran arrested two women trying to attend a soccer match at the city’s Azadi stadium.
The venue is now outfitted with 500 closed-circuit television cameras, which in the past have targeted women disguised as men and ejected them from matches. In December, Iranian authorities for the first time began setting aside a limited number of seats at Azadi for women.
PI is asking the UN to push states to block the technologies from being used at sporting events other than when needed to investigate serious crimes; to make sure certain AI surveillance technologies do not lead to discrimination by providing “systematic assessments” of their impact; to monitor human rights outcomes regularly; and ensure remedies for people whose rights have been violated.
“Facial recognition is dangerous when it makes mistakes, and even more dangerous when it works perfectly,” Jake Wiener, counsel at the Electronic Privacy Information Center said. “Sports venues are a particularly inappropriate location for facial recognition because sport is so closely tied to speech, protest, and personal identity.”
He added that the technologies can “over-criminalize harmless behaviors, lead to wrongful arrests, and suppress speech.”
Suzanne Smalley
is a reporter covering privacy, disinformation and cybersecurity policy for The Record. She was previously a cybersecurity reporter at CyberScoop and Reuters. Earlier in her career Suzanne covered the Boston Police Department for the Boston Globe and two presidential campaign cycles for Newsweek. She lives in Washington with her husband and three children.