French President Emmanuel Macron inspects equipment as he meets with members of a surveillance brigade for the Paris 2024 Olympics during a visit at the Cherbourg naval base in Cherbourg, Normandy, Jan 19, 2024. (PHOTO / AP)
PARIS - France tested Artificial Intelligence-driven video surveillance technology that will be deployed during the Olympic Games at a Depeche Mode concert this week, calling the exercise a success.
French legislation passed in 2023 permits the use of AI video surveillance for a trial period covering the Games to detect abnormal events or human behavior at large-scale events.
ALSO READ: Paris 2024 organizers to invite 222,000 to opening ceremony
The technology could be pivotal to thwarting an attack like the bombing at the 1996 Olympics in Atlanta or the Nice truck attack in 2016, officials say.
The law allows for eight different "events" to be flagged by AI surveillance software during the Games that include: crowd surges; abnormally heavy crowds; abandoned objects; presence or use of weapons; a person on the ground; a fire breaking out; contravention of rules on traffic direction
Rights campaigners warn the technology poses a threat to civil liberties.
What is AI-powered surveillance?
Algorithmic video surveillance uses computer software to analyze images captured by video surveillance cameras in real time.
Four companies - Videtics, Orange Business, ChapsVision and Wintics - have developed AI software that use algorithms to analyze video streams coming from existing video surveillance systems to help identify potential threats in public spaces.
The algorithms are trained to detect pre-determined "events" and abnormal behavior and send alerts accordingly. Human beings then decide if the alert is real and whether to act on it.
What will the algorithms be looking for?
The law allows for eight different "events" to be flagged by AI surveillance software during the Games that include: crowd surges; abnormally heavy crowds; abandoned objects; presence or use of weapons; a person on the ground; a fire breaking out; contravention of rules on traffic direction.
READ MORE: Paris 2024 unveils surrealistic official posters
Within these categories, specific thresholds (number of people, type of vehicle, timing, etc.) can be set manually to cater for each individual event, location or threat.
Who will use AI-powered surveillance?
National and local police, firefighters, public transport security agents will all have access to AI-powered surveillance.
Software developed by Wintics and tested at the Depeche Mode concert, will be deployed in the Paris region and on public transport.
Paris Police chief Laurent Nunez described the trial as largely a success.
"Everything went relatively well, all the lights are green (for future use)," he said.
Will facial recognition be used?
It should not. The new law continues to ban facial recognition in most cases and French authorities have said it is a red line not to be crossed.
Nonetheless, rights campaigners are concerned that mission creep risks setting in down the line.
READ MORE: Paris 2024 village handed over to organizers on time
"Software that enables AI-powered video surveillance can easily enable facial recognition. It's simply a configuration choice," said Katia Roux of Amnesty International France.
The legal framework regulating facial recognition remained too fuzzy and technical and legal safeguards were insufficient, according to Amnesty International.
Wintics Co-founder Matthias Houllier said his software's algorithms were not trained for facial recognition.
"There's no personal identification method in our algorithms," he said. "It's technically excluded."
How will privacy be protected?
France's Interior Ministry has created an evaluation committee to keep tabs on civil liberties throughout the trial period.
Led by a high-ranking official within France's top administrative court, the committee also comprises the head of the country's privacy watchdog, CNIL, four lawmakers and a mayor.