The challenge
In view of the new risks that have emerged in the 21st Century, the European Union is dedicated to seeking greater safety for public places and individuals against threats such as terrorism. Modern technologies have the potential to further involve and empower citizens in the protection of public spaces but the use of security-related technology is controversial under data protection standards. Surveillance technology can be seen a intrusive, and pose significant risks to to privacy and individual freedoms.
As a result, the Partnership for Security in Public Spaces promotes actions that ensure optimal articulation between implementing security measures, promoting innovation and European technology sovereignty, while also respecting GDPR laws, people' privacy, and fundamental rights. In this regard, Action 3 – Evaluate the application of Artificial intelligence technologies – recognised the necessity to do a study on safe and smart cities, where discussions over the use of surveillance technologies in public spaces are ongoing. The action also asks data protection authorities for more flexibility in experiment with and deploying innovative solutions.
What is being done
This Action resulted in the publication of a research that examined the usage of facial recognition technology (FRT) in public space. Its goal is to examine the difficulties and opportunities presented by European legislation for a responsible FRT framework. It looks at:
- defining FRTs features and risks,
- current experiments,
- the applicability of European law and drawing guidelines for entities using facial recognition technologies,
- recommendations for European legislators.
The first public space experiment in France was carried out during the Nice Carnival, and it aimed to address the problems and obstacles of FRT from a field viewpoint. Authorities experimented with various scenarios, such as those designed for identifying specific persons in an uncontrolled setting. The goal was to check if the technology might be used to find a vulnerable individual, such as a missing child or a person suffering from dementia, a fleeing subject, or a person of interest.
This was accomplished by collecting biometric data, which was retained for 0.2 seconds before being destroyed, unless there was a match. Consequently, the program is shown to be very reliable while simultaneously being mindful of privacy.
This study, as well as others included in the report, indicate cities' and law enforcement agencies' demand for these technologies in the current security context. It also underlines the hazards posed by their use. Additionally, they can also establish the groundwork for a European legislative framework and create a trustworthy environment for FRT rollouts.
More information
About the Security in Public Spaces Partnership
About the Final Action Plan of the Security in Public Spaces
About Action 3 Final Report
- Inicie sessão para publicar comentários
- Etiquetas
- Urban Case Study Security in public spaces Application of AI inclusive technologies Action 3 FRT City of Nice