In law enforcement and national security, facial recognition may have its benefits, and as the past decade saw great progress in the efficiency of the technology, an increasing number of states have been introducing it, allowing for the large-scale and continuous monitoring of their citizens. International experience nonetheless shows that despite the obvious risks it poses to fundamental rights, states fail to create the necessary regulatory framework for the use of facial recognition technology, which can lead to its indiscriminate and abusive use.
Moreover, the introduction of facial recognition systems is almost never preceded by social debate. Its widespread application and the omission of public consultation can lead to the normalisation of continuous surveillance and a violation of rights, where states possess the ability of knowing where we go, whom we meet, what pubs or churches we visit.
The rights to freedom of expression and of assembly are jeopardised by the technology’s capability for the mass collection and storage of facial profiles of persons participating in public events— including demonstrations—as well as their categorisation by race, gender or age. The mere possibility of surveillance can have a chilling effect on citizens and deter them from exercising their democratic rights or even expose them to direct danger in authoritarian regimes. In Russia for example, persons were only allowed to participate in a demonstration if they passed a gate equipped with a facial recognition system, leaving them without a choice but to provide their biometric data if they wanted to exercise their fundamental rights.
It is also known that facial recognition systems identify people of colour and women with lower accuracy than white men - and false positives reinforce the already existing discriminative practices of law enforcement against minorities, increasing the number of unfounded checks and arrests. In the United States, a client of the ACLU civil liberties organisation was arrested in front of his family based on an inaccurate match provided by the algorithm and was detained for 30 hours, only to be found innocent. It was later revealed that Detroit police use facial recognition almost exclusively against African Americans, with a 96 percent error rate.
An even bigger danger than inaccuracy is that continuous monitoring renders privacy illusory as the immense amount of data gathered enables making accurate guesses about a person's private life even where direct state surveillance cannot reach. This means that in the wrong hands the technology can be used as a tool of complete social control
The HCLU is monitoring the development and application of this surveillance technology in Hungary. The law grants the right of using facial recognition systems to two bodies: the National Security Service (NBSZ) and the National Directorate-General for Aliens Policing. In 2019, we initiated an investigation with the National Authority for Data Protection and Freedom of Information to investigate how these two organizations handle the data thus obtained. The investigation revealed that to date there is no mass surveillance in case of NBSZ, and facial recognition is only used in individual cases, with a limited scope in time and space. In 2018, the system produced 6,000 matches, based on which 209 checks and four arrests were carried out.
The infrastructural background of mass surveillance, however, seems to be developing in Hungary: in 2020, images of 35,000 public cameras were routed to the Government Data Center within the framework of Project Dragonfly, thus becoming a central database.
The National Security Services Act allows for secret services to gain access to this database and apply facial recognition software to the data held within, on the basis of mere directorial orders and without any external control on authorisation. For these reasons, we urge legislators to create a precise legal framework for the use of facial recognition systems in a transparent manner for the public, and with the involvement of civil society organisations.