Last Tuesday (21), 50 parliamentarians from different parties presented bills banning the use of facial recognition in public spaces. The action involved state deputies and councilors from 12 states and the Federal District and is the result of the #SaiDaMinhaCara campaign, organized by the organizations Coding Rights, MediaLab/UFRJ, Rede Lavits, the Brazilian Institute for Consumer Protection (IDEC) and the Centro de Estudos of Security and Citizenship (CESeC).
In São Paulo, the bill was presented to the Legislative Assembly jointly by representatives Isa Penna (PCdoB), Leci Brandão (PCdoB) and Erika Malunguinho (PSOL) and processed as PL 385/2022. The text prohibits the Government from “obtaining, acquiring, retaining, selling, possessing, receiving, requesting, accessing, developing, improving or using facial recognition technologies or information derived from facial recognition technology”, as well as contracting third parties for these services. .
The use of facial recognition without due concern for the guarantee of rights has happened in several places. In 2021, the city of Recife announced that 108 digital clocks would be installed throughout the city equipped with facial recognition instruments.
In Rio de Janeiro, in 2019, the Military Police implemented a pilot project for video surveillance by facial recognition. The cameras were initially installed in the Copacabana neighborhood, but later expanded to the surroundings of the Maracanã stadium and Santos Dumont airport, in the central region of the city. Later, the Cláudio Castro (PL) government proposed installing the technology in the community of Jacarezinho, the scene of the deadliest massacre ever recorded in Rio de Janeiro, which resulted in 28 deaths.
Pablo Nunes, coordinator of Panóptico, a CESeC project that monitors the use of facial recognition technologies for policing, highlights that all these experiences are taking place without any regulatory legislation.
“We don’t find a minimum basis for this type of technology to be used. That is, operational protocols, regulation, clear determinations of who, when and how access to people’s data will be made, the life cycle of this data, who will be responsible for the errors that occurred during this implementation. We don’t have this basic study”, he says.
The results of the implementation of facial recognition so far show a strong racial bias. A study carried out by the Security Observatories Network in 2019, with data collected in four Brazilian states (Bahia, Rio de Janeiro, Santa Catarina and Paraíba), shows that of 151 people arrested, 90% were black.
In an interview with Brazil de facto, Nunes exposes the main problems of the unregulated use of this technology, especially in the context of public security – mainly affecting the black population. “We have to understand that facial recognition means police approach and incarceration. We already know that a good part of the people approached by the police, especially in a violent way, are young black people. Here in Rio, 63% of the city’s population that is approached by the police is black and we also know that violence is permeated in these encounters between young black people and the police”, he maintains.
Another important aspect is the emphasis on incarceration. “Brazil is one of the countries that has the largest prison population in the world, and one of the fastest growing. And we have not seen an improvement in our public security in this increase in incarceration, on the contrary. a new bet on incarceration as a solution to the country’s public security problems.”
Read the full interview:
Brasil de Fato: What is the central line of the bills being presented? Was it a collective discussion of the organizations?
Paul Nunes: The #SaidaMinhaCara campaign is the result of a collective organization between the CESeC, which carries out the Panóptico, Coding Rights, IDEC and Media Lab project at UFRJ. These organizations came together and built a model bill banning facial recognition at the state and municipal levels. Based on this model, we contacted parliamentarians throughout Brazil to see if there was any interest in filing these requests and we carried out the campaign.
Facial recognition has been used in many everyday institutions such as banks. Even the federal government has encouraged people to register their faces. What are the risks of adopting these technologies and why is it important to prevent their advancement?
The focus of the bills that we help to protocol is a little more specific, which is the use of technologies in public spaces. There are dozens, hundreds of applications of these algorithms and we focused on them being operated in the public space, because it is in this use that we find the greatest possible number of violations of privacy, human rights and other legal guarantees.
But this does not exempt from the fact that these algorithms can produce biases that will harm access to rights. If we think of Gov.br, which is a platform through which we can access most of the services operated by the federal government, if it puts this system together with facial recognition, certain people will have more difficulty validating their registration to access these rights. We saw this process happening during the covid pandemic, mainly from the creation of emergency aid, which needed validation through facial recognition. This ended up promoting a series of difficulties for a certain portion of the population to have these rights guaranteed.
Another application that can also mean a difficulty in accessing rights is in public transport, since a good part of the Bilhete Único cards, which have different names in other states, are allied to the identity of the cardholder and perform facial recognition on cameras coupled to card validation equipment. And people often have difficulties and end up finding their public transport access cards blocked by a possible fraud that facial recognition would have found. This is a drama, that’s gone very well documented by Coding Rights, which is experienced by trans people, whose identity is often questioned by these systems. This ends up causing these people to lose their one-time tickets.
Panopticon research focuses on issues related to public security. What do they show in relation to the way facial recognition has been applied in Brazil?
It has been applied in a totally unregulated way. There is no specific regulation for the use of facial recognition in public safety in Brazil. In fact, the General Data Protection Law (LGPD) does not cover this use, because in article 4 the law determines that public security and national defense are not within the scope of the determinations placed in the text. So, we have a scenario of complete deregulation, combined with a growing increase in projects being developed by police and municipal guards throughout the national territory.
It’s a complicating scenario, because we don’t find a minimum basis for this type of technology to be used. That is, operational protocols, regulation, clear determinations of who, when and how access to people’s data will be made, the life cycle of this data, who will be responsible for errors that occur during this implementation. We don’t have this basic study to start these projects.
And when they are working, we don’t have a minimum follow-up. The police officers who were asked during our research period answered that they do not carry out an accounting of how many people were recognized correctly, how many were wrongly, how many people were approached, how many faces were captured, how big is this database of captured faces, we do not You know. And it is very important that these basic data are collected because they will tell you how efficient or not this technology is for use in our daily lives, in the national context.
What we found here in Rio de Janeiro was the report on the use of cameras in one day, on the outskirts of Maracanã, demonstrating that 63% of the people who were arrested that day were wrongly arrested, they were not people who had issued arrest warrants. in your name. So, what we have seen is exactly a lack of attention and regulation of the use of these technologies. We know that facial recognition technologies produce much more damage than advances, than positive elements to guarantee rights, to improve the management of security agencies. So, our position is for the ban.
How does structural racism connect with this issue? And in relation to the identification of trans people, how does this happen?
Structural racism is fully connected with this issue on several levels, either by the design of the algorithms, which takes into account the white face structure as a standard structure and everything that is outside this standard is considered an out of the curve point. We also have racism as a central element in the operation and in the choice of places where these cameras will be installed. So, in Rio de Janeiro, we have, at first, a siege of Copacabana for the access of certain people through facial recognition – we know that Copacabana has been surrounded for decades in relation to the entry of young black people, mainly coming from peripheries here in Rio. Or also when we see cameras being used in favelas to control these populations.
And what we have seen beyond that is how the police approach them. We have to understand that facial recognition means police approach and incarceration. We already know that most of the people approached by the police, especially in a violent way, are young black people. Here in Rio, 63% of the population of the city that is approached by the police is black and we also know that violence is permeated in these encounters between young black people and the police. And also in terms of incarceration, the police Brazil is one of the countries that has the largest prison population in the world, and one of the fastest growing. And we have not seen an improvement in our public security in this increase in incarceration, quite the opposite. We are with facial recognition making a new bet on incarceration as a solution to the country’s public security problems.
There is a discourse on security that ends up appealing to people’s fear. Can this lead them to consider these problems as something minor?
We have a scenario in which our rights are placed on a scale and we have to decide which right we want to lose in order to gain another. While, in fact, we should have a State that protects and promotes all the rights that are provided for in its Constitution. And it is important to emphasize that not everyone is asked to be flexible in relation to their rights. It is mainly for the black population, who have suffered the greatest collateral effects in these public safety experiments, which is exactly this population who is asked to forfeit part of their rights for the greater good.
It is sad to know and see that many times we, as a society as a whole, parliamentarians, civil society organizations and others, end up accepting that these biases, these side effects of the use of facial recognition and other public security policies, reach the black population and only criticize and face the advance of this rape machine when the victims are white people. This scenario is very serious, but it is what we have seen in Brazil today.
Editing: Thalita Pires