Controversial Facial Recognition is Tracing Kids with Suspected Criminal Profile in Buenos Aires

by October 18, 2020 0 comments

Facial recognition

Deploying live facial recognition system with CONRAC can put the future of children in peril.

Technology is always scrutinized under the lens of scepticism. Despite the many advancements, Artificial Intelligence and its subsidiaries are contributing to; the biases in algorithms remain the biggest challenge amongst experts. Specifically, if the technology is integrated into the draconian laws, the infringement of human rights gets amplified.

George Floyd’s death casts a shadow on the misuse of technology by authorities. And while tech organizations have apprehended about the negative impact of the technology in society, some organizations are perilously using this technology.

Ultimately, technology is the double-edged sword!

In a move to identify the juvenile criminals and mitigate child-related crimes, Argentina’s live facial recognition technology has come under the radar of the Human Rights Watch. Integrated into the Consulta Nacional de Rebeldias y Capturas (National Register of Fugitives and Arrests) CONARC in Argentina, this machine learning model utilizes the photo ID’s registered on the national registry to scan the real-time matches through the city’s subway cameras. Once the system catches anything suspicious it flags off an alarm and alert authorities.

Developed by the Ntechlabs, the facial recognition system spurns many errors, as the algorithms are trained, tested and tuned only for the adult faces. The three algorithms which, the Ntechlabs have submitted for testing was found to be highly erroneous by the National Institute of Standards and Technology, USA. Moreover, the system fails to identify children a year or two older. Also, the photo ids utilized for live facial recognition are not recent, but the comparisons are made by using outdated images, thus flagging an even higher risk of human rights infringement.

Human Rights Watch has criticised the government and sent a notice to the Buenos Aires Mayor, about publishing the personal data of the kids online, without any acknowledgement for privacy. Already the use of CONRAC has raised many brows regarding the authenticity of the data published.

Since 2009, CONRAC which is a form of Google-spreadsheet, is available online and doesn’t possess any passwords to retain the privacy of the suspects, and has been publishing the personal information about the suspects of alleged crimes. The type of crimes leading to arrests is not specified in CONRAC. And, many discrepancies are observed regarding the age of juvenile criminals who are arrested. Many children between 16-17 years of age are charged with criminal conspiracies, while a one-year-old was arrested for the crime of malicious intent.

This contradicts the international human rights laws, regarding children or juvenile criminals which promises to protect the privacy of the child at all stages of proceedings. Since 2019, the United Nations human rights council have been warning the government regarding the misuse of the system. Names of 66 children are included in the database after the UNHRC warning.

The HRW has reviewed that between May 2017 to May 2020, the names of 166 children are added to the database, and are added continuously using live facial recognition.

The Human Rights Watch report cites that apart from the discrepancies mentioned above, the database also has typographical errors, conflicting details and multiple national ID numbers assigned to single individuals and the risk of mistaken matches.

The official documents by the Argentine Computer Law Observatory state that many adults are detailed based on only the automated alerts by the facial recognition system. With continued use of the live facial recognition system, the life, and future of many children and adults are in danger.

No Comments so far

Jump into a conversation

No Comments Yet!

You can be the one to start a conversation.

Your data will be safe!Your e-mail address will not be published. Also other data will not be shared with third person.