Acessibilidade / Reportar erro

Algorithmic Necropolitics7 7 Translated from Portuguese from Silva, Tarcízio. Racismo algorítmico: inteligência artificial e discriminação nas redes digitais. São Paulo: Edições Sesc, 2022. Chapter 4 (n/p). The organisers of this dossier wish to thank the author and the publisher for their generous permission to publish this version.

Abstract

This chapter of the author’s 2022 book Racismo algorítmico: inteligência artificial e discriminação nas redes digitais (Algorithmic racism: artificial intelligence and discrimination in digital networks) demonstrates the racism encoded in artificial intelligence. It addresses the material and symbolic, often lethal, violence inflicted upon Black and poor individuals and populations by the deployment of predictive systems built and backfed from datasets that reflect a history of exploitation and segregation. It begins with a historical overview of the normalisation of hypervigilance and violent control over racialized populations in the United States and in Brazil. It then shows the continuity of that control by contemporary algorithmic classification systems such as facial recognition, predictive policing, and health and security risk scores. Aligned with official narratives of racial harmony and the meritocratic justification of colour-blind policy, they are deemed neutral by their developers, private enterprises and the public institutions that employ them, who insidiously ignore their bias.

Keywords:
necropolitics; race and technology; surveillance; facial recognition; artificial intelligence

Resumen

Este capítulo del libro Racismo algorítmico: inteligência artificial e discriminação nas redes digitais (Racismo algorítmico: inteligencia artificial y discriminación en redes digitales), publicado por el autor en 2022, demuestra el racismo codificado en la inteligencia artificial. Aborda la violencia material y simbólica, a menudo letal, infligida a personas y poblaciones negras y pobres mediante el despliegue de sistemas predictivos construidos y retroalimentados a partir de conjuntos de datos que reflejan una historia de explotación y segregación. Comienza con un panorama histórico de la normalización de la hipervigilancia y el control violento sobre poblaciones racializadas en Estados Unidos y Brasil. Luego muestra la continuidad de ese control por parte de los sistemas de clasificación algorítmicos contemporáneos, como el reconocimiento facial, la vigilancia predictiva y las puntuaciones de riesgo en seguridad y salud. Alineados con las narrativas oficiales de armonía racial y la justificación meritocrática de la política daltónica, sus desarrolladores, las empresas privadas e instituciones públicas que los emplean los consideran neutrales e ignoran insidiosamente sus prejuicios.

Palabras clave:
necropolítica; raza y tecnología; vigilancia; reconocimiento facial; inteligencia artificial

Resumo

Este capítulo do livro Racismo algorítmico: inteligência artificial e discriminação nas redes digitalis, publicado pelo autor em 2022, demonstra o racismo codificado na inteligência artificial. Aborda a violência material e simbólica, muitas vezes letal, infligida a indivíduos e populações negras e pobres, através da implementação de sistemas preditivos baseados e realimentados a partir de conjuntos de dados que refletem uma história de exploração e segregação. Começa com um panorama histórico da normalização da hipervigilância e do controle violento sobre populações racializadas nos Estados Unidos e no Brasil. Em seguida, mostra a continuidade desse controle por sistemas contemporâneos de classificação algorítmica, como o reconhecimento facial, a vigilância preditiva e as escalas de risco em segurança e saúde. Alinhados com narrativas oficiais de harmonia racial e com a justificativa meritocrática da política daltônica, seus desenvolvedores, as empresas privadas e as instituições públicas que os empregam, consideram-nos neutros e ignoram insidiosamente os seus preconceitos.

Palavras-chave:
necropolítica; raça e tecnologia; vigilância; reconhecimento facial; inteligência artificial

Introduction

“Didn’t they see I was wearing my school clothes, Mom?” were Marcus Vinícius’ last words, as he died in his mother’s arms. He was a 14 year-old resident of Complexo da Maré, in Rio de Janeiro, shot by the police in an operation. “Thugs don’t carry school bags”, said Bruna da Silva to journalists in her teenage son’s defence, holding her son’s backpack and notebooks. She knew the usual accusations against the victim that would follow (see Barbon, 2018BARBON, Júlia. 2018. “‘Bandido não carrega mochila’, diz mãe de aluno de 14 anos morto no Rio”. Folha de S. Paulo. 21 de junio de 2018.).

The ideological tools of Black genocide (Flauzina, 2016FLAUZINA, Ana Pinheiro. 2014. “As fronteiras raciais do genocídio”. Direito.UnB. Vol. 1, nº. 1, p. 119-146.) instantly deny, as in the case above and in hundreds of others annually, the entire symbolic complexity of being Black in Brazil today. The minutiae of necropolitics and of Black genocide in Brazil cannot be apprehended without honestly considering the abyss between different social and political realities produced by the hegemonic culture, the single story told by White supremacy.

At the end of the 19th century, sociologist W. E. B. Du Bois, based on Frederick Douglass, proposed the concept of “colour lines”. The concept was designed to understand the United States, but it can easily be stretched to part of the continent’s Afro-diasporic reality. The permanent and relative public opacity of post-abolition physical, political, and economic oppression in a country divided by contrasts was made possible by the White control of formal knowledge, of official representations and of school curricula. But at times, by their peculiar situation, the Afro-diasporic populations would be allowed to develop a double consciousness:

[…] a world which yields him no true self-consciousness, but only lets him see himself through the revelation of the other world. It is a peculiar sensation, his double-consciousness, this sense of always looking at one’s self through the eyes of others, of measuring one’s soul by the tape of a world that looks on in amused contempt and pity (Du Bois, 2018DU BOIS, W. E. B. 2018. The Souls of Black Folk. Oxford: Oxford University Press., p. 8).

In countries like Brazil, Black individuals’ and political minorities’ belief in the existence of equality and in the Rule of Law is only possible thanks to the control of interpretations and representations of social reality. Different classification marks on “killable” bodies act in an apparently paradoxical way. Insofar as the ideology of institutions such as the military police conditions its members to dehumanise Black people, hegemonic power erases data, information and, above all, the possibility of a critical, propositional reflection on the abysmal inequality existing in the country.

The right to physical, intellectual and political self-defence is denied to the majority. The belief in the school uniform as a marker of innocence and respect for social rules did not allow Marcus Vinícius to understand his becoming, from an early age, a target for the police. As a Black young boy living in a favela, the heuristics of race and spatialization made him that target. In the words of Beatriz Nascimento, there is a dual or triple society, in terms of rights, respect for humanity and the framing of hegemonic social representations. Such dissonance creates, for Black Brazilians, a greater burden of social effort: “In a White society, in which your behaviour has to be guaranteed according to White dictates, you, as a Black person, become mooned, you start to live another life, floating without any ground where to land, without reference, without parameters” (Nascimento, 2018NASCIMENTO, Beatriz. 2018. Beatriz Nascimento, quilombola e intelectual: possibilidades nos dias de destruição. São Paulo: Filhos da África., p. 249).

Social classifications are instrumentalized as exercises of power and record - or the refusal or falsification of records - of the impact of racial relations. In a society driven by the imbrication of racism in technologies, the processing of automated decisions about individuals and groups based on machine learning intensifies the tendency to erase subjects and make inequalities opaque.

The colonial genesis of necropolitics and carceral imagination

In order to deal with the technologies that organise and classify individuals by racializing logics, we need to recover the history of processes of normalisation, hyper-surveillance and violent control of certain groups.

The colonial project, especially after the 18th century, has transformed the face of the world and of humanity - as it has also promoted dehumanisation. That project, as it happened in the so-called “conquest” of regions like the Americas, started with the genocide of native peoples and led to the production of material technologies and to managing large populations for the benefit of a Eurocentric project. During that period, practices of extermination, expropriation, domination, exploitation, genocide, torture and sexual violence became natural, as a global kind of hierarchy in which such horrors (Maldonado-Torres, 2018) became spatially stratified.

However, “given the order of things in the modern/colonial world, questions about colonisation and decolonization cannot be relevant, except as a mere historical curiosity” (Maldonado-Torres, 2018MALDONADO-TORRES, Nelson. 2018. “Analítica da colonialidade e da decolonialidade - algumas dimensões básicas”. In: BERNARDINO-COSTA, Joaze; MALDONADO-TORRES, Nelson; GROSFOGUEL, Ramón (orgs.). Decolonialidade e pensamento afrodiaspórico. Belo Horizonte: Autêntica., p. 33). That is based on denials that reproduce the invisibilization of the role played by the colonial project and by White supremacy in institutions and constructs such as prisons, the city, laws, in addition to the police apparatus and its practices. Surveillance and hierarchically racialized social classifications are the fulcrum of the differential allocation of humanity that sanctions the global maintenance of capitalism (Almeida, 2018ALMEIDA, Silvio. 2018. Racismo estrutural. Belo Horizonte: Letramento., p. 56). The horror of the other and of otherness, supported by fictions of the “races” outside the Eurocentric hegemonic pattern, enabled the development of technologies for discipline, control and punishment perfected during the history of colonial invasion and extraction.

In her groundbreaking book Dark Matters: On the Surveillance of Blackness, Simone Browne unveils the genesis of contemporary surveillance, especially that of centuries of slaver colonialism in the Americas. She recovers Frantz Fanon’s pioneer characterisation of Modernity as the process of “classifying” men, by which “records, files, time sheets, and identity documents that together form a biography, and sometimes an unauthorised one, of the modern subject” (Browne, 2015BROWNE, Simone. 2015. Dark Matters: On the Surveillance of Blackness. Londres: Duke University Press., p. 16). This reflects on the datafied representations of the subjects in contemporary databases.

Before social classification was systematised by state mechanisms, the colonial-slavery project dealt with the challenge of managing the horrors of forced immigration of millions of enslaved Africans. Such management was accomplished by technologies that transformed humans into merchandise, that dehumanised Africans based on Christian ideology and scientific racism, and by control tactics to avoid escapes and insurrections.

Browne argues that “the history of branding in transatlantic slavery anticipates the ‘social sorting’ [...] contemporary surveillance practices, including passports, identification documents, or credit bureau databases” (Ibidem, p. 44). Shared on different scales by English, Dutch, Spanish and Portuguese slaveholders, the practice of iron branding was adopted on a large scale, and not only as an instrument to torture and subjugate enslaved people. It was also used for commercial and international body management - some marks distinguished, before the individual slaveholder’s marking, which imperial nation had kidnapped the enslaved - and to classify individuals suitable for exploitation. Marking was also used to impose an extra character of marginalisation, as in the case of the letter F for “fugido” (runaway) in Brazil (Oliveira Filho, 2009OLIVEIRA FILHO, Roque F. de. 2009. Crimes e perdões na ordem jurídica colonial: Bahia (1750/1808). Tese de doutorado, Universidade Federal da Bahia.).

Colonial projects based on the labour of enslaved people were similar when adapting technologies to control and criminalise populations. American and Brazilian historians have recorded the mandatory use of lanterns by enslaved people who needed to move around the city without raising suspicion (Silva, 2008SILVA, Wellington Barbosa da. 2008. “Burlando a vigilância: repressão policial e resistência negra no Recife no século XIX (1830-1850)”. Revista África e Africanidades. Ano 1, nº. 1, 2008, p. 1-18.). For illiterate slaveholders, this came in substitution of the note with the name of the enslaved person, the “owner” and the task in progress for the former to move around town after curfew.

In recent history, manifestations of racial differentiation still remain in the relationship between individuals and identification documents in a context of hypersurveillance in everyday life. Such is the value of showing one’s employment record to avoid being framed by the laws that criminalised vagrancy laws since the imperial period until the present, and especially during the military dictatorship (see Pires, 2015PIRES, Thula Rafaela de Oliveira. 2015. Colorindo memórias e redefinindo olhares: ditadura militar e racismo no Rio de Janeiro. Relatório da Comissão da Verdade do Rio. Rio de Janeiro: Comissão Nacional da Verdade.); or of carrying purchase invoices in response to the generalised suspicion that Black and the poor persons are subjected when they wonder into a shop (Menezes, 2009MENEZES, Elisa Matos. 2009. O inimputável: crimes do Estado contra a juventude criminalizada. Monografia de graduação em Antropologia: Universidade de Brasília, 2009.).

The modernisation of scientific-colonial racism in the fields of phrenology and criminology extended, by the use of biometrics, the praxis of persecuting Black and enslaved people. Browne also drew attention to the role of newspaper advertisements in the persecution of resisting and runaway enslaved people. That is a form of consumption of Black subjects by a presumably White public. The readers of these advertisements are understood as an “‘imaginary community’ of surveillance: the eyes and ears of face- to-face watching, observing, and regulating” (Browne, 2015BROWNE, Simone. 2015. Dark Matters: On the Surveillance of Blackness. Londres: Duke University Press., p. 72). The tactic was common in various places of the slave world, where Blacks whose human rights had been violated came to be ideologically constructed, also in newspapers, as “opposed to work and freedom, requiring constant vigilance” (Abreu, 2012ABREU, Tenner Inauhiny de. 2012. “Nascidos no grêmio da sociedade”: racialização e mestiçagem entre os trabalhadores na Província do Amazonas (1850-1889). Dissertação de mestrado, Universidade Federal do Amazonas., p. 95).

Brazilian authorities, by their “recourse to editing posture codes, tried at all costs (almost always unsuccessfully) to control the steps of the African and Afro-descendant population” (Silva, 2008SILVA, Wellington Barbosa da. 2008. “Burlando a vigilância: repressão policial e resistência negra no Recife no século XIX (1830-1850)”. Revista África e Africanidades. Ano 1, nº. 1, 2008, p. 1-18., p. 2). They created legal instruments and police infrastructure to promote society stratification, defining which groups should be placed under perennial suspicion and constantly monitored. Mass media at the time became a tool not only to support surveillance, but also to promote and normalise distributed surveillance carried out by hegemonic groups, in consonance with the racist State.

At the end of the 19th Century, in São Paulo, although the number of captives had always been limited, this did not prevent authorities from justifying the use of strict measures of control and surveillance (Machado, 2004MACHADO, Maria Helena P. T. 2004. “Sendo cativo nas ruas: a escravidão urbana na cidade de São Paulo”, in: PORTA, Paula (org.). História da cidade de São Paulo. São Paulo: Paz e Terra., p. 85). Those were increased when the city became a route and destination for streams of runaway enslaved people.

A relationship of mutual dependency ensued between the military police apparatus in Brazil, already developing at the beginning of the 19th century, and the slaveholding and corrupt “elites”. The control and genocide of subaltern populations was instilled into police ideologies and practices, especially in face of the threat of abolitionist movements.

Among numerous recorded cases that naturalised the use of police force by the White middle classes in defence of their everyday privileges, it is worth mentioning the persecution of Joaquim Mina, a freedman, in the city of Itu in 1856. Joaquim, a healer and counsellor of other enslaved Black people, was persecuted by a group of town slaveholding citizens. He was reported to the police for alleged practices of “witchcraft” and for disseminating ideas of insubordination that would influence the enslaved in the region. Among the informers was doctor Ricardo Gumbleton Daunt, most irritated, claiming special offence to his profession - some slave owners preferred their victims’ health to be treated by a healer instead of looking for a diploma-holding physician qualified and authorised by the Emperor Pedro II himself.

The physician, by the mid-19th Century, did not accept competition from someone he did not consider human, especially because he, himself, was also a slave owner. How could he accept that “a disorderly Black man with vicious behaviour would spoil the only slave he could purchase in a long time?” (Lima, 2009LIMA, Adriano B. M., 2009. “Feitiço pega sempre: alforrias e curandeirismo no oeste paulista (século XIX)”. Anais do 4º Encontro Escravidão e Liberdade no Brasil Meridional. Curitiba, Universidade Federal do Paraná., n/p).

The maintenance and intensification of racist violence in countries like Brazil and the United States, even among low-income whites, involves layers of White privilege in the distribution of different resources. However, two essential aspects of the management of lower class affiliation to elements of White supremacy in economic and political terms are noteworthy: the individualist, neoliberal projection of the possibility of social climbing based on the hegemonic cultural reflection of the equivalence between whiteness and success; and the introjection of the-sometimes tacit, sometimes explicit-knowledge of which groups are “killable” and subject to exclusion.

W. E. B. Du Bois named as “psychological wage” the way in which certain guarantees of respect for humanity and access to public resources, albeit minimal, motivated White American proletariats to prioritise racial privilege over class interest, which would have prevented the partnership of broad population strata against the exploitation by the real owners of power and capital (Nopper, 2016NOPPER, Tamara K. 2016. “Strangers to the Economy: Black Work and the Wages of Non-Blackness”. In: SAUCIER, P. Khalil; WOODS, Tryon P (eds.). Conceptual Aphasia in Black: Displacing Racial Formation. Lanham: Lexington Books.). Similarly, in the 1960s, Abdias Nascimento drew attention to the racialized perception of coercive power in the hands of the ruling classes manipulated as “an instrument capable of granting or denying African descendants access and mobility to socio-political and economic positions” (1978NASCIMENTO, Abdias. 1978. O genocídio do negro brasileiro: processo de um racismo mascarado. Rio de Janeiro: Paz e Terra., p. 76).

The evolution of communication and information media and technology was driven by ideologies affiliated with White supremacy, resulting in the incorporation of a “carceral imagination” into artefacts and culture. Ruha Benjamin’s concept offers a lens into this matter:

Visions of development and progress often build on forms of social and political subjugation that require actualization as new techniques of classification and control. As researchers set out to study values, assumptions, and desires that shape science and technology, we must also remain mindful of racial anxieties and fears that shape the design of techno science. (Benjamin, 2020BENJAMIN, Ruha. 2020. “Retomando nosso fôlego: estudos de ciência e tecnologia, teoria racial crítica e a imaginação carcerária”. In: SILVA, Tarcízio. (Org). Comunidades, algoritmos e ativismos digitais: olhares afrodiaspóricos. São Paulo: LiteraRUA., p. 19).

Understanding algorithmic carceral technologies, such as the distribution of facial recognition, involves understanding that the current “carceral imagination” in countries shaped by colonialism and White supremacy requires understanding “who and what is fixed in the same place-classified, cornered and/or coerced” (Ibidem, p. 20), and how technologies and institutions are created to maintain and promote exploitative social hierarchies.

This dynamic, central to structural racism in view of the construction and constant updating of Black people as representations of danger and otherness, was fully coupled with the culture of incarceration (Borges, 2019BORGES, Juliana. 2019. Encarceramento em massa. São Paulo: Pólen.) as a solution for deviants in the country. If we agree with Achille Mbembe when he states that “racism is above all a technology destined to enable the exercise of biopower,” and that its function is “to regulate the distribution of death and allow the murderous functions of the state” (Mbembe, 2018, p. 18), establishing structures of more or less diffuse racial castes in countries like Brazil and the United States (Alexander, 2018ALEXANDER, Michelle. 2012. A nova segregação: racismo e encarceramento em massa. São Paulo: Boitempo.) has promoted a differential introjection on the relationship with the police and incarceration among different groups in those countries.

Facial recognition and technochauvinism

In November 2019, a news report by TV Itapoan, in the state of Bahia, interviewed the mother of a teenager who had been mistaken for a drug dealer. Based on mistaken identification by security camera images, he was approached inside the metro, in Salvador, and taken to the police station. According to the woman, who remained anonymous for her own safety, her son arrived home shaken by the violent treatment he received and needed several minutes to calm down before telling her what had happened. After that, he became “afraid to go to school, afraid to take the bus, afraid to take the subway, emotionally disturbed” (see Silva, 2019SILVA, Tarcízio. 2019. “Reconhecimento facial na Bahia: mais erros policiais contra negros e pobres”. Tarcízio Silva, (s/l). 21 de noviembre de 2019. Disponible en: https://tarciziosilva.com.br/blog/reconhecimento-facial-na-bahia-mais-erros-policiaiscontra-negros-e-pobres/. [Acceso Oct. 2021]
https://tarciziosilva.com.br/blog/reconh...
).

In January 2020 in Detroit, in the United States, Robert Williams, 42, was approached by the police at home in front of his wife and his 2- and 5-year-old daughters, accused of robbing a watch shop. The officers found his name and address in a neighbouring town by running an image from a surveillance camera on the department’s facial recognition database. When his wife asked the officer where he was being taken, she received a short “Google it” as an answer. At the police station, Williams had to repeatedly show officers how much the photo of the man on the security camera footage looked nothing like him, nor with his photo on the database. Despite this, the police were slow to doubt the authority of the computer system. After 30 hours of wrongful arrest, Williams had to pay bail and he is still troubled by family, personal and psychological sequels of the episode (Hill, 2020HILL, Kashmir. 2020. “Wrongfully Accused by an Algorithm”. NY Times. 24 de junio de 2020. Disponible en: https://www.nytimes.com/2020/06/24/technology/facial-recognition-arrest.html. [Acceso Oct. 2021]
https://www.nytimes.com/2020/06/24/techn...
).

Although shocking, cases like these are becoming more and more frequent, albeit underreported. Facial recognition for police purposes has been around for over twenty years, but the combination of cheaper technology, larger biometric databases, legislative leniency and companies lobbying has accelerated its use lately. While other cases of false positives may be even more disturbing, these two bring up key points about the problem of facial recognition: its relationship with public transportation infrastructure and with the right to cities; and the normalisation of computational decision-making to escape the individualisation of human responsibility.

Similarly to what happened to the feeding of personal information to social media platforms, the normalisation of biometric data collection and processing in urban spaces begins with the obtention of seemingly positive or harmless benefits. In 2018 in the city of São Paulo, for example, the first subway line licensed to the private sector, ViaQuatro, of the CCR group, tried to frame the setup of digital advertising panels that counted the people who looked at the screens and supposedly recognized their facial expressions as a positive innovation. After a request by a consumer protection agency, the São Paulo Justice Department determined the removal of that technology (G1 São Paulo, 2018G1 SÃO PAULO. 2018. “Justiça de SP proíbe uso de câmeras de reconhecimento facial em painel do Metrô”.Disponible en: https://g1.globo.com/sp/saopaulo/noticia/2018/09/14/justica-de-sp-proibe-uso-de-cameras-de-reconhecimentofacial-em-painel-do-metro-de-sp.ghtml. [Acceso Oct. 2021]
https://g1.globo.com/sp/saopaulo/noticia...
).

The public normalisation of this technology in the country and in the State of São Paulo has taken great strides after its initial adoption by transportation networks. An exploratory survey of facial recognition systems adopted by public authorities showed that, of the-growing-number of reported cases, 44% occur in public transport equipment and infrastructure.8 8 See the infographic produced by Instituto Igarapé at https://igarape.org.br/infografico-reconhecimento-facial-no-brasil/

In early 2020, the government of São Paulo inaugurated the “Facial and Digital Biometric Identification Laboratory”to advance the management of digital biometric data and in promoting the use and standardisation of facial recognition (São Paulo, 2020SÃO PAULO. 2020. “Discurso de João Doria na inauguração do Laboratório de Identificação Biométrica em 28 de janeiro”. Disponible en: https://soundcloud.com/governosp/discurso-de-joao-504599548. [Acceso Oct. 2021]
https://soundcloud.com/governosp/discurs...
). On the occasion, Governor João Dória, using technical jargon, celebrated the inauguration of a laboratory that “locates the bandit before he commits the crime” (sic) by means of over 30 million citizen photographs in its database. Closing his speech, he congratulated the Chief of the Civil Police, “who had never conducted as many arrests as over the past 13 months” (São Paulo, 2020). The laboratory is part of the Ricardo Gumbleton Daunt Identification Institute-named after an innovative criminologist and forensic fingerprint analyst who, in turn, was named after his grandfather, the slave owner and physician mentioned in the previous section.

The lobby of artificial intelligence and public repression technology companies has taken advantage of the wave of far-right political projects around the world, from Trump to Bolsonaro. Elected for supporting Bolsonaro’s violent and segregationist ideas, the then candidate for governor of Rio de Janeiro, Wilson Witzel, had the reinforcement of public policing by the use of surveillance and facial recognition systems as one of his main campaign flagships. Adopted with pomp but without much preparation or discussion at the 2019 Carnival, with the support of partner companies, the systems were deactivated the following year supposedly due to the impact of the pandemic (Heringer, 2020HERINGER, Carolina. 2019. “Uma das principais promessas de campanha de Witzel, câmeras de reconhecimento facial não funcionam mais desde o fim de 2019”. O Globo, Rio de Janeiro, 20 de julio de 2020.). Witzel was removed by the High Court on corruption charges in 2020 and lost office in May 2021, becoming the first Rio de Janeiro Governor to be impeached in the country since the 1964-85 military dictatorship.

This particularly reactionary atmosphere makes room for technology companies to promote their products along with repressive political projects. The convergence of the fear of public spaces with the belief that more police and more technological devices-both, factors permeated by racism-would be the solution, promotes what Meredith Broussard called technochauvinism.

Technochauvinism is the belief that tech is always the solution. [...] [It] is often accompanied by fellow-traveller beliefs such as Ayn Randian meritocracy; technolibertarian political values; celebrating free speech to the extent of denying that online harassment is a problem; the notion that computers are more “objective” or “unbiased” because they distil questions and answers down to mathematical evaluation. (Broussard, 2018BROUSSARD, Meredith. 2018. Artificial (Un)intelligence: How Computers Misunderstand the World. Cambridge. MA: The MIT Press., p. 7-8).

The police officer who trusts the algorithmic system more than his own eyes when face to face with a suspect victim of a facial recognition false positive represents one of the most loquacious embodiments of the interface between racism and technochauvinism. Diluting responsibility by attributing agency to technology for decisions related to approach, identification, classification or conviction, by means of devices such as facial recognition, predictive policing and risk scores, is one of the greatest dangers of algorithmic racism.

The mistakes and successes of recognition: penal selectivity

Facial recognition technologies are incredibly inaccurate. And their mistakes are not unknown to academia, activists or public authorities. But facial recognition systems continue to expand, despite the growing number of state and independent reports that verify their weaknesses. In addition to the more comprehensive studies on computer vision inaccuracy cited above, two cases deserve attention. Between 2016 and 2019 researchers at the University of Essex followed a series of exploratory facial recognition procedures adopted by the Metropolitan Police in London, which established a watch list that searched thousands of faces of wanted people in various public spaces. About 38% of the program’s indications were deemed not credible by the police, even before approaching the suspects. Even with this filter, the approaches carried out were prone to error: about 63% of the individuals approached were “false positives”: people who were not the ones wanted.

Nonetheless, study leaders Pete Fussey and Daragh Murray (2019FUSSEY, Pete; MURRAY, Daragh. 2019. “Independent Report on the London Metropolitan Police Service’s Trial of Live Facial Recognition Technology”. The Human Rights, Big Data and Technology Project. (s/l). Julio de 2019. Disponible en: https://www.essex.ac.uk/research/showcase/report-on-the-police-use-of-facial-recognition-technology-identifies-significant-concerns. [Acceso Oct. 2021]
https://www.essex.ac.uk/research/showcas...
) also highlight a particular problem in terms of the project’s cost-effectiveness. When comparing the number of people wanted included in the watch lists with the approaches indicated and carried out, the discrepancy is even greater: datasets with more than 2,400 suspects generated only eight arrests. Even worse proportional data was reported in Brazil, where a huge infrastructure of facial recognition was adopted during the Micareta (a traditional Carnival-like event, held on a different date) in Feira de Santana, Bahia. The collection of 1,3 million snapshots led to only 18 warrants (G1 Bahia, 2019G1 BAHIA. 2019. “Feira de Santana registra 33 prisões por reconhecimento facial durante micareta”. Disponible en: https://g1.globo.com/ba/bahia/noticia/2019/04/29/feira-de-santana-registra-33-prisoespor-reconhecimento-facial-durante-micareta.ghtml. [Acceso Oct. 2021]
https://g1.globo.com/ba/bahia/noticia/20...
).

They also noted the problem of “street officers not waiting for the decision-making process in the control room-a clear example of presumption in favour of intervention” (Fussey and Murray, 2019FUSSEY, Pete; MURRAY, Daragh. 2019. “Independent Report on the London Metropolitan Police Service’s Trial of Live Facial Recognition Technology”. The Human Rights, Big Data and Technology Project. (s/l). Julio de 2019. Disponible en: https://www.essex.ac.uk/research/showcase/report-on-the-police-use-of-facial-recognition-technology-identifies-significant-concerns. [Acceso Oct. 2021]
https://www.essex.ac.uk/research/showcas...
, p. 125), which reinforces the danger of the relationship between technology and police culture, regardless of the accuracy or inaccuracy of recognition analysis.

One of the cases reported in the London study was a violent approach to a 14-year-old Black boy, dressed in school clothes, also a “false positive”. Despite those mistakes, the city police had plans to expand the system the following year. As stated by a police commissioner, the institution wanted to “make sure these deployments are effective in fighting crime but are also accepted by the public. Londoners expect us to deploy this technology responsibly” (Dearden, 2020DEARDEN, Lizzie. 2020. “Facial Recognition to Be Rolled Out Across London by Police, Despite Privacy Concerns”. Independent. 24 de enero de 2020. Disponible en:https://www.independent.co.uk/news/uk/crime/facial-recognition-london-met-policescotland-yard-privacy-a9299986.html. [Acceso Oct. 2021]
https://www.independent.co.uk/news/uk/cr...
). However, an investigation by The Independent had shown that individuals approached by the police did not receive explanations about the motives and about the technology of approach in testing stages (Dearden, 2018).

* * *

The National Institute of Standards and Technology in the United States has been analysing facial recognition and biometrics resources for verification and identification since 1994, with increasing coverage due to the development of this market. Some previous editions had, though inconsistently, included measurements of demographic variables, such as race, gender and age, but after public pressure on the topic, in 2019 a detailed report was published about the demographic effects on 189 tested algorithms from countries such as the United States and China. The study identified that error rates for “false positives” are ten to one hundred times higher for photos of Black, Asian and Native American individuals. In the case of the Black population, errors were consistently more prominent in police systems.

It is important to recognize the authors’ points of view implicit in the risks indicated in the report. The document had State control of people’s access and circulation as its main concern:

in a one-to-one access control, false negatives inconvenience legitimate users; false positives undermine a system owner’s security goals. On the other hand, in a one-to-many deportee detection application, a false negative would present a security problem, and a false positive would flag legitimate visitors.” (Grother et al., 2019GROTHER, Patrick; NGAN, Mei; HANAOKA, Kayee. 2019. Face Recognition Vendor Test (FRVT) - Part 3: Demographic Effects. Maryland: National Institute of Standards and Technology., p. 6)

The ability of technology providers to audit their own systems, the acceptance of error rates, the way in which they are installed and sold and, finally, the way in which government buyers evaluate or accept such errors are just as or more important than the metric indicators of accuracy or error.

In societies such as in Brazil, where racist penal selectivity is the rule, tracking technologies will not but help the mass incarceration of specific groups. With more than 700,000 people incarcerated, Brazil has the third largest prison population in the world, only behind the United States and China. In 2000, the number was much smaller, adding to 232,000 people. The reasons for this escalation include instruments that intensify enforcement, such as the 11.343 Law (the “Anti-Drug Law”). The sluggishness and inhumanity of the Judiciary resulted in 292,000 people imprisoned without conviction in the 2016 survey. Almost half of pre-trial prisoners had been incarcerated for more than ninety days without trial or verdict (Departamento Penitenciário Nacional, 2016DEPARTAMENTO PENITENCIÁRIO NACIONAL. 2017. Levantamento nacional de informações penitenciárias: Junho de 2016. Brasília-DF: Governo Federal.).

Among the prison population, 64% are Black and 75% were unable to complete high school. When we cross this data with the reasons for arrest, we come across the criminalization of blackness and poverty. Among men, 26% have been arrested for trafficking and 12% for theft, while 62% of women answer for trafficking and 11% for theft.

Such differences have consequently led to new layers of immediate discrepancies in the application of algorithmic prison technologies. A pioneering study by Rede de Observatórios de Segurança (Security Observatory Network) showed that 90.5% of convicts based on facial recognition were Black, with the states of Bahia, Rio de Janeiro and Santa Catarina leading the use of this technique (Nunes, 2021).

The data was collected based on press coverage of approaches or on spontaneous statements by public security departments, but the accuracy of the information and the transparency of State initiatives still fall short. Pablo Nunes (2021NUNES, Pablo. 2020. “Levantamento revela que 90,5% dos presos por monitoramento facial no Brasil são negros”. The Intercept Brasil. 21 de noviembre de 2019. Disponible en: https://theintercept.com/2019/11/21/presos-monitoramento-facial-brasil-negros. [Acceso Oct. 2021]
https://theintercept.com/2019/11/21/pres...
) argues that facial recognition systems are presented as ways to modernise police practice but, according to Barbon (2019BARBON, Júlia. 2019. “151 pessoas são presas por reconhecimento facial no país; 90% são negras”. Folha de S. Paulo. 22 de noviembre de 2019.) “they have actually represented a setback regarding efficiency, transparency, accountability and the protection of the population’s personal information”.

The greater contempt for instruments of public transparency, such as the Access to Information Law, is related to the social incentive to stigmatise citizens who somehow came into contact with the prison system. Jurists observe that, without the necessary transparency of government agencies about the systems adopted, their accuracy rates and how they are trained, it is likely that “mass incarceration will become more and more intense, mainly based on the false identification of suspects” (Silva and Silva, 2019SILVA, Rosane Leal; SILVA, Fernanda dos Santos Rodrigues. 2019. “Reconhecimento facial e segurança pública: os perigos da tecnologia no sistema penal seletivo brasileiro”. Anais do 5º Congresso Internacional de Direito e Contemporaneidade: mídias e direitos da sociedade em rede. Santa Maria: Universidade Federal de Santa Maria., p. 14). Exploratory data suggests that this is a growing trend-and facial recognition is just the algorithmic prison technology most visible in the process.

Risks, spatialized and embodied

The efforts to restrict the movement of people in the segregated public space unfold on several fronts, such as urban planning, transportation and housing. To promote the incarceration of specific groups, the differential policing of supposedly more violent or problematic regions, from the State’s point of view, becomes mixed up with the history of the police institution itself. However, in the name of efficiency, predictive policing and risk-scores for regions and populations increase segregation.

Crime maps and “hot spots” are traditional resources. In Brazil, they go back to the Imperial period (Ferreira, 2011FERREIRA, Ricardo Alexandre. 2011. Crimes em comum: escravidão e liberdade sob a pena do Estado imperial brasileiro (1830-1888). São Paulo: Editora Unesp.) and over the past decades they have been transformed by data digitalisation and the seduction of algorithmic potential. Building “control centres” strengthens the role of business normativity in police practice (Cardoso, 2019CARDOSO, Bruno. 2019. “Estado, tecnologias de segurança e normatividade neoliberal”, in: BRUNO, Fernanda; CARDOSO, Bruno; KANASHIRO, Marta; GUILHON, Luciana; MELGAÇO, Lucas. (orgs.). Tecnopolíticas da vigilância: perspectivas da margem. São Paulo: Boitempo .) to manage information on pursuing public safety, prioritising indicators and the reach and reproduction of spatialized goals.

The core of predictive policing is to allocate human resources-such as patrols and the watch for suspicious behaviour-and technological equipment, such as surveillance cameras, to direct proactive surveillance over spaces based on criminality rates measured over time. However, selective policing practices driven by an imaginary of who the criminal is and the types of crimes that are observed and registered generate criminalising feedback of certain regions and populations.

As we have seen, criminal types that produce both police engagement and high levels of incarceration are in great part associated with drug trafficking and crimes against property, such as theft. A survey coordinated by Jacqueline Sinhoretto on racial inequality in police lethality and red-handed arrests in São Paulo reveals that: “police surveillance singles out Black people and recognizes them as criminal suspects, whose illegal conducts are therefore monitored more intensely; while whites, less targeted by police surveillance, enjoy less visibility before the police and are, therefore, less frequently caught in their criminal activity” (Sinhoretto et al., 2014SINHORETTO, Jacqueline; SILVESTRE, Giane; SCHLITTLER, Maria Carolina. 2014. Desigualdade racial e segurança pública em São Paulo: letalidade policial e prisões em flagrante. Relatório de pesquisa. São Carlos: Departamento de Sociologia da Universidade Federal de São Carlos., p. 27). One of this study’s most shocking indicators is police lethality: 61% of the victims surveyed are Black, that is, three times greater than White citizens. Additionally, 96% of police-inflicted deaths either were filed or did not lead to an indictment.

When structural constructions of racism are added to these practices, to the framing of favelas and urban peripheries as criminalised spaces, and to the violent typifying of conducts such as drug trafficking, police power actually means the freedom to kill in a discretionary way.

Since the 19th century, the criminalization of the use, growth and sale of substances such as those derived from cannabis has been conducted based on the fear of ideological subversion of social and labour relations. “Angolan tobacco,” “African tobacco” or even shamelessly the “Black’s tobacco” were terms used throughout the 19th and 20th centuries to refer to marijuana. Criminalised for the first time in Brazil, it has a relevant role in this history. A physician from the state of Sergipe, of Lombrosian inspiration, defended its international criminalization based on a shamelessly eugenicist perspective against Black people. The institutional and social breadth of such ideological heritage easily justifies the murder of the Black and poor population perpetrated by police officers, especially in favelas and urban peripheries, by the frequent allegation that the victim was involved in drug trafficking (Ribeiro Junior, 2016RIBEIRO JUNIOR, Antônio Carlos. 2016. “As drogas, os inimigos e a necropolítica”. Cadernos do CEAS . nº. 238, p. 596-610.).

Based on the analysis of four thousand drug traffic sentences in São Paulo, journalists of the Pública press agency concluded that Black people are more often convicted, while carrying smaller quantities of drugs. In the case of marijuana, the apprehension average varied between 136.5g (Black convict) to 482.4g (White convict). The most noticeable discrepancy was in the typification of convicts carrying up to 10 grams of marijuana: 68.4% of the Black citizens approached by the police were considered drug dealers, against 18.1% of the white citizens (Domenici et al., 2019DOMENICI, Thiago; BARCELOS, Iuri; FONSECA, Bruno. 2019. “Negros são mais condenados por tráfico e com menos drogas em São Paulo”. Pública, 6 de mayo de 2019. Disponible en: https://apublica.org/2019/05/negros-saomais-condenados-por-trafico-e-com-menos-drogas-em-sao-paulo. [Acceso Oct. 2021]
https://apublica.org/2019/05/negros-saom...
).

Even without counting the frequent use of the so-called “bust kits”9 9 In Portuguese, “kit flagrante.” It refers to a certain quantity of illicit drugs by corrupt police officers, who carry them in order to incriminate citizens by claiming that it was in their possession. , uncritical acquiescence to the value of police agents’ testimonial evidence in the “war against drugs” is the first tool of the State’s racism selective criteria to incarcerate the Blacks and the poor. The feeding of algorithmic systems with data generated by institutions in the funnels of public safety, which bring factors of blackness and poverty closer to results such as incarceration and death should not, therefore, be naturalised.

The COMPAS (Correctional Offender Management Profiling for Alternative Sanctions) case has become emblematic of algorithmic evaluation in criminal justice, particularly for relapse risk scoring. COMPAS’ main product is a set of risk scores based on 137 variables and questions related to the defendants. Offering a 1 to 10 relapse risk score would help judges and parole counsellors decide on bail, sentences and possibilities of alternative sentences, considering the probable or possible repetition of offences.

Journalists of the ProPublica news agency have identified a disproportionate attribution of risks between White and Black defendants. In one case, a White man, Vernon Prater, arrested for theft, but with a history of armed robberies and aggravated theft, received a score of 3, which represents low risk. A young Black woman, Brisha Borden, also arrested for theft, had only a history of juvenile offences, but received a repeat score of 8, which is high. Comparing offences is as shocking as the system’s predictive (in)capacity: with a low score, Vernon relapsed and participated in a robbery of great proportions. Brisha, with a high score, did not (see Angwin et al., 2016ANGWIN, Julia; LARSON, Jeff; MATTU, Surya; KIRCHNER, Lauren. 2016. “Machine Bias”, ProPublica. 23 de mayo de 2016 . Disponible en: https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing. [Acceso Oct. 2021]
https://www.propublica.org/article/machi...
).

However, the 137 variables explain only part of the problem. The composition of the variables in the questionnaire is telling. Among the items in the form, one could find: How many of your friends or acquaintances have already been arrested? How many times have you moved in the past year? Do you have a nickname? Have your friends and neighbours ever been victims of crimes? How many of your friends use illegal substances? Have you ever been suspended from school? How often do you get bored? Do you agree that a starving person has the right to steal? Do you agree that the law does not help the average citizen?

Besides the most obvious questions, such as the type of crime and history of offences, the supposedly predictive variables also survey social conditions, relationships and even the declaration of attitudes. Sociologist Ruha Benjamin points out that “all these variables are structured by racial domination - from job market discrimination to ghettoization - the survey measures the extent to which an individual’s life chances have been impacted by racism without ever asking an individual’s race (Benjamin, 2019BENJAMIN, Ruha. 2019. Race After Technology: Abolitionist Tools for the New Jim Code. Cambridge: Polity Press., p. 81).

An additional result of this process is that the distribution of the recidivism risk score has become extremely biased and prejudicial against Black Americans, whose scores concentrate at the top. As for White Americans, they concentrate at the bottom.

The necropolitical organisation of the world involves a constant transformation of violence, punishment and classification mechanisms of individuals by hegemonic powers, heirs of colonialism. Algorithmic technologies and the definition of acceptable limits of what is considered quality and efficiency in artificial intelligence are shaped by such state of power relations. Achille Mbembe writes that “[necropolitics involves] a production of borders and hierarchies, zones and enclaves; the subversion of existing property regimes; the classification of people according to different categories; resource extraction; and, finally, the production of a broad reserve of cultural imaginaries” (Mbembe, 2018, p. 39).

Over the past centuries, the prison has become one of the most important characteristics of our imagination about society, to the point of assuming, as Angela Davis (2018DAVIS, Angela. 2018. Estarão as prisões obsoletas? Rio de Janeiro: Difel.) points out, “their existence as something natural. Prison has become an essential ingredient in our common sense”. Such imaginaries articulate structural dimensions such as ideology and culture, connected to the State’s condition as a reproducer of White supremacy, as well as in its feedback in the form of data and the production of imaginaries, stereotypes, views and perverse classifications regarding life, death and violence.

Criminalization of racialized visualities and faces

Tamir Rice, 12 years old, like Marcus Vinícius and thousands of Black children around the world, was also mistaken to believe that he could exercise his childhood to its fullest, as White American boys do.

He was playing in a park with a toy gun and another resident on the block called the police claiming that someone -probably a teenager- was threatening people with a toy gun. Police officers sent to the scene received information by radio that there was a man carrying a real gun. They arrived at the scene and murdered Tamir immediately, in a state where it is legal to carry weapons in public spaces, as long as they are in plain sight.

The reproduction of racist inferences that identify Black individuals and members of other racialized groups as potentially violent is one of the pillars of White supremacy, insofar as it justifies violence and control and prevents class-based solidarity and associativism. Mistaking everyday objects for weapons and shooting before asking are routine among Brazilian police. In recent killings, police officers claimed to mistake a range of objects, from umbrellas to drills, for weapons (Notícia Preta, 2019NOTÍCIA PRETA. 2019. “Mais um jovem negro é morto ao ter furadeira confundida com arma no Rio”. Disponible en: https://noticiapreta.com.br/mais-um-jovem-negro-emorto-ao-ter-furadeira-confundida-com-arma-no-rio/. [Acceso Oct. 2021]
https://noticiapreta.com.br/mais-um-jove...
).

In an experiment publicised on Twitter, Nicolas Kayser-Bril, a member of AlgorithmWatch, uploaded two photos of people holding a portable thermometer - an artefact popularised during the Covid-19 pandemic - to Google Vision, a visual computing service that labels objects and concepts in the images. In the photograph where a white hand holds the thermometer, the label with the highest accuracy rate for the image was “technology”, with 68% accuracy, and no label with a negative connotation appears. In the photo that shows a Black person’s hand holding the thermometer, “gun” leads with 88%, and “firearm” appears with 65%.

Because it was publicised on Twitter, the experiment was doubly enlightening about the affiliation of part of the techno-scientific community with the belief in technology neutrality. Dozens of people immediately claimed that the position of the thermometer was the reason for the difference in ranking. In response, developer Bart Nagel cropped the photo of the Black person and ran it again in Google Vision, comparing it to the same photo and to a photo with the hand edited to look like a white hand. In the first picture, the tag “gun” appears with 61% accuracy. In the second, the tag does not appear.

In a note issued to AlgorithmWatch, Google claimed that it does not systematically discriminate. They apologised “to anyone who may have been offended” and argued that this is not a consistent bias. The company agrees, however, that “when someone complains, many have already been disproportionately impacted by the model’s biased performance” (Kayser-Bril, 2020KAYSER-BRIL, Nicolas. 2020. “Google Apologizes After Its Vision AI Produced Racist Results”. AlgorithmWatch. 7 de abril de 2020. Disponible en: https://algorithmwatch.org/en/story/google-vision-racism/. [Acceso Oct. 2021]
https://algorithmwatch.org/en/story/goog...
). This confession says a lot about the stance on possible damages, which is only reactive - let us remember that the company also offers products and services to military and police institutions.

With a less immediate disastrous impact, but established on the same basis, automated content restriction and moderation systems have been employed in environments such as Facebook and Instagram to prevent visibility or dissemination of violent images. Among the documented cases of problems of - at least - accuracy, we have the one reported by illustrator Gabriel Jardim. On November 20, no less, National Black Awareness Day in Brazil, an illustration by the artist was prevented from being published on the platform.

Figure 1
Driver Lewis Hamilton, by Gabriel Jardim (2020)

There was an attempt to boost the illustration’s reach through Facebook/Instagram’s ad service to promote a store. However, it was not possible, as the image included “sales of ammunition, firearms, paintball, pellet guns or other types of weapons”. In the illustrator’s opinion, the image was refused because “the drawing is set in a hood with Black characters”. In the image, it is possible to recognize the visual elements of a favela or popular neighbourhood, but nothing that indicates violence. However, in Brazil, the hegemonic imagery culture favours violence in journalistic coverage or in fictional narratives set in such spaces.

In both cases, the production of automated identification systems for objects or contexts is based on the history of representations, stereotypes and framings that produce visibilities and invisibilities in a disproportionate way, and linked to asymmetries of racial and power relations. Specific spaces were bequeathed to Black Brazilians, such as “the space of poorly paid work, of precarious labour, of domestic work, of the favela, of the hill (morro)10 10 Translator’s Note: favelas in Rio de Janeiro are typically set on hillsides. and of prisons” (Jardim, 2017JARDIM, Suzane. 2017. “A reconstrução do mínimo: falsa ordem democrática e extermínio”, in: BUENO, Winnie Bueno; PINHEIRO-MACHADO, Rosana; BURIGO, Joanna; SOLANO, Esther. (Orgs). Tem saída? Ensaios críticos sobre o Brasil. Porto Alegre: Zouk., p. 196) - and the collective and creative construction of other imaginaries related to these places is not allowed or accepted.

In addition to the aforementioned platform moderation scandals, another leak of moderation rules, this time from TikTok, is related to the case of criminalising or rejecting perceptions of poverty that we have just reported. The documents leaked reported rules of human moderation against fat bodies, bodies considered to be “abnormal”, “old” or “ugly”, and the display of depredated environments or subnormal dwellings as slums, which are not deleted, but intentionally hidden (Biddle et al., 2020BIDDLE, Sam; RIBEIRO, Paulo Victor; DIAS, Tatiana. 2020. “Censura invisível”. The Intercept Brasil. 16 de marzo de 2020. Disponible en: https://theintercept.com/2020/03/16/tiktok-censurourostos-feios-e-favelas-para-atrair-novos-usuarios. [Acceso Oct. 2021]
https://theintercept.com/2020/03/16/tikt...
).

* * *

In addition to the objects and contexts, one can notice that - under new names - the worst beliefs of scientific racism from the 19th century, such as phrenology, are being restored and normalised. The effort to discover facial features linked to criminal tendencies has long been dismissed as pseudoscience, but artificial intelligence has given new life to some advocates of these ideas.

From England to China, the intention to identify criminals’ facial patterns, expression patterns, or even body movement, is a sort of normalisation undertaken by the community of machine learning. Authored by Chinese researchers, a study published in 2016 claims to have identified patterns in the faces of criminals by exploring a dataset with 2,000 photos. The scientific community has ethically, politically and technically refuted the study, but the researchers replied in an additional publication arguing that, “like most technologies, machine learning is neutral” (Wu and Zhang, 2016WU, Xiaolin; ZHANG, Xi, 2016. “Responses to Critiques on Machine Learning of Criminality Perceptions”. arXiv, nº. 1611.04135v3., p. 2), and that the study should be praised for supposedly being the “first to study face-induced inference on criminality and remaining free from any subjective biases of human observers” (Ibidem, p. 9). However, the study brings absolutely no discussion about the social impacts that could derive from the possible implementation of the system, not even a reflection on criminology and on how the people included in the database had been incarcerated.

Other technologies that have already been transformed into services relate to this simplification of supposed hints of emotions and of dangerous internal conditions. Headquartered in England, WeSee promises, even with low quality videos, to be able to “determine an individual’s mental state or intentions from their facial expressions, postures, gestures and movements” (China Daily, 2018HK Edition. 2018. “Facial Recognition Tech - HK Can Rise to the Occasion”. China Daily. 27 de julio de 2018. Disponible en: http://www.chinadaily.com.cn/hkedition/2018-07/27/content_36654608.htm. [Acceso Oct. 2021]
http://www.chinadaily.com.cn/hkedition/2...
), even if imperceptible to human eyes. The fact that such identifications may simply encourage “society to double down on its existing priorities on crime detection” (Pasquale, 2018PASQUALE, Frank. 2018. “When Machine Learning is Facially Invalid”. Communications of the ACM. Vol. 61, nº. 9, p. 25-27., p. 26) relating to spaces, target groups and specific criminal types is a possible motivation that cannot be ruled out, in addition to the commercial interest of developers and of entrepreneurs in the prison imagination. Israeli startup Faception, for example, was a venture that gained media and investors’ attention by promising “personality facial analytics.” Applying facial recognition and machine learning techniques, it is supposedly able to classify faces among potential researchers, “high IQs”, paedophiles, poker players, terrorists or white-collar criminals. In the materials of product promotion, the “terrorist” category is highlighted, with the help of a graphic that outlines the faces of Arab men supposedly identified as potential terrorists. In a few months, they received 625 thousand dollars of investment from Silicon Valley funds.11 11 Data provided by Crunchbase. For more: https://www.crunchbase.com/organization/faception/company_financials

In addition to building a concerning regress to pseudoscientific ideologies and beliefs, typical of physiognomy (Bendel, 2018BENDEL, Oliver. 2018. “The Uncanny Return of Physiognomy”, The 2018 AAAI Spring Symposium Series.), accepting discourses and possibilities of business and public policies based on new biometric surveillance practices reframes what Simone Browne calls the “digital epidermalization” of racism. At territorial, legal or digital borders, challenging marks turned into biometrics “could allow us to critically rethink our contact moments with increasingly technological borders” (Browne, 2010BROWNE, Simone. 2010. “Digital Epidermalization: Race, Identity and Biometrics”. Critical Sociology. Vol. 36, nº. 1, p. 131-150., p. 139).

To let die

One of the main ways to maintain the damage of colonialism and its necropolitical logic is maintained is its frequent organising aspect of life-and-death, mediated by opaque selectivity or harmful passivity. Literally, “to let die” is an important pillar in this process. It means a differential valuation of humanities classified by race, gender and nationality around the world. Based on Foucault’s inscriptions on biopower, Mbembe claims that:

Such power is defined in relation to a biological field-from which it takes control and in which it is inscribed. This control assumes the distribution of the human species into groups, the subdivision of the population into subgroups, and the establishment of a biological break between ones and others” (Mbembe, 2018MBEMBE, Achille. 2018. Necropolítica. São Paulo: N-1 Edições., p. 17).

The internalisation of this biological break, i.e., racism, spreads out to practices and actions and unfortunately also includes the actions of groups dedicated to the protection of life and health. Researchers in American universities carried out an important study on commercial algorithms that predict medical care needs to identify possible biases and discriminatory results in demographic subgroups. They found out that millions of Black patients had received risk scores that harmed them in terms of the care and resources they would receive (Obermeyer et al., 2019OBERMEYER, Ziad; POWERS, Brian; VOGELI, Christine; MULLAINATHAN, Sendhil. 2019. “Dissecting Racial Bias in an Algorithm Used to Manage the Health of Populations”. Science. Vol. 366, n°. 6464, p. 447-453.).

Among those attributed a certain risk score in medical screenings, Black patients were actually-at alarming rates-much more ill than White patients. When they investigated the origin of this discrepancy in the database, researchers discovered two biased variables on the data fed to the system. They were based on the history of resources and expenses directed at patients in the same conditions: while Black patients, poorer on average, were unable to spend the same amount of money on their own treatments, physicians and other healthcare professionals often decided to allocate fewer resources to them.

The algorithmic systems analysed in the study considered that the history of how much was spent would be a reliable indicator of the severity of the medical condition. Implementing automated systems based on such premises disregards economic variables within the patients’ scope-disadvantages that are largely the result of racism; and disregards discriminatory variables within the professionals’ scope: mostly White, who have historically not given equal attention to their patients.

There is a growing body of research on how racism makes health professionals evaluate and care for Black patients in a discriminatory way. The situation is especially serious in relation to gender, resulting in rates that are inhumane against Black women, especially pregnant and women giving birth. Studies have registered differences in access to health resources (Goes and Nascimento, 2013GOES, Emanuelle F; NASCIMENTO, Enilda R. 2013. “Mulheres negras e brancas e os níveis de acesso aos serviços preventivos de saúde: uma análise sobre as desigualdades”. Saúde em Debate. Vol. 37, nº. 99, p.571-579.); inequality in the supply of anaesthesia (Leal et al., 2005LEAL, Maria do Carmo; GAMA, Silvana Granado Nogueira da; CUNHA, Cynthia Braga da. 2005. “Desigualdades raciais, sociodemográficas e na assistência ao pré-natal e ao parto, 1999-2001”, Revista de Saúde Pública, Vol. 39, nº. 1, p. 100-107.), and a gap in the care provided to Black, on the one hand, and White babies on the other (Greenwood et al., 2020GREENWOOD, Brad N.; HARDEMAN, Rachel R.; HUANG, Laura; SOJOURNER, Aaron. 2020. “Physician-Patient Racial Concordance and Disparities in Birthing Mortality for Newborns”. Proceedings of the National Academy of Sciences. Vol. 117, nº. 35, p. 21194-21200.), among other serious problems.

Therefore, when dealing with attempts to automate or drive health processes and assessments by algorithms, we must take into account the so-called “mediators” that Jurema Werneck described as related to a professional’s human factor, including not only their qualification, but also “their possibilities of favouring or limiting user access to various necessary resources” (Werneck, 2016WERNECK, Jurema. 2016. “Racismo institucional e saúde da população negra”. Saúde e Sociedade. Vol. 25, n°. 3, p. 525-549., p. 544).

In that same study on algorithmic racism at scoring patients’ criticality, a metric of the previously spent amount was uncritically applied as a shortcut to assess patients’ real condition. Thus, the algorithmic system reproduced, intensified and hid granular racist decisions by physicians working for health insurance, clinics and hospitals that provided the data to train the system.

The case points towards many facts and variables of artificial “unintelligence”. The most obvious ones refer to the incompetence, to say the least, of developers, who considered the “resources spent” metric as an equivalent to “health conditions”, of the inhumane negligence of providers and private hospitals that used the algorithmic system to optimise costs without requiring prior audits. Those institutions should be aware of discriminatory factuality in the public health system.

The case also highlights something else. If the commercialization of algorithmic systems has as a main feature the attempt to impose opacity on the workflows that maintain them, what can we say about algorithmic systems based on machine learning, grounded on thousands or millions of data points of racist decisions that were already in progress?

Every time a physician ignored the pain of a Black person, chose a less effective procedure because it was cheaper, or offered care in a discriminatory way, that action directly affected that patient and was added, as a data point, to the datasets that would enable the scaled automation of racist decisions (Benjamin, 2019BENJAMIN, Ruha. 2019. “Assessing risk, automating racism”. Science. Vol. 366, nº. 6464, p. 421-422.). Ironically, the audit performed by the study shed light not only on “artificial unintelligence”, but also on the horrors of decisions made by co-signers of a racial contract (Mills, 2014MILLS, Charles W. The Racial Contract. Nova York: Cornell University Press, 2014.) in favour of violent whiteness, even before such decisions became data. In a study relevant for this discussion, Sueli Carneiro (2005CARNEIRO, Aparecida Sueli. 2005. A construção do outro como não-ser como fundamento do ser. Tese de doutorado, Universidade de São Paulo.) analyses the relationship between raciality, morbidity and mortality spread from police stations to hospitals. “The representations of raciality act by impacting morbidity and mortality processes, which makes biopower an operator in the distribution of vitalism and death, always unbalanced towards death’s side for racial groups considered undesirable” (Carneiro, 2005CARNEIRO, Aparecida Sueli. 2005. A construção do outro como não-ser como fundamento do ser. Tese de doutorado, Universidade de São Paulo., p. 323).

If in countries like Brazil police officers are fully involved in exercising the power to kill, physicians and healthcare professionals would, in theory, be involved in the preservation of life. However, data uncovered in cases like the one above reminds us that the differential options of access to public services and the quality of private services create new inequities when mediated by algorithmic systems. Therefore, algorithms cannot be considered neutral, at the risk of promoting another layer of racialized violence.

Even if some racist decisions are made unconsciously by their perpetrators, their acts result in very real and lethal harm. Paradoxically, those decisions are grouped as observable and comparable data in a system that reproduces them. Add to that another problem: audit cases such as the one above are still rare. Only a tiny portion of algorithmic systems is analysed with such breadth of data and level of attention (Epstein et al., 2018EPSTEIN, Ziv; PAYNE, Blakeley H.; SHEN, Judy Hanwen; DUBEY, Abhimanyu; FELBO, Bjarke; GROH, Matthew; OBRADOVICH, Nick; CEBRIAN, Manuel; RAHWAN, Iyad. 2018. “Closing the AI Knowledge Gap”. arXiv, nº. 1803.07233.).

We cannot, therefore, allow the gathering of everyday discriminatory actions to become data to feed machine learning systems before becoming a source of data for the collective scrutiny of racist dynamics in society.

Fuentes/Sources

Referencias/References

  • ABREU, Tenner Inauhiny de. 2012. “Nascidos no grêmio da sociedade”: racialização e mestiçagem entre os trabalhadores na Província do Amazonas (1850-1889) Dissertação de mestrado, Universidade Federal do Amazonas.
  • ALEXANDER, Michelle. 2012. A nova segregação: racismo e encarceramento em massa São Paulo: Boitempo.
  • ALMEIDA, Silvio. 2018. Racismo estrutural Belo Horizonte: Letramento.
  • BENDEL, Oliver. 2018. “The Uncanny Return of Physiognomy”, The 2018 AAAI Spring Symposium Series
  • BENJAMIN, Ruha. 2020. “Retomando nosso fôlego: estudos de ciência e tecnologia, teoria racial crítica e a imaginação carcerária”. In: SILVA, Tarcízio. (Org). Comunidades, algoritmos e ativismos digitais: olhares afrodiaspóricos São Paulo: LiteraRUA.
  • BENJAMIN, Ruha. 2019. Race After Technology: Abolitionist Tools for the New Jim Code Cambridge: Polity Press.
  • BENJAMIN, Ruha. 2019. “Assessing risk, automating racism”. Science Vol. 366, nº. 6464, p. 421-422.
  • BROWNE, Simone. 2015. Dark Matters: On the Surveillance of Blackness Londres: Duke University Press.
  • BROWNE, Simone. 2010. “Digital Epidermalization: Race, Identity and Biometrics”. Critical Sociology Vol. 36, nº. 1, p. 131-150.
  • BORGES, Juliana. 2019. Encarceramento em massa São Paulo: Pólen.
  • BROUSSARD, Meredith. 2018. Artificial (Un)intelligence: How Computers Misunderstand the World Cambridge. MA: The MIT Press.
  • CARDOSO, Bruno. 2019. “Estado, tecnologias de segurança e normatividade neoliberal”, in: BRUNO, Fernanda; CARDOSO, Bruno; KANASHIRO, Marta; GUILHON, Luciana; MELGAÇO, Lucas. (orgs.). Tecnopolíticas da vigilância: perspectivas da margem São Paulo: Boitempo .
  • CARNEIRO, Aparecida Sueli. 2005. A construção do outro como não-ser como fundamento do ser Tese de doutorado, Universidade de São Paulo.
  • DAVIS, Angela. 2018. Estarão as prisões obsoletas? Rio de Janeiro: Difel.
  • DEPARTAMENTO PENITENCIÁRIO NACIONAL. 2017. Levantamento nacional de informações penitenciárias: Junho de 2016 Brasília-DF: Governo Federal.
  • DU BOIS, W. E. B. 2018. The Souls of Black Folk Oxford: Oxford University Press.
  • EPSTEIN, Ziv; PAYNE, Blakeley H.; SHEN, Judy Hanwen; DUBEY, Abhimanyu; FELBO, Bjarke; GROH, Matthew; OBRADOVICH, Nick; CEBRIAN, Manuel; RAHWAN, Iyad. 2018. “Closing the AI Knowledge Gap”. arXiv, nº. 1803.07233.
  • FLAUZINA, Ana Pinheiro. 2014. “As fronteiras raciais do genocídio”. Direito.UnB Vol. 1, nº. 1, p. 119-146.
  • FERREIRA, Ricardo Alexandre. 2011. Crimes em comum: escravidão e liberdade sob a pena do Estado imperial brasileiro (1830-1888) São Paulo: Editora Unesp.
  • GOES, Emanuelle F; NASCIMENTO, Enilda R. 2013. “Mulheres negras e brancas e os níveis de acesso aos serviços preventivos de saúde: uma análise sobre as desigualdades”. Saúde em Debate Vol. 37, nº. 99, p.571-579.
  • GREENWOOD, Brad N.; HARDEMAN, Rachel R.; HUANG, Laura; SOJOURNER, Aaron. 2020. “Physician-Patient Racial Concordance and Disparities in Birthing Mortality for Newborns”. Proceedings of the National Academy of Sciences Vol. 117, nº. 35, p. 21194-21200.
  • GROTHER, Patrick; NGAN, Mei; HANAOKA, Kayee. 2019. Face Recognition Vendor Test (FRVT) - Part 3: Demographic Effects Maryland: National Institute of Standards and Technology.
  • JARDIM, Suzane. 2017. “A reconstrução do mínimo: falsa ordem democrática e extermínio”, in: BUENO, Winnie Bueno; PINHEIRO-MACHADO, Rosana; BURIGO, Joanna; SOLANO, Esther. (Orgs). Tem saída? Ensaios críticos sobre o Brasil Porto Alegre: Zouk.
  • MACHADO, Maria Helena P. T. 2004. “Sendo cativo nas ruas: a escravidão urbana na cidade de São Paulo”, in: PORTA, Paula (org.). História da cidade de São Paulo São Paulo: Paz e Terra.
  • LEAL, Maria do Carmo; GAMA, Silvana Granado Nogueira da; CUNHA, Cynthia Braga da. 2005. “Desigualdades raciais, sociodemográficas e na assistência ao pré-natal e ao parto, 1999-2001”, Revista de Saúde Pública, Vol. 39, nº. 1, p. 100-107.
  • LIMA, Adriano B. M., 2009. “Feitiço pega sempre: alforrias e curandeirismo no oeste paulista (século XIX)”. Anais doEncontro Escravidão e Liberdade no Brasil Meridional Curitiba, Universidade Federal do Paraná.
  • MALDONADO-TORRES, Nelson. 2018. “Analítica da colonialidade e da decolonialidade - algumas dimensões básicas”. In: BERNARDINO-COSTA, Joaze; MALDONADO-TORRES, Nelson; GROSFOGUEL, Ramón (orgs.). Decolonialidade e pensamento afrodiaspórico Belo Horizonte: Autêntica.
  • MBEMBE, Achille. 2018. Necropolítica São Paulo: N-1 Edições.
  • MENEZES, Elisa Matos. 2009. O inimputável: crimes do Estado contra a juventude criminalizada Monografia de graduação em Antropologia: Universidade de Brasília, 2009.
  • MILLS, Charles W. The Racial Contract Nova York: Cornell University Press, 2014.
  • NASCIMENTO, Abdias. 1978. O genocídio do negro brasileiro: processo de um racismo mascarado Rio de Janeiro: Paz e Terra.
  • NASCIMENTO, Beatriz. 2018. Beatriz Nascimento, quilombola e intelectual: possibilidades nos dias de destruição São Paulo: Filhos da África.
  • NOPPER, Tamara K. 2016. “Strangers to the Economy: Black Work and the Wages of Non-Blackness”. In: SAUCIER, P. Khalil; WOODS, Tryon P (eds.). Conceptual Aphasia in Black: Displacing Racial Formation Lanham: Lexington Books.
  • OBERMEYER, Ziad; POWERS, Brian; VOGELI, Christine; MULLAINATHAN, Sendhil. 2019. “Dissecting Racial Bias in an Algorithm Used to Manage the Health of Populations”. Science Vol. 366, n°. 6464, p. 447-453.
  • OLIVEIRA FILHO, Roque F. de. 2009. Crimes e perdões na ordem jurídica colonial: Bahia (1750/1808) Tese de doutorado, Universidade Federal da Bahia.
  • PASQUALE, Frank. 2018. “When Machine Learning is Facially Invalid”. Communications of the ACM Vol. 61, nº. 9, p. 25-27.
  • PIRES, Thula Rafaela de Oliveira. 2015. Colorindo memórias e redefinindo olhares: ditadura militar e racismo no Rio de Janeiro Relatório da Comissão da Verdade do Rio. Rio de Janeiro: Comissão Nacional da Verdade.
  • RIBEIRO JUNIOR, Antônio Carlos. 2016. “As drogas, os inimigos e a necropolítica”. Cadernos do CEAS . nº. 238, p. 596-610.
  • SILVA, Wellington Barbosa da. 2008. “Burlando a vigilância: repressão policial e resistência negra no Recife no século XIX (1830-1850)”. Revista África e Africanidades Ano 1, nº. 1, 2008, p. 1-18.
  • SILVA, Rosane Leal; SILVA, Fernanda dos Santos Rodrigues. 2019. “Reconhecimento facial e segurança pública: os perigos da tecnologia no sistema penal seletivo brasileiro”. Anais doCongresso Internacional de Direito e Contemporaneidade: mídias e direitos da sociedade em rede Santa Maria: Universidade Federal de Santa Maria.
  • SINHORETTO, Jacqueline; SILVESTRE, Giane; SCHLITTLER, Maria Carolina. 2014. Desigualdade racial e segurança pública em São Paulo: letalidade policial e prisões em flagrante Relatório de pesquisa. São Carlos: Departamento de Sociologia da Universidade Federal de São Carlos.
  • WERNECK, Jurema. 2016. “Racismo institucional e saúde da população negra”. Saúde e Sociedade Vol. 25, n°. 3, p. 525-549.
  • WU, Xiaolin; ZHANG, Xi, 2016. “Responses to Critiques on Machine Learning of Criminality Perceptions”. arXiv, nº. 1611.04135v3.
  • 7
    Translated from Portuguese from Silva, Tarcízio. Racismo algorítmico: inteligência artificial e discriminação nas redes digitais. São Paulo: Edições Sesc, 2022. Chapter 4 (n/p). The organisers of this dossier wish to thank the author and the publisher for their generous permission to publish this version.
  • 8
    See the infographic produced by Instituto IgarapéINSTITUTO IGARAPÉ. 2021. “Infográfico Reconhecimento Facial no Brasil”. Disponible en: https://igarape.org.br/infografico-reconhecimento-facial-no-brasil. [Acceso Oct. 2021]
    https://igarape.org.br/infografico-recon...
    at https://igarape.org.br/infografico-reconhecimento-facial-no-brasil/
  • 9
    In Portuguese, “kit flagrante.” It refers to a certain quantity of illicit drugs by corrupt police officers, who carry them in order to incriminate citizens by claiming that it was in their possession.
  • 10
    Translator’s Note: favelas in Rio de Janeiro are typically set on hillsides.
  • 11
    Data provided by Crunchbase. For more: https://www.crunchbase.com/organization/faception/company_financials
  • Translation

    Isabel Hardgrave
  • Technical revision

    Horacio F. Sívori

Publication Dates

  • Publication in this collection
    04 Dec 2023
  • Date of issue
    2023

History

  • Received
    12 Oct 2022
  • Accepted
    16 Sept 2023
Centro Latino-Americano em Sexualidade e Direitos Humanos (CLAM/IMS/UERJ) R. São Francisco Xavier, 524, 6º andar, Bloco E 20550-013 Rio de Janeiro/RJ Brasil, Tel./Fax: (21) 2568-0599 - Rio de Janeiro - RJ - Brazil
E-mail: sexualidadsaludysociedad@gmail.com