While inclusion and accuracy are worthy goals in the abstract, given the encoding of long-standing racism in discriminatory design, what does it mean to be included, and hence more accurately identifiable, in an unjust set of social relations? Innocence and criminality are not objective states of being that can be detected by an algorithm but are created through the interaction of institutions and individuals against the backdrop of a deeply racialized history, in which Blackness is coded as a criminal. Inclusion in this context is more akin to possession, as in Fanon’s plea that the “tool never possess the man,” where possession alerts us to the way freedom is constrained.
Consider a population-wide facial recognition program in which the Zimbabwean government has contracted a China-based company to track millions of Zimbabwean citizens in order to make the Chinese database more comprehensive by “more clearly identify[ing] different ethnicities.” The benefit for Zimbabwe is access to a suite of technologies that can be used by law enforcement and other public agencies, while positioning China to become “the world leader in artificial intelligence.”63 Transnational algorithmic diversity training par excellence! Perhaps. Or, better, neocolonial extraction for the digital age in which the people whose faces populate the database have no rights vis-à-vis the data or systems that are built with their biometric input. Not only that. Since the biggest application of facial recognition is in the context of law enforcement and immigration control, Zimbabwe is helping Chinese officials to become more adept at criminalizing Black people within China and across the African diaspora.