Even when used ethically, facial recognition technology carries with it a whole host of personal privacy concerns, ethnic recognition on the other hand, is an entirely different beast altogether. With few (if any) legitimate uses, and countless opportunities for misuse, ethnic recognition tools have the potential to automate racial profiling on a large scale. But Moscow-based tech company NtechLab seems posed to roll the technology out anyway. A post on the company’s website claims that one of its next products will be able to “recogniz[e] a person’s ethnicity.”
NtechLab’s facial recognition software is used to power Moscow’s 170,000-strong network of surveillance cameras and the eerily accurate app FindFace, which reportedly was able to identify random strangers with 70 percent accuracy back in 2017. The team of Russian artificial intelligence researchers who make up NtechLab have won some of the most coveted awards in the facial recognition community, and the program’s accuracy has even been independently verified (not to mention lauded) by the U.S. Department of Commerce and the University of Washington.
While facial recognition is one thing, ethnic recognition appears to have no meaningful use aside from amplifying racism and powering prejudice on a mass scale. The potential existence of such a technology raises numerous privacy concerns beyond Russian borders, as any product that automates racial classification inherently carries a high potential for misuse. There are no apparent practical applications for such a product that come close to justifying the countless potential issues inherent in any form of ethnicity recognition technology.
“[W]hen you think about how this could be misused, imagine any kind of ethnic recognition laid over CCTV cameras,” Caroline Sinders, a machine learning designer and Harvard Kennedy School affiliate, told The Outline. “How many black bodies will be more surveilled now? How many people will be accused of committing crimes that they're not committing because they looked suspicious or were loitering? How many Arab men for example will be tracked and followed in public areas because they potentially look like a terrorist?”
NtechLab’s description of the product doesn’t address any of these claims, of course, instead it merely touches upon the vague, PR-y benefits of the service at large: “Facial recognition algorithms analyze photographic images for the characteristics visible to an untrained eye. We can determine with good accuracy age, gender and certain emotions, as well as if the person is wearing sunglasses or a mustache. Those classifiers may have practical applications in retail, healthcare, entertainment and other industries by delivering accurate and timely demographic data to enhance the quality of service. Thus the algorithm has the ability to confirm the accuracy of manually entered demographic records, or to narrow the database search.”
NtechLabs did not immediately return The Outline’s request for comment, though we’ll update this post if we hear back.