By J. Wylie Donald
A coded gaze is a coded gaze is a coded gaze. With apologies to Gertrude Stein—see, e.g., rose is a rose, etc.—she never bought insurance for artificial intelligence applications. Specifically, she never had to concern herself with claims of racial discrimination arising from the adoption of facial recognition technology.
What are we talking about here? Is it really possible that surveillance cameras, scanning technologies or security protocols could lead to a race discrimination claim? It is just software, right? As one metropolitan police department’s website claimed: facial recognition simply “does not see race.” Ones and zeros, right? The camera takes a picture, the software finds a match. No directions, no commentary, no words at all.