top of page
Writer's pictureMalaika Norman

Racial Bias in Facial Recognition Technology

Updated: Nov 9, 2021

Racism in Facial Recognition Technology (FRT) is resulting in discriminatory policies and acts because of its direct link to the photography industry, and its history of racial bias towards Black people and People of color.

FRT uses cameras to capture the images of an individual’s face to accurately identify them. This means that it needs to be demographically sensitive however, this is not the reality of most FR software companies. The same technology that is used to open our phones, access our bank details, and, even further, used by governments and law enforcement agencies all around the world, should not leave room for racial bias.


Uber (in the UK) is currently being sued over its “racist” facial recognition algorithm that cannot accurately identify Black people and POC. When employees of color attempt to log into the app to work, it locks them out as it is not able to identify them. According to the Independent, “The Independent Workers’ Union of Great Britain (IWUGB) is bringing the action on behalf of an anonymous driver who says they were locked out so many times that their account was terminated, claiming indirect racial discrimination”.


To understand the systematic impact racism has had on FRT in today’s modern society and the companies that are working to solve this problem, it is important to understand its origin, the camera.



The first partially successful photograph of a camera image was made in approximately 1826 by Nicéphore Niépce, where racism was rampant and widely accepted meaning that diversity and inclusion were not exactly their main priority. This resulted in the camera being developed and marketed with only white people in mind, an image that would accurately capture them.


Because this is the origin and basis of photography and cameras, it has trickled down to even the most modern parts of photography and technology today, and Uber is only one of the multiple examples of this bias and its consequences.


Artificial intelligence learns from the data input and diversified data is not necessarily the solution and the solution is creating an algorithm that can recognize and identify people of all colors accurately.


Up-and-coming FR companies are aware of the clear racial bias within the industry, and as FRT is implemented in more areas of day-to-day life, government, and law enforcement, it is of increasing importance to solve.


Facedapter is one of the few companies working to solve the issue of racial bias in FRT while maintaining its speed, quality, and security.


Using different types of cameras, Facedapter cross matches all the images captured by each camera to ensure its accuracy. This eliminates the issue of identifying people that look alike and enables cameras to recognize people, even in the dark; Our unique and single software can be paired with any camera such as 2D, 3D, Near-infrared, short wave infrared, and thermal cameras.


Being the first of its kind in cross-matching facial recognition, this software uses patented biometric authentication to bridge the gap between sensing technologies and everyday forms of identification. Using a single software that works to match any Infrared, 3D, thermal facial images to pre-existing 2D images, like government-issued IDs and passports.


As FRT is already being implanted in airports as a form of self-identification, governments and regulatory bodies need to create frameworks and policies that prevent technology from perpetuating racial bias in facial recognition algorithms in a way that prevents this kind of discrimination from happening again.


211 views0 comments

Recent Posts

See All

Comments


bottom of page