Colour blind tech

jörn dunker Jörn Dunker - 17th Jun, 2020

The issues around the higher arrest or stop and search figures for people of colour, both here and in the US, has put technology under the spotlight.

Surveillance systems are being produced and sold to councils, police and shops but serious flaws have been revealed that may produce biased results.   Trials of face scan tech and training data is reputedly tested on photos of mostly white, male faces.  As a result, the data becomes a little less reliable when it is presented with a woman or anyone with darker skin.

Amazon’s facial recognition software Rekognitions was less reliable when it came to identifying gender if the person was darker skinned or female.  Self-driving cars, researchers from the Georgia Institute of Technology found, were better at detecting light skinned people, which led to accidents when the cars failed to identify black people.

In 2018, MIT found that face recognition systems were wrong in a third of all cases involving black women but were almost 100% accurate when it came to the identification of white males.  Again, this was attributed to trials being conducted on pictures of white men.  OK tech can’t change the world but SURELY it can fix this?

Tell us about your project