Self-Driving Cars And Racist Technology

Like many I’m waiting for self-driving cars, but I’m also increasingly concerned about how safe they will be. Now there’s another issue. The technology working on those safety issues looks to be programmed to be racist. It identifies white faces but the darker someone is the harder it is for the machine to identify it as a person, in the case of self-driving cars, pedestrians. Researchers from Georgia Tech found that machines consistently failed at recognizing darker skin tones. It’s actually not only self-driving cars, AI in Google image recognition system couldn’t recognize black people, and couldn’t tell the difference between them and an a dark ape. The researchers called such finding alarming, as I hope you will too. There are apparently radars which can better differentiate skin tones, but these are very expensive and to include them in cars would make them very expensive.

It seems to me that since the machines were once programmed by humans and that since the algorithm they function on were devised by humans that the time has come to change the algorithm. That should be the responsibility of the researchers who erred in the first place by revealing their own view of race. So my message to the companies developing AI for self-driving cars is, correct the AI race biases the original engineers programmed in before you even think of cost.

Implicit Bias

Implicit bias refers to beliefs that unconsciously drive decisions and behavior. They obviously become part of what lies behind racism. As far as the judicial system goes, racist behavior has been studied with juries, judges and prosecutors, those who put people away. Now there is growing awareness that this can be extended to public defenders as well. They apparently spend less time with defendants of color. Implicit bias is exacerbated by stress, exhaustion and speed, three of the things that affect public defenders. It’s not only the amount of time a public defender may spend with a defendant, implicit bias can affect Continue reading “Implicit Bias”