Self-Driving Cars Are More Likely To Run You Over If You Are Black, Study Suggests

acri1

The Chosen 1
Supporter
Joined
May 2, 2012
Messages
25,363
Reputation
4,225
Daps
114,351
Reppin
Detroit
Self-Driving Cars Are More Likely To Run You Over If You Are Black, Study Suggests

extra_large-1551958132-cover-image.jpg


Congresswoman Alexandria Ocasio-Cortez took a lot of criticism in January for suggesting that algorithms could have biases.

“Algorithms are still made by human beings, and those algorithms are still pegged to basic human assumptions,” she said at the annual MLK Now event. “They’re just automated assumptions. And if you don’t fix the bias, then you are just automating the bias.”

Of course, she is correct. There are numerous examples of this in consumer technologies, from facial recognition tech that doesn't recognize non-white skin tones, to cameras that tell Asian people to stop blinking, to racist soap dispensers that won't give you soap if you're black.

Especially alarming is when this tech is scaled up from soap dispensers and mobile phones, which brings us to a new problem: It appears that self-driving cars could also have a racism problem.

A new study from the Georgia Institute of Technology has found that self-driving vehicles may be more likely to run you over if you are black. The researchers found that, just like the soap dispenser, systems like those used by automated cars are worse at spotting darker skin tones.

According to the team's paper, which is available to read on Arxiv, they were motivated by the "many recent examples of [machine learning] and vision systems displaying higher error rates for certain demographic groups than others." They point out that a "few autonomous vehicle systems already on the road have shown an inability to entirely mitigate risks of pedestrian fatalities," and recognizing pedestrians is key to avoiding deaths.

They collected a large set of photographs showing pedestrians of various skin tones (using the Fitzpatrick scale for classifying skin tones) in a variety of lights, and fed them into eight different image-recognition systems. The team then analyzed how often the machine-learning systems correctly identified the presence of people across all skin tones.

They found a bias within the systems, meaning it's less likely that an automated vehicle would spot someone with darker skin tones and so would carry on driving into them. On average, they found that the systems were 5 percent less accurate at detecting people with darker skin tones. This held true even when taking into account time of day and partially-obstructing the view of the pedestrians.



The study did have limits; it used models created by academics rather than the car manufacturers themselves, but it's still useful in flagging up the recurring problem to tech companies, which could easily be solved by simply including a wide and accurate variety of humans when rigorously testing new products.

After all, it's not just skin tones that algorithms can be biased against. Voice recognition systems seem to struggle more recognizing women's voices than men's, and women are 47 percent more likely to sustain an injury while wearing a seat belt because car safety is mostly designed with men in mind.

"We hope this study provides compelling evidence of the real problem that may arise if this source of capture bias is not considered before deploying these sort of recognition models," the authors concluded in the study.

Fingers crossed Tesla and Google are feeding their machine-learning algorithms more data from people with varied skin tones than the academic models, otherwise we could soon face a situation where AI is physically able to kill you and is more likely to do so if you are not white.


Self-Driving Cars Are More Likely To Run You Over If You Are Black, Study Suggests

I'm all for science and technology but this is :huhldup:
 

GunRanger

Veteran
Joined
May 17, 2014
Messages
31,916
Reputation
4,987
Daps
105,516
"The study did have limits; it used models created by academics rather than the car manufacturers themselves, "

:beli:


Call me when teslas are killing people more than human drivers
 

mc_brew

#NotMyPresident
Joined
May 19, 2012
Messages
5,673
Reputation
2,541
Daps
19,355
Reppin
the black cat is my crown...
not shocking... this is a problem that won't ever go away either, since white people don't even accept the idea that bigotry (except against them) even exists....
 

ORDER_66

I dont care anymore 2026
Joined
Feb 2, 2014
Messages
148,812
Reputation
16,785
Daps
590,781
Reppin
Queens,NY
:snoop: the problem is not the fukking machines.... its the person who creates the shyt...:stopitslime:

how hard is it to have a computer program stop for ANY large object regardless of skin tone... why is skin tone even neccessary??? :what: what happens when the car is out in the country is it gonna hit a huge ass bear because it's not PROGRAMMED to see black or dark objects???:hhh:
 

Professor Emeritus

Veteran
Poster of the Year
Supporter
Joined
Jan 5, 2015
Messages
51,330
Reputation
19,901
Daps
204,042
Reppin
the ether
Congresswoman Alexandria Ocasio-Cortez took a lot of criticism in January for suggesting that algorithms could have biases.

You'd have to be either ignorant or a straight liar if you're giving someone heat for claiming algorithms have biases.

The biases in algorithms are a known, studied part of the field. There's a popular-level book, "Weapons of Math Destruction", written by someone straight out of the industry that covers a lot of the issues pretty well. Biases in algorithms can be even worse than human bias because there's no opportunity to challenge the algorithm or get it to "change its mind" and the algorithm's creators can just claim "we didn't do it, the algorithm did" and wash their hands clean.
 
Top