NYPD's Big Artificial-Intelligence Reveal

ORDER_66

I dont care anymore 2026
Joined
Feb 2, 2014
Messages
148,670
Reputation
16,745
Daps
590,364
Reppin
Queens,NY
I agree that this is an inherent bias in data submission to worry about. I guess as someone with a degree in Computer Engineering and dealt with this stuff at least at an academic level, I might have a slight favoritism bias towards it. But at the same time, I'm speaking as someone who can envision working on such software myself. I can put myself in the shoes of someone creating this and I can't imagine the first thought for someone working on it wouldn't be how to minimize human biases as much as possible in data submission. It seems like such a "first thought" concept when approaching this problem.

this won't end well.... because the person who creates this is inherently biased...:beli: all these shows and movies about the dire warnings of A.I. and people are still gung ho to make them work... Person's of interest was a great show tackling this very idea of using an A.I. for law enforcement....The creator of the machine was a decent person but he had to teach the machine to VALUE the importance of human life, not be biased, not be selfish...Problem is the NYPD is under a mandate to arrest people for crimes... and their perception is black and poor neighborhoods is where crimes is supposedly high... they'll deploy this A.I. against us...Not everyone.
 

Dr. Acula

Hail Hydra
Supporter
Joined
Jul 26, 2012
Messages
26,262
Reputation
8,928
Daps
139,884
this won't end well.... because the person who creates this is inherently biased...:beli: all these shows and movies about the dire warnings of A.I. and people are still gung ho to make them work... Person's of interest was a great show tackling this very idea of using an A.I. for law enforcement....The creator of the machine was a decent person but he had to teach the machine to VALUE the importance of human life, not be biased, not be selfish...Problem is the NYPD is under a mandate to arrest people for crimes... and their perception is black and poor neighborhoods is where crimes is supposedly high... they'll deploy this A.I. against us...Not everyone.
Breh I'm black and not naive to the issues and I understand the concerns. But I guess having personal experience with the subject matter has either informed me better or made me biased more about the matter.

A human created system is still subject to the errors of humans but like I said just from my academic career, there is a genuine effort from a design point to actively try to control those biases as much as possible .

It is stated by the NYPD itself, so take it with a grain of salt but this is mentioned in the article

New York police point out that the software was designed to exclude race and gender from its algorithm. Based on internal testing, the NYPD told Fast Company, the software is no more likely to generate links to crimes committed by persons of a specific race than a random sampling of police reports.

If you asked me to create software that is supposed to unemotionally examine data and give you a statistical probability based on the data my mind is going to switch into a mode to give intense push back of making it possible for submissions to include any descriptive details about a criminal and only would want the data related to the crime committed. You asked me to allow the input of race, name, height , etc after it's know eyewitness testimony is not the best then I'm vehemently saying no. But not everyone is me so I dunno.

Like I said, I think instead of outright rejection of the idea, pushing for full transparency of the software should be a priority.
 

ORDER_66

I dont care anymore 2026
Joined
Feb 2, 2014
Messages
148,670
Reputation
16,745
Daps
590,364
Reppin
Queens,NY
Breh I'm black and not naive to the issues and I understand the concerns. But I guess having personal experience with the subject matter has either informed me better or made me biased more about the matter.

A human created system is still subject to the errors of humans but like I said just from my academic career, there is a genuine effort from a design point to actively try to control those biases as much as possible .

It is stated by the NYPD itself, so take it with a grain of salt but this is mentioned in the article



If you asked me to create software that is supposed to unemotionally examine data and give you a statistical probability based on the data my mind is going to switch into a mode to give intense push back of making it possible for submissions to include any descriptive details about a criminal and only would want the data related to the crime committed. You asked me to allow the input of race, name, height , etc after it's know eyewitness testimony is not the best then I'm vehemently saying no. But not everyone is me so I dunno.

Like I said, I think instead of outright rejection of the idea, pushing for full transparency of the software should be a priority.

Do YOU believe them to stay true to their word???:what:
 

Dr. Acula

Hail Hydra
Supporter
Joined
Jul 26, 2012
Messages
26,262
Reputation
8,928
Daps
139,884
Do YOU believe them to stay true to their word???:what:
:stopitslime: I feel you only focused on one part of my whole post. I said take it with a grain of salt. The solution to your concerns is constant transparency of the system which I stated.
 

Sonic Boom of the South

Louisiana, Army 2 War Vet, Jackson State Univ Alum
Supporter
Joined
May 1, 2012
Messages
82,113
Reputation
24,391
Daps
297,588
Reppin
Rosenbreg's, Rosenberg's...1825, Tulane
It can also help catch serial killers who have a signature pattern of habits in their crimes. You don't need this technology to racial profile people as we see it's done without computer assistance already.

In fact, I would argue racial profiling is an issue of human biases and prejudices. At least a computer if the only information is a dataset of crimes that have been committed and then probability is used to determine the likelihood who the offender is, won't consider personal racial prejudices in it's determination. By removing it away from a human more, you also remove it from human error and biases more.

The rights group quoted in the article calls for transparency of the software to ensure racial prejudices are not present. I agree with them on that. The details of crimes submitted should only rely on details of the crimes and work to ensure only objective facts about the case should be considered. Anyone with a slightly scientific mind and idea of how approach data analysis would already consider that such as people who are most likely to create such a software in the first place.
U typed a whole bunch conflicting nothing
 

Cadillac

Veteran
Joined
Oct 17, 2015
Messages
42,426
Reputation
6,306
Daps
140,273
Some of y'all said what I was thinking. Incoming some more profiling
 

BushidoBrown

Superstar
Joined
Oct 30, 2015
Messages
7,357
Reputation
1,563
Daps
20,041
Reppin
Brooklyn
If this was the only thing being used to convict someone I would see the issue .But not many prosecutors are going to push forward on charging someone and possibly lose a case on the "likelihood" you committed the crime based on what a computer says.

Also like I said it's not a revolutionary idea in the first place. If you've taken college probability you could implement the same system.
u clearly have a well informed technical background in this stuff based off ur replies in this thread :hubie:

still some very :mjpls: elements about the whole project
 

stave

Superstar
Joined
Jul 18, 2017
Messages
6,171
Reputation
1,811
Daps
22,630
Minority Report realities inching closer by the day

no bias... it's just software :mjpls:

and it's been shared so people can make their own tweaks :mjpls:


imagine if it really was unbiased and showed actual number of cac crimes being committed and left not pursued :sas1:


it would be by mistake of course, the software is too smart to be biased. It learns on it's own who and where the bad guys are:mjpls:

because Americans aren't ready to admit who the biggest enemy here is :sas2:
 

ORDER_66

I dont care anymore 2026
Joined
Feb 2, 2014
Messages
148,670
Reputation
16,745
Daps
590,364
Reppin
Queens,NY
Minority Report realities inching closer by the day

no bias... it's just software :mjpls:

and it's been shared so people can make their own tweaks :mjpls:


imagine if it really was unbiased and showed actual number of cac crimes being committed and left not pursued :sas1:


it would be by mistake of course, the software is too smart to be biased. It learns on it's own who and where the bad guys are:mjpls:

because Americans aren't ready to admit who the biggest enemy here is :sas2:

Yep alot of white neo nazi chatter the NSA be intercepting but they only focused on BLACK IDENTITY EXTREMISTS... :mjpls:
 
Top