Skip to content
 
If you can read this, either the style sheet didn't load or you have an older browser that doesn't support style sheets. Try clearing your browser cache and refreshing the page.

(ThreatPost)   Amazon's facial recognition "misidentified" more than 100 photos of US and UK lawmakers as criminals. Sure Jan, misidentified   (threatpost.com) divider line
    More: Amusing, Facial recognition system, Facial recognition technology, use of facial recognition technology, Police, Amazon's face recognition platform, police departments, Paul Bischoff, facial recognition platform  
•       •       •

1187 clicks; posted to Main » on 30 Jun 2020 at 3:50 PM (2 weeks ago)   |   Favorite    |   share:  Share on Twitter share via Email Share on Facebook



46 Comments     (+0 »)
 
View Voting Results: Smartest and Funniest
 
2020-06-30 9:00:53 AM  
There are no bugs in software, only features.
 
2020-06-30 9:20:08 AM  
Fark user imageView Full Size

Wanted for High Treason and generally being an asshat.
 
2020-06-30 3:51:34 PM  
Fark user imageView Full Size
 
2020-06-30 3:56:01 PM  
typo.

fecal recognition
 
2020-06-30 3:57:35 PM  
It should say that it misidentified all the rest as not being criminals
 
2020-06-30 3:58:44 PM  

MrBallou: [Fark user image image 185x273]
Wanted for High Treason and generally being an asshat.


Wanted for rape:

Fark user imageView Full Size



Wanted for destruction of government records:

upload.wikimedia.orgView Full Size


Silver alert, suffering from dementia and missing from basement:

api.time.comView Full Size
 
2020-06-30 4:01:23 PM  
Facial Recognition: Last Week Tonight with John Oliver (HBO)
Youtube jZjmlJPJgug
 
2020-06-30 4:01:27 PM  

gar1013: MrBallou: [Fark user image image 185x273]
Wanted for High Treason and generally being an asshat.

Wanted for rape:

[Fark user image 196x257]


Wanted for destruction of government records:

[upload.wikimedia.org image 850x1260]

Silver alert, suffering from dementia and missing from basement:

[api.time.com image 850x599]


And, of course, the Trumpers had to show up. Stick to the Pol tab, dude.
 
2020-06-30 4:01:46 PM  
False positives are the key bug in all neural net based software.

Including the wet one in your head.
 
2020-06-30 4:02:09 PM  
Fark user imageView Full Size
 
2020-06-30 4:05:06 PM  
Algorithms can't be biased. They can be ineffective on certain skin tones, or face types, or badly programmed, but they, literally, cannot be racially biased.
 
2020-06-30 4:05:13 PM  
They will work the bugs out and the cameras will be everywhere.
 
2020-06-30 4:06:30 PM  
pics.me.meView Full Size
 
2020-06-30 4:08:39 PM  

gar1013: MrBallou: [Fark user image image 185x273]
Wanted for High Treason and generally being an asshat.

Wanted for rape:

[Fark user image 196x257]


Wanted for destruction of government records:

[upload.wikimedia.org image 850x1260]

Silver alert, suffering from dementia and missing from basement:

[api.time.com image 850x599]


*Activate Bullwinkle voice*
"No one ever believes my bullshiat, but this time for sure!!!"
*Deactivate Bullwinkle voice*
 
2020-06-30 4:10:03 PM  
Are you sure that is misidentified? I mean at least 51 Senators allowed a man who is doing nothing for bounties on US soldiers not be impeached. If anything I'd say it was being conservative in it's estimate especially when you factor in the people in the House who voted to let him off the hook.
 
2020-06-30 4:11:21 PM  

ClavellBCMI: gar1013: MrBallou: [Fark user image image 185x273]
Wanted for High Treason and generally being an asshat.

Wanted for rape:

[Fark user image 196x257]


Wanted for destruction of government records:

[upload.wikimedia.org image 850x1260]

Silver alert, suffering from dementia and missing from basement:

[api.time.com image 850x599]

And, of course, the Trumpers had to show up. Stick to the Pol tab, dude.


I missed where you complained about the other politicians depicted.

Oh yeah, because you are a rape apologist.
 
2020-06-30 4:11:53 PM  

Albert911emt: gar1013: MrBallou: [Fark user image image 185x273]
Wanted for High Treason and generally being an asshat.

Wanted for rape:

[Fark user image 196x257]


Wanted for destruction of government records:

[upload.wikimedia.org image 850x1260]

Silver alert, suffering from dementia and missing from basement:

[api.time.com image 850x599]

*Activate Bullwinkle voice*
"No one ever believes my bullshiat, but this time for sure!!!"
*Deactivate Bullwinkle voice*


Truth, like Holy Water, is your enemy, huh?
 
2020-06-30 4:13:19 PM  

alex10294: Algorithms can't be biased. They can be ineffective on certain skin tones, or face types, or badly programmed, but they, literally, cannot be racially biased.


Police aren't racially biased. It's simply that their fingers can be ineffective at resisting trigger pulls on certain skin tones.
 
2020-06-30 4:14:38 PM  
AI picks up on racially biased data much faster than humans, not only because it can (usually) be trained faster, but also because unlike humans, it has no outside experience to tell it what racism is or why it's bad.

I thought we already learned this lesson with Microsoft's "Tay" chatbot.
 
2020-06-30 4:17:22 PM  

gar1013: ClavellBCMI: gar1013: MrBallou: [Fark user image image 185x273]
Wanted for High Treason and generally being an asshat.

Wanted for rape:

[Fark user image 196x257]


Wanted for destruction of government records:

[upload.wikimedia.org image 850x1260]

Silver alert, suffering from dementia and missing from basement:

[api.time.com image 850x599]

And, of course, the Trumpers had to show up. Stick to the Pol tab, dude.

I missed where you complained about the other politicians depicted.

Oh yeah, because you are a rape apologist.


Nice of you to smart your own post, Trumper.
 
2020-06-30 4:19:18 PM  
Sorry if I was a under a rock when this was mentioned, but after hearing this issue mentioned several times here, I never asked WHY Amazon is working on facial recognition in the first place.
 
2020-06-30 4:20:29 PM  
Fark user imageView Full Size
 
2020-06-30 4:26:00 PM  

alex10294: Algorithms can't be biased. They can be ineffective on certain skin tones, or face types, or badly programmed, but they, literally, cannot be racially biased.


Sure they can, if they are programmed to be.
 
2020-06-30 4:27:10 PM  

gar1013: MrBallou: [Fark user image image 185x273]
Wanted for High Treason and generally being an asshat.

Wanted for rape:

[Fark user image 196x257]


Wanted for destruction of government records:

[upload.wikimedia.org image 850x1260]

Silver alert, suffering from dementia and missing from basement:

[api.time.com image 850x599]


BSAB and red team has all the same accusations, but checkmate libs! Rs have only one person accused of all 3 instead of 3 separate people!
 
2020-06-30 4:27:20 PM  

alex10294: Algorithms can't be biased. They can be ineffective on certain skin tones, or face types, or badly programmed, but they, literally, cannot be racially biased.


They can if you program them to be racially biased.
 
2020-06-30 4:27:31 PM  

Resident Muslim: Sorry if I was a under a rock when this was mentioned, but after hearing this issue mentioned several times here, I never asked WHY Amazon is working on facial recognition in the first place.


Because certain people want to pay them very large amounts of money for it.
 
2020-06-30 4:33:15 PM  
Poor Fark Trumpies have no sense of humor. SAD!
 
2020-06-30 4:34:24 PM  

alex10294: Algorithms can't be biased. They can be ineffective on certain skin tones, or face types, or badly programmed, but they, literally, cannot be racially biased.


Why not?
 
2020-06-30 4:36:39 PM  

anfrind: AI picks up on racially biased data much faster than humans, not only because it can (usually) be trained faster, but also because unlike humans, it has no outside experience to tell it what racism is or why it's bad.

I thought we already learned this lesson with Microsoft's "Tay" chatbot.


Good old ingrained 'Merican racism naturally baked right into all of our technology.
 
2020-06-30 4:38:43 PM  

ClavellBCMI: gar1013: MrBallou: [Fark user image image 185x273]
Wanted for High Treason and generally being an asshat.

Wanted for rape:

[Fark user image 196x257]


Wanted for destruction of government records:

[upload.wikimedia.org image 850x1260]

Silver alert, suffering from dementia and missing from basement:

[api.time.com image 850x599]

And, of course, the Trumpers had to show up. Stick to the Pol tab, dude.


Why?  With Drew's blessing the politics tab has spilled over into all of the other tabs like an exploding 100,000 gallon septic tank.  Face it, he frequently posts political articles on the politics tab at the same time as other tabs.

When the guy who runs the site does it, who are you to tell a user not to do it?
 
2020-06-30 4:45:46 PM  

alex10294: Algorithms can't be biased. They can be ineffective on certain skin tones, or face types, or badly programmed, but they, literally, cannot be racially biased.


Yes they can. They are ineffective on skin tones and the like because of the bias.
 
2020-06-30 5:11:44 PM  
Clearly Amazon needs to work on its false negative problem.
 
2020-06-30 5:24:13 PM  

Albert911emt: Poor Fark Trumpies have no sense of humor. SAD!


I was thinking it's  more like fark hate brigade can dish it out, but can't take it.
 
2020-06-30 5:27:10 PM  

alex10294: Algorithms can't be biased. They can be ineffective on certain skin tones, or face types, or badly programmed, but they, literally, cannot be racially biased.


I would have thought it would be pretty easy to program non-whites to identify as criminals.

Unless you're saying the algorithm can't be racist, but the programmer can. In which case, not much of a difference...
 
2020-06-30 5:29:52 PM  

mrlewish: alex10294: Algorithms can't be biased. They can be ineffective on certain skin tones, or face types, or badly programmed, but they, literally, cannot be racially biased.

Yes they can. They are ineffective on skin tones and the like because of the bias.


He probably thinks he is engaging in clever word play because most people think bias means something to do with how a person thinks without realizing that objects can have bias (e.g. a ball that isn't perfectly round that rolls off to the left).

The program does what the programmer tells it to do, whether or not it is what the programmer intended is up to the skill of the programmer.  Like the ball, the fact that the program skews one way is a bias.
 
2020-06-30 5:35:35 PM  

Resident Muslim: Sorry if I was a under a rock when this was mentioned, but after hearing this issue mentioned several times here, I never asked WHY Amazon is working on facial recognition in the first place.


Because there would be hell to pay if the government did it. It's "cleaner" if the private sector does it for them. Same with GPS tracking. Government provides the satellites, and private sector logs location data through cell phone. Give corporations first amendment rights, and let them censor content.

And on and on... The next hotness will be contact tracing.
 
2020-06-30 5:35:55 PM  
Obviously, they were guilty of something!

Fark user imageView Full Size
 
2020-06-30 5:39:04 PM  

NotThatGuyAgain: ClavellBCMI: gar1013: MrBallou: [Fark user image image 185x273]
Wanted for High Treason and generally being an asshat.

Wanted for rape:

[Fark user image 196x257]


Wanted for destruction of government records:

[upload.wikimedia.org image 850x1260]

Silver alert, suffering from dementia and missing from basement:

[api.time.com image 850x599]

And, of course, the Trumpers had to show up. Stick to the Pol tab, dude.

Why?  With Drew's blessing the politics tab has spilled over into all of the other tabs like an exploding 100,000 gallon septic tank.  Face it, he frequently posts political articles on the politics tab at the same time as other tabs.

When the guy who runs the site does it, who are you to tell a user not to do it?


Because only Trumpers are supposed to be quarantined to the pol tab. Good side politics can be discussed in any tab, any thread...
 
2020-06-30 5:55:42 PM  

RogermcAllen: mrlewish: alex10294: Algorithms can't be biased. They can be ineffective on certain skin tones, or face types, or badly programmed, but they, literally, cannot be racially biased.

Yes they can. They are ineffective on skin tones and the like because of the bias.

He probably thinks he is engaging in clever word play because most people think bias means something to do with how a person thinks without realizing that objects can have bias (e.g. a ball that isn't perfectly round that rolls off to the left).

The program does what the programmer tells it to do, whether or not it is what the programmer intended is up to the skill of the programmer.  Like the ball, the fact that the program skews one way is a bias.


Neural nets 'fix' that issue.

The bias is now hidden in the training set and nobody can prove anything.
You can exclude putting things like race into the training set, but the neural net will find a proxy (e.g. Zip code). Only true if 'race/ethnicity' is actually predictive of anything (e.g. Jewish ancestry IS predictive of winning Nobel prizes...Why? is the question). Leave it as an exercise for the reader to find additional examples where 'race' is predictive, despite not really existing.
 
2020-06-30 6:16:43 PM  

PerpetualPeristalsis: alex10294: Algorithms can't be biased. They can be ineffective on certain skin tones, or face types, or badly programmed, but they, literally, cannot be racially biased.

Why not?


Because they're computer code. If they're bad at doing something, it's because they're bad at doing it.  Like older film, or newer cameras making Black skin really dark and featureless. The film isn't biased, it's just not good at imaging black skin.  It's like saying my vacuum cleaner is nosed against dog hair, because it won't pick it up, or my old car was biased against running well. The way you would express this is that "the algorithm does a poor job discriminating between faces when they have non-white skin tones and should be improved, or need better input from cameras". Black faces are quite hard to photograph well in imperfect light, just ask any photographer.
 
2020-06-30 6:21:26 PM  
As long as nobody's looking, Amazon would want to overestimate criminal potential for the police.

Too bad somebody's looking at the moment.
 
2020-06-30 7:28:09 PM  
"Bischoff also found that the platform was racially biased, misidentifying non-white people at a higher rate than white people."

Perhaps they benchmarked it with regular police officers.

"Our suspect is black, he's black, let's get him. Oh, he says his innocent? That's resisting arrest, I feel threatened, ventilate him!" [blam blam blam].
 
2020-06-30 7:49:16 PM  

alex10294: PerpetualPeristalsis: alex10294: Algorithms can't be biased. They can be ineffective on certain skin tones, or face types, or badly programmed, but they, literally, cannot be racially biased.

Why not?

Because they're computer code. If they're bad at doing something, it's because they're bad at doing it.  Like older film, or newer cameras making Black skin really dark and featureless. The film isn't biased, it's just not good at imaging black skin.  It's like saying my vacuum cleaner is nosed against dog hair, because it won't pick it up, or my old car was biased against running well. The way you would express this is that "the algorithm does a poor job discriminating between faces when they have non-white skin tones and should be improved, or need better input from cameras". Black faces are quite hard to photograph well in imperfect light, just ask any photographer.


So you are splitting hairs by saying that the program isn't biased, only the inputs and/or outputs are?
 
2020-06-30 7:56:09 PM  

alex10294: PerpetualPeristalsis: alex10294: Algorithms can't be biased. They can be ineffective on certain skin tones, or face types, or badly programmed, but they, literally, cannot be racially biased.

Why not?

Because they're computer code. If they're bad at doing something, it's because they're bad at doing it.  Like older film, or newer cameras making Black skin really dark and featureless. The film isn't biased, it's just not good at imaging black skin.  It's like saying my vacuum cleaner is nosed against dog hair, because it won't pick it up, or my old car was biased against running well. The way you would express this is that "the algorithm does a poor job discriminating between faces when they have non-white skin tones and should be improved, or need better input from cameras". Black faces are quite hard to photograph well in imperfect light, just ask any photographer.


The film actually is biased. It's specifically designed to perk up white skin colors. I'll grant you that people with dark black skin are going to be difficult to photograph anyway, because their skin appears to be designed to scatter light off. They may look black to you but, a lot of their skin is working like mirrors.
 
2020-06-30 11:32:38 PM  

jfclark27: Resident Muslim: Sorry if I was a under a rock when this was mentioned, but after hearing this issue mentioned several times here, I never asked WHY Amazon is working on facial recognition in the first place.

Because there would be hell to pay if the government did it. It's "cleaner" if the private sector does it for them. Same with GPS tracking. Government provides the satellites, and private sector logs location data through cell phone. Give corporations first amendment rights, and let them censor content.

And on and on... The next hotness will be contact tracing.


I'm not saying corporate hands are clean, it's just for something like this, wouldn't you think that a more likely and more plausible deniability company would be Microsoft/Google/Apple?
They all can claim they did it for login/security purposes.

The only thing amazon can claim is facial recognition for deliveries, and even that is not plausible because it means you need to be home to receive your goods.
 
2020-07-01 1:50:05 PM  

Resident Muslim: jfclark27: Resident Muslim: Sorry if I was a under a rock when this was mentioned, but after hearing this issue mentioned several times here, I never asked WHY Amazon is working on facial recognition in the first place.

Because there would be hell to pay if the government did it. It's "cleaner" if the private sector does it for them. Same with GPS tracking. Government provides the satellites, and private sector logs location data through cell phone. Give corporations first amendment rights, and let them censor content.

And on and on... The next hotness will be contact tracing.

I'm not saying corporate hands are clean, it's just for something like this, wouldn't you think that a more likely and more plausible deniability company would be Microsoft/Google/Apple?
They all can claim they did it for login/security purposes.

The only thing amazon can claim is facial recognition for deliveries, and even that is not plausible because it means you need to be home to receive your goods.


Well, Amazon started out as a book store...

Camera identifies you through facial recognition. Ads targeted directly to you based on your location to nearby services? Provide the tech to law enforcement, get tax breaks? Tax breaks pay for R&D?

Who knows. Knowledge is power. Soon, corporations will have more knowledge power than the government has fire power. And, corporations are people with rights. Perhaps a corporation can run for president?
 
Displayed 46 of 46 comments

View Voting Results: Smartest and Funniest

This thread is closed to new comments.

Continue Farking





On Twitter




In Other Media
  1. Links are submitted by members of the Fark community.

  2. When community members submit a link, they also write a custom headline for the story.

  3. Other Farkers comment on the links. This is the number of comments. Click here to read them.

  4. Click here to submit a link.