(Replying to PARENT post)

Some commenters here mentioned they may have setup the test incorrectly and that may be, but I think the problem this highlights most is that technology used improperly, especially by law enforcement can have major ramifications and consequences. Is the contractor that makes software for your local department going to follow best practices and have the algorithm audited by experts? Will they release the code? Those are real considerations for any program that has the potential to help ruin someone’s life.

It also possibly highlights the fact that algorithms are not immune to bias when they are designed by humans. Obviously I don’t think this bias is intentional, but so much of it isn’t and happens anyway.

πŸ‘€alphabettsyπŸ•‘7yπŸ”Ό0πŸ—¨οΈ0

(Replying to PARENT post)

Things I want to know:

- Show us the side-by-side images of the false-positives. Are the matches plausible?

- What is the demographic distribution of the mugshot database? If the data is disproportionately biased, then that bias would be reflected in the false-positives. A casual skimming of some mughot websites shows a potentially significant racial bias.

πŸ‘€joemaller1πŸ•‘7yπŸ”Ό0πŸ—¨οΈ0

(Replying to PARENT post)

They have Montana Rep. Greg Gianforte in the list of false positives. Did they call it a false positive because they are sure he wasn't included in the 25,000 mugs they loaded? His mug is out there.

https://www.google.com/amp/s/amp.usatoday.com/amp/756343001

πŸ‘€twothamendmentπŸ•‘7yπŸ”Ό0πŸ—¨οΈ0

(Replying to PARENT post)

https://www.theverge.com/2018/7/26/17615634/amazon-rekogniti...

>Reached by The Verge, an Amazon spokesperson attributed the results to poor calibration. The ACLU’s tests were performed using Rekognition’s default confidence threshold of 80 percent β€” but Amazon says it recommends at least a 95 percent threshold for law enforcement applications where a false ID might have more significant consequences.

Presented without real comment on my part.

πŸ‘€cthalupaπŸ•‘7yπŸ”Ό0πŸ—¨οΈ0

(Replying to PARENT post)

I don't see why this is a bad thing.

Falsely arrest 28 members of Congress due to poor face recognition, and the problem of facial recognition in law enforcement is resolved the next day.

πŸ‘€AdmiralAsshatπŸ•‘7yπŸ”Ό0πŸ—¨οΈ0

(Replying to PARENT post)

> People of color were disproportionately falsely matched in our test.

They are trying to make this racially charged without giving enough information to verify their claims. If you use a dataset of mugshots, that's statistically going to have more data on people of color. If you have more data on people of color, it is more likely to match people of color. Claiming the algorithm is racist because your data is racist is inflammatory bullshit.

πŸ‘€andrewguentherπŸ•‘7yπŸ”Ό0πŸ—¨οΈ0

(Replying to PARENT post)

I don't see any mention of them verifying those members of congress are not in fact the same people as the mugshots (/s)
πŸ‘€mnxπŸ•‘7yπŸ”Ό0πŸ—¨οΈ0

(Replying to PARENT post)

Maybe their face recognition tech can see the future?
πŸ‘€ggambettaπŸ•‘7yπŸ”Ό0πŸ—¨οΈ0

(Replying to PARENT post)

Well, it's not like police are using Amazon's face recognition technology in dragnets.

Oh, wait: https://www.usatoday.com/story/tech/talkingtech/2018/06/29/c...

πŸ‘€reaperducerπŸ•‘7yπŸ”Ό0πŸ—¨οΈ0

(Replying to PARENT post)

People have similar faces. Who knew?

The technology is not perfect. And should never be used as evidence of a crime, just as an indication that two images might be of the same person and to have humans look at them.

And then those humans will also make mistakes. People do look alike. And there are twins.

I think a jury should require more evidence than just similar appearance. But such a match is a strong indication of where to look.

πŸ‘€stretchwithmeπŸ•‘7yπŸ”Ό0πŸ—¨οΈ0

(Replying to PARENT post)

The scary part is that people believe "science" because it is created by people who they think are "clever" and thefore not likely to be wrong.

How many people have been falsely convicted because "DNA"? 1B-1 odds of a match, "and sir, yet you claim you were not even in the area?".

There should be a way of recognising the parts of the science that are basically correct and the parts that are either less reliable, open to bias or could simply be broken due to incorrect process/mistake in the lab.

πŸ‘€lbrinerπŸ•‘7yπŸ”Ό0πŸ—¨οΈ0

(Replying to PARENT post)

At the risk of being down voted....it recognizes 28 members of congress as criminals....i’d Call that a good start.
πŸ‘€ericcumbeeπŸ•‘7yπŸ”Ό0πŸ—¨οΈ0

(Replying to PARENT post)

I work in the industry and obviously I can't say for certain, but I don't think Amazon has anywhere near the most accurate face recognition software available. If you want to see where the tech is in terms of performance, the various NIST tests are highly informative.
πŸ‘€drpgqπŸ•‘7yπŸ”Ό0πŸ—¨οΈ0

(Replying to PARENT post)

Amazon really has to take this back to the drawing board. ~5% accuracy is pretty terrible...
πŸ‘€ejlangevπŸ•‘7yπŸ”Ό0πŸ—¨οΈ0

(Replying to PARENT post)

Could this be down to the people training the algorithm being predominantly middle class white people? It's a pretty well known phenomenon that people are bad at recognising features of people of other races.
πŸ‘€aarong11πŸ•‘7yπŸ”Ό0πŸ—¨οΈ0

(Replying to PARENT post)

In other news, Amazons "Freudian Slip" Machine is 100% accurate.
πŸ‘€jamistevenπŸ•‘7yπŸ”Ό0πŸ—¨οΈ0

(Replying to PARENT post)

America is in too much danger to take any chances. The 28 positives should be held indefinitely until they can prove they're innocent.

Presumption of Guilt is the new normal, isn't it?

πŸ‘€squozzerπŸ•‘7yπŸ”Ό0πŸ—¨οΈ0

(Replying to PARENT post)

There are definitely concerns about false positives, but it has to be compared to the current system. Are the results, effectiveness, better?

When the gov steers opinion, we call it manufactured consent, when public advocacy organizations engage in sloppy methodology to further a cause, I propose calling it manufactured outrage.

πŸ‘€mc32πŸ•‘7yπŸ”Ό0πŸ—¨οΈ0

(Replying to PARENT post)

The algo missed 507 of those crooks!
πŸ‘€imnotlostπŸ•‘7yπŸ”Ό0πŸ—¨οΈ0

(Replying to PARENT post)

Are they sure this was a mistake? I wouldn't be terribly surprised to find out if that many had mugshots.
πŸ‘€exabrialπŸ•‘7yπŸ”Ό0πŸ—¨οΈ0

(Replying to PARENT post)

Sure. "Falsely" :-)
πŸ‘€knorkerπŸ•‘7yπŸ”Ό0πŸ—¨οΈ0

(Replying to PARENT post)

maybe we should let the cops have this...
πŸ‘€j_laneπŸ•‘7yπŸ”Ό0πŸ—¨οΈ0

(Replying to PARENT post)

What percentage of people in the openly available mugshot database were people of colour, and why should that not be relevant?
πŸ‘€hanselotπŸ•‘7yπŸ”Ό0πŸ—¨οΈ0

(Replying to PARENT post)

predictive analytics..?
πŸ‘€noliteπŸ•‘7yπŸ”Ό0πŸ—¨οΈ0

(Replying to PARENT post)

Internet used to be delivered through unreliable 28.8kbps modems. I didn't think for a second that we should cease using the Internet.

Bone marrow transplants for cancer treatment used to have a 20% failure rate. I didn't think for a second that we should cease using bone marrow transplants.

And yet why does the ACLU think we should cease using a technology to deliver potential location hits on wanted criminals because its not 100% perfect?

πŸ‘€sadrisπŸ•‘7yπŸ”Ό0πŸ—¨οΈ0

(Replying to PARENT post)

Only 28? Should be a complete 1:1, meaning they all seem crooked in one way or another...
πŸ‘€aurizonπŸ•‘7yπŸ”Ό0πŸ—¨οΈ0