Home | Community | Message Board

MagicBag Grow Bags
This site includes paid links. Please support our sponsors.


Welcome to the Shroomery Message Board! You are experiencing a small sample of what the site has to offer. Please login or register to post messages and view our exclusive members-only content. You'll gain access to additional forums, file attachments, board customizations, encrypted private messages, and much more!

Bridgetown Botanicals Shop: Bridgetown Botanicals

Jump to first unread post Pages: 1
Invisibletdubz
Male User Gallery


Registered: 02/26/12
Posts: 5,586
A New Program Judges If You’re a Criminal From Your Facial Features
    #23846644 - 11/18/16 08:53 PM (7 years, 2 months ago)

https://motherboard.vice.com/read/new-program-decides-criminality-from-facial-features

Like a more crooked version of the Voight-Kampff test from Blade Runner, a new machine learning paper from a pair of Chinese researchers has delved into the controversial task of letting a computer decide on your innocence. Can a computer know if you’re a criminal just from your face?

In their paper ‘Automated Inference on Criminality using Face Images’, published on the arXiv pre-print server, Xiaolin Wu and Xi Zhang from China’s Shanghai Jiao Tong University investigate whether a computer can detect if a human could be a convicted criminal just by analysing his or her facial features. The two say their tests were successful, and that they even found a new law governing “the normality for faces of non-criminals.”

They described the idea of algorithms that can match and exceed a human’s performance in face recognition to infer criminality “irresistible”. But as a number of Twitter users and commenters on Hacker News point out, by stuffing biases into artificial intelligence and machine learning algorithms, the computer could act on those biases. The researchers maintain that the data sets were controlled for race, gender, age, and facial expressions, though.

Follow
Tim Maughan ✔ @timmaughan
Imagine this with drones, every CCTV camera in every city, the eyes of self driving cars, everywhere there's a camera... https://twitter.com/elizabeth_joh/status/799658586078969856
11:07 AM - 18 Nov 2016
  48 48 Retweets  38 38 likes
The images used in the research were standard ID photographs of Chinese males between the ages of 18 and 55, with no facial hair, scars, or other markings. Wu and Zhang stress that the ID photos used were not police mugshots, and that out of 730 criminals, 235 committed violent crimes “including murder, rape, assault, kidnap, and robbery.”

The two state they purposely took away “any subtle human factors” out of the assessment process. As long as data sets are finely controlled, could human bias be completely eradicated? Wu told Motherboard that human bias didn’t come into it. “In fact, we got our first batch of results a year ago. We went through very rigorous checking of our data sets, and also ran many tests searching for counterexamples but failed to find any,” said Wu.

Here’s how it worked: Xiaolin and Xi fed into a machine learning algorithm facial images of 1,856 people, of which half were convicted criminals, and then observed if any of their four classifiers—each using a different method of analysing facial features—could infer criminality.

They found that all four of their different classifiers were mostly successful, and that the faces of criminals and those not convicted of crimes differ in key ways that are perceptible to a computer program. Moreover, “the variation among criminal faces is significantly greater than that of the non-criminal faces,” Xiaolin and Xi write.

“Also, we find some discriminating structural features for predicting criminality, such as lip curvature,"
“All four classifiers perform consistently well and produce evidence for the validity of automated face-induced inference on criminality, despite the historical controversy surrounding the topic,” the researchers write. “Also, we find some discriminating structural features for predicting criminality, such as lip curvature, eye inner corner distance, and the so-called nose-mouth angle.” The best classifier, known as the Convolutional Neural Network, achieved 89.51 percent accuracy in the tests.

“By extensive experiments and vigorous cross validations,” the researchers conclude, “we have demonstrated that via supervised machine learning, data-driven face classifiers are able to make reliable inference on criminality.”

While Xiaolin and Xi admit in their paper that they are “not qualified to discuss or to debate on societal stereotypes,” the problem is that machine learning is adept at picking up on human biases in data sets and acting on those biases, as proved by multiple recent incidents. The pair admit they’re on shaky ground. “We have been accused on Internet of being irresponsible socially,” Wu said.

Follow
Stephen Mayhew @mayhewsw
This paper is the exact reason why we need to think about ethics in AI. https://arxiv.org/abs/1611.04135
3:57 PM - 17 Nov 2016
  551 551 Retweets  477 477 likes
In the paper they go on to quote philosopher Aristotle, “It is possible to infer character from features,” but that has to be left to human psychologists, not machines, surely? One major concern going forward is that of false positives—that is, identifying innocent people as guilty—especially if this program is used in any sort of real-world criminal justice settings. The researchers said the algorithms did throw up some false positives (identifying non-criminals as criminals) and false negatives (identifying criminals as non-criminals), which increased when the faces were randomly labeled for control tests.

Online critics have lambasted the paper. “I thought this was a joke when I read the abstract, but it appears to be a genuine paper,” said a user on Hacker News. “I agree it's an entirely valid area of study...but to do it you need experts in criminology, physiology and machine learning, not just a couple of people who can follow the Keras instructions for how to use a neural net for classification.”

Read more: Google-Backed A.I. Aims to Help Journalists Write Better News Stories

Others questioned the validity of the paper, noting that one of the researchers is listed as having a Gmail account. “First of all, I don't think this is satire. I'll admit that the use of a gmail account by a researcher at a Chinese uni is facially suspicious,” posed another Hackers News reader.

Wu had an answer for this, however. “Some questioned why I used gmail address as a faculty member in China. In fact, I am also a professor at McMaster University, Canada,” he told Motherboard.


Extras: Filter Print Post Top
OfflineKryptos
Stranger
 User Gallery
Registered: 11/01/14
Posts: 12,263
Last seen: 1 day, 4 hours
Re: A New Program Judges If You’re a Criminal From Your Facial Features [Re: tdubz]
    #23850148 - 11/19/16 11:07 PM (7 years, 2 months ago)

Rebirth of Phrenology? Why not, seems like we're swinging back into the post-depression 30's again everywhere else.

Guess I should probably go shave and get a haircut, can't look like I was out all night stealing and smoking dope.


Extras: Filter Print Post Top
Jump to top Pages: 1

Bridgetown Botanicals Shop: Bridgetown Botanicals


Similar ThreadsPosterViewsRepliesLast post
* FTC: Windows feature is a backdoor for spam motamanM 1,271 7 11/12/03 07:34 AM
by T0aD
* DVD-copying software illegal, says judge. Xochitl 1,063 2 02/22/04 06:37 AM
by Seuss
* Grow Rating Program teonacatl 961 3 08/06/03 09:03 AM
by teonacatl
* Good Program to delete stuff on comp that I do not need Tangerines 802 3 07/14/05 04:27 PM
by Aiko Aiko
* please help with spyware/virus protection programs Edge 2,098 19 04/06/05 09:26 AM
by trendal
* Willing to do asp programming, and get paid when cash starts Fliquid 974 3 11/12/03 03:03 AM
by Fliquid
* What can i program for you?
( 1 2 all )
vaporbrains 3,017 22 05/03/03 04:33 AM
by dumlovesyou
* programs strangladesh 531 4 03/10/06 11:59 PM
by Vvellum

Extra information
You cannot start new topics / You cannot reply to topics
HTML is disabled / BBCode is enabled
Moderator: trendal, automan, Northerner
295 topic views. 0 members, 1 guests and 0 web crawlers are browsing this forum.
[ Show Images Only | Sort by Score | Print Topic ]
Search this thread:

Copyright 1997-2024 Mind Media. Some rights reserved.

Generated in 0.023 seconds spending 0.007 seconds on 14 queries.