• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.

julia crawford

Took the red AND the blue pills
Member
Oct 27, 2017
35,090
This is interesting. Got this email today and was pretty cool to see Google doing this.

Google's Cloud Vision API is a service for developers that allows them to, among other things, attach labels to photos identifying the contents.

The tool can detect faces, landmarks, brand logos, and even explicit content, and has a host of uses from retailers using visual search to researchers identifying animal species.

In an email to developers on Thursday morning, seen by Business Insider, Google said it would no longer use "gendered labels" for its image tags. Instead, it will tag any images of people with "non-gendered" labels such as "person."

Google said it had made the change because it was not possible to infer someone's gender solely from their appearance. It also cited its own ethical rules on AI, stating that gendering photos could exacerbate unfair bias.

Frederike Kaltheuner, a tech policy fellow at Mozilla with expertise on AI bias, told Business Insider that the update was "very positive."

She said in an email: "Anytime you automatically classify people, whether that's their gender, or their sexual orientation, you need to decide on which categories you use in the first place — and this comes with lots of assumptions.

"Classifying people as male or female assumes that gender is binary. Anyone who doesn't fit it will automatically be misclassified and misgendered. So this is about more than just bias — a person's gender cannot be inferred by appearance. Any AI system that tried to do that will inevitably misgender people."

Via Business Insider
 

zou

Member
Oct 29, 2017
743
lol, this is such bullshit. What use is a classification service if it can't classify gender..
 

Cyanity

Member
Oct 25, 2017
9,345
lol, this is such bullshit. What use is a classification service if it can't classify gender..
I know nothing about this ai, but am gonna go out on a limb and guess that it will classify images on a sliding scale of "femininity" or "masculinity" now, instead of sorting data in a binary way?
 

MikeHattsu

Member
Oct 25, 2017
8,913
I know nothing about this ai, but am gonna go out on a limb and guess that it will classify images on a sliding scale of "femininity" or "masculinity" now, instead of sorting data in a binary way?

Just says person (picture from the article):
QRGLYLM.jpg


Ya can try it here for yourself too:
cloud.google.com

Vision AI | Cloud Vision API | Google Cloud

Derive insights from images with AutoML Vision, or use pre-trained Vision API models or create computer vision applications with Vertex AI Vision
 
Oct 26, 2017
8,686
I don't know about this - it depends on what the tagged data will be used for. No classification is perfect but avoiding it will also have an effect.
 

Border

Banned
Oct 25, 2017
14,859
So if I want a picture of a woman eating spaghetti, it would just return photographs of any and all humans eating spaghetti (including men and children)? I don't see how that is particularly helpful.
 
Oct 26, 2017
19,722
Google said it had made the change because it was not possible to infer someone's gender solely from their appearance.
Makes sense to me.

So if I want a picture of a woman eating spaghetti, it would just return photographs of any and all humans eating spaghetti (including men and children)? I don't see how that is particularly helpful.
Truly though, is it that bothersome to sort through pictures of people eating spaghetti?
 
Oct 25, 2017
3,789
How would you even define gender by looks?

This isn't quite the right question. Humans do it all the time all day every day. Yeah you can get it wrong but it's so infrequent that it's considered less work than the alternative which is to manually ask every person. It doesn't have to be 100% accurate to be useful. Of course usefulness is at odds with how the identified feel which is what this is trying to address but there was always a pretty clear decision boundary. I think the galaxy brain version is "what did you need gender info for anyway?" I can't think of any good reason.
 

uzipukki

Attempted to circumvent ban with alt account
Banned
Oct 25, 2017
5,722
How about youtube? Gonna tackle that cesspool at some point?
 

The Albatross

Member
Oct 25, 2017
38,950
I think this is the right decision and I applaud Google for doing it, it's not an easy decision, and part of why they deserve some credit for doing it is precisely because it's hard and they'll lose profit from it by doing the right thing.

It's an interesting decision as far as artificial intelligence is concerned, where in an ethical limitation is being put on an artificial intelligence that you, assume, is be developed to mimic human intelligence.

Of course an artificial intelligence algorithm can infer gender by looks, the whole point of how artificial intelligence systems work is by training them to eventually get something more right than wrong. It can infer gender by looks the same way that an AI can infer gender by your last Google search, or the politician you're likely to vote for, or the products you're likely to buy, or the route you took home from work, or the timber of your voice, or the song you're listening to on spotify, or any number of tens of thousands of data points that an AI can be trained to infer anything about anybody. Is it right all the time? No, of course not, that's the point of "Training" an AI is that it's going to be wrong, it's why AIs are "Trained" in the first place. AIs guess, and they start by guessing more wrong than right and are progressively trained to guess more right than wrong.

Humans infer gender incessantly, it might be one of our most basic intuitions about other humans, and even the most egalitarian minded people will infer gender if they're pressed ... If someone runs up to you and steals your wallet and police ask you to describe the person, even the most egalitarian minded victim of theft will probably say, "Above average height, kinda long hair, male..." Well what is average height? 5'5"? 5'9"? 6'1"? What is long hair? Shoulder length? Half-way down your back? Just covering your ears? Are you going to be right all of the time, most of the time, part of the time? Who knows. To say that "humans cannot infer gender" is to imply that there is no implicit gender bias, which is an illiberal, incorrect, flawed argument. It's like someone saying "Well, I don't see color so I can't be biased against skin color." "I don't see gender, so I can't have implicit gender bias." Bull shit, human intelligence infers gender but your better angels take over to inform you against gender bias, or your worse angels don't let your better angels step in.

I think this is a good thing because it will take the bias out of tools that are developed with Google's image classification. If you're building a stock photography search tool and basing it off of this, and you search for "male programmer," the classificiation system is not going to use human bias -- of which gender is an inescapable bias today -- to return a result, or at least, the tools will begin to be less informed about gender bias in their classification systems.

Stating that an AI cannot be trained to identify gender is an egalitarian, ethical statement, and it's about drawing a line in the sand of what the right thing to do is, not the profitable thing to do. If it were self-evident then it would not be laudable, and this would not be a remarkable story worth writing about or sharing.
 
Oct 26, 2017
2,237
I suppose identifying something as a person or not a person would be easier than "not a person / person who is a ____ or ______ or _____", etc. Too many subjective variables to account for I'd imagine.
 

Ragnorok64

Banned
Nov 6, 2017
2,955
So what's the actual impact of this even going to be? It seems like it'd be wildly unhelpful for advertisers trying to do targeted marketing and even just to people trying to do a Google image search.

Is this going to effect normal image searches?
 
OP
OP
julia crawford

julia crawford

Took the red AND the blue pills
Member
Oct 27, 2017
35,090
Is this going to effect normal image searches?

Probably not, there's an unending number of images you can parse the contents of with just their metadata, google images worked well enough before they had these kind of services. I guess it's just for people using the API, people like me, who used it for their own projects.