this post was submitted on 25 Mar 2021
62 points (100.0% liked)

Technology

35689 readers
282 users here now

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

founded 5 years ago
MODERATORS
 

I found this while browsing Reddit, and going past the first reaction of "this is terrible" I think it can spark an interesting discussion on machine learning and how our own societal problems can end up creating bad habits and immortalizing those issues when building such systems.

So, what do you guys think about this?

top 23 comments
sorted by: hot top controversial new old
[–] [email protected] 12 points 3 years ago (1 children)

If the google english dictionary allowed for a gender neutral pronouns, this wouldn't be a problem. I'm less mad about an algorithm accurately representing a sexist culture, and more upset that it inaccurately interpreted the original text.

[–] [email protected] 4 points 3 years ago (2 children)

As I stated in another comment, in cases like this with gender neutrality it could easily use "he/she" instead of assuming, it would interpret the text more accurately while being respectful and without going out of the defined dictionary by using "them" or something like this.

[–] [email protected] 11 points 3 years ago (1 children)

As a lifelong grammarian, I've always hate hate hated that English lacks a technically "correct" gender-neutral third person singular pronoun, and I'm frankly rather relieved to see an emerging consensus forming around "they." It may seem awkward for now, but this is how languages evolve - a grammatical "error" gets wedged into a niche to serve a linguistic need. The change is already happening, and in fifty years no one will remember or care that it used to be wrong.

[–] [email protected] 7 points 3 years ago (1 children)

Boy, I wish I could say the same for portuguese (my native language). It has the same issue but the second someone tries to bring the idea up they are instantly treated like the "twitter cancer trying to destroy our language".

[–] [email protected] 6 points 3 years ago

Same here in Germany.

[–] [email protected] 3 points 3 years ago* (last edited 3 years ago)

They can be used for a singular person according to some dictionaries https://www.merriam-webster.com/dictionary/they

It's not ideal in all situations though, since we need to be able to differentiate between plural and singular pronouns. He/she would make more sense in this context.

[–] [email protected] 8 points 3 years ago (1 children)

The problem with AI is that we don't "program" it directly. It learns on its own, absorbing any data you throw at it and naïvely interpreting it. Just like a small child might make inappropriate comments based on what they have heard, since being respectful to other people requires awareness of them.

[–] [email protected] 9 points 3 years ago* (last edited 3 years ago)

Exactly. The problem is, with a small child, you can properly teach it what's right and wrong, while with AI it's much more complicated to do so. There should be some consideration taken by people who develop this kind of software (in this case Google) about the issues it can create, since it basically parrots societal behaviors.

[–] [email protected] 7 points 3 years ago* (last edited 3 years ago) (1 children)

in the case of google translate (or any translation tool for that matter) it's not even an issue with the ml algorithm itself: separate translations can be created specifically for languages that have non-gendered pronounces, to say something like they or he/she or whatever, for other non-concrete cases it's a different issue of course

i am actually against using ml wherever it is remotely makes sense, imo the entire movement has been made worse by the hype around it and skewed it's applications away from topics where its use could be very helpful and bring improvements to society (things like science and medicine), toward things which are easy to monetize, and we now have people with phds in ml trying to discover new ways to keep users longer on youtube to watch more ads

my point being that, if you could throw away all the unnecessary applications of ml where gender/race/ethnicity bias could be a problem (like automated job hiring, crime profiling, information gathering for monetization purposes), there aren't that many things left, and the ones that left the easy fix would be just getting more non standard data, where [semi]supervised learning is concerned of course

but maybe i'm wrong, i'm curious what you think

[–] [email protected] 5 points 3 years ago

I'm not that knowledgeable about ML but from what I've seen, I wholeheartedly agree. For tasks where any bias is an issue it shouldn't be used, unless it can be developed in a way that properly deals with those biases. The lack of doing so always end up reinforcing the issues you mentioned.

[–] [email protected] 7 points 3 years ago* (last edited 3 years ago)

Google Translate is based on AI, so someone on Mastodon suggested it might be a gender bias in the training data.

Also, English has "they" for gender neutral, get on that, Google.

[–] [email protected] 6 points 3 years ago

Haha they is a gender neutral single pronoun, its in the dictionary. I don't mind machine learning in a sense because it helps to illuminate things like this. if it just spits out what it's learned, that's literally what humans do and what we are learning in turn. The only difference is humans can review things like this and change.

[–] [email protected] 4 points 3 years ago (1 children)

Sorry, but i laughed a bit when you said "immortalizing those issues", it would be so funny if our society issues went into machines algorithms lmao.

[–] [email protected] 3 points 3 years ago (1 children)

It is unironically a potential issue, like the Tay incident.

[–] [email protected] 5 points 3 years ago

In the Tay case, what happens is what expected. If you are going to Twitter to learn social norms you are gonna fail miserably.

+People knew it's a bot, and they started messing with it for the lulz. Nothing different from the Justin Bieber to North Korea case.

[–] [email protected] 3 points 3 years ago (1 children)
[–] [email protected] 8 points 3 years ago

Idk why you are getting downvoted, as a Hungarian, this is really fucking useful.

None of the pronoun battle is going on here, there is only a single pronoun, and we use it for everyone, end of story.

[–] [email protected] 3 points 3 years ago

The data fed to the algorithm is probably not balanced in terms of gender.

[–] [email protected] 2 points 1 year ago* (last edited 1 year ago)

See also: Penis is OK, Vagina is a Content Violation

Edit: no idea why i was recommended a 3 year old post...

[–] [email protected] 1 points 3 years ago

Bojler eladó :)

Come one though, nobody takes gtranslate seriously. This is probably the smallest translation error to exist.

[–] [email protected] -3 points 3 years ago (1 children)

the computer is trying to translate a laungage without prounouns into a laungage with pronouns, it did its best and what we exspected of it.

[–] [email protected] 5 points 3 years ago (1 children)

It could and should still do better than this, specially considering the stereotypical assumptions it took.

In such cases it should, for example, make it clearer that it isn't gender defined by using "(he/she)" instead of just assigning one.

[–] [email protected] 3 points 3 years ago

which would help understanding it. maybe a single-use they could be better too than a random pronoun. and also I tested some other sentences: Egy ember (a person) was translated to a man. but Translate improved a lot in the last couple years since the translations make sense at least