If the google english dictionary allowed for a gender neutral pronouns, this wouldn't be a problem. I'm less mad about an algorithm accurately representing a sexist culture, and more upset that it inaccurately interpreted the original text.
Technology
This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.
Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.
Rules:
1: All Lemmy rules apply
2: Do not post low effort posts
3: NEVER post naziped*gore stuff
4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.
5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)
6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist
7: crypto related posts, unless essential, are disallowed
As I stated in another comment, in cases like this with gender neutrality it could easily use "he/she" instead of assuming, it would interpret the text more accurately while being respectful and without going out of the defined dictionary by using "them" or something like this.
As a lifelong grammarian, I've always hate hate hated that English lacks a technically "correct" gender-neutral third person singular pronoun, and I'm frankly rather relieved to see an emerging consensus forming around "they." It may seem awkward for now, but this is how languages evolve - a grammatical "error" gets wedged into a niche to serve a linguistic need. The change is already happening, and in fifty years no one will remember or care that it used to be wrong.
Boy, I wish I could say the same for portuguese (my native language). It has the same issue but the second someone tries to bring the idea up they are instantly treated like the "twitter cancer trying to destroy our language".
Same here in Germany.
They can be used for a singular person according to some dictionaries https://www.merriam-webster.com/dictionary/they
It's not ideal in all situations though, since we need to be able to differentiate between plural and singular pronouns. He/she would make more sense in this context.
The problem with AI is that we don't "program" it directly. It learns on its own, absorbing any data you throw at it and naïvely interpreting it. Just like a small child might make inappropriate comments based on what they have heard, since being respectful to other people requires awareness of them.
Exactly. The problem is, with a small child, you can properly teach it what's right and wrong, while with AI it's much more complicated to do so. There should be some consideration taken by people who develop this kind of software (in this case Google) about the issues it can create, since it basically parrots societal behaviors.
in the case of google translate (or any translation tool for that matter) it's not even an issue with the ml algorithm itself: separate translations can be created specifically for languages that have non-gendered pronounces, to say something like they
or he/she
or whatever, for other non-concrete cases it's a different issue of course
i am actually against using ml wherever it is remotely makes sense, imo the entire movement has been made worse by the hype around it and skewed it's applications away from topics where its use could be very helpful and bring improvements to society (things like science and medicine), toward things which are easy to monetize, and we now have people with phds in ml trying to discover new ways to keep users longer on youtube to watch more ads
my point being that, if you could throw away all the unnecessary applications of ml where gender/race/ethnicity bias could be a problem (like automated job hiring, crime profiling, information gathering for monetization purposes), there aren't that many things left, and the ones that left the easy fix would be just getting more non standard data, where [semi]supervised learning is concerned of course
but maybe i'm wrong, i'm curious what you think
I'm not that knowledgeable about ML but from what I've seen, I wholeheartedly agree. For tasks where any bias is an issue it shouldn't be used, unless it can be developed in a way that properly deals with those biases. The lack of doing so always end up reinforcing the issues you mentioned.
Google Translate is based on AI, so someone on Mastodon suggested it might be a gender bias in the training data.
Also, English has "they" for gender neutral, get on that, Google.
Haha they is a gender neutral single pronoun, its in the dictionary. I don't mind machine learning in a sense because it helps to illuminate things like this. if it just spits out what it's learned, that's literally what humans do and what we are learning in turn. The only difference is humans can review things like this and change.
Sorry, but i laughed a bit when you said "immortalizing those issues", it would be so funny if our society issues went into machines algorithms lmao.
It is unironically a potential issue, like the Tay incident.
In the Tay case, what happens is what expected. If you are going to Twitter to learn social norms you are gonna fail miserably.
+People knew it's a bot, and they started messing with it for the lulz. Nothing different from the Justin Bieber to North Korea case.
Based Hungarian
Idk why you are getting downvoted, as a Hungarian, this is really fucking useful.
None of the pronoun battle is going on here, there is only a single pronoun, and we use it for everyone, end of story.
The data fed to the algorithm is probably not balanced in terms of gender.
See also: Penis is OK, Vagina is a Content Violation
Edit: no idea why i was recommended a 3 year old post...
Bojler eladó :)
Come one though, nobody takes gtranslate seriously. This is probably the smallest translation error to exist.
the computer is trying to translate a laungage without prounouns into a laungage with pronouns, it did its best and what we exspected of it.
It could and should still do better than this, specially considering the stereotypical assumptions it took.
In such cases it should, for example, make it clearer that it isn't gender defined by using "(he/she)" instead of just assigning one.
which would help understanding it. maybe a single-use they could be better too than a random pronoun. and also I tested some other sentences: Egy ember (a person) was translated to a man. but Translate improved a lot in the last couple years since the translations make sense at least