this post was submitted on 25 May 2024
1 points (52.4% liked)

Australia

3846 readers
22 users here now

A place to discuss Australia and important Australian issues.

Before you post:

If you're posting anything related to:

If you're posting Australian News (not opinion or discussion pieces) post it to Australian News

Rules

This community is run under the rules of aussie.zone. In addition to those rules:

Banner Photo

Congratulations to @[email protected] who had the most upvoted submission to our banner photo competition

Recommended and Related Communities

Be sure to check out and subscribe to our related communities on aussie.zone:

Plus other communities for sport and major cities.

https://aussie.zone/communities

Moderation

Since Kbin doesn't show Lemmy Moderators, I'll list them here. Also note that Kbin does not distinguish moderator comments.

Additionally, we have our instance admins: @[email protected] and @[email protected]

founded 2 years ago
MODERATORS
all 6 comments
sorted by: hot top controversial new old
[–] [email protected] 11 points 9 months ago (1 children)

There's a few issues with this article, namely auto-correct doesn't use any AI as such (I believe, it's a pretty old thing). I agree with the general argument that the name dictionaries of these auto-correct should be expanded to recognise more names, but I think labeling it "racist" is moronic. I also take issue with this weird dig:

"The big problem is a lot of artificial intelligence scrapes the internet, whether it's writing or music or blog sites or whatever, and so much of the stuff on the internet has been made by white men.

"Effectively, AI is starting to mimic those white men, and you can see that AI on a number of platforms is increasingly becoming kind of racist and more sexist.

I don't think it's "white men" that are the problem - it's snarky redditors and stack exchange contributors. I'd might as well blame Italians for fascism or Muslims for 9/11.

Like it or not AI needs training data - and if you want a more diverse training set get out there and make it for free like everyone else does. It's a shame how sites like Twitter and Facebook have kind of killed personal blogs and promote shorter-form content which is harder to find, it's amazing how much useful stuff I find comes from random old blogs.

[–] [email protected] 8 points 9 months ago (1 children)

It is racism - an unthinking one, but still so.

Where did those dictionaries come from? Why were they chosen? Not every act of racial bias involves a burning cross or a nazi.

With an english speaking AI it's gonna be based on a library of shitposters and snarkers - and yeah, they're gonna overstep a lot of lines and they're gonna have bias, conscious or otherwise. And the two sources you cite are overwhelmingly white male.

[–] [email protected] 4 points 9 months ago* (last edited 9 months ago) (1 children)

~~> And the two sources you cite are overwhelmingly white male~~

~~what sources??~~

Edit: nevermind worked it out

Yeah AI companies are being stupid using Reddit, Twitter and StackExchange, but there's no magic fix to get more diverse AI. Would they need to pay people of underrepresented groups to write training material? Is that racist? Why should they get paid when everyone else is doing it for free?

I don't think they were being intentionally racist when they made the dictionaries for auto-correct - the article says it only accounts for around 41% of English names - but most modern auto-correct things on phones at least seem to add words to your dictionary if you use them a lot, and they've probably been limited by file size requirements in the past.

[–] [email protected] 1 points 9 months ago

This is the best summary I could come up with:


Having your name mangled by autocorrect when you type it into your phone or computer can be frustrating and time consuming.

Whenever Sydney dental student Halla writes her name into her phone, it assumes there's a typo and suggests either "Hail" or "Halal".

The campaigners have written to tech giants, including Microsoft and Apple, and provided them with a spreadsheet of names they want embedded in their dictionaries.

In the open letter, the campaigners say 41 per cent of the names given to babies in England and Wales are considered "incorrect" by Microsoft's English UK dictionary.

Microsoft has recently changed some of its language to make it more gender-neutral, using terms such as "humanity" rather than "mankind", and "workforce" rather than "manpower".

Dr Badami said there were valuable tools such as the Australian Digital Inclusion Index to measure how different people interacted with technology.


The original article contains 560 words, the summary contains 144 words. Saved 74%. I'm a bot and I'm open source!

[–] [email protected] 1 points 8 months ago