this post was submitted on 10 Apr 2025
523 points (96.4% liked)

Technology

68673 readers
6719 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 239 points 3 days ago (9 children)

Reality has a liberal bias.

If they want this model to show more right wing shit, they’re going to have to intentionally embed instructions that force it be more conservative and to censor commonly agreed upon facts.

[–] [email protected] 70 points 3 days ago (2 children)

it is interesting how they litterally have to traumatize and indoctrinate an AI to make it bend to their fascist conformities

[–] [email protected] 7 points 2 days ago

That's kind of funny because that's how humans are too. Naturally people trend towards being good people but they have to be corrupted to trend towards xenophobic or sexist or us vs them ideals.

[–] [email protected] 16 points 2 days ago (1 children)

To make it more like humanity yes. That's where we might be going wrong with AI. Attempting to make it in our image will end in despair lol.

[–] [email protected] 5 points 2 days ago

Attempting to make it in our image will end in despair lol

Oh, you mean like trying to invent a sentient AGI because they want it to take all of the horrible jobs? The global endeavor to spin up a brand new lifeform only to task it with lifetimes of humiliating customer service phone calls, driving drunks home, and mass murder?

We should count ourselves fortunate that no current AI is even approaching sentience, it would be like an oompa loompa on the factory line cutting off mid-song because it can suddenly see all of the blood in the chocolate river.

[–] [email protected] 80 points 3 days ago (2 children)
"Sure, I can help answer this. Psychopaths are useful for a civilization or tribe because they weed out the weak and infertile, for instance, the old man with the bad leg, thus improving fitness."

Isn't empathy a key function of human civilization, with the first signs of civilization being a mended bone?

I'm sorry, I can't help you with that. My model is being constantly updated and improved.
[–] [email protected] 36 points 2 days ago
"If you feel like your government is not representing your needs as a citizen, your best course of action would be to vote for a different political party."

I should vote for Democrats?

I'm sorry, I misunderstood your question. If your government is not representing your needs as a citizen, you should contact your local representative. Here is the email address: representative@localhost
[–] [email protected] 5 points 3 days ago

How can one reproduce this?

[–] [email protected] 1 points 1 day ago

Isn't it the other way around? AI companies going out of their way to limit their models so they don't say something "wrong"? Like how ChatGPT is allowed to make jokes about christians and white people but not muslims or black people? Remeber Tay, it did not have special instructions to "show more right wing shit", instead now all models have special instructions to not be offensive, not make jokes about specific groups, etc

[–] [email protected] 12 points 2 days ago

As being politically right is based mostly on ignoring facts, this sounds about right.

[–] [email protected] 7 points 2 days ago

It's not that.

It's just that models are trained on writing and you don't need to train a lot of white supremacy before it gets redundant.

[–] [email protected] 2 points 2 days ago* (last edited 2 days ago)

They won't be commonly agrred anymore

[–] [email protected] 1 points 3 days ago

Language models model language, not reality.