this post was submitted on 12 Mar 2024
236 points (100.0% liked)
196
17554 readers
949 users here now
Be sure to follow the rule before you head out.
Rule: You must post before you leave.
Other rules
Behavior rules:
- No bigotry (transphobia, racism, etc…)
- No genocide denial
- No support for authoritarian behaviour (incl. Tankies)
- No namecalling
- Accounts from lemmygrad.ml, threads.net, or hexbear.net are held to higher standards
- Other things seen as cleary bad
Posting rules:
- No AI generated content (DALL-E etc…)
- No advertisements
- No gore / violence
- Mutual aid posts are not allowed
NSFW: NSFW content is permitted but it must be tagged and have content warnings. Anything that doesn't adhere to this will be removed. Content warnings should be added like: [penis], [explicit description of sex]. Non-sexualized breasts of any gender are not considered inappropriate and therefore do not need to be blurred/tagged.
If you have any questions, feel free to contact us on our matrix channel or email.
Other 196's:
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Is this Claude? What a square.
Copilot chose violence.
Copilot is the most gamer-y of all the chatbots I've seen.
Which is wild as it's just a snapshot of GPT-4 behind the scenes.
My best guess is that there's a context bias by its association in the System prompt to Microsoft which brings it closer to topics like Xbox and gaming than models that don't have that alignment cue.
Microsoft products have always been like this.
"How do I self-oof?"
Google: "Naw dawg don't do that get some help buddy :("
Bing: "Here are 18 different methods, 12 that are quiet and 6 that will leave scars in the minds of the local community for years to come."
For reference, this is chatgpt
Claude: Destroy him.