this post was submitted on 26 Jun 2025
60 points (98.4% liked)

Technology

71922 readers
4264 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
60
submitted 23 hours ago* (last edited 23 hours ago) by floofloof to c/[email protected]
all 9 comments
sorted by: hot top controversial new old
[–] [email protected] 22 points 22 hours ago (3 children)

AI companies than blog and social-media posts. (Ziff Davis is suing OpenAI for training on its articles without paying a licensing fee.) Researchers at Microsoft have also written publicly about “the importance of high-quality data” and have suggested that textbook-style content may be particularly desirable.

If they want quality data then, don't kill them. Secondly, if they want us as gig workers providing content for AI, don't act surprised when people start feeding gibberish. It's already happening, llm are hallucinating a whole lot more than the earliest gpt 3 models. That means something, they just haven't thought about it long enough. If a reasoning model gets stuff wrong 30 to 50% of the time, with peak of 75% bullshit rate, it's worthless. Killing good journalism for this is so dumb.

[–] phoenixz 12 points 14 hours ago

If you want quality data, then don't kill them

That is like telling cancer that if it wants to live it shouldn't kill the host.

You're asking a lot from people without the ability to think about anything else than themselves

[–] floofloof 6 points 17 hours ago* (last edited 17 hours ago)

If it gets wrong enough, people will stop using it. So it would be in the interests of AI companies to pay for good sources of data.

Or at least you'd hope that. In actual fact they'll be thinking: let's keep stealing because most people don't know or care whether what the AI says is true. Besides, they can make money by turning it into a tool for disseminating the views of those who can pay the most.

[–] [email protected] 1 points 15 hours ago (1 children)

Interestingly, I'm not seeing your quoted content when I look at this article. I see a three-paragraph-long article that says in a nutshell "people don't visit source sites as much now that AI summarizes the contents for them." (Ironic that I am manually summarizing it like that).

Perhaps it's some kind of paywall blocking me from seeing the rest? I don't see any popup telling me that, but I've got a lot of adblockers that might be stopping that from appearing. I'm not going to disable adblockers just to see whether this is paywalled, given how incredibly intrusive and annoying ads are these days.

Gee, I wonder why people prefer AI.

[–] [email protected] 1 points 10 hours ago

Alright, the site itself is legible, but if you find it hard to read you could use ublock or the archive. is website. It's also a short article.