this post was submitted on 28 May 2025
132 points (95.8% liked)

Technology

70415 readers
4077 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
all 27 comments
sorted by: hot top controversial new old
[–] [email protected] 92 points 1 day ago (2 children)

Gross. I'm so sick of these AI tech bros shoving it into every corner of my life.

[–] [email protected] 50 points 1 day ago (1 children)

It's probably long past time to remove Telegram from your life.

[–] [email protected] -1 points 1 day ago (1 children)

i really like ai, but only when i can activate it myself in apps or website.

[–] [email protected] -1 points 1 day ago* (last edited 1 day ago) (1 children)

There are valuable uses for AI in general. LLMs are not one of them.

[–] [email protected] 2 points 1 day ago (1 children)
[–] [email protected] 0 points 1 day ago (1 children)

They're nothing but manufactured bullshit machines.

[–] [email protected] -1 points 1 day ago (1 children)

which models did you use and what did you ask? the big and gratis models are very good already. (claude 3.7, openai 4o, deepseek r1). mistrals gratis model is not so good.

[–] [email protected] 1 points 1 day ago

All of them

[–] [email protected] 24 points 1 day ago

grok is about to enter its furry terrorist arc

[–] [email protected] 12 points 1 day ago (1 children)

So you’re telling me that Grok is so bad that they have to pay developers to integrate it? How backwards. People are willing to pay for Grok’s competitors.

[–] [email protected] 12 points 1 day ago

No, the world's richest Nazi wants access to all of the communications of Tele-Nazis.

[–] [email protected] 32 points 1 day ago

Telegram was always a bad actor with misleading claims about their communications service. Not shocking at all.

[–] [email protected] 20 points 1 day ago (1 children)

Hold on.... Why are they paying telegram and not the other way around?!

To ask is to answer 🐸

[–] [email protected] 13 points 1 day ago

...because Grok is being used to push certain "alternative facts" to those who are susceptible.

[–] [email protected] 10 points 1 day ago (1 children)

What's worse than enshittening?

[–] [email protected] 9 points 1 day ago (1 children)
[–] [email protected] 5 points 1 day ago

Telegram with Grok.

[–] [email protected] 6 points 1 day ago

Is it gonna be opt out/opt in

[–] [email protected] 5 points 1 day ago (1 children)

Oh wow. Integrating an LLM AI in the absolute bottom in human communication; CSAM, drugs, revenge porn, criminal chats, illegal weapons trade and so much more … this will work out well.

[–] [email protected] 7 points 1 day ago (3 children)

We know about all of that bottom human stuff because it's not encrypted. I guarantee you encrypted messengers like Signal are much worse when it comes to that.

[–] [email protected] 3 points 1 day ago (1 children)

And yet, most radicalized chats are on Telegram, or so we were led to believe.

[–] toastmeister 0 points 1 day ago

Isn't telegram more like Instagram with feeds?

[–] [email protected] 2 points 1 day ago

See, believe it or not but as far as we can tell, Signal is just less convenient for large groups and it very much looks like Telegram is worse. But it doesn’t matter: it’s about AI (Grok) being integrated with an environment known for bad data. Grok will interact with bad actors and train on awful data.