this post was submitted on 29 Jan 2025
853 points (97.4% liked)

Technology

62161 readers
5305 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
(page 3) 31 comments
sorted by: hot top controversial new old
[–] [email protected] 2 points 1 week ago

Openai - you stole it from us first.
Stop, there is 0 sympathy for you

[–] [email protected] 2 points 2 weeks ago
[–] [email protected] -3 points 2 weeks ago (4 children)

The local versions I've tested out today are absolutely garbage. It frustrated me over simple questions.

[–] [email protected] 2 points 2 weeks ago (1 children)

Why is that? I mean why does the locally run version suck?

[–] [email protected] 1 points 2 weeks ago

Because it has fewer parameters and (in some cases) it's quantized. The hardware needed to run local inference on the full model is not really feasible to most people. Though, the release of it will probably still make a wide impact on the quality of other upcoming smaller models being distilled from it, or trained on synthetic data from it, or merged with it, etc.

load more comments (3 replies)
load more comments
view more: ‹ prev next ›