this post was submitted on 26 Feb 2025
25 points (87.9% liked)
LocalLLaMA
2615 readers
13 users here now
Community to discuss about LLaMA, the large language model created by Meta AI.
This is intended to be a replacement for r/LocalLLaMA on Reddit.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
It's comparable to the M4 Pro in memory bandwidth but has way more RAM for the price.
Good point. You can't even get an M* Pro with 128GB. Only the Max and Ultra lines go that high, and then you'll end up spending at least twice as much.