this post was submitted on 24 Jun 2025
27 points (80.0% liked)
Ollama - Local LLMs for everyone!
194 readers
1 users here now
A place to discuss Ollama, from basic use, extensions and addons, integrations, and using it in custom code to create agents.
founded 1 week ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I’m in the process of trying them out again
Phi4 has been okay for me, and I use deepseek R1 32B quantized for some coding tasks. Both are a lot for my aging m1 MacBook Pro to handle.
Lately Ive been trying deepseek 8b for document summaries and it’s pretty fast but janky.
What I’m working towards is setting up an RSS app and feeding that into a local model (freshRSS I think lets you subscribe to a combined feed) to build a newspaper of my news subscriptions, but that’s not viable until I get a computer to run as a server.