this post was submitted on 24 Jun 2025
27 points (80.0% liked)

Ollama - Local LLMs for everyone!

194 readers
1 users here now

A place to discuss Ollama, from basic use, extensions and addons, integrations, and using it in custom code to create agents.

founded 1 week ago
MODERATORS
 

Do you use it to help with schoolwork / work? Maybe to help you code projects, or to help teach you how to do something?

What are your preferred models and why?

you are viewing a single comment's thread
view the rest of the comments
[–] wise_pancake 2 points 1 week ago

I’m in the process of trying them out again

Phi4 has been okay for me, and I use deepseek R1 32B quantized for some coding tasks. Both are a lot for my aging m1 MacBook Pro to handle.

Lately Ive been trying deepseek 8b for document summaries and it’s pretty fast but janky.

What I’m working towards is setting up an RSS app and feeding that into a local model (freshRSS I think lets you subscribe to a combined feed) to build a newspaper of my news subscriptions, but that’s not viable until I get a computer to run as a server.