this post was submitted on 24 Jun 2025
27 points (80.0% liked)

Ollama - Local LLMs for everyone!

194 readers
1 users here now

A place to discuss Ollama, from basic use, extensions and addons, integrations, and using it in custom code to create agents.

founded 1 week ago
MODERATORS
 

Do you use it to help with schoolwork / work? Maybe to help you code projects, or to help teach you how to do something?

What are your preferred models and why?

you are viewing a single comment's thread
view the rest of the comments
[โ€“] [email protected] 3 points 1 week ago (3 children)

I haven't been able to find a model that is both performant and useful on my machines (RTX 3060 12GB and M4 Mac mini), but I am open to suggestions! I know I want to use local LLMs more, but I feel that their utility is limited on consumer hardware

[โ€“] [email protected] 1 points 1 week ago

Did you check out Gemma 3 variants? They were quite good in my opinion.

load more comments (2 replies)