this post was submitted on 24 Jun 2025
27 points (80.0% liked)
Ollama - Local LLMs for everyone!
194 readers
1 users here now
A place to discuss Ollama, from basic use, extensions and addons, integrations, and using it in custom code to create agents.
founded 1 week ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I haven't been able to find a model that is both performant and useful on my machines (RTX 3060 12GB and M4 Mac mini), but I am open to suggestions! I know I want to use local LLMs more, but I feel that their utility is limited on consumer hardware
Did you check out Gemma 3 variants? They were quite good in my opinion.