Hey everyone, I’ve been working on something in my free time — a wrapper that lets you use open AI models (like Mistral, LLaMA, etc.) in a clean interface.
The longer-term idea is to make it easier to run larger models — 70B+, fine-tuned variants, or your own — without needing your own GPU cluster.
Still very much a side project — I’m building this for fun, learning as I go, and curious if it’s useful to anyone else.
Link’s here if you want to try it: https://umbraai.xyz/