Does this have fast onboard RAM like the Framework desktop? If so then this would actually be quite interesting. Up to 96GB usable for AI models.
Homelab
Rules
- Be Civil.
- Post about your homelab, discussion of your homelab, questions you may have, or general discussion about transition your skill from the homelab to the workplace.
- No memes or potato images.
- We love detailed homelab builds, especially network diagrams!
- Report any posts that you feel should be brought to our attention.
- Please no shitposting or blogspam.
- No Referral Linking.
- Keep piracy discussion off of this community
How tf can a device which's only purpose is to literally be storage, attached to a network, be "AI"? WTF?
It could be an all-in-one box that allows for running local AI models like Llama, DeepSeek, Immich AI search and so on.
It's a trend for NAS at the moment. They're all advertising themselves as AI.
Network-attached storage is pretty proscriptive for what these machines actually do nowadays. Especially the home ones like this, tend to end up pulling multiple duties like media server, homelab, etc. Heck, a dedicated resource for local AI projects is a use case I built into my custom-built NAS using an old 2080 I had laying around after an upgrade, precisely because I want to be able to use my desktop for other stuff while the model is running.