this post was submitted on 25 Feb 2025
49 points (100.0% liked)

Framework Laptop Community

2872 readers
115 users here now

Related links:

Related communities:

founded 3 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 24 points 6 days ago (44 children)

New products:

  • AMD Ryzen 300 13" motherboards
  • Translucent bezels (just 13"?) and expansion cards
  • ITX desktop
  • 12" 2-in-1 convertible

Not too exciting for me, personally.

[–] [email protected] 21 points 6 days ago* (last edited 6 days ago) (41 children)

I can't recall anyone ever asking FW for a desktop PC. It's like the one modern electronics product that's already modular. And it looks like they're selling them with mobile processors only also, which, why?

As for the convertible, there's no AMD option?

Nor did I see any details regarding pricing, and their website is 404'd to hell.

E: Got into the website, still no pricing available for 12" convertibel.

[–] [email protected] 6 points 6 days ago (17 children)

Yea, I don't see the point of the desktop, but it sounds more like they are pushing it towards AI servers for the small players.

[–] [email protected] 2 points 5 days ago* (last edited 5 days ago) (1 children)

For inference (running previously-trained models that need lots of RAM), the desktop could be useful, but I would be surprised if training anything bigger than toy examples on this hardware would make sense because I expect compute performance to be limited.

Does anyone here have practical recent experience with ROCm and how it compares with the far-more-dominant CUDA? I would imagine that compatibility is much better now that most models are using PyTorch and that is supported, but what is the performance compared to a dedicated Nvidia GPU?

[–] [email protected] 2 points 5 days ago (1 children)

ROCM is complete garbage. AMD has an event every year that "Pytorch works now!" and it never does.

ZLUDA is supposedly a good alternative to ROCM but I have not tried it.

[–] [email protected] 1 points 5 days ago* (last edited 5 days ago)

Thanks for the comment. I have had exposure to similar claims, but wasn't seeing anyone using AMD GPUs for AI unless they were somehow incentivized by AMD, which made me suspicious.

In principle, more competition in the AI hardware market would be amazing, and Nvidia GPUs do feel overpriced, but I personally don't want to deal with the struggles of early adoption.

load more comments (15 replies)
load more comments (38 replies)
load more comments (40 replies)