this post was submitted on 11 Jun 2023
8 points (100.0% liked)
Furry Technologists
1371 readers
1 users here now
Science, Technology, and pawbs
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I have 6800xt and it works well for stable diffusion under Archlinux, I can also run local LLMs but last I've worked with LLMs on GPU I had to do a bit of tweaking and installing rocm forks of things like bitsandbytes