Never had issues with nvidia :p.. feels like im the only one
linuxmemes
Hint: :q!
Sister communities:
Community rules (click to expand)
1. Follow the site-wide rules
- Instance-wide TOS: https://legal.lemmy.world/tos/
- Lemmy code of conduct: https://join-lemmy.org/docs/code_of_conduct.html
2. Be civil
- Understand the difference between a joke and an insult.
- Do not harrass or attack users for any reason. This includes using blanket terms, like "every user of thing".
- Don't get baited into back-and-forth insults. We are not animals.
- Leave remarks of "peasantry" to the PCMR community. If you dislike an OS/service/application, attack the thing you dislike, not the individuals who use it. Some people may not have a choice.
- Bigotry will not be tolerated.
3. Post Linux-related content
- Including Unix and BSD.
- Non-Linux content is acceptable as long as it makes a reference to Linux. For example, the poorly made mockery of
sudo
in Windows. - No porn, no politics, no trolling or ragebaiting.
4. No recent reposts
- Everybody uses Arch btw, can't quit Vim, <loves/tolerates/hates> systemd, and wants to interject for a moment. You can stop now.
5. π¬π§ Language/ΡΠ·ΡΠΊ/Sprache
- This is primarily an English-speaking community. π¬π§π¦πΊπΊπΈ
- Comments written in other languages are allowed.
- The substance of a post should be comprehensible for people who only speak English.
- Titles and post bodies written in other languages will be allowed, but only as long as the above rule is observed.
6. (NEW!) Regarding public figures
We all have our opinions, and certain public figures can be divisive. Keep in mind that this is a community for memes and light-hearted fun, not for airing grievances or leveling accusations. - Keep discussions polite and free of disparagement.
- We are never in possession of all of the facts. Defamatory comments will not be tolerated.
- Discussions that get too heated will be locked and offending comments removed. Β
Please report posts and comments that break these rules!
Important: never execute code or follow advice that you don't understand or can't verify, especially here. The word of the day is credibility. This is a meme community -- even the most helpful comments might just be shitposts that can damage your system. Be aware, be smart, don't remove France.
It's not just you. Perhaps it depends on the distro?
I just had to click around a little when setting up Ubuntu 22.04 and it's done.
I currently use pop!_os and that just came with them, but even then, most other distros I tried it was one command or one click in the package manager and done
I know the open source ones are a lot more finicky so maybe also depends on what you get :3
Zorin comes with the Nvidia drivers if you want them
It's mostly when you're trying to optimize for power on a non standard distro. By default, they're kinda a power hog but you can sorta turn off the gpu when not in use, it's just fininky because Nvidia doesn't want open source drivers that can go that low level. Thankfully don't have to worry about it anymore after getting a non-Nvidia laptop for my latest daily.
Funny thing is.. I was gonna get my PC with an AMD card, but because the one I wanted was out of stock I got upgraded (depending on how you want to look at it) to an nvidia one :3
I may go AMD next time I swap it, but as I've not had any problems as of yet, im not in a major rush
My advice is to generally opt for integrated on mobile, unless you absolutely need them. I did on my last computer (training ml models can often be sped up with Cuda cores), but the trade off was it breaking three times when updating my Nvidia drivers (had to chroot in an manually update, huge pain to deal with), so I specifically went away from Nvidia drivers on my latest laptop.
Same here. I've always grabbed the latest drivers from the Nvidia page and installed the dot run file manually from a command line. From there everything just works.
As long as I revert to the open source driver before doing major OS upgrades I haven't had issues either in years. Last time I tried AMD though it was a shit show.
I have a better one. Installing ATI drivers mid 2000s.
Adjusting for overscan in the 2000s....
Can I ask for help here?
I've got 3 displays, right...a 1080p75 and a 4k60/444 on my Nvidia GeForce 1660, and a 1080p60 on my onboard graphics (AMD).
Works reasonably under X11, but can't get 4k60 (only 30) in Wayland. And not really sure I've got 4:4:4, either. Seems prime-select keeps forgetting my setting in Wayland, too.
I'm using tumbleweed with plasma as my desktop.
I think it's because of the mismatched refresh rates. I think NVIDIA is working on a fix. But that may be outdated info i'm remembering. NVIDIA has said they are committed to fixing the remaining issues with Wayland support.
Not the right place to ask. Try the official forums of your distro, or one of the many Linux communities on Lemmy.
4k60/444
Is that HDR? I can tell you right now that HDR is still experimental on all Wayland compositors (Plasma seems to be the farthest along, but still not reliable), and will never be implemented in X11.
Not quite HDR, similar but different.
4:4:4 refers to chroma subsampling. Essentially how much bandwidth is available for chroma and luma. 4:4:4 allows for an 4x2 array of pixels to each be unique colors, which isn't possible with 4:2:2 or 4:2:0.
It's a feature you really want when using a 4k TV for a monitor (as I am) because without it, text can be very fuzzy and difficult to read. Especially certain color combinations (i.e. red-on-black, as Konsole will do when there's an error).
I remember around 15 years ago I was excited to get my first computer with a dedicated graphics card, a laptop with Nvidia Optimus. It was also around the time I was just beginning to get into Linux. I found an Ubuntu forum post with detailed instructions on installing Ubuntu and setting it up properly on that exact laptop, so I tried to follow that.
It didn't help that I was unfamiliar with using the terminal at the time. But even so, this was before tools like Bumblebee were in a usable state (is Bumblebee still the preferred way to use Optimus?). I remember getting to the part about graphics switching and seeing some messy confusing hack for it. I don't remember the specifics, but I think it involved importing a script and using diff to patch something. And I think all it did was just disable the very gpu I was looking forward to trying out.
I jumped back and forth between distros and Windows 7 a lot at that time. But it was such a shitty experience all because of Nvidia that I have never purchased any of their products since then. I've owned a lot of computers in that time, and I'm just one customer lost. I hope Nvidia looks at AMD sales and wonders how many of them are users that Nvidia lost because things like that.
Cudas *
Honestly, all it took these days is reading the news.