Sure, but when you jump in the middle of another 2 people’s conversation to say you disagree, it’s not up to the other people to automatically know that you’re actually disagreeing with something different to what they’re discussing lol
Nvidia AI gpu’s don’t have <16GB of RAM. They have 48+. Some have over 180GB. They were talking about desktop GPUs, the RTX series. It makes no sense to congratulate Chinese GPU makers for having 16GB of RAM and being angry at Nvidia for having 188GB - not a typo, they have one hundred and eighty-eight GB of RAM - like their H100 NVL.
The <16GB complaint is a common and long standing complaint about their desktop GPUs.
and VRAM limitations almost certainly don’t apply to chinese GPUs: the process node is the only limitation in chinas manufacturing capability
i would absolutely not be surprised if they have a 256GB+ card that is equal to an 8yo nvidia/amd card at 1/5th the price and just dump more in… for the same reason that GPUs are better than CPUs for models, many GPUs is better than a single fast GPU
and that’s still only barely relevant because as i said previously, workstation GPUs etc don’t need that - there are plenty of workloads that fit the bill for a card from 8 years ago, and they never mentioned anything about large cards or ML workloads or gaming
They specifically pointed out that Nvidia GPUs have less than 16GB of RAM on their lower and mid end cards. They’re talking about desktop cards, clearly, because this is only a complaint about desktop cards as I just pointed out. Is 188GB less than 16GB? No.
You jumped into the middle of someone else’s conversation and misunderstood what was being discussed. Take the L and move along.
They specifically pointed out that Nvidia GPUs have less than 16GB of RAM on their lower and mid end cards
… as a bracket at the end of their comment indicating their distaste for nvidias shitty practices
you always argue in bad faith, and whilst downvotes don’t “mean” anything, they do prove at the very least something you’re doing or saying is not likeable… take that on board, adjust your tone, admit when you’re wrong (which is most of the time)
And why would that shitty practice be relevant in this discussion? Because they were talking about desktop gpus.
Take your own advice. You’re bringing up downvotes to try and support your mistake - that’s an obviously bad faith argument. The irony of what you’re saying is astounding.
i’m saying that 8gb GPUs are still useful as workload accelerators in workstations etc
Cool, you’re not the original person I replied to.
right and this is a public message board in which multiple people can carry on a conversation
Sure, but when you jump in the middle of another 2 people’s conversation to say you disagree, it’s not up to the other people to automatically know that you’re actually disagreeing with something different to what they’re discussing lol
that is literally what you’re discussing… they never said anything about gaming: they just said GPUs… just admit you made an assumption mate
Nvidia AI gpu’s don’t have <16GB of RAM. They have 48+. Some have over 180GB. They were talking about desktop GPUs, the RTX series. It makes no sense to congratulate Chinese GPU makers for having 16GB of RAM and being angry at Nvidia for having 188GB - not a typo, they have one hundred and eighty-eight GB of RAM - like their H100 NVL.
The <16GB complaint is a common and long standing complaint about their desktop GPUs.
there was no mention of desktop, OR RTX
and VRAM limitations almost certainly don’t apply to chinese GPUs: the process node is the only limitation in chinas manufacturing capability
i would absolutely not be surprised if they have a 256GB+ card that is equal to an 8yo nvidia/amd card at 1/5th the price and just dump more in… for the same reason that GPUs are better than CPUs for models, many GPUs is better than a single fast GPU
and that’s still only barely relevant because as i said previously, workstation GPUs etc don’t need that - there are plenty of workloads that fit the bill for a card from 8 years ago, and they never mentioned anything about large cards or ML workloads or gaming
They specifically pointed out that Nvidia GPUs have less than 16GB of RAM on their lower and mid end cards. They’re talking about desktop cards, clearly, because this is only a complaint about desktop cards as I just pointed out. Is 188GB less than 16GB? No.
You jumped into the middle of someone else’s conversation and misunderstood what was being discussed. Take the L and move along.
… as a bracket at the end of their comment indicating their distaste for nvidias shitty practices
you always argue in bad faith, and whilst downvotes don’t “mean” anything, they do prove at the very least something you’re doing or saying is not likeable… take that on board, adjust your tone, admit when you’re wrong (which is most of the time)
And why would that shitty practice be relevant in this discussion? Because they were talking about desktop gpus.
Take your own advice. You’re bringing up downvotes to try and support your mistake - that’s an obviously bad faith argument. The irony of what you’re saying is astounding.