NVIDIA is full of shit
-
they pay because AMD (or any other for that matter) has no product to compete with a 5080 or 5090
That’s exactly it, they have no competition at the high end
-
Because they choose not to go full idiot though. They could make their top-line cards to compete if they slam enough into a pipeline and require a dedicated PSU to compete, but that's not where their product line intends to go. That's why it's smart.
For reference: AMD has the most deployed GPUs on the planet as of right now. There's a reason why it's in every gaming console except Switch 1/2, and why OpenAI just partnered with them for chips. The goal shouldn't just making a product that churns out results at the cost of everything else does, but to be cost-effective and efficient. Nvidia fails at that on every level.
this openai partnership really stands out, because the server world is dominated by nvidia, even more than in consumer cards.
-
My mind is still blown on why people are so interested in spending 2x the cost of the entire machine they are playing on AND a hefty power utility bill to run these awful products from Nvidia. Generational improvements are minor on the performance side, and fucking AWFUL on the product and efficiency side. You'd think people would have learned their lessons a decade ago.
If you're on Windows it's hard to recommend anything else. Nvidia has DLSS supported in basically every game. For recent games there's the new transformer DLSS. Add to that ray reconstruction, superior ray tracing, and a steady stream of new features. That's the state of the art, and if you want it you gotta pay Nvidia. AMD is about 4 years behind Nvidia in terms of features. Intel is not much better. The people who really care about advancements in graphics and derive joy from that are all going to buy Nvidia because there's no competition.
-
this openai partnership really stands out, because the server world is dominated by nvidia, even more than in consumer cards.
Yup. You want a server? Dell just plain doesn't offer anything but Nvidia cards. You want to build your own? The GPGPU stuff like zluda is brand new and not really supported by anyone. You want to participate in the development community, you buy Nvidia and use CUDA.
-
this openai partnership really stands out, because the server world is dominated by nvidia, even more than in consumer cards.
Actually...not true. Nvidia recently became bigger in the DC because of their terrible inference cards being bought up, but AMD overtook Intel on chips with all major cloud platforms last year, and their Xilinix chips are slowly overtaking the sales of regular CPUs for special purposes processing. By the end of this year, I bet AMD will be the most deployed brand in datacenters globally. FPGA is the only path forward in the architecture world at this point for speed and efficiency in single-purpose processing. Nvidia doesn't have a competing product.
-
If you're on Windows it's hard to recommend anything else. Nvidia has DLSS supported in basically every game. For recent games there's the new transformer DLSS. Add to that ray reconstruction, superior ray tracing, and a steady stream of new features. That's the state of the art, and if you want it you gotta pay Nvidia. AMD is about 4 years behind Nvidia in terms of features. Intel is not much better. The people who really care about advancements in graphics and derive joy from that are all going to buy Nvidia because there's no competition.
First, DLSS is supported on Linux.
Second, DLSS is kinda bullshit. The article goes into details that are fairly accurate.
Lastly, AMD is at parity with Nvidia with features. You can see my other comments, but AMD's goal isn't selling cards for gamers. Especially ones that require an entire dedicated PSU to power them.
-
Actually...not true. Nvidia recently became bigger in the DC because of their terrible inference cards being bought up, but AMD overtook Intel on chips with all major cloud platforms last year, and their Xilinix chips are slowly overtaking the sales of regular CPUs for special purposes processing. By the end of this year, I bet AMD will be the most deployed brand in datacenters globally. FPGA is the only path forward in the architecture world at this point for speed and efficiency in single-purpose processing. Nvidia doesn't have a competing product.
we're talking GPUs, idk why you're bringing FPGA and CPUs in the mix
-
Yup. You want a server? Dell just plain doesn't offer anything but Nvidia cards. You want to build your own? The GPGPU stuff like zluda is brand new and not really supported by anyone. You want to participate in the development community, you buy Nvidia and use CUDA.
yeah, I helped raise hw requirements for two servers recently, an alternative to nvidia wasn't even on the table
-
My mind is still blown on why people are so interested in spending 2x the cost of the entire machine they are playing on AND a hefty power utility bill to run these awful products from Nvidia. Generational improvements are minor on the performance side, and fucking AWFUL on the product and efficiency side. You'd think people would have learned their lessons a decade ago.
But but but but but my shadows look 3% more realistic now!
-
First, DLSS is supported on Linux.
Second, DLSS is kinda bullshit. The article goes into details that are fairly accurate.
Lastly, AMD is at parity with Nvidia with features. You can see my other comments, but AMD's goal isn't selling cards for gamers. Especially ones that require an entire dedicated PSU to power them.
Don't you mean NVidia's goal isn't selling cards for gamers?
-
My mind is still blown on why people are so interested in spending 2x the cost of the entire machine they are playing on AND a hefty power utility bill to run these awful products from Nvidia. Generational improvements are minor on the performance side, and fucking AWFUL on the product and efficiency side. You'd think people would have learned their lessons a decade ago.
Cause numbers go brrrrrrrrr
-
My mind is still blown on why people are so interested in spending 2x the cost of the entire machine they are playing on AND a hefty power utility bill to run these awful products from Nvidia. Generational improvements are minor on the performance side, and fucking AWFUL on the product and efficiency side. You'd think people would have learned their lessons a decade ago.
Once the 9070 dropped all arguments for Nvidia stopped being worthy of consideration outside of very niche/fringe needs.
-
Don't you mean NVidia's goal isn't selling cards for gamers?
No. AMD. See my other comments in this thread. Though they are in every major gaming console, the bulk of AMD sales are aimed at the datacenter.
-
Because they choose not to go full idiot though. They could make their top-line cards to compete if they slam enough into a pipeline and require a dedicated PSU to compete, but that's not where their product line intends to go. That's why it's smart.
For reference: AMD has the most deployed GPUs on the planet as of right now. There's a reason why it's in every gaming console except Switch 1/2, and why OpenAI just partnered with them for chips. The goal shouldn't just making a product that churns out results at the cost of everything else does, but to be cost-effective and efficient. Nvidia fails at that on every level.
Then why does Nvidia have so much more money?
-
Yup. You want a server? Dell just plain doesn't offer anything but Nvidia cards. You want to build your own? The GPGPU stuff like zluda is brand new and not really supported by anyone. You want to participate in the development community, you buy Nvidia and use CUDA.
Fortunately, even that tide is shifting.
I've been talking to Dell about it recently, they've just announced new servers (releasing later this year) which can have either Nvidia's B300 or AMD's MI355x GPUs. Available in a hilarious 19" 10RU air-cooled form factor (XE9685), or ORv3 3OU water-cooled (XE9685L).
It's the first time they've offered a system using both CPU and GPU from AMD - previously they had some Intel CPU / AMD GPU options, and AMD CPU / Nvidia GPU, but never before AMD / AMD.
With AMD promising release day support for PyTorch and other popular programming libraries, we're also part-way there on software. I'm not going to pretend like needing CUDA isn't still a massive hump in the road, but "everyone uses CUDA" <-> "everyone needs CUDA" is one hell of a chicken-and-egg problem which isn't getting solved overnight.
Realistically facing that kind of uphill battle, AMD is just going to have to compete on price - they're quoting 40% performance/dollar improvement over Nvidia for these upcoming GPUs, so perhaps they are - and trying to win hearts and minds with rock-solid driver/software support so people who do have the option (ie in-house code, not 3rd-party software) look to write it with not-CUDA.
To note, this is the 3rd generation of the MI3xx series (MI300, MI325, now MI350/355). I think it might be the first one to make the market splash that AMD has been hoping for.
-
Then why does Nvidia have so much more money?
See the title of this very post you're responding to. No, I'm not OP lolz
-
See the title of this very post you're responding to. No, I'm not OP lolz
He's not OP. He's just another person...
-
Fortunately, even that tide is shifting.
I've been talking to Dell about it recently, they've just announced new servers (releasing later this year) which can have either Nvidia's B300 or AMD's MI355x GPUs. Available in a hilarious 19" 10RU air-cooled form factor (XE9685), or ORv3 3OU water-cooled (XE9685L).
It's the first time they've offered a system using both CPU and GPU from AMD - previously they had some Intel CPU / AMD GPU options, and AMD CPU / Nvidia GPU, but never before AMD / AMD.
With AMD promising release day support for PyTorch and other popular programming libraries, we're also part-way there on software. I'm not going to pretend like needing CUDA isn't still a massive hump in the road, but "everyone uses CUDA" <-> "everyone needs CUDA" is one hell of a chicken-and-egg problem which isn't getting solved overnight.
Realistically facing that kind of uphill battle, AMD is just going to have to compete on price - they're quoting 40% performance/dollar improvement over Nvidia for these upcoming GPUs, so perhaps they are - and trying to win hearts and minds with rock-solid driver/software support so people who do have the option (ie in-house code, not 3rd-party software) look to write it with not-CUDA.
To note, this is the 3rd generation of the MI3xx series (MI300, MI325, now MI350/355). I think it might be the first one to make the market splash that AMD has been hoping for.
AMD’s also apparently unifying their server and consumer gpu departments for RDNA5/UDNA iirc, which I’m really hoping helps with this too
-
I know Dell has been doing a lot of AMD CPUs recently, and those have definitely been beating Intel, so hopefully this continues. But I'll believe it when I see it. Often, these things rarely pan out in terms of price/performance and support.
-
See the title of this very post you're responding to. No, I'm not OP lolz
They have so much money because they're full of shit? Doesn't make much sense.
-
Scientists spot ‘superorganism’ in the wild for the first time and it’s made of worms, In a groundbreaking discovery, scientists have observed nematodes, tiny worms, forming 'living towers' in nature
Technology1
-
-
-
Reddit sues Anthropic, alleging its bots accessed Reddit more than 100,000 times since last July
Technology1
-
-
-
Duolingo CEO says AI is a better teacher than humans—but schools will exist ‘because you still need childcare’
Technology1
-