Skip to content

NVIDIA is full of shit

Technology
48 32 0
  • See the title of this very post you're responding to. No, I'm not OP lolz

    He's not OP. He's just another person...

  • Fortunately, even that tide is shifting.

    I've been talking to Dell about it recently, they've just announced new servers (releasing later this year) which can have either Nvidia's B300 or AMD's MI355x GPUs. Available in a hilarious 19" 10RU air-cooled form factor (XE9685), or ORv3 3OU water-cooled (XE9685L).

    It's the first time they've offered a system using both CPU and GPU from AMD - previously they had some Intel CPU / AMD GPU options, and AMD CPU / Nvidia GPU, but never before AMD / AMD.

    With AMD promising release day support for PyTorch and other popular programming libraries, we're also part-way there on software. I'm not going to pretend like needing CUDA isn't still a massive hump in the road, but "everyone uses CUDA" <-> "everyone needs CUDA" is one hell of a chicken-and-egg problem which isn't getting solved overnight.

    Realistically facing that kind of uphill battle, AMD is just going to have to compete on price - they're quoting 40% performance/dollar improvement over Nvidia for these upcoming GPUs, so perhaps they are - and trying to win hearts and minds with rock-solid driver/software support so people who do have the option (ie in-house code, not 3rd-party software) look to write it with not-CUDA.

    To note, this is the 3rd generation of the MI3xx series (MI300, MI325, now MI350/355). I think it might be the first one to make the market splash that AMD has been hoping for.

    AMD’s also apparently unifying their server and consumer gpu departments for RDNA5/UDNA iirc, which I’m really hoping helps with this too

  • I know Dell has been doing a lot of AMD CPUs recently, and those have definitely been beating Intel, so hopefully this continues. But I'll believe it when I see it. Often, these things rarely pan out in terms of price/performance and support.

  • See the title of this very post you're responding to. No, I'm not OP lolz

    They have so much money because they're full of shit? Doesn't make much sense.

  • They have so much money because they're full of shit? Doesn't make much sense.

    Stock isnt money in the bank.

  • Stock isnt money in the bank.

    No one said anything about stock.

  • This post did not contain any content.

    Have a 2070s. Been thinking for a while now my next card will be AMD. I hope they get back into the high end cards again 😕

  • But but but but but my shadows look 3% more realistic now!

    The best part is, for me, ray tracing looks great. When I'm standing there and slowly looking around.

    When I'm running and gunning and shits exploding, I don't think the human eye is even capable of comprehending the difference between raster and ray tracing at that point.

  • This post did not contain any content.

    And only toke 15 years to figure it out?

  • Have a 2070s. Been thinking for a while now my next card will be AMD. I hope they get back into the high end cards again 😕

    The 9070 XT is excellent and FSR 4 actually beats DLSS 4 in some important ways, like disocclusion.

  • The 9070 XT is excellent and FSR 4 actually beats DLSS 4 in some important ways, like disocclusion.

    Concur.

    I went from a 2080 Super to the RX 9070 XT and it flies. Coupled with a 9950X3D, I still feel a little bit like the GPU might be the bottleneck, but it doesn't matter. It plays everything I want at way more frames than I need (240 Hz monitor).

    E.g., Rocket League went from struggling to keep 240 fps at lowest settings, to 700+ at max settings. Pretty stark improvement.

  • Because they choose not to go full idiot though. They could make their top-line cards to compete if they slam enough into a pipeline and require a dedicated PSU to compete, but that's not where their product line intends to go. That's why it's smart.

    For reference: AMD has the most deployed GPUs on the planet as of right now. There's a reason why it's in every gaming console except Switch 1/2, and why OpenAI just partnered with them for chips. The goal shouldn't just making a product that churns out results at the cost of everything else does, but to be cost-effective and efficient. Nvidia fails at that on every level.

    Unfortunately, this partnership with OpenAI means they've sided with evil and I won't spend a cent on their products anymore.

  • This post did not contain any content.

    I wish I had the money to change to AMD

  • I wish I had the money to change to AMD

    This is a sentence I never thought I would read.

    ^(AMD used to be cheap)^

  • This post did not contain any content.

    It covers the breadth of problems pretty well, but I feel compelled to point out that there are a few times where things are misrepresented in this post e.g.:

    Newegg selling the ASUS ROG Astral GeForce RTX 5090 for $3,359 (MSRP: $1,999)

    eBay Germany offering the same ASUS ROG Astral RTX 5090 for €3,349,95 (MSRP: €2,229)

    The MSRP for a 5090 is $2k, but the MSRP for the 5090 Astral -- a top-end card being used for overclocking world records -- is $2.8k. I couldn't quickly find the European MSRP but my money's on it being more than 2.2k euro.

    If you’re a creator, CUDA and NVENC are pretty much indispensable, or editing and exporting videos in Adobe Premiere or DaVinci Resolve will take you a lot longer[3]. Same for live streaming, as using NVENC in OBS offloads video rendering to the GPU for smooth frame rates while streaming high-quality video.

    NVENC isn't much of a moat right now, as both Intel and AMD's encoders are roughly comparable in quality these days (including in Intel's iGPUs!). There are cases where NVENC might do something specific better (like 4:2:2 support for prosumer/professional use cases) or have better software support in a specific program, but for common use cases like streaming/recording gameplay the alternatives should be roughly equivalent for most users.

    as recently as May 2025 and I wasn’t surprised to find even RTX 40 series are still very much overpriced

    Production apparently stopped on these for several months leading up to the 50-series launch; it seems unreasonable to harshly judge the pricing of a product that hasn't had new stock for an extended period of time (of course, you can then judge either the decision to stop production or the still-elevated pricing of the 50 series).


    DLSS is, and always was, snake oil

    I personally find this take crazy given that DLSS2+ / FSR4+, when quality-biased, average visual quality comparable to native for most users in most situations and that was with DLSS2 in 2023, not even DLSS3 let alone DLSS4 (which is markedly better on average). I don't really care how a frame is generated if it looks good enough (and doesn't come with other notable downsides like latency). This almost feels like complaining about screen space reflections being "fake" reflections. Like yeah, it's fake, but if the average player experience is consistently better with it than without it then what does it matter?

    Increasingly complex manufacturing nodes are becoming increasingly expensive as all fuck. If it's more cost-efficient to use some of that die area for specialized cores that can do high-quality upscaling instead of natively rendering everything with all the die space then that's fine by me. I don't think blaming DLSS (and its equivalents like FSR and XeSS) as "snake oil" is the right takeaway. If the options are (1) spend $X on a card that outputs 60 FPS natively or (2) spend $X on a card that outputs upscaled 80 FPS at quality good enough that I can't tell it's not native, then sign me the fuck up for option #2. For people less fussy about static image quality and more invested in smoothness, they can be perfectly happy with 100 FPS but marginally worse image quality. Not everyone is as sweaty about static image quality as some of us in the enthusiast crowd are.

    There's some fair points here about RT (though I find exclusively using path tracing for performance RT performance testing a little disingenuous given the performance gap), but if RT performance is the main complaint then why is the sub-heading "DLSS is, and always was, snake oil"?


    obligatory: disagreeing with some of the author's points is not the same as saying "Nvidia is great"

  • This post did not contain any content.

    Is it because it's not how they make money now?

  • It covers the breadth of problems pretty well, but I feel compelled to point out that there are a few times where things are misrepresented in this post e.g.:

    Newegg selling the ASUS ROG Astral GeForce RTX 5090 for $3,359 (MSRP: $1,999)

    eBay Germany offering the same ASUS ROG Astral RTX 5090 for €3,349,95 (MSRP: €2,229)

    The MSRP for a 5090 is $2k, but the MSRP for the 5090 Astral -- a top-end card being used for overclocking world records -- is $2.8k. I couldn't quickly find the European MSRP but my money's on it being more than 2.2k euro.

    If you’re a creator, CUDA and NVENC are pretty much indispensable, or editing and exporting videos in Adobe Premiere or DaVinci Resolve will take you a lot longer[3]. Same for live streaming, as using NVENC in OBS offloads video rendering to the GPU for smooth frame rates while streaming high-quality video.

    NVENC isn't much of a moat right now, as both Intel and AMD's encoders are roughly comparable in quality these days (including in Intel's iGPUs!). There are cases where NVENC might do something specific better (like 4:2:2 support for prosumer/professional use cases) or have better software support in a specific program, but for common use cases like streaming/recording gameplay the alternatives should be roughly equivalent for most users.

    as recently as May 2025 and I wasn’t surprised to find even RTX 40 series are still very much overpriced

    Production apparently stopped on these for several months leading up to the 50-series launch; it seems unreasonable to harshly judge the pricing of a product that hasn't had new stock for an extended period of time (of course, you can then judge either the decision to stop production or the still-elevated pricing of the 50 series).


    DLSS is, and always was, snake oil

    I personally find this take crazy given that DLSS2+ / FSR4+, when quality-biased, average visual quality comparable to native for most users in most situations and that was with DLSS2 in 2023, not even DLSS3 let alone DLSS4 (which is markedly better on average). I don't really care how a frame is generated if it looks good enough (and doesn't come with other notable downsides like latency). This almost feels like complaining about screen space reflections being "fake" reflections. Like yeah, it's fake, but if the average player experience is consistently better with it than without it then what does it matter?

    Increasingly complex manufacturing nodes are becoming increasingly expensive as all fuck. If it's more cost-efficient to use some of that die area for specialized cores that can do high-quality upscaling instead of natively rendering everything with all the die space then that's fine by me. I don't think blaming DLSS (and its equivalents like FSR and XeSS) as "snake oil" is the right takeaway. If the options are (1) spend $X on a card that outputs 60 FPS natively or (2) spend $X on a card that outputs upscaled 80 FPS at quality good enough that I can't tell it's not native, then sign me the fuck up for option #2. For people less fussy about static image quality and more invested in smoothness, they can be perfectly happy with 100 FPS but marginally worse image quality. Not everyone is as sweaty about static image quality as some of us in the enthusiast crowd are.

    There's some fair points here about RT (though I find exclusively using path tracing for performance RT performance testing a little disingenuous given the performance gap), but if RT performance is the main complaint then why is the sub-heading "DLSS is, and always was, snake oil"?


    obligatory: disagreeing with some of the author's points is not the same as saying "Nvidia is great"

    I think DLSS (and FSR and so on) are great value propositions but they become a problem when developers use them as a crutch. At the very least your game should not need them at all to run on high end hardware on max settings. With them then being options for people on lower end hardware to either lower settings or combine higher settings with upscaling. When they become mandatory they stop being a value proposition since the benefit stops being a benefit and starts just being neccesary for baseline performance.

  • Once the 9070 dropped all arguments for Nvidia stopped being worthy of consideration outside of very niche/fringe needs.

    Got my 9070XT at retail (well retail + VAT but thats retail for my country) and my entire PC costs less than a 5090.

  • they pay because AMD (or any other for that matter) has no product to compete with a 5080 or 5090

    Why do you even need those graphics cards for?

    Even the best games don't require those and if they did, I wouldn't be interested in them, especially if it's an online game.

    Probably only a couple people would be playing said game with me.

  • Have a 2070s. Been thinking for a while now my next card will be AMD. I hope they get back into the high end cards again 😕

    AMD only releases high end for servers and high end workstations

  • Fatphobia Is Fueled by AI-Created Images, Study Finds

    Technology technology
    14
    1
    15 Stimmen
    14 Beiträge
    26 Aufrufe
    K
    I pretty much agree. The only thing I would add is that it's not our place to tell others to lose weight or to point out their weight; people already know they are overweight and that it's unhealthy. We shouldn't be policing other people's bodies. It's also possible to be overweight and have body positivity; being overweight doesn't equate to being unattractive.
  • Sitting up and waiting.

    Technology technology
    7
    5 Stimmen
    7 Beiträge
    12 Aufrufe
    A
    What new AI slop hell is this?
  • A receipt printer cured my procrastination [ADHD]

    Technology technology
    21
    1
    120 Stimmen
    21 Beiträge
    41 Aufrufe
    cygnosis@lemmy.worldC
    Good to know. Also an easy problem to fix. Just use phenol free paper.
  • How Do I Prepare My Phone for a Protest?

    Technology technology
    139
    1
    506 Stimmen
    139 Beiträge
    96 Aufrufe
    D
    So first, even here we see foundation money and big tech, not government. Facebook, Google, etc mostly love net neutrality, tolerate encryption, anf see utility in anonymous internet access, mostly because these things don't interfere with their core advertising businesses, and generally have helped them. I didn't see Comcast and others in the ISP oligopoly on that list, probably because they would not benefit from net neutrality, encryption, and privacy for obvious reasons. The EFF advocates for particular civil libertarian policies, always has. That does attract certain donors, but not others. They have plenty of diverse and grassroots support too. One day they may have to choose between their corpo donors and their values, but I have yet to see them abandon principles.
  • 286 Stimmen
    46 Beiträge
    74 Aufrufe
    E
    NGL, it would be great if they could make it work and go fuck off into international waters. Unrelated, but did you know that if you put big enough holes in a ship it'll sink?
  • 386 Stimmen
    9 Beiträge
    9 Aufrufe
    C
    Melon Usk doomed their FSD efforts from the start with his dunning-kruger-brain take of "humans drive just using their eyes, so cars shouldn't need any sensors besides cameras." Considering how many excellent engineers there are (or were, at least) at his companies, it's kind of fascinating how "stupid at the top" is just as bad, if not worse, than "stupid all the way down."
  • AI model collapse is not what we paid for

    Technology technology
    20
    1
    84 Stimmen
    20 Beiträge
    34 Aufrufe
    A
    I share your frustration. I went nuts about this the other day. It was in the context of searching on a discord server, rather than Google, but it was so aggravating because of the how the "I know better than you" is everywhere nowadays in tech. The discord server was a reading group, and I was searching for discussion regarding a recent book they'd studied, by someone named "Copi". At first, I didn't use quotation marks, and I found my results were swamped with messages that included the word "copy". At this point I was fairly chill and just added quotation marks to my query to emphasise that it definitely was "Copi" I wanted. I still was swamped with messages with "copy", and it drove me mad because there is literally no way to say "fucking use the terms I give you and not the ones you think I want". The software example you give is a great example of when it would be real great to be able to have this ability. TL;DR: Solidarity in rage
  • 0 Stimmen
    6 Beiträge
    22 Aufrufe
    P
    Outlook.... Ok Pretty solid Bahaha hahahahaha Sorry. Outlook is a lot of things. "Gooey crap" would be one way to describe it, but "solid"? Yeah, no. Gmail is (well, was) pretty solid. There are a lot of other webmail providers out there, including self hosted options and most are pretty solid, yeah. Outlook, though? It's a shit show, it's annoying. Do you love me? Please love me, please give feedback, please give feedback again, please look at this, hey am I the best? Am I.. STFU YOU PIECE OF CRAP! Can you PLEASE just let me do my email without being an attention whore every hour? Even down to the basics. Back button? "What is that? Never heard of it, can't go back to the message I just was on because I'm Microsoft software and so half baked." Having two tabs open? "Oh noes, now I get scawed, now I don't know how to manage sessions anymore, better just sign you out everywhere." What is it with Microsoft and not being able to do something basic as sessions normal? I'm not even asking for good, definitely not "awesome", just normal, and that is already too much to ask. Try running it in Firefox! I'm sure it's totally not on purpose, just "oopsie woopsie poopsie" accidentally bwoken. Maybe it's working again today, who knows, tomorrow it'll be broken again. I run everything on Firefox except the Microsoft sites, they have to be in chrome because fuck you, that's why. Seriously, I can't take any Microsoft software seriously at this point, and all of it is on its way out in our company, I'm making sure of that