Skip to content

NVIDIA is full of shit

Technology
49 33 0
  • This post did not contain any content.

    Have a 2070s. Been thinking for a while now my next card will be AMD. I hope they get back into the high end cards again 😕

  • But but but but but my shadows look 3% more realistic now!

    The best part is, for me, ray tracing looks great. When I'm standing there and slowly looking around.

    When I'm running and gunning and shits exploding, I don't think the human eye is even capable of comprehending the difference between raster and ray tracing at that point.

  • This post did not contain any content.

    And only toke 15 years to figure it out?

  • Have a 2070s. Been thinking for a while now my next card will be AMD. I hope they get back into the high end cards again 😕

    The 9070 XT is excellent and FSR 4 actually beats DLSS 4 in some important ways, like disocclusion.

  • The 9070 XT is excellent and FSR 4 actually beats DLSS 4 in some important ways, like disocclusion.

    Concur.

    I went from a 2080 Super to the RX 9070 XT and it flies. Coupled with a 9950X3D, I still feel a little bit like the GPU might be the bottleneck, but it doesn't matter. It plays everything I want at way more frames than I need (240 Hz monitor).

    E.g., Rocket League went from struggling to keep 240 fps at lowest settings, to 700+ at max settings. Pretty stark improvement.

  • Because they choose not to go full idiot though. They could make their top-line cards to compete if they slam enough into a pipeline and require a dedicated PSU to compete, but that's not where their product line intends to go. That's why it's smart.

    For reference: AMD has the most deployed GPUs on the planet as of right now. There's a reason why it's in every gaming console except Switch 1/2, and why OpenAI just partnered with them for chips. The goal shouldn't just making a product that churns out results at the cost of everything else does, but to be cost-effective and efficient. Nvidia fails at that on every level.

    Unfortunately, this partnership with OpenAI means they've sided with evil and I won't spend a cent on their products anymore.

  • This post did not contain any content.

    I wish I had the money to change to AMD

  • I wish I had the money to change to AMD

    This is a sentence I never thought I would read.

    ^(AMD used to be cheap)^

  • This post did not contain any content.

    It covers the breadth of problems pretty well, but I feel compelled to point out that there are a few times where things are misrepresented in this post e.g.:

    Newegg selling the ASUS ROG Astral GeForce RTX 5090 for $3,359 (MSRP: $1,999)

    eBay Germany offering the same ASUS ROG Astral RTX 5090 for €3,349,95 (MSRP: €2,229)

    The MSRP for a 5090 is $2k, but the MSRP for the 5090 Astral -- a top-end card being used for overclocking world records -- is $2.8k. I couldn't quickly find the European MSRP but my money's on it being more than 2.2k euro.

    If you’re a creator, CUDA and NVENC are pretty much indispensable, or editing and exporting videos in Adobe Premiere or DaVinci Resolve will take you a lot longer[3]. Same for live streaming, as using NVENC in OBS offloads video rendering to the GPU for smooth frame rates while streaming high-quality video.

    NVENC isn't much of a moat right now, as both Intel and AMD's encoders are roughly comparable in quality these days (including in Intel's iGPUs!). There are cases where NVENC might do something specific better (like 4:2:2 support for prosumer/professional use cases) or have better software support in a specific program, but for common use cases like streaming/recording gameplay the alternatives should be roughly equivalent for most users.

    as recently as May 2025 and I wasn’t surprised to find even RTX 40 series are still very much overpriced

    Production apparently stopped on these for several months leading up to the 50-series launch; it seems unreasonable to harshly judge the pricing of a product that hasn't had new stock for an extended period of time (of course, you can then judge either the decision to stop production or the still-elevated pricing of the 50 series).


    DLSS is, and always was, snake oil

    I personally find this take crazy given that DLSS2+ / FSR4+, when quality-biased, average visual quality comparable to native for most users in most situations and that was with DLSS2 in 2023, not even DLSS3 let alone DLSS4 (which is markedly better on average). I don't really care how a frame is generated if it looks good enough (and doesn't come with other notable downsides like latency). This almost feels like complaining about screen space reflections being "fake" reflections. Like yeah, it's fake, but if the average player experience is consistently better with it than without it then what does it matter?

    Increasingly complex manufacturing nodes are becoming increasingly expensive as all fuck. If it's more cost-efficient to use some of that die area for specialized cores that can do high-quality upscaling instead of natively rendering everything with all the die space then that's fine by me. I don't think blaming DLSS (and its equivalents like FSR and XeSS) as "snake oil" is the right takeaway. If the options are (1) spend $X on a card that outputs 60 FPS natively or (2) spend $X on a card that outputs upscaled 80 FPS at quality good enough that I can't tell it's not native, then sign me the fuck up for option #2. For people less fussy about static image quality and more invested in smoothness, they can be perfectly happy with 100 FPS but marginally worse image quality. Not everyone is as sweaty about static image quality as some of us in the enthusiast crowd are.

    There's some fair points here about RT (though I find exclusively using path tracing for performance RT performance testing a little disingenuous given the performance gap), but if RT performance is the main complaint then why is the sub-heading "DLSS is, and always was, snake oil"?


    obligatory: disagreeing with some of the author's points is not the same as saying "Nvidia is great"

  • This post did not contain any content.

    Is it because it's not how they make money now?

  • It covers the breadth of problems pretty well, but I feel compelled to point out that there are a few times where things are misrepresented in this post e.g.:

    Newegg selling the ASUS ROG Astral GeForce RTX 5090 for $3,359 (MSRP: $1,999)

    eBay Germany offering the same ASUS ROG Astral RTX 5090 for €3,349,95 (MSRP: €2,229)

    The MSRP for a 5090 is $2k, but the MSRP for the 5090 Astral -- a top-end card being used for overclocking world records -- is $2.8k. I couldn't quickly find the European MSRP but my money's on it being more than 2.2k euro.

    If you’re a creator, CUDA and NVENC are pretty much indispensable, or editing and exporting videos in Adobe Premiere or DaVinci Resolve will take you a lot longer[3]. Same for live streaming, as using NVENC in OBS offloads video rendering to the GPU for smooth frame rates while streaming high-quality video.

    NVENC isn't much of a moat right now, as both Intel and AMD's encoders are roughly comparable in quality these days (including in Intel's iGPUs!). There are cases where NVENC might do something specific better (like 4:2:2 support for prosumer/professional use cases) or have better software support in a specific program, but for common use cases like streaming/recording gameplay the alternatives should be roughly equivalent for most users.

    as recently as May 2025 and I wasn’t surprised to find even RTX 40 series are still very much overpriced

    Production apparently stopped on these for several months leading up to the 50-series launch; it seems unreasonable to harshly judge the pricing of a product that hasn't had new stock for an extended period of time (of course, you can then judge either the decision to stop production or the still-elevated pricing of the 50 series).


    DLSS is, and always was, snake oil

    I personally find this take crazy given that DLSS2+ / FSR4+, when quality-biased, average visual quality comparable to native for most users in most situations and that was with DLSS2 in 2023, not even DLSS3 let alone DLSS4 (which is markedly better on average). I don't really care how a frame is generated if it looks good enough (and doesn't come with other notable downsides like latency). This almost feels like complaining about screen space reflections being "fake" reflections. Like yeah, it's fake, but if the average player experience is consistently better with it than without it then what does it matter?

    Increasingly complex manufacturing nodes are becoming increasingly expensive as all fuck. If it's more cost-efficient to use some of that die area for specialized cores that can do high-quality upscaling instead of natively rendering everything with all the die space then that's fine by me. I don't think blaming DLSS (and its equivalents like FSR and XeSS) as "snake oil" is the right takeaway. If the options are (1) spend $X on a card that outputs 60 FPS natively or (2) spend $X on a card that outputs upscaled 80 FPS at quality good enough that I can't tell it's not native, then sign me the fuck up for option #2. For people less fussy about static image quality and more invested in smoothness, they can be perfectly happy with 100 FPS but marginally worse image quality. Not everyone is as sweaty about static image quality as some of us in the enthusiast crowd are.

    There's some fair points here about RT (though I find exclusively using path tracing for performance RT performance testing a little disingenuous given the performance gap), but if RT performance is the main complaint then why is the sub-heading "DLSS is, and always was, snake oil"?


    obligatory: disagreeing with some of the author's points is not the same as saying "Nvidia is great"

    I think DLSS (and FSR and so on) are great value propositions but they become a problem when developers use them as a crutch. At the very least your game should not need them at all to run on high end hardware on max settings. With them then being options for people on lower end hardware to either lower settings or combine higher settings with upscaling. When they become mandatory they stop being a value proposition since the benefit stops being a benefit and starts just being neccesary for baseline performance.

  • Once the 9070 dropped all arguments for Nvidia stopped being worthy of consideration outside of very niche/fringe needs.

    Got my 9070XT at retail (well retail + VAT but thats retail for my country) and my entire PC costs less than a 5090.

  • they pay because AMD (or any other for that matter) has no product to compete with a 5080 or 5090

    What do you even need those graphics cards for?

    Even the best games don't require those and if they did, I wouldn't be interested in them, especially if it's an online game.

    Probably only a couple people would be playing said game with me.

  • Have a 2070s. Been thinking for a while now my next card will be AMD. I hope they get back into the high end cards again 😕

    AMD only releases high end for servers and high end workstations

  • I don’t really care how a frame is generated if it looks good enough (and doesn’t come with other notable downsides like latency). This almost feels like complaining about screen space reflections being “fake” reflections. Like yeah, it’s fake, but if the average player experience is consistently better with it than without it then what does it matter?

    But it does come with increased latency. It also disrupts the artistic vision of games. With MFG you're seeing more fake frames than real frames. It's deceptive and like snake oil in that Nvidia isn't distinguishing between fake frames and real frames. I forget what the exact comparison is, but when they say "The RTX 5040 has the same performance as the RTX 4090" but that's with 3 fake frames for every real frame, that's incredibly deceptive.

  • Got my 9070XT at retail (well retail + VAT but thats retail for my country) and my entire PC costs less than a 5090.

    Yeah I got a 9070 + 9800x3d for around $1100 all-in. Couldn’t be happier with the performance. Expedition 33 running max settings at 3440x1440 and 80-90fps

  • they pay because AMD (or any other for that matter) has no product to compete with a 5080 or 5090

    I have overclocked my AMD 7900XTX as far as it will go on air alone.

    Undervolted every step on the frequency curve, cranked up the power, 100% fan duty cycles.

    At it's absolute best, it's competitive or trades blows with the 4090D, and is 6% slower than the RTX 4090 Founder's Edition (the slowest of the stock 4090 lineup).

    The fastest AMD card is equivalent to a 4080 Super, and the next gen hasn't shown anything new.

    AMD needs a 5090-killer. Dual socket or whatever monstrosity which pulls 800W, but it needs to slap that greenbo with at least a 20-50% lead in frame rates across all titles, including raytraced. Then we'll see some serious price cuts and competition.

  • This post did not contain any content.

    Folks, ask yourselves, what game is out there that REALLY needs a 5090? If you have the money to piss away, by all means, it's your money. But let's face it, games have plateaued and VR isn't all that great.

    Nvidia's market is not you anymore. It's the massive corporations and research firms for useless AI projects or number crunching. They have more money than all gamers combined. Maybe time to go outside; me included.

  • Folks, ask yourselves, what game is out there that REALLY needs a 5090? If you have the money to piss away, by all means, it's your money. But let's face it, games have plateaued and VR isn't all that great.

    Nvidia's market is not you anymore. It's the massive corporations and research firms for useless AI projects or number crunching. They have more money than all gamers combined. Maybe time to go outside; me included.

    Cyberpunk 2077 with the VR mod is the only one I can think of. Because it’s not natively built for VR you have to render the world separately for each eye leading to a halving of the overall frame rate. And with 90 fps as the bare minimum for many people in VR you really don’t have a choice but to use the 5090.

    Yeah it’s literally only one game/mod, but that would be my use case if I could afford it.

  • Cyberpunk 2077 with the VR mod is the only one I can think of. Because it’s not natively built for VR you have to render the world separately for each eye leading to a halving of the overall frame rate. And with 90 fps as the bare minimum for many people in VR you really don’t have a choice but to use the 5090.

    Yeah it’s literally only one game/mod, but that would be my use case if I could afford it.

    Also the Train World Sim Series. Those games make my tower complain, and my laptop give up.

  • 624 Stimmen
    73 Beiträge
    22 Aufrufe
    swelter_spark@reddthat.comS
    Swappa is good for tech.
  • Lawmakers Demand Palantir Provide Information About U.S. Contracts

    Technology technology
    2
    120 Stimmen
    2 Beiträge
    7 Aufrufe
    C
    Sauron Denies Request for Contract Information Reading a prepared statement from the tower of Barad-dûr, the Mouth of Sauron indicated today that the Dark Lord would not be complying with the demands of lawmakers to provide information on its contracts with the Trump Administration. The Messenger of Mordor further called the demands "ridiculous" and "unnecessary government intrusion into private affairs of Sauron, who does not answer to any higher authority, save that of his fallen master Morgoth." Furthermore, the statement chastised the lawmakers for contacting Sauron through the Palantir, which he described as "an illegal privacy breach," and said he planned to seek legal action for this invasion of his personal communications.
  • Apple acquires RAC7, its first-ever video game studio

    Technology technology
    16
    1
    67 Stimmen
    16 Beiträge
    41 Aufrufe
    E
    I'm not questioning whether or not the game is good, just wondering why Apple would want to limit their customer base so much.
  • Britain’s Companies Are Being Hacked

    Technology technology
    9
    1
    21 Stimmen
    9 Beiträge
    19 Aufrufe
    D
    Is that "goodbye" in Russian? Why?
  • 112 Stimmen
    34 Beiträge
    63 Aufrufe
    fredselfish@lemmy.worldF
    Nlow that was a great show. I always wanted in on that too. Back when Radio Shack still dealt in parts for remote control cars.
  • 374 Stimmen
    69 Beiträge
    85 Aufrufe
    T
    In those situations I usually enable 1.5x.
  • 241 Stimmen
    175 Beiträge
    104 Aufrufe
    N
    I think a generic plug would be great but look at how fragmented USB specifications are. Add that to biology and it's a whole other level of difficulty. Brain implants have great potential but the abandonment issue is a problem that exists now that we have to solve for. It's also not really a tech issue but a societal one on affordability and accountability of medical research. Imagine if a company held the patents for the brain device and just closed down without selling or leasing the patent. People with that device would have no support unless a government body forced the release of the patent. This has already happened multiple times to people in clinical trials and scaling up deployment with multiple versions will make the situation worse. https://jamanetwork.com/journals/jamanetworkopen/fullarticle/2818077 I don't really have a take on your personal desires. I do think if anyone can afford one they should make sure it's not just the up front cost but also the long term costs to be considered. Like buying an expensive car, it's not if you can afford to purchase it but if you can afford to wreck it.
  • CrowdStrike Announces Layoffs Affecting 500 Employees

    Technology technology
    8
    1
    242 Stimmen
    8 Beiträge
    23 Aufrufe
    S
    This is where the magic of near meaningless corpo-babble comes in. The layoffs are part of a plan to aspirationally acheive the goal of $10b revenue by EoY 2025. What they are actually doing is a significant restructuring of the company, refocusing by outside hiring some amount of new people to lead or be a part of departments or positions that haven't existed before, or are being refocused to other priorities... ... But this process also involves laying off 500 of the 'least productive' or 'least mission critical' employees. So, technically, they can, and are, arguing that their new organizational paradigm will be so succesful that it actually will result in increased revenue, not just lower expenses. Generally corpos call this something like 'right-sizing' or 'refocusing' or something like that. ... But of course... anyone with any actual experience with working at a place that does this... will tell you roughly this is what happens: Turns out all those 'grunts' you let go of, well they actually do a lot more work in a bunch of weird, esoteric, bandaid solutions to keep everything going, than upper management was aware of... because middle management doesn't acknowledge or often even understand that that work was being done, because they are generally self-aggrandizing narcissist petty tyrants who spend more time in meetings fluffing themselves up than actually doing any useful management. Then, also, you are now bringing on new, outside people who look great on paper, to lead new or modified apartments... but they of course also do not have any institutional knowledge, as they are new. So now, you have a whole bunch of undocumented work that was being done, processes which were being followed... which is no longer being done, which is not documented.... and the new guys, even if they have the best intentions, now have to spend a quarter or two or three figuring out just exactly how much pre-existing middle management has been bullshitting about, figuring out just how much things do not actually function as they ssid it did... So now your efficiency improving restructuring is actually a chaotic mess. ... Now, this 'right sizing' is not always apocalyptically extremely bad, but it is also essentially never totally free from hiccups... and it increases stress, workload, and tensions between basically everyone at the company, to some extent. Here's Forbes explanation of this phenomenon, if you prefer an explanation of right sizing in corpospeak: https://www.forbes.com/advisor/business/rightsizing/