Skip to content

NVIDIA is full of shit

Technology
96 51 38
  • This post did not contain any content.

    "and the drivers, for which NVIDIA has always been praised, are currently falling apart"

    what? they were shit since hl2

  • This post did not contain any content.

    AMD & Intel ARC are king now. All that CUDA nonsense, is just price-hiking justification

  • Only Cyberpunk 2077 in path tracing. The only game i have not played until i can run it on ultra settings. But for that amount of money, i better wait until the real 2077 to see it happen.

    The studio has done a great job. You most certainly have heard it already, but I am willing to say it again: the game is worth playing with whatever quality you can afford, save stutter-level low fps - the story is so touching it outplays graphics completely (though I do share the desire to play it on ultra settings - will do one day myself)

  • This post did not contain any content.

    Those 4% can make an RTX 5070 Ti perform at the levels of an RTX 4070 Ti Super, completely eradicating the reason you’d get an RTX 5070 Ti in the first place.

    You'd buy a 5070 Ti for a 4% increase in performance over the 4070 Ti Super you already had? Ok.

  • Bought my first AMD card last year, never looked back

    AMD’s Windows drivers are a little rough, but the open source drivers on Linux are spectacular.

  • Since when did gfx cards need to cost more than a used car?

    We are being scammed by nvidia. They are selling stuff that 20 years ago, the equivalent would have been some massive research prototype. And there would be, like, 2 of them in an nvidia bunker somewhere powering deep thought whilst it calculated the meaning of life, the universe, and everything.

    3k for a gfx card. Man my whole pc cost 500 quid and it runs all my games and pcvr just fine.

    Could it run better? Sure

    Does it need to? Not for 3 grand...

    Fuck me!.....

    I haven’t bought a GPU since my beloved Vega 64 for $400 on Black Friday 2018, and the current prices are just horrifying. I’ll probably settle with midrange next build.

  • Those 4% can make an RTX 5070 Ti perform at the levels of an RTX 4070 Ti Super, completely eradicating the reason you’d get an RTX 5070 Ti in the first place.

    You'd buy a 5070 Ti for a 4% increase in performance over the 4070 Ti Super you already had? Ok.

    They probably mean the majority of people, not 4070 Ti owners. For them, buying that 4070 Ti would be a better choice already.

  • I assume people mean 3440x1440 when they say 1440 as it’s way more common than 2560x1440.

    Most people do not use WS as evidenced by the mixed bag support it gets. 1440 monitors are by default understood to be 2560x1440p as it’s 16:9 which is still considered the “default” by the vast majority of businesses and people alike. You may operate as if most people using 1440+ are on WS but that’s a very atypical assumption.

    Raytracing sure but otherwise the 4090 is actually better than the 5070 in many respects. So you’re paying a comparable price for Raytracing and windows dependency, which if that is important to you then go right ahead. Ultimately though my point is that there is no point in buying the insanely overpriced Nvidia offerings when you have excellent AMD offerings for a fraction of the price that don’t have all sorts of little pitfalls/compromises. The Nvidia headaches are worth it for performance, which unless you 3-4x your investment you’re not getting more of. So the 5070 is moot.

    I’m not sure what you’re comparing at the end unless you meant a 9070XT which I don’t use/have and wasn’t comparing.

    I’m not sure what you’re comparing at the end unless you meant a 9070XT which I don’t use/have and wasn’t comparing.

    Sorry I thought I read you had the 9070 XT, which is better than the 9070 that you have. The 9070 and the 5070 are the same price, and are neck and neck in performance , so the nvidia card isn't "insanely overpriced" compared to AMDs offerings, is it? The 9070 isn't a "fraction of the price" of the equivalent nvidia card, it's the same price.

    As you said, there are 40 series cards that are better than the 50 series cards apart from probably the 5090, and the prices on those is cheaper than the 9070.

  • I’m not even against tricks like upscaling and such to be honest. If it looks good I’ll take it lol. But I do agree they don’t feel like long-term, hardened solutions vs something more like “raw performance.” And there’s no doubt There is a certain elegance to AMD’s cards

    And there’s no doubt There is a certain elegance to AMD’s cards

    What exactly do you mean by this?

  • @RazgrizOne @FreedomAdvocate the reason why i decided for AMD after being nearly all my life team green ( aka >20 years ) , i feel like AI Frame Generation and Upscalling are anti consumer cause the hide the real performance behind none reproducable image generation. And if you look correctly ... this is how nvidia has a performance lead over AMD.

    Calling DLSS "anti consumer" is one of the dumbest things I've read about PC gaming in a long time.

  • Low rent comment.

    First: https://www.corsair.com/us/en/explorer/gamer/gaming-pcs/rtx-5090-5080-and-5070-series-gpus-everything-you-need-to-know/

    Second: you apparently are unaware, so just search up the phrase, but as this article very clearly explains...it's shit. It's not innovative, interesting, or improving performance, it's a marketing scam. Games would be run better and more efficiently if you just lower the requirements. It's like saying you want food to taste better, but then they serve you a vegan version of it. AMD's version is technically more useful, but it's still a dumb trick.

    First: https://www.corsair.com/us/en/explorer/gamer/gaming-pcs/rtx-5090-5080-and-5070-series-gpus-everything-you-need-to-know/

    What exactly am I supposed to be looking at here? Do you think that says that the GPUs need their own PSUs? Do you think people with 50 series GPUs have 2 PSUs in their computers?

    It’s not innovative, interesting, or improving performance, it’s a marketing scam. Games would be run better and more efficiently if you just lower the requirements.

    DLSS isn't innovative? It's not improving performance? What on earth? Rendering a frame at a lower resolution and then using AI to upscale it to look the same or better than rendering it at full resolution isn't innovative?! Getting an extra 30fps vs native resolution isn't improving performance?! How isn't it?

    You can't just "lower the requirements" lol. What you're suggesting is make the game worse so people with worse hardware can play at max settings lol. That is absolutely absurd.

    Let me ask you this - do you think that every new game should still be being made for the PS2? PS3? Why or why not?

  • AMD is at least running the smart game on their hardware releases with generational leaps instead of just jacking up power requirements and clock speeds as Nvidia does. Hell, even Nvidia's latest lines of Jetson are just recooked versions from years ago.

    AMD is at least running the smart game on their hardware releases with generational leaps instead of just jacking up power requirements and clock speeds as Nvidia does.

    AMD could only do that because they were so far behind. GPU manufacturers, at least nvidia, are approaching the limits of what they can do with current fabrication technology other than simply throwing "more" at it. Without a breakthrough in tech all they can really do is jack up power requirements and clock speeds. AMD will be there soon too.

  • Concur.

    I went from a 2080 Super to the RX 9070 XT and it flies. Coupled with a 9950X3D, I still feel a little bit like the GPU might be the bottleneck, but it doesn't matter. It plays everything I want at way more frames than I need (240 Hz monitor).

    E.g., Rocket League went from struggling to keep 240 fps at lowest settings, to 700+ at max settings. Pretty stark improvement.

    I went from a 2080 Super to the RX 9070 XT and it flies.

    You went from a 7 year old GPU to a brand new top of the line one, what did you expect? That's not a fair comparison lol. Got nothing to do with FSR4 vs DLSS4.

  • Calling DLSS "anti consumer" is one of the dumbest things I've read about PC gaming in a long time.

    @FreedomAdvocate you remember the time when AMD was called out for even the smallest of difference from a default render ? Now since nvidia basically use some kind of statistic guessing method -> Noone is allowed to call them out ?
    I call them out cause basically they removed the possibility for any consumer to compare other graphics card with themself. Or did i miss nvidia making dlss / frametime generation and all the features available on other gpu brands ?
    Do you know AI Models behind all this and how they would perform on other hardware ? Do we want to talk about how they try to force media to have access to tests ? Yes imho there is alot anti consumer here ...

  • This post did not contain any content.

    Nvidia is using the "its fake news" strategy now? My how the mighty have fallen.

    I've said it many times but publicly traded companies are destroying the world. The fact they have to increase revenue every single year is not sustainable and just leads to employees being underpaid, products that are built cheaper and invasive data collection to offset their previous poor decisions.

  • @FreedomAdvocate you remember the time when AMD was called out for even the smallest of difference from a default render ? Now since nvidia basically use some kind of statistic guessing method -> Noone is allowed to call them out ?
    I call them out cause basically they removed the possibility for any consumer to compare other graphics card with themself. Or did i miss nvidia making dlss / frametime generation and all the features available on other gpu brands ?
    Do you know AI Models behind all this and how they would perform on other hardware ? Do we want to talk about how they try to force media to have access to tests ? Yes imho there is alot anti consumer here ...

    No, I don’t remember that. What are you talking about?

    Why would Nvidia make DLSS work on other brands hardware? It’s hardware dependant btw - it needs their cuda cores.

  • No, I don’t remember that. What are you talking about?

    Why would Nvidia make DLSS work on other brands hardware? It’s hardware dependant btw - it needs their cuda cores.

    @FreedomAdvocate https://forums.tomshardware.com/threads/ati-cheating-on-benchmarks.877565/

    read about that when they got grilled in the early 2000s

    And how much nvidia influences media ->

  • First: https://www.corsair.com/us/en/explorer/gamer/gaming-pcs/rtx-5090-5080-and-5070-series-gpus-everything-you-need-to-know/

    What exactly am I supposed to be looking at here? Do you think that says that the GPUs need their own PSUs? Do you think people with 50 series GPUs have 2 PSUs in their computers?

    It’s not innovative, interesting, or improving performance, it’s a marketing scam. Games would be run better and more efficiently if you just lower the requirements.

    DLSS isn't innovative? It's not improving performance? What on earth? Rendering a frame at a lower resolution and then using AI to upscale it to look the same or better than rendering it at full resolution isn't innovative?! Getting an extra 30fps vs native resolution isn't improving performance?! How isn't it?

    You can't just "lower the requirements" lol. What you're suggesting is make the game worse so people with worse hardware can play at max settings lol. That is absolutely absurd.

    Let me ask you this - do you think that every new game should still be being made for the PS2? PS3? Why or why not?

    Like I said...you don't know what DLSS is, or how it works. It's not using "AI", that's just marketing bullshit. Apparently it works on some people 😂

    You can find tons of info on this (why I told you to search it up), but it uses rendering tables, inference sorting, and pattern recognition to quickly render scenes with other tricks that video formats have used for ages to render images at a higher resolution cheaply from the point of view of the GPU. You render a scene a dozen times once, then it regurgitates those renders from memory again if they are shown before ejected from cache on the card. It doesn't upsample, it does intelligently render anything new, and there is no additive anything. It seems you think it's magic, but it's just fast sorting memory tricks.

    Why you think it makes games better is subjective, but it solely works to run games with the same details at a higher resolution. It doesn't improve rendered scenes whatsoever. It's literally the same thing as lowering your resolution and increasing texture compression (same affect on cached rendered scenes), since you bring it up. The effect on the user being a higher FPS at a higher resolution which you could achieve by just lowering your resolution. It absolutely does not make a game playable while otherwise unplayable by adding details and texture definition, as you seem to be claiming.

    Go read up.

  • No, I don’t remember that. What are you talking about?

    Why would Nvidia make DLSS work on other brands hardware? It’s hardware dependant btw - it needs their cuda cores.

    @FreedomAdvocate ... this question is totally unimportant for the fact that their current behaivior is not very consumer friendly or harder expressed anti consumer.

    Second cuda is not hardware dependend 😉 https://github.com/vosen/ZLUDA/tree/master | https://www.xda-developers.com/nvidia-cuda-amd-zluda/

    "Imagine a world where noone needed a brand specific addition to have modern features" ... oh those ideas exist since centuries ( DX / OpenGL / Vulkan .... ) ... now ask yourself why nvidia always tries to operate outside of those api's ?

    ....

  • Like I said...you don't know what DLSS is, or how it works. It's not using "AI", that's just marketing bullshit. Apparently it works on some people 😂

    You can find tons of info on this (why I told you to search it up), but it uses rendering tables, inference sorting, and pattern recognition to quickly render scenes with other tricks that video formats have used for ages to render images at a higher resolution cheaply from the point of view of the GPU. You render a scene a dozen times once, then it regurgitates those renders from memory again if they are shown before ejected from cache on the card. It doesn't upsample, it does intelligently render anything new, and there is no additive anything. It seems you think it's magic, but it's just fast sorting memory tricks.

    Why you think it makes games better is subjective, but it solely works to run games with the same details at a higher resolution. It doesn't improve rendered scenes whatsoever. It's literally the same thing as lowering your resolution and increasing texture compression (same affect on cached rendered scenes), since you bring it up. The effect on the user being a higher FPS at a higher resolution which you could achieve by just lowering your resolution. It absolutely does not make a game playable while otherwise unplayable by adding details and texture definition, as you seem to be claiming.

    Go read up.

    I 100% know what DLSS is, though by the sounds of it you don't. It is "AI" as much as any other thing is "AI". It uses models to "learn" what it needs to reconstruct and how to reconstruct it.

    What do you think DLSS is?

    You render a scene a dozen times once, then it regurgitates those renders from memory again if they are shown before ejected from cache on the card. It doesn’t upsample, it does intelligently render anything new, and there is no additive anything. It seems you think it’s magic, but it’s just fast sorting memory tricks.

    This is blatantly and monumentally wrong lol. You think it's literally rendering a dozen frames and then just picking the best one to show you out of them? Wow. Just wow lol.

    It absolutely does not make a game playable while otherwise unplayable by adding details and texture definition, as you seem to be claiming.

    That's not what I claimed though. Where did I claim that?

    What it does is allow you to run a game at higher settings than you could usually at a given framerate, with little to no loss of image quality. Where you could previously only run a game at 20fps at 1080p Ultra settings, you can now run it at 30fps at "1080p" Ultra, whereas to hit 30fps otherwise you might have to drop everything to Low settings.

    Go read up.

    Ditto.

  • A Tech-Backed Influencer Wants to Replace Teachers With AI

    Technology technology
    32
    1
    114 Stimmen
    32 Beiträge
    100 Aufrufe
    sturgist@lemmy.caS
    Heck yeah! Gotta watch that again, thank you kind stranger!
  • 455 Stimmen
    149 Beiträge
    61 Aufrufe
    eyekaytee@aussie.zoneE
    They will say something like solar went from 600gw to 1000 thats a 66% increase this year and coal only increased 40% except coal is 3600gw to 6400. Hrmmmm, maybe these numbers are outdated? Based on this coal and gas are down: In Q1 2025, solar generation rose 48% compared to the same period in 2024. Solar power reached 254 TWh, making up 10% of total electricity. This was the largest increase among all clean energy sources. Coal-fired electricity dropped by 4%, falling to 1,421 TWh. Gas-fired power also went down by 4%, reaching 67 TWh https://carboncredits.com/china-sets-clean-energy-record-in-early-2025-with-951-tw/ are no where close to what is required to meet their climate goals Which ones in particular are you talking about? Trump signs executive order directing US withdrawal from the Paris climate agreement — again https://apnews.com/article/trump-paris-agreement-climate-change-788907bb89fe307a964be757313cdfb0 China vowed on Tuesday to continue participating in two cornerstone multinational arrangements -- the World Health Organization and Paris climate accord -- after newly sworn-in US President Donald Trump ordered withdrawals from them. https://www.france24.com/en/live-news/20250121-china-says-committed-to-who-paris-climate-deal-after-us-pulls-out What's that saying? You hate it when the person you hate is doing good? I can't remember what it is I can't fault them for what they're doing at the moment, even if they are run by an evil dictatorship and do pollute the most I’m not sure how european defense spending is relevant It suggests there is money available in the bank to fund solar/wind/battery, but instead they are preparing for? something? what? who knows. France can make a fighter jet at home but not solar panels apparently. Prehaps they would be made in a country with environmental and labour laws if governments legislated properly to prevent companies outsourcing manufacturing. However this doesnt absolve china. China isnt being forced at Gunpoint to produce these goods with low labour regulation and low environmental regulation. You're right, it doesn't absolve china, and I avoid purchasing things from them wherever possible, my solar panels and EV were made in South Korea, my home battery was made in Germany, there are only a few things in my house made in China, most of them I got second hand but unfortunately there is no escaping the giant of manufacturing. With that said it's one thing for me to sit here and tut tut at China, but I realise I am not most people, the most clearest example is the extreme anti-ai, anti-billionaire bias on this platform, in real life most people don't give a fuck, they love Amazon/Microsoft/Google/Apple etc, they can't go a day without them. So I consider myself a realist, if you want people to buy your stuff then you will need to make the conditions possible for them to WANT to buy your stuff, not out of some moral lecture and Europe isn't doing that, if we look at energy prices: Can someone actually point out to me where this comes from? ... At the end of the day energy is a small % of EU household spending I was looking at corporate/business energy use: Major European companies are already moving to cut costs and retain their competitive edge. For example, Thyssenkrupp, Germany’s largest steelmaker, said on Monday it would slash 11,000 jobs in its steel division by 2030, in a major corporate reshuffle. https://oilprice.com/Latest-Energy-News/World-News/High-Energy-Costs-Continue-to-Plague-European-Industry.html Prices have since fallen but are still high compared to other countries. A poll by Germany's DIHK Chambers of Industry and Commerce of around 3,300 companies showed that 37% were considering cutting production or moving abroad, up from 31% last year and 16% in 2022. For energy-intensive industrial firms some 45% of companies were mulling slashing output or relocation, the survey showed. "The trust of the German economy in energy policy is severely damaged," Achim Dercks, DIHK deputy chief executive said, adding that the government had not succeeded in providing companies with a perspective for reliable and affordable energy supply. https://www.reuters.com/business/energy/more-german-companies-mull-relocation-due-high-energy-prices-survey-2024-08-01/ I've seen nothing to suggest energy prices in the EU are SO cheap that it's worth moving manufacturing TO Europe, and this is what annoys me the most. I've pointed this out before but they have an excellent report on the issues: https://commission.europa.eu/document/download/97e481fd-2dc3-412d-be4c-f152a8232961_en?filename=The+future+of+European+competitiveness+_+A+competitiveness+strategy+for+Europe.pdf Then they put out this Competitive Compass: https://commission.europa.eu/topics/eu-competitiveness/competitiveness-compass_en But tbh every week in the EU it seems like they are chasing after some other goal. This would be great, it would have been greater 10 years ago. Agreed
  • 6 Stimmen
    1 Beiträge
    3 Aufrufe
    Niemand hat geantwortet
  • 100 Stimmen
    49 Beiträge
    101 Aufrufe
    A
    Okay man.
  • 75 Stimmen
    8 Beiträge
    17 Aufrufe
    L
    Police: Arrest you for having an open beer in public Judge: sentences you to prison The PIC:
  • You Can Choose Tools That Make You Happy

    Technology technology
    1
    1
    30 Stimmen
    1 Beiträge
    11 Aufrufe
    Niemand hat geantwortet
  • Elon Musk's X temporarily down for tens of thousands of users

    Technology technology
    1
    1
    0 Stimmen
    1 Beiträge
    8 Aufrufe
    Niemand hat geantwortet
  • 0 Stimmen
    6 Beiträge
    26 Aufrufe
    H
    Then that's changed since the last time I toyed with the idea. Which, granted, was probably 20 years ago...