Skip to content

NVIDIA is full of shit

Technology
115 53 151
  • First: https://www.corsair.com/us/en/explorer/gamer/gaming-pcs/rtx-5090-5080-and-5070-series-gpus-everything-you-need-to-know/

    What exactly am I supposed to be looking at here? Do you think that says that the GPUs need their own PSUs? Do you think people with 50 series GPUs have 2 PSUs in their computers?

    It’s not innovative, interesting, or improving performance, it’s a marketing scam. Games would be run better and more efficiently if you just lower the requirements.

    DLSS isn't innovative? It's not improving performance? What on earth? Rendering a frame at a lower resolution and then using AI to upscale it to look the same or better than rendering it at full resolution isn't innovative?! Getting an extra 30fps vs native resolution isn't improving performance?! How isn't it?

    You can't just "lower the requirements" lol. What you're suggesting is make the game worse so people with worse hardware can play at max settings lol. That is absolutely absurd.

    Let me ask you this - do you think that every new game should still be being made for the PS2? PS3? Why or why not?

    Like I said...you don't know what DLSS is, or how it works. It's not using "AI", that's just marketing bullshit. Apparently it works on some people 😂

    You can find tons of info on this (why I told you to search it up), but it uses rendering tables, inference sorting, and pattern recognition to quickly render scenes with other tricks that video formats have used for ages to render images at a higher resolution cheaply from the point of view of the GPU. You render a scene a dozen times once, then it regurgitates those renders from memory again if they are shown before ejected from cache on the card. It doesn't upsample, it does intelligently render anything new, and there is no additive anything. It seems you think it's magic, but it's just fast sorting memory tricks.

    Why you think it makes games better is subjective, but it solely works to run games with the same details at a higher resolution. It doesn't improve rendered scenes whatsoever. It's literally the same thing as lowering your resolution and increasing texture compression (same affect on cached rendered scenes), since you bring it up. The effect on the user being a higher FPS at a higher resolution which you could achieve by just lowering your resolution. It absolutely does not make a game playable while otherwise unplayable by adding details and texture definition, as you seem to be claiming.

    Go read up.

  • No, I don’t remember that. What are you talking about?

    Why would Nvidia make DLSS work on other brands hardware? It’s hardware dependant btw - it needs their cuda cores.

    @FreedomAdvocate ... this question is totally unimportant for the fact that their current behaivior is not very consumer friendly or harder expressed anti consumer.

    Second cuda is not hardware dependend 😉 https://github.com/vosen/ZLUDA/tree/master | https://www.xda-developers.com/nvidia-cuda-amd-zluda/

    "Imagine a world where noone needed a brand specific addition to have modern features" ... oh those ideas exist since centuries ( DX / OpenGL / Vulkan .... ) ... now ask yourself why nvidia always tries to operate outside of those api's ?

    ....

  • Like I said...you don't know what DLSS is, or how it works. It's not using "AI", that's just marketing bullshit. Apparently it works on some people 😂

    You can find tons of info on this (why I told you to search it up), but it uses rendering tables, inference sorting, and pattern recognition to quickly render scenes with other tricks that video formats have used for ages to render images at a higher resolution cheaply from the point of view of the GPU. You render a scene a dozen times once, then it regurgitates those renders from memory again if they are shown before ejected from cache on the card. It doesn't upsample, it does intelligently render anything new, and there is no additive anything. It seems you think it's magic, but it's just fast sorting memory tricks.

    Why you think it makes games better is subjective, but it solely works to run games with the same details at a higher resolution. It doesn't improve rendered scenes whatsoever. It's literally the same thing as lowering your resolution and increasing texture compression (same affect on cached rendered scenes), since you bring it up. The effect on the user being a higher FPS at a higher resolution which you could achieve by just lowering your resolution. It absolutely does not make a game playable while otherwise unplayable by adding details and texture definition, as you seem to be claiming.

    Go read up.

    I 100% know what DLSS is, though by the sounds of it you don't. It is "AI" as much as any other thing is "AI". It uses models to "learn" what it needs to reconstruct and how to reconstruct it.

    What do you think DLSS is?

    You render a scene a dozen times once, then it regurgitates those renders from memory again if they are shown before ejected from cache on the card. It doesn’t upsample, it does intelligently render anything new, and there is no additive anything. It seems you think it’s magic, but it’s just fast sorting memory tricks.

    This is blatantly and monumentally wrong lol. You think it's literally rendering a dozen frames and then just picking the best one to show you out of them? Wow. Just wow lol.

    It absolutely does not make a game playable while otherwise unplayable by adding details and texture definition, as you seem to be claiming.

    That's not what I claimed though. Where did I claim that?

    What it does is allow you to run a game at higher settings than you could usually at a given framerate, with little to no loss of image quality. Where you could previously only run a game at 20fps at 1080p Ultra settings, you can now run it at 30fps at "1080p" Ultra, whereas to hit 30fps otherwise you might have to drop everything to Low settings.

    Go read up.

    Ditto.

  • @FreedomAdvocate ... this question is totally unimportant for the fact that their current behaivior is not very consumer friendly or harder expressed anti consumer.

    Second cuda is not hardware dependend 😉 https://github.com/vosen/ZLUDA/tree/master | https://www.xda-developers.com/nvidia-cuda-amd-zluda/

    "Imagine a world where noone needed a brand specific addition to have modern features" ... oh those ideas exist since centuries ( DX / OpenGL / Vulkan .... ) ... now ask yourself why nvidia always tries to operate outside of those api's ?

    ....

    Second cuda is not hardware dependend

    That's essentially an emulation layer. Nvidia make DLSS specifically for their GPUs, which have CUDA cores on them. It's the reason why DLSS doesn't work on their pre-CUDA core hardware.

    Could they make DLSS work on AMDs hardware? Sure, they could - but it would not be DLSS as we know it, and again - why would they? They are allowed to make stuff exclusively for their hardware.

  • @FreedomAdvocate https://forums.tomshardware.com/threads/ati-cheating-on-benchmarks.877565/

    read about that when they got grilled in the early 2000s

    And how much nvidia influences media ->

    But nvidia got dragged across the coals for using frame-gen in their performance benchmarks too. Did you miss that?

    Also ATI wasn't owned by AMD then.....AMD aquired ATI in 2006. Your link is from 2001.

    Also no one should be listening to official GPU manufacturer benchmark results. No one. Review companies do their own benchmarking, and you do know that you can turn off DLSS and DLSS Frame-Gen, don't you? I haven't seen any reviewers only compare DLSS+Frame-Gen on an nvidia card to native-with-no-frame-gen on AMD cards. You must have, so can you link to any?

  • I went from a 2080 Super to the RX 9070 XT and it flies.

    You went from a 7 year old GPU to a brand new top of the line one, what did you expect? That's not a fair comparison lol. Got nothing to do with FSR4 vs DLSS4.

    what did you expect?

    I expected as much. 👍

    The thing to which I was concurring was simply that they said the 9070 was excellent.

    nothing to do with FSR4 vs DLSS4

    The 2080 Super supports DLSS. 🤷♂

    I'm just posting an anecdote, bro. Chill.

    Also the 2080 Super was released in 2019, not 2018. 👍

  • I 100% know what DLSS is, though by the sounds of it you don't. It is "AI" as much as any other thing is "AI". It uses models to "learn" what it needs to reconstruct and how to reconstruct it.

    What do you think DLSS is?

    You render a scene a dozen times once, then it regurgitates those renders from memory again if they are shown before ejected from cache on the card. It doesn’t upsample, it does intelligently render anything new, and there is no additive anything. It seems you think it’s magic, but it’s just fast sorting memory tricks.

    This is blatantly and monumentally wrong lol. You think it's literally rendering a dozen frames and then just picking the best one to show you out of them? Wow. Just wow lol.

    It absolutely does not make a game playable while otherwise unplayable by adding details and texture definition, as you seem to be claiming.

    That's not what I claimed though. Where did I claim that?

    What it does is allow you to run a game at higher settings than you could usually at a given framerate, with little to no loss of image quality. Where you could previously only run a game at 20fps at 1080p Ultra settings, you can now run it at 30fps at "1080p" Ultra, whereas to hit 30fps otherwise you might have to drop everything to Low settings.

    Go read up.

    Ditto.

    I 100% know what DLSS is, though by the sounds of it you don't. It is "AI" as much as any other thing is "AI". It uses models to "learn" what it needs to reconstruct and how to reconstruct it.

    No, you don't. https://en.m.wikipedia.org/wiki/Deep_Learning_Super_Sampling

    This is blatantly and monumentally wrong lol. You think it's literally rendering a dozen frames and then just picking the best one to show you out of them? Wow. Just wow lol.

    Literally in the docs: https://raw.githubusercontent.com/NVIDIA/DLSS/main/doc/DLSS_Programming_Guide_Release.pdf

    What it does is allow you to run a game at higher settings than you could usually at a given framerate, with little to no loss of image quality. Where you could previously only run a game at 20fps at 1080p Ultra settings, you can now run it at 30fps at "1080p" Ultra, whereas to hit 30fps otherwise you might have to drop everything to Low settings.

    No it doesn't. It allows you to run a game at a higher resolution for no reason at all, instead of dropping to a lower resolution that your card can handle natively. That's it.

    Keep claiming otherwise, and you're just literally denying reality and the Nvidia link to the docs right in front of you.

  • But nvidia got dragged across the coals for using frame-gen in their performance benchmarks too. Did you miss that?

    Also ATI wasn't owned by AMD then.....AMD aquired ATI in 2006. Your link is from 2001.

    Also no one should be listening to official GPU manufacturer benchmark results. No one. Review companies do their own benchmarking, and you do know that you can turn off DLSS and DLSS Frame-Gen, don't you? I haven't seen any reviewers only compare DLSS+Frame-Gen on an nvidia card to native-with-no-frame-gen on AMD cards. You must have, so can you link to any?

    @FreedomAdvocate so you didn't read the heise link which showed you that pre release tests had strict rules on how to test including framegen settings ...

  • Second cuda is not hardware dependend

    That's essentially an emulation layer. Nvidia make DLSS specifically for their GPUs, which have CUDA cores on them. It's the reason why DLSS doesn't work on their pre-CUDA core hardware.

    Could they make DLSS work on AMDs hardware? Sure, they could - but it would not be DLSS as we know it, and again - why would they? They are allowed to make stuff exclusively for their hardware.

    @FreedomAdvocate zuda is an reimplementation of an api not a emulation.

  • I 100% know what DLSS is, though by the sounds of it you don't. It is "AI" as much as any other thing is "AI". It uses models to "learn" what it needs to reconstruct and how to reconstruct it.

    No, you don't. https://en.m.wikipedia.org/wiki/Deep_Learning_Super_Sampling

    This is blatantly and monumentally wrong lol. You think it's literally rendering a dozen frames and then just picking the best one to show you out of them? Wow. Just wow lol.

    Literally in the docs: https://raw.githubusercontent.com/NVIDIA/DLSS/main/doc/DLSS_Programming_Guide_Release.pdf

    What it does is allow you to run a game at higher settings than you could usually at a given framerate, with little to no loss of image quality. Where you could previously only run a game at 20fps at 1080p Ultra settings, you can now run it at 30fps at "1080p" Ultra, whereas to hit 30fps otherwise you might have to drop everything to Low settings.

    No it doesn't. It allows you to run a game at a higher resolution for no reason at all, instead of dropping to a lower resolution that your card can handle natively. That's it.

    Keep claiming otherwise, and you're just literally denying reality and the Nvidia link to the docs right in front of you.

    Linking to an 81 page document isn’t helpful. What specifically in there are you referring to?

    No it doesn’t. It allows you to run a game at a higher resolution for no reason at all

    Other than the reasons like I said - running it at higher settings while maintaining a playable framerate. The point is you don’t have to lower settings as much with DLSS.

    You fundamentally don’t understand what it is and what it allows you to do.

  • @FreedomAdvocate so you didn't read the heise link which showed you that pre release tests had strict rules on how to test including framegen settings ...

    Nvidia can say what they want, but reviewers didn’t follow those.

    Sounds like you need to find better GPU review sites.

  • @FreedomAdvocate zuda is an reimplementation of an api not a emulation.

    I said it’s essentially emulation, which it is. Its like WINE, which is also essentially emulation but isn’t emulation.

  • Linking to an 81 page document isn’t helpful. What specifically in there are you referring to?

    No it doesn’t. It allows you to run a game at a higher resolution for no reason at all

    Other than the reasons like I said - running it at higher settings while maintaining a playable framerate. The point is you don’t have to lower settings as much with DLSS.

    You fundamentally don’t understand what it is and what it allows you to do.

    You didn't read up on it huh?

    Hilarious.

  • I said it’s essentially emulation, which it is. Its like WINE, which is also essentially emulation but isn’t emulation.

    @FreedomAdvocate there is a reason why WINE = Wine is not (a) Emulator is used. So don't call a api reimplementation a emulation specially since other api reimplementation have shown to be better than the original implementation from the hardware provider ( example dxvk on amd > the original amd dx implementation ) . But this gets us far from the original topics , my point was if nvidia wanted to have real competition they would have included all those new fance features into official api's like for example DX or Vulkan or any other.

    They didn't ... and while not directly against the consumer it is against the consumer end.
    So i have brought up another point why i call nvidia anti consumer ... neither you like it or not.

  • This post did not contain any content.

    You don't need NVENC, the AMD and Intel versions are very good. If you care about maximum quality you would software encode for the best compression

  • Since when did gfx cards need to cost more than a used car?

    We are being scammed by nvidia. They are selling stuff that 20 years ago, the equivalent would have been some massive research prototype. And there would be, like, 2 of them in an nvidia bunker somewhere powering deep thought whilst it calculated the meaning of life, the universe, and everything.

    3k for a gfx card. Man my whole pc cost 500 quid and it runs all my games and pcvr just fine.

    Could it run better? Sure

    Does it need to? Not for 3 grand...

    Fuck me!.....

    When advanced nodes stopped giving you Moore transistors per $

  • I plan on getting at least a 4060 and I'm sitting on that for years. I'm on a 2060 right now.

    My 2060 alone can run at least 85% of all games in my entire libraries across platforms. But I want at least 95% or 100%

    Why not just get a 9060 xt and get that to 99%?(everything but ray tracing black myth wukong)

  • Then why does Nvidia have so much more money?

    Because of vendor lock in

  • Like I said...you don't know what DLSS is, or how it works. It's not using "AI", that's just marketing bullshit. Apparently it works on some people 😂

    You can find tons of info on this (why I told you to search it up), but it uses rendering tables, inference sorting, and pattern recognition to quickly render scenes with other tricks that video formats have used for ages to render images at a higher resolution cheaply from the point of view of the GPU. You render a scene a dozen times once, then it regurgitates those renders from memory again if they are shown before ejected from cache on the card. It doesn't upsample, it does intelligently render anything new, and there is no additive anything. It seems you think it's magic, but it's just fast sorting memory tricks.

    Why you think it makes games better is subjective, but it solely works to run games with the same details at a higher resolution. It doesn't improve rendered scenes whatsoever. It's literally the same thing as lowering your resolution and increasing texture compression (same affect on cached rendered scenes), since you bring it up. The effect on the user being a higher FPS at a higher resolution which you could achieve by just lowering your resolution. It absolutely does not make a game playable while otherwise unplayable by adding details and texture definition, as you seem to be claiming.

    Go read up.

    It actually doesn't do what you said anymore, not since an update in like 2020 or something. It's an AI model now

  • He’s talking about DLSS upscaling - not DLSS Frame Generation - which doesn’t add latency.

    It does add latency, you need 1-2ms to upscale the frame. However, if you are using a lower render resolution (instead of going up in resolution while rendering internally the same) then the latency will be lower because you have a higher frame rate

  • Palantir partners to develop AI software for nuclear construction

    Technology technology
    1
    0 Stimmen
    1 Beiträge
    6 Aufrufe
    Niemand hat geantwortet
  • 455 Stimmen
    149 Beiträge
    77 Aufrufe
    eyekaytee@aussie.zoneE
    They will say something like solar went from 600gw to 1000 thats a 66% increase this year and coal only increased 40% except coal is 3600gw to 6400. Hrmmmm, maybe these numbers are outdated? Based on this coal and gas are down: In Q1 2025, solar generation rose 48% compared to the same period in 2024. Solar power reached 254 TWh, making up 10% of total electricity. This was the largest increase among all clean energy sources. Coal-fired electricity dropped by 4%, falling to 1,421 TWh. Gas-fired power also went down by 4%, reaching 67 TWh https://carboncredits.com/china-sets-clean-energy-record-in-early-2025-with-951-tw/ are no where close to what is required to meet their climate goals Which ones in particular are you talking about? Trump signs executive order directing US withdrawal from the Paris climate agreement — again https://apnews.com/article/trump-paris-agreement-climate-change-788907bb89fe307a964be757313cdfb0 China vowed on Tuesday to continue participating in two cornerstone multinational arrangements -- the World Health Organization and Paris climate accord -- after newly sworn-in US President Donald Trump ordered withdrawals from them. https://www.france24.com/en/live-news/20250121-china-says-committed-to-who-paris-climate-deal-after-us-pulls-out What's that saying? You hate it when the person you hate is doing good? I can't remember what it is I can't fault them for what they're doing at the moment, even if they are run by an evil dictatorship and do pollute the most I’m not sure how european defense spending is relevant It suggests there is money available in the bank to fund solar/wind/battery, but instead they are preparing for? something? what? who knows. France can make a fighter jet at home but not solar panels apparently. Prehaps they would be made in a country with environmental and labour laws if governments legislated properly to prevent companies outsourcing manufacturing. However this doesnt absolve china. China isnt being forced at Gunpoint to produce these goods with low labour regulation and low environmental regulation. You're right, it doesn't absolve china, and I avoid purchasing things from them wherever possible, my solar panels and EV were made in South Korea, my home battery was made in Germany, there are only a few things in my house made in China, most of them I got second hand but unfortunately there is no escaping the giant of manufacturing. With that said it's one thing for me to sit here and tut tut at China, but I realise I am not most people, the most clearest example is the extreme anti-ai, anti-billionaire bias on this platform, in real life most people don't give a fuck, they love Amazon/Microsoft/Google/Apple etc, they can't go a day without them. So I consider myself a realist, if you want people to buy your stuff then you will need to make the conditions possible for them to WANT to buy your stuff, not out of some moral lecture and Europe isn't doing that, if we look at energy prices: Can someone actually point out to me where this comes from? ... At the end of the day energy is a small % of EU household spending I was looking at corporate/business energy use: Major European companies are already moving to cut costs and retain their competitive edge. For example, Thyssenkrupp, Germany’s largest steelmaker, said on Monday it would slash 11,000 jobs in its steel division by 2030, in a major corporate reshuffle. https://oilprice.com/Latest-Energy-News/World-News/High-Energy-Costs-Continue-to-Plague-European-Industry.html Prices have since fallen but are still high compared to other countries. A poll by Germany's DIHK Chambers of Industry and Commerce of around 3,300 companies showed that 37% were considering cutting production or moving abroad, up from 31% last year and 16% in 2022. For energy-intensive industrial firms some 45% of companies were mulling slashing output or relocation, the survey showed. "The trust of the German economy in energy policy is severely damaged," Achim Dercks, DIHK deputy chief executive said, adding that the government had not succeeded in providing companies with a perspective for reliable and affordable energy supply. https://www.reuters.com/business/energy/more-german-companies-mull-relocation-due-high-energy-prices-survey-2024-08-01/ I've seen nothing to suggest energy prices in the EU are SO cheap that it's worth moving manufacturing TO Europe, and this is what annoys me the most. I've pointed this out before but they have an excellent report on the issues: https://commission.europa.eu/document/download/97e481fd-2dc3-412d-be4c-f152a8232961_en?filename=The+future+of+European+competitiveness+_+A+competitiveness+strategy+for+Europe.pdf Then they put out this Competitive Compass: https://commission.europa.eu/topics/eu-competitiveness/competitiveness-compass_en But tbh every week in the EU it seems like they are chasing after some other goal. This would be great, it would have been greater 10 years ago. Agreed
  • 67 Stimmen
    2 Beiträge
    16 Aufrufe
    1
    Says the same IT group of humanity with their heads buried in code mumbling i hate people into their monitors /s its just a joke. Im describing myself
  • 1 Stimmen
    1 Beiträge
    11 Aufrufe
    Niemand hat geantwortet
  • 580 Stimmen
    278 Beiträge
    202 Aufrufe
    V
    The main difference being the consequences that might result from the surveillance.
  • 479 Stimmen
    81 Beiträge
    180 Aufrufe
    douglasg14b@lemmy.worldD
    Did I say that it did? No? Then why the rhetorical question for something that I never stated? Now that we're past that, I'm not sure if I think it's okay, but I at least recognize that it's normalized within society. And has been for like 70+ years now. The problem happens with how the data is used, and particularly abused. If you walk into my store, you expect that I am monitoring you. You expect that you are on camera and that your shopping patterns, like all foot traffic, are probably being analyzed and aggregated. What you buy is tracked, at least in aggregate, by default really, that's just volume tracking and prediction. Suffice to say that broad customer behavior analysis has been a thing for a couple generations now, at least. When you go to a website, why would you think that it is not keeping track of where you go and what you click on in the same manner? Now that I've stated that I do want to say that the real problems that we experience come in with how this data is misused out of what it's scope should be. And that we should have strong regulatory agencies forcing compliance of how this data is used and enforcing the right to privacy for people that want it removed.
  • 19 Stimmen
    5 Beiträge
    23 Aufrufe
    S
    Jesus that's just straight up porn
  • The mystery of $MELANIA

    Technology technology
    13
    1
    25 Stimmen
    13 Beiträge
    39 Aufrufe
    geekwithsoul@lemm.eeG
    Archive