Skip to content

NVIDIA is full of shit

Technology
102 51 101
  • Like I said...you don't know what DLSS is, or how it works. It's not using "AI", that's just marketing bullshit. Apparently it works on some people 😂

    You can find tons of info on this (why I told you to search it up), but it uses rendering tables, inference sorting, and pattern recognition to quickly render scenes with other tricks that video formats have used for ages to render images at a higher resolution cheaply from the point of view of the GPU. You render a scene a dozen times once, then it regurgitates those renders from memory again if they are shown before ejected from cache on the card. It doesn't upsample, it does intelligently render anything new, and there is no additive anything. It seems you think it's magic, but it's just fast sorting memory tricks.

    Why you think it makes games better is subjective, but it solely works to run games with the same details at a higher resolution. It doesn't improve rendered scenes whatsoever. It's literally the same thing as lowering your resolution and increasing texture compression (same affect on cached rendered scenes), since you bring it up. The effect on the user being a higher FPS at a higher resolution which you could achieve by just lowering your resolution. It absolutely does not make a game playable while otherwise unplayable by adding details and texture definition, as you seem to be claiming.

    Go read up.

    I 100% know what DLSS is, though by the sounds of it you don't. It is "AI" as much as any other thing is "AI". It uses models to "learn" what it needs to reconstruct and how to reconstruct it.

    What do you think DLSS is?

    You render a scene a dozen times once, then it regurgitates those renders from memory again if they are shown before ejected from cache on the card. It doesn’t upsample, it does intelligently render anything new, and there is no additive anything. It seems you think it’s magic, but it’s just fast sorting memory tricks.

    This is blatantly and monumentally wrong lol. You think it's literally rendering a dozen frames and then just picking the best one to show you out of them? Wow. Just wow lol.

    It absolutely does not make a game playable while otherwise unplayable by adding details and texture definition, as you seem to be claiming.

    That's not what I claimed though. Where did I claim that?

    What it does is allow you to run a game at higher settings than you could usually at a given framerate, with little to no loss of image quality. Where you could previously only run a game at 20fps at 1080p Ultra settings, you can now run it at 30fps at "1080p" Ultra, whereas to hit 30fps otherwise you might have to drop everything to Low settings.

    Go read up.

    Ditto.

  • @FreedomAdvocate ... this question is totally unimportant for the fact that their current behaivior is not very consumer friendly or harder expressed anti consumer.

    Second cuda is not hardware dependend 😉 https://github.com/vosen/ZLUDA/tree/master | https://www.xda-developers.com/nvidia-cuda-amd-zluda/

    "Imagine a world where noone needed a brand specific addition to have modern features" ... oh those ideas exist since centuries ( DX / OpenGL / Vulkan .... ) ... now ask yourself why nvidia always tries to operate outside of those api's ?

    ....

    Second cuda is not hardware dependend

    That's essentially an emulation layer. Nvidia make DLSS specifically for their GPUs, which have CUDA cores on them. It's the reason why DLSS doesn't work on their pre-CUDA core hardware.

    Could they make DLSS work on AMDs hardware? Sure, they could - but it would not be DLSS as we know it, and again - why would they? They are allowed to make stuff exclusively for their hardware.

  • @FreedomAdvocate https://forums.tomshardware.com/threads/ati-cheating-on-benchmarks.877565/

    read about that when they got grilled in the early 2000s

    And how much nvidia influences media ->

    But nvidia got dragged across the coals for using frame-gen in their performance benchmarks too. Did you miss that?

    Also ATI wasn't owned by AMD then.....AMD aquired ATI in 2006. Your link is from 2001.

    Also no one should be listening to official GPU manufacturer benchmark results. No one. Review companies do their own benchmarking, and you do know that you can turn off DLSS and DLSS Frame-Gen, don't you? I haven't seen any reviewers only compare DLSS+Frame-Gen on an nvidia card to native-with-no-frame-gen on AMD cards. You must have, so can you link to any?

  • I went from a 2080 Super to the RX 9070 XT and it flies.

    You went from a 7 year old GPU to a brand new top of the line one, what did you expect? That's not a fair comparison lol. Got nothing to do with FSR4 vs DLSS4.

    what did you expect?

    I expected as much. 👍

    The thing to which I was concurring was simply that they said the 9070 was excellent.

    nothing to do with FSR4 vs DLSS4

    The 2080 Super supports DLSS. 🤷♂

    I'm just posting an anecdote, bro. Chill.

    Also the 2080 Super was released in 2019, not 2018. 👍

  • I 100% know what DLSS is, though by the sounds of it you don't. It is "AI" as much as any other thing is "AI". It uses models to "learn" what it needs to reconstruct and how to reconstruct it.

    What do you think DLSS is?

    You render a scene a dozen times once, then it regurgitates those renders from memory again if they are shown before ejected from cache on the card. It doesn’t upsample, it does intelligently render anything new, and there is no additive anything. It seems you think it’s magic, but it’s just fast sorting memory tricks.

    This is blatantly and monumentally wrong lol. You think it's literally rendering a dozen frames and then just picking the best one to show you out of them? Wow. Just wow lol.

    It absolutely does not make a game playable while otherwise unplayable by adding details and texture definition, as you seem to be claiming.

    That's not what I claimed though. Where did I claim that?

    What it does is allow you to run a game at higher settings than you could usually at a given framerate, with little to no loss of image quality. Where you could previously only run a game at 20fps at 1080p Ultra settings, you can now run it at 30fps at "1080p" Ultra, whereas to hit 30fps otherwise you might have to drop everything to Low settings.

    Go read up.

    Ditto.

    I 100% know what DLSS is, though by the sounds of it you don't. It is "AI" as much as any other thing is "AI". It uses models to "learn" what it needs to reconstruct and how to reconstruct it.

    No, you don't. https://en.m.wikipedia.org/wiki/Deep_Learning_Super_Sampling

    This is blatantly and monumentally wrong lol. You think it's literally rendering a dozen frames and then just picking the best one to show you out of them? Wow. Just wow lol.

    Literally in the docs: https://raw.githubusercontent.com/NVIDIA/DLSS/main/doc/DLSS_Programming_Guide_Release.pdf

    What it does is allow you to run a game at higher settings than you could usually at a given framerate, with little to no loss of image quality. Where you could previously only run a game at 20fps at 1080p Ultra settings, you can now run it at 30fps at "1080p" Ultra, whereas to hit 30fps otherwise you might have to drop everything to Low settings.

    No it doesn't. It allows you to run a game at a higher resolution for no reason at all, instead of dropping to a lower resolution that your card can handle natively. That's it.

    Keep claiming otherwise, and you're just literally denying reality and the Nvidia link to the docs right in front of you.

  • But nvidia got dragged across the coals for using frame-gen in their performance benchmarks too. Did you miss that?

    Also ATI wasn't owned by AMD then.....AMD aquired ATI in 2006. Your link is from 2001.

    Also no one should be listening to official GPU manufacturer benchmark results. No one. Review companies do their own benchmarking, and you do know that you can turn off DLSS and DLSS Frame-Gen, don't you? I haven't seen any reviewers only compare DLSS+Frame-Gen on an nvidia card to native-with-no-frame-gen on AMD cards. You must have, so can you link to any?

    @FreedomAdvocate so you didn't read the heise link which showed you that pre release tests had strict rules on how to test including framegen settings ...

  • Second cuda is not hardware dependend

    That's essentially an emulation layer. Nvidia make DLSS specifically for their GPUs, which have CUDA cores on them. It's the reason why DLSS doesn't work on their pre-CUDA core hardware.

    Could they make DLSS work on AMDs hardware? Sure, they could - but it would not be DLSS as we know it, and again - why would they? They are allowed to make stuff exclusively for their hardware.

    @FreedomAdvocate zuda is an reimplementation of an api not a emulation.

  • I 100% know what DLSS is, though by the sounds of it you don't. It is "AI" as much as any other thing is "AI". It uses models to "learn" what it needs to reconstruct and how to reconstruct it.

    No, you don't. https://en.m.wikipedia.org/wiki/Deep_Learning_Super_Sampling

    This is blatantly and monumentally wrong lol. You think it's literally rendering a dozen frames and then just picking the best one to show you out of them? Wow. Just wow lol.

    Literally in the docs: https://raw.githubusercontent.com/NVIDIA/DLSS/main/doc/DLSS_Programming_Guide_Release.pdf

    What it does is allow you to run a game at higher settings than you could usually at a given framerate, with little to no loss of image quality. Where you could previously only run a game at 20fps at 1080p Ultra settings, you can now run it at 30fps at "1080p" Ultra, whereas to hit 30fps otherwise you might have to drop everything to Low settings.

    No it doesn't. It allows you to run a game at a higher resolution for no reason at all, instead of dropping to a lower resolution that your card can handle natively. That's it.

    Keep claiming otherwise, and you're just literally denying reality and the Nvidia link to the docs right in front of you.

    Linking to an 81 page document isn’t helpful. What specifically in there are you referring to?

    No it doesn’t. It allows you to run a game at a higher resolution for no reason at all

    Other than the reasons like I said - running it at higher settings while maintaining a playable framerate. The point is you don’t have to lower settings as much with DLSS.

    You fundamentally don’t understand what it is and what it allows you to do.

  • @FreedomAdvocate so you didn't read the heise link which showed you that pre release tests had strict rules on how to test including framegen settings ...

    Nvidia can say what they want, but reviewers didn’t follow those.

    Sounds like you need to find better GPU review sites.

  • @FreedomAdvocate zuda is an reimplementation of an api not a emulation.

    I said it’s essentially emulation, which it is. Its like WINE, which is also essentially emulation but isn’t emulation.

  • Linking to an 81 page document isn’t helpful. What specifically in there are you referring to?

    No it doesn’t. It allows you to run a game at a higher resolution for no reason at all

    Other than the reasons like I said - running it at higher settings while maintaining a playable framerate. The point is you don’t have to lower settings as much with DLSS.

    You fundamentally don’t understand what it is and what it allows you to do.

    You didn't read up on it huh?

    Hilarious.