NVIDIA is full of shit
-
I assume people mean 3440x1440 when they say 1440 as it’s way more common than 2560x1440.
Most people do not use WS as evidenced by the mixed bag support it gets. 1440 monitors are by default understood to be 2560x1440p as it’s 16:9 which is still considered the “default” by the vast majority of businesses and people alike. You may operate as if most people using 1440+ are on WS but that’s a very atypical assumption.
Raytracing sure but otherwise the 4090 is actually better than the 5070 in many respects. So you’re paying a comparable price for Raytracing and windows dependency, which if that is important to you then go right ahead. Ultimately though my point is that there is no point in buying the insanely overpriced Nvidia offerings when you have excellent AMD offerings for a fraction of the price that don’t have all sorts of little pitfalls/compromises. The Nvidia headaches are worth it for performance, which unless you 3-4x your investment you’re not getting more of. So the 5070 is moot.
I’m not sure what you’re comparing at the end unless you meant a 9070XT which I don’t use/have and wasn’t comparing.
I’m not sure what you’re comparing at the end unless you meant a 9070XT which I don’t use/have and wasn’t comparing.
Sorry I thought I read you had the 9070 XT, which is better than the 9070 that you have. The 9070 and the 5070 are the same price, and are neck and neck in performance , so the nvidia card isn't "insanely overpriced" compared to AMDs offerings, is it? The 9070 isn't a "fraction of the price" of the equivalent nvidia card, it's the same price.
As you said, there are 40 series cards that are better than the 50 series cards apart from probably the 5090, and the prices on those is cheaper than the 9070.
-
I’m not even against tricks like upscaling and such to be honest. If it looks good I’ll take it lol. But I do agree they don’t feel like long-term, hardened solutions vs something more like “raw performance.” And there’s no doubt There is a certain elegance to AMD’s cards
And there’s no doubt There is a certain elegance to AMD’s cards
What exactly do you mean by this?
-
@RazgrizOne @FreedomAdvocate the reason why i decided for AMD after being nearly all my life team green ( aka >20 years ) , i feel like AI Frame Generation and Upscalling are anti consumer cause the hide the real performance behind none reproducable image generation. And if you look correctly ... this is how nvidia has a performance lead over AMD.
Calling DLSS "anti consumer" is one of the dumbest things I've read about PC gaming in a long time.
-
Low rent comment.
Second: you apparently are unaware, so just search up the phrase, but as this article very clearly explains...it's shit. It's not innovative, interesting, or improving performance, it's a marketing scam. Games would be run better and more efficiently if you just lower the requirements. It's like saying you want food to taste better, but then they serve you a vegan version of it. AMD's version is technically more useful, but it's still a dumb trick.
What exactly am I supposed to be looking at here? Do you think that says that the GPUs need their own PSUs? Do you think people with 50 series GPUs have 2 PSUs in their computers?
It’s not innovative, interesting, or improving performance, it’s a marketing scam. Games would be run better and more efficiently if you just lower the requirements.
DLSS isn't innovative? It's not improving performance? What on earth? Rendering a frame at a lower resolution and then using AI to upscale it to look the same or better than rendering it at full resolution isn't innovative?! Getting an extra 30fps vs native resolution isn't improving performance?! How isn't it?
You can't just "lower the requirements" lol. What you're suggesting is make the game worse so people with worse hardware can play at max settings lol. That is absolutely absurd.
Let me ask you this - do you think that every new game should still be being made for the PS2? PS3? Why or why not?
-
AMD is at least running the smart game on their hardware releases with generational leaps instead of just jacking up power requirements and clock speeds as Nvidia does. Hell, even Nvidia's latest lines of Jetson are just recooked versions from years ago.
AMD is at least running the smart game on their hardware releases with generational leaps instead of just jacking up power requirements and clock speeds as Nvidia does.
AMD could only do that because they were so far behind. GPU manufacturers, at least nvidia, are approaching the limits of what they can do with current fabrication technology other than simply throwing "more" at it. Without a breakthrough in tech all they can really do is jack up power requirements and clock speeds. AMD will be there soon too.
-
Concur.
I went from a 2080 Super to the RX 9070 XT and it flies. Coupled with a 9950X3D, I still feel a little bit like the GPU might be the bottleneck, but it doesn't matter. It plays everything I want at way more frames than I need (240 Hz monitor).
E.g., Rocket League went from struggling to keep 240 fps at lowest settings, to 700+ at max settings. Pretty stark improvement.
I went from a 2080 Super to the RX 9070 XT and it flies.
You went from a 7 year old GPU to a brand new top of the line one, what did you expect? That's not a fair comparison lol. Got nothing to do with FSR4 vs DLSS4.
-
Calling DLSS "anti consumer" is one of the dumbest things I've read about PC gaming in a long time.
@FreedomAdvocate you remember the time when AMD was called out for even the smallest of difference from a default render ? Now since nvidia basically use some kind of statistic guessing method -> Noone is allowed to call them out ?
I call them out cause basically they removed the possibility for any consumer to compare other graphics card with themself. Or did i miss nvidia making dlss / frametime generation and all the features available on other gpu brands ?
Do you know AI Models behind all this and how they would perform on other hardware ? Do we want to talk about how they try to force media to have access to tests ? Yes imho there is alot anti consumer here ... -
This post did not contain any content.
Nvidia is using the "its fake news" strategy now? My how the mighty have fallen.
I've said it many times but publicly traded companies are destroying the world. The fact they have to increase revenue every single year is not sustainable and just leads to employees being underpaid, products that are built cheaper and invasive data collection to offset their previous poor decisions.
-
@FreedomAdvocate you remember the time when AMD was called out for even the smallest of difference from a default render ? Now since nvidia basically use some kind of statistic guessing method -> Noone is allowed to call them out ?
I call them out cause basically they removed the possibility for any consumer to compare other graphics card with themself. Or did i miss nvidia making dlss / frametime generation and all the features available on other gpu brands ?
Do you know AI Models behind all this and how they would perform on other hardware ? Do we want to talk about how they try to force media to have access to tests ? Yes imho there is alot anti consumer here ...No, I don’t remember that. What are you talking about?
Why would Nvidia make DLSS work on other brands hardware? It’s hardware dependant btw - it needs their cuda cores.
-
No, I don’t remember that. What are you talking about?
Why would Nvidia make DLSS work on other brands hardware? It’s hardware dependant btw - it needs their cuda cores.
@FreedomAdvocate https://forums.tomshardware.com/threads/ati-cheating-on-benchmarks.877565/
read about that when they got grilled in the early 2000s
And how much nvidia influences media ->
Nvidia significantly influences early tests of the GeForce RTX 5060
Early tests of the GeForce RTX 5060 have almost no informative value. Nvidia specifies the game selection, settings and comparison models.
heise online (www.heise.de)
-
What exactly am I supposed to be looking at here? Do you think that says that the GPUs need their own PSUs? Do you think people with 50 series GPUs have 2 PSUs in their computers?
It’s not innovative, interesting, or improving performance, it’s a marketing scam. Games would be run better and more efficiently if you just lower the requirements.
DLSS isn't innovative? It's not improving performance? What on earth? Rendering a frame at a lower resolution and then using AI to upscale it to look the same or better than rendering it at full resolution isn't innovative?! Getting an extra 30fps vs native resolution isn't improving performance?! How isn't it?
You can't just "lower the requirements" lol. What you're suggesting is make the game worse so people with worse hardware can play at max settings lol. That is absolutely absurd.
Let me ask you this - do you think that every new game should still be being made for the PS2? PS3? Why or why not?
Like I said...you don't know what DLSS is, or how it works. It's not using "AI", that's just marketing bullshit. Apparently it works on some people
You can find tons of info on this (why I told you to search it up), but it uses rendering tables, inference sorting, and pattern recognition to quickly render scenes with other tricks that video formats have used for ages to render images at a higher resolution cheaply from the point of view of the GPU. You render a scene a dozen times once, then it regurgitates those renders from memory again if they are shown before ejected from cache on the card. It doesn't upsample, it does intelligently render anything new, and there is no additive anything. It seems you think it's magic, but it's just fast sorting memory tricks.
Why you think it makes games better is subjective, but it solely works to run games with the same details at a higher resolution. It doesn't improve rendered scenes whatsoever. It's literally the same thing as lowering your resolution and increasing texture compression (same affect on cached rendered scenes), since you bring it up. The effect on the user being a higher FPS at a higher resolution which you could achieve by just lowering your resolution. It absolutely does not make a game playable while otherwise unplayable by adding details and texture definition, as you seem to be claiming.
Go read up.
-
No, I don’t remember that. What are you talking about?
Why would Nvidia make DLSS work on other brands hardware? It’s hardware dependant btw - it needs their cuda cores.
@FreedomAdvocate ... this question is totally unimportant for the fact that their current behaivior is not very consumer friendly or harder expressed anti consumer.
Second cuda is not hardware dependend
https://github.com/vosen/ZLUDA/tree/master | https://www.xda-developers.com/nvidia-cuda-amd-zluda/
"Imagine a world where noone needed a brand specific addition to have modern features" ... oh those ideas exist since centuries ( DX / OpenGL / Vulkan .... ) ... now ask yourself why nvidia always tries to operate outside of those api's ?
....
-
Like I said...you don't know what DLSS is, or how it works. It's not using "AI", that's just marketing bullshit. Apparently it works on some people
You can find tons of info on this (why I told you to search it up), but it uses rendering tables, inference sorting, and pattern recognition to quickly render scenes with other tricks that video formats have used for ages to render images at a higher resolution cheaply from the point of view of the GPU. You render a scene a dozen times once, then it regurgitates those renders from memory again if they are shown before ejected from cache on the card. It doesn't upsample, it does intelligently render anything new, and there is no additive anything. It seems you think it's magic, but it's just fast sorting memory tricks.
Why you think it makes games better is subjective, but it solely works to run games with the same details at a higher resolution. It doesn't improve rendered scenes whatsoever. It's literally the same thing as lowering your resolution and increasing texture compression (same affect on cached rendered scenes), since you bring it up. The effect on the user being a higher FPS at a higher resolution which you could achieve by just lowering your resolution. It absolutely does not make a game playable while otherwise unplayable by adding details and texture definition, as you seem to be claiming.
Go read up.
I 100% know what DLSS is, though by the sounds of it you don't. It is "AI" as much as any other thing is "AI". It uses models to "learn" what it needs to reconstruct and how to reconstruct it.
What do you think DLSS is?
You render a scene a dozen times once, then it regurgitates those renders from memory again if they are shown before ejected from cache on the card. It doesn’t upsample, it does intelligently render anything new, and there is no additive anything. It seems you think it’s magic, but it’s just fast sorting memory tricks.
This is blatantly and monumentally wrong lol. You think it's literally rendering a dozen frames and then just picking the best one to show you out of them? Wow. Just wow lol.
It absolutely does not make a game playable while otherwise unplayable by adding details and texture definition, as you seem to be claiming.
That's not what I claimed though. Where did I claim that?
What it does is allow you to run a game at higher settings than you could usually at a given framerate, with little to no loss of image quality. Where you could previously only run a game at 20fps at 1080p Ultra settings, you can now run it at 30fps at "1080p" Ultra, whereas to hit 30fps otherwise you might have to drop everything to Low settings.
Go read up.
Ditto.
-
@FreedomAdvocate ... this question is totally unimportant for the fact that their current behaivior is not very consumer friendly or harder expressed anti consumer.
Second cuda is not hardware dependend
https://github.com/vosen/ZLUDA/tree/master | https://www.xda-developers.com/nvidia-cuda-amd-zluda/
"Imagine a world where noone needed a brand specific addition to have modern features" ... oh those ideas exist since centuries ( DX / OpenGL / Vulkan .... ) ... now ask yourself why nvidia always tries to operate outside of those api's ?
....
Second cuda is not hardware dependend
That's essentially an emulation layer. Nvidia make DLSS specifically for their GPUs, which have CUDA cores on them. It's the reason why DLSS doesn't work on their pre-CUDA core hardware.
Could they make DLSS work on AMDs hardware? Sure, they could - but it would not be DLSS as we know it, and again - why would they? They are allowed to make stuff exclusively for their hardware.
-
@FreedomAdvocate https://forums.tomshardware.com/threads/ati-cheating-on-benchmarks.877565/
read about that when they got grilled in the early 2000s
And how much nvidia influences media ->
Nvidia significantly influences early tests of the GeForce RTX 5060
Early tests of the GeForce RTX 5060 have almost no informative value. Nvidia specifies the game selection, settings and comparison models.
heise online (www.heise.de)
But nvidia got dragged across the coals for using frame-gen in their performance benchmarks too. Did you miss that?
Also ATI wasn't owned by AMD then.....AMD aquired ATI in 2006. Your link is from 2001.
Also no one should be listening to official GPU manufacturer benchmark results. No one. Review companies do their own benchmarking, and you do know that you can turn off DLSS and DLSS Frame-Gen, don't you? I haven't seen any reviewers only compare DLSS+Frame-Gen on an nvidia card to native-with-no-frame-gen on AMD cards. You must have, so can you link to any?
-
I went from a 2080 Super to the RX 9070 XT and it flies.
You went from a 7 year old GPU to a brand new top of the line one, what did you expect? That's not a fair comparison lol. Got nothing to do with FSR4 vs DLSS4.
what did you expect?
I expected as much.
The thing to which I was concurring was simply that they said the 9070 was excellent.
nothing to do with FSR4 vs DLSS4
The 2080 Super supports DLSS.
️
I'm just posting an anecdote, bro. Chill.
Also the 2080 Super was released in 2019, not 2018.
-
I 100% know what DLSS is, though by the sounds of it you don't. It is "AI" as much as any other thing is "AI". It uses models to "learn" what it needs to reconstruct and how to reconstruct it.
What do you think DLSS is?
You render a scene a dozen times once, then it regurgitates those renders from memory again if they are shown before ejected from cache on the card. It doesn’t upsample, it does intelligently render anything new, and there is no additive anything. It seems you think it’s magic, but it’s just fast sorting memory tricks.
This is blatantly and monumentally wrong lol. You think it's literally rendering a dozen frames and then just picking the best one to show you out of them? Wow. Just wow lol.
It absolutely does not make a game playable while otherwise unplayable by adding details and texture definition, as you seem to be claiming.
That's not what I claimed though. Where did I claim that?
What it does is allow you to run a game at higher settings than you could usually at a given framerate, with little to no loss of image quality. Where you could previously only run a game at 20fps at 1080p Ultra settings, you can now run it at 30fps at "1080p" Ultra, whereas to hit 30fps otherwise you might have to drop everything to Low settings.
Go read up.
Ditto.
I 100% know what DLSS is, though by the sounds of it you don't. It is "AI" as much as any other thing is "AI". It uses models to "learn" what it needs to reconstruct and how to reconstruct it.
No, you don't. https://en.m.wikipedia.org/wiki/Deep_Learning_Super_Sampling
This is blatantly and monumentally wrong lol. You think it's literally rendering a dozen frames and then just picking the best one to show you out of them? Wow. Just wow lol.
Literally in the docs: https://raw.githubusercontent.com/NVIDIA/DLSS/main/doc/DLSS_Programming_Guide_Release.pdf
What it does is allow you to run a game at higher settings than you could usually at a given framerate, with little to no loss of image quality. Where you could previously only run a game at 20fps at 1080p Ultra settings, you can now run it at 30fps at "1080p" Ultra, whereas to hit 30fps otherwise you might have to drop everything to Low settings.
No it doesn't. It allows you to run a game at a higher resolution for no reason at all, instead of dropping to a lower resolution that your card can handle natively. That's it.
Keep claiming otherwise, and you're just literally denying reality and the Nvidia link to the docs right in front of you.
-
-
Four teams of Humanoid Robots faced off in a fully autonomous 3-on-3 Football game powered entirely by Artificial Intelligence in Beijing on Saturday night.
Technology1
-
-
-
-
A UK government trial with 20K+ civil servants using Microsoft's Copilot AI for three months found a 26 minute average daily time saving, or two weeks per year
Technology1
-
-
1