NVIDIA is full of shit
-
Since when did gfx cards need to cost more than a used car?
We are being scammed by nvidia. They are selling stuff that 20 years ago, the equivalent would have been some massive research prototype. And there would be, like, 2 of them in an nvidia bunker somewhere powering deep thought whilst it calculated the meaning of life, the universe, and everything.
3k for a gfx card. Man my whole pc cost 500 quid and it runs all my games and pcvr just fine.
Could it run better? Sure
Does it need to? Not for 3 grand...
Fuck me!.....
When advanced nodes stopped giving you Moore transistors per $
-
I plan on getting at least a 4060 and I'm sitting on that for years. I'm on a 2060 right now.
My 2060 alone can run at least 85% of all games in my entire libraries across platforms. But I want at least 95% or 100%
Why not just get a 9060 xt and get that to 99%?(everything but ray tracing black myth wukong)
-
Then why does Nvidia have so much more money?
Because of vendor lock in
-
Like I said...you don't know what DLSS is, or how it works. It's not using "AI", that's just marketing bullshit. Apparently it works on some people
You can find tons of info on this (why I told you to search it up), but it uses rendering tables, inference sorting, and pattern recognition to quickly render scenes with other tricks that video formats have used for ages to render images at a higher resolution cheaply from the point of view of the GPU. You render a scene a dozen times once, then it regurgitates those renders from memory again if they are shown before ejected from cache on the card. It doesn't upsample, it does intelligently render anything new, and there is no additive anything. It seems you think it's magic, but it's just fast sorting memory tricks.
Why you think it makes games better is subjective, but it solely works to run games with the same details at a higher resolution. It doesn't improve rendered scenes whatsoever. It's literally the same thing as lowering your resolution and increasing texture compression (same affect on cached rendered scenes), since you bring it up. The effect on the user being a higher FPS at a higher resolution which you could achieve by just lowering your resolution. It absolutely does not make a game playable while otherwise unplayable by adding details and texture definition, as you seem to be claiming.
Go read up.
It actually doesn't do what you said anymore, not since an update in like 2020 or something. It's an AI model now
-
He’s talking about DLSS upscaling - not DLSS Frame Generation - which doesn’t add latency.
It does add latency, you need 1-2ms to upscale the frame. However, if you are using a lower render resolution (instead of going up in resolution while rendering internally the same) then the latency will be lower because you have a higher frame rate
-
After being AMD for years recently went back to nvidia for one reason. nvenc works way better for encoding livestreams and videos than amd
Fixed in 9000 series
-
Why not just get a 9060 xt and get that to 99%?(everything but ray tracing black myth wukong)
Why not - no?
I don't even care for the monkey game.
-
It does add latency, you need 1-2ms to upscale the frame. However, if you are using a lower render resolution (instead of going up in resolution while rendering internally the same) then the latency will be lower because you have a higher frame rate
Yeah, so it doesn’t add latency. It takes like 1-2ms iirc in the pipeline, which like you said is less than/the same/negligibly more than it would take to render at the native resolution.
-
@FreedomAdvocate there is a reason why WINE = Wine is not (a) Emulator is used. So don't call a api reimplementation a emulation specially since other api reimplementation have shown to be better than the original implementation from the hardware provider ( example dxvk on amd > the original amd dx implementation ) . But this gets us far from the original topics , my point was if nvidia wanted to have real competition they would have included all those new fance features into official api's like for example DX or Vulkan or any other.
They didn't ... and while not directly against the consumer it is against the consumer end.
So i have brought up another point why i call nvidia anti consumer ... neither you like it or not.I’m not sure if English isn’t your first language, or if you’re just being wilfully obtuse, but I didn’t call it emulation. I said it is essentially emulation, like WINE. I know WINE isn’t emulation, which is why I said it is “essentially” emulation because it’s doing the same thing - converting calls from one set of APIs to work on other hardware/architecture. It’s not emulation, but it’s essentially the same thing.
Why would Nvidia want competition? AMD don’t want competition either, but they made FSR work on everything because they were so far behind Nvidia (and because it was all done in software, requiring no special hardware) that they have to give it away to try and catch up.
Companies making proprietary tech is not anti-consumer - unless of course you think that everything other than making everything free and open source is “anti-consumer”, which I am thinking you might?
-
You didn't read up on it huh?
Hilarious.
I know what DLSS is, I want to know exactly what you think in there says what you think it does.
So are you going to actually substantiate your claims with evidence/proof or no?
-
It actually doesn't do what you said anymore, not since an update in like 2020 or something. It's an AI model now
The docs I linked to LITERALLY explain what it is, how it works, and the mechanics behind it. Not until DLSS 4 released earlier this year do the docs mention any kind of way to employ the use of of models to anything at all.
-
Yeah, so it doesn’t add latency. It takes like 1-2ms iirc in the pipeline, which like you said is less than/the same/negligibly more than it would take to render at the native resolution.
Which also means it's not possible to use it to go to 1000 fps
-
Why not - no?
I don't even care for the monkey game.
Just that one particular game runs poorly on AMD. The new cards are pretty good, even at ray tracing now
-
It absolutely is, because Ray tracing isn’t just about how precise or good the reflections/shadows look, it’s also about reflecting/getting shadows from things that are outside of your field of view. That’s the biggest difference.
One of the first “holy shit!” moments for me was playing doom I think it was, and walking down a corridor and being able to see that there were enemies around the corner by seeing their reflection on the opposite wall. That’s never been possible before, and it’s only possible thanks to raytracing. Same with being able to see shadows from enemies that are behind you out of screen to the side.
But we can do these reflections with raster already? And you skipped my point. You mention walking down a hall. I said when I'm running and gunning and shits exploding. You're turning and focusing on shooting an enemy or something, there's no way I'm noticing or paying attention to a reflection of an enemy down a hall.
-
-
Big Tech CEOs Spent Millions to Influence Trump and Republican Lawmakers, Attempting to Secure Billions in Tax Handouts Paid For By Ripping Health Care, Food From Families
Technology1
-
AI learns math reasoning by playing Snake and Tetris-like games rather than using math datasets
Technology1
-
-
-
-
Apple business executives ban Fortnite from iOS. People around the world - including in Europe - say their iPhone is preventing them from playing the videogame.
Technology1
-