Putting the 'role' back in role-playing games since 2002.
Donate to Codex
Good Old Games
  • Welcome to rpgcodex.net, a site dedicated to discussing computer based role-playing games in a free and open fashion. We're less strict than other forums, but please refer to the rules.

    "This message is awaiting moderator approval": All new users must pass through our moderation queue before they will be able to post normally. Until your account has "passed" your posts will only be visible to yourself (and moderators) until they are approved. Give us a week to get around to approving / deleting / ignoring your mundane opinion on crap before hassling us about it. Once you have passed the moderation period (think of it as a test), you will be able to post normally, just like all the other retards.

Decline I hate Digital Foundry.

Joined
Jan 7, 2012
Messages
14,361
If video cards had never wasted silicon on RT we'd probably have 120 fps as the expected minimum for new games on midrange hardware. In native resolution
 
Unwanted

Cologno

Unwanted
Joined
Jan 3, 2024
Messages
293
I don't know, I don't have a dog in the RT vs non fight, but I legit thought CP2077 looked better without RT lighting; RT shadows were pretty cool though. Path tracing is a whole other thing.
 

Zewp

Arcane
Joined
Sep 30, 2012
Messages
3,590
Codex 2013
yeah, fuck nvidia for pushing technology forward. let's just all take it up the ass from amd instead where they refuse to implement any new features unless nvidia goes first and they scramble to get something out of the door to have parity.
also, amd couldn't compete because rdna1 was a shit buggy unfinished architecture and also had no RT hardware built in to begin with. amd's problem was that they didn't just follow directx or vulkan development to see what nvidia was up to.

GPUs aren't AMD's primary business. It is for NVidia, though.
 

Ba'al

Scholar
Joined
Jun 26, 2016
Messages
175
Gay tracing is theoretically a cool concept but to this day it's only used as a marketing gimmick because people with no artistic sense think moar grafix = better. The resident retard saying that the remastered version of Wind Waker looks better than the original is a perfect example of that.
 

antimeridian

Learned
Patron
Joined
May 18, 2021
Messages
278
Codex Year of the Donut
But when it comes to games themselves, if anything the main DF members (aside from Rich) do tend to prefer older and non-AAA games (hell, almost every other DF Direct he is in, John tends to have some rant against AAA games :-P despite all the flashy technical stuff they mention being mainly available in AAA games).
I listen to their podcast occasionally and John once came right out and said he would have been happier if graphics stopped progressing after PS2 era. I get the sense he is burned out on the typical DF style analysis and just wants to focus on the retro content. His end of year lists usually feature a lot of interesting smaller titles compared to the triple-A stuff (and his 2023 GOTY was Armored Core VI anyway which is :obviously:). Yeah I still have issues with the guy and DF in general, but as someone with interest in the technical side of games, there is nobody on their level.

Anyone out there blaming them for graphics consoomer audience mentality and $2K GPUs is cracked, graphics whoring is a story much, much older than DF. If anything they've done some good work trying to clear up some of the common misconceptions about game tech and performance.
 

soutaiseiriron

Educated
Joined
Aug 8, 2023
Messages
226
GPUs aren't AMD's primary business. It is for NVidia, though.
That's not really my problem. If they refuse to put serious effort into their GPUs which means they totally lack in features compared to the competition, then I have no incentive to buy them either. If their upscaling tech isn't at least on Intel's level next upgrade cycle then I just won't bother. Also why the fuck did they buy ATI if they didn't give a single shit at all for HPC, media, GPGPU? It was clear by 08-09 that this was going to take off with PS3 clusters and later CUDA. Jesus Christ. I guess they've made low effort bux off of consoles anyway.
The resident retard saying that the remastered version of Wind Waker looks better than the original is a perfect example of that.
I just prefer a less desaturated look and I like the bloom and shadows. Go yell at the staff who worked on both the original and the remaster if you think the artistic sense was ruined and how they don't understand anything.
Though the version is also superior for gameplay reasons.
 

Rincewind

Magister
Patron
Joined
Feb 8, 2020
Messages
2,544
Location
down under
Codex+ Now Streaming!
Good points Bad Sector and soutaiseiriron about RT being useful for non-realistic rendering.

I guess I just don't like chasing realism, so I'm looking at all this from that angle. Many of my favourite games just need the ability to blit rectangles, so... :)

Btw, the last recent 3D game I played was ELEX, and my view is you don't quite need anything more advanced than that to create a good game. That John guy in the DF channel seemed like a sensible person to me, so yeah, PS2 level graphics are prettt much sufficient I think.

I'd rather play original Gothic 1 from 2003 than some new AAA cutscene "game".
 

tritosine2k

Erudite
Joined
Dec 29, 2010
Messages
1,564
DF knows what's up.
But it's almost like their hands are tied even though they know perfectly well what's up:

I expect them to jump ship for eyewear stuff once it crosses into mainstream (and you can't f around without MSAA there luckily).
 
Last edited:

Gerrard

Arcane
Joined
Nov 5, 2007
Messages
12,148
yeah, fuck nvidia for pushing technology forward.
Nvidia is "pushing the technology forward" in the same way that Creative "pushed the technology forward" in terms of sound in games. You think sound in games today is different compared to 20 years ago? The only different thing you might find is some games use some custom software sound engine that does the things the hardware would be doing if there actually was any progress done in that time.
 

Azdul

Magister
Joined
Nov 3, 2011
Messages
3,414
Location
Langley, Virginia
If video cards had never wasted silicon on RT we'd probably have 120 fps as the expected minimum for new games on midrange hardware. In native resolution
Many relatively new games - like Baldur's Gate 3 - do not use RT at all.

What if you combine such game with graphic card that is true rasterization beast - like Radeon 7900 GRE ?
  • 140 fps at 1080p
  • 130 fps at 1440p
  • 75 fps at 4k
If all silicon dedicated to matrix-multiply-accumulate acceleration would be spent on 'rasterization' - you still probably wouldn't reach 120 fps at 4k.

The problem is - that you can achieve 120 fps at 4k with good image quality using image reconstruction and frame generation - so Tensor cores / Wave MMA silicon is useful even if the game does not use Ray Tracing.

And - both Nvidia and AMD sell the same chips to people training neural networks and gamers - so expect that upcoming graphic chips will dedicate even more silicon to matrix-multiply-accumulate. Most games won't take advantage of it - because they need to run on Playstation 5.
 

Lyric Suite

Converting to Islam
Joined
Mar 23, 2006
Messages
56,903
The problem with RT is that you just know eventually developers aren't going to bother painting proper shadows 'cause, just turn on RT bro.

I think Control is already there, given how big of a difference there is between RTX on and off in that game.

Games that were made to be "complete" without factoring in Ray Tracing generally don't look much different whether it's on and off, but why bother do that when you can just tell the player to just turn Ray Tracing on?
 

abija

Prophet
Joined
May 21, 2011
Messages
2,921
Many relatively new games - like Baldur's Gate 3 - do not use RT at all.

What if you combine such game with graphic card that is true rasterization beast - like Radeon 7900 GRE ?
  • 140 fps at 1080p
  • 130 fps at 1440p
  • 75 fps at 4k
If all silicon dedicated to matrix-multiply-accumulate acceleration would be spent on 'rasterization' - you still probably wouldn't reach 120 fps at 4k.

Why would you go for that model? https://tpucdn.com/review/baldur-s-...nce-analysis/images/performance-3840-2160.png
Nevermind that bg3 isn't exactly a well optimized game. It's barely good enough for a type of game that doesn't require good fps.

Even deciding to go for "upscale is required route" would have meant a good hardware upscaler not generic AI bullshit. Neah, pushing graphics forward was not what made them push RT and the tensor cores.
 

Azdul

Magister
Joined
Nov 3, 2011
Messages
3,414
Location
Langley, Virginia
Many relatively new games - like Baldur's Gate 3 - do not use RT at all.

What if you combine such game with graphic card that is true rasterization beast - like Radeon 7900 GRE ?
  • 140 fps at 1080p
  • 130 fps at 1440p
  • 75 fps at 4k
If all silicon dedicated to matrix-multiply-accumulate acceleration would be spent on 'rasterization' - you still probably wouldn't reach 120 fps at 4k.
Why would you go for that model? https://tpucdn.com/review/baldur-s-...nce-analysis/images/performance-3840-2160.png
Nevermind that bg3 isn't exactly a well optimized game. It's barely good enough for a type of game that doesn't require good fps.
I don't know much about Radeons - I've just assumed that $549 card from 2024 will be better than 999$ card from 2022. Apparently that's not how Radeons work ;).

Still - no stable 120 fps at 4k.
Even deciding to go for "upscale is required route" would have meant a good hardware upscaler not generic AI bullshit.
Hardware upscaler is not magic.

I want image to be stable between frames (no shimmering) with no ghosting or aliasing artifacts. Sharp letters and UI - and smooth depth of field. As close as possible to the image rendered at 100k scaled down to the 4k of the display.

Hardware upscaler can do 2x or 4x resolution - with some artifacts that you'll get used to. Deep networks get better all the time - and at some point they will be able to cheat visual cortex without getting caught.
Neah, pushing graphics forward was not what made them push RT and the tensor cores.
'Pushing graphics forward' is too vague.

You cannot push beautiful 2D graphics of NeoGeo or Amiga to better simulate reality - even if from artistic point of view that's much more satisfying than early 3D.

The same way - you cannot push rasterization closer to physical reality in technologically sound manner. As John Carmack said - most of modern rasterization engine is literally 'smoke and mirrors'. Saying that - you can still achieve esthetically pleasing image using rasterization. But you cannot get rid of last 5% of visual errors without simulating physical behaviour of the photons using RT.
 

tritosine2k

Erudite
Joined
Dec 29, 2010
Messages
1,564

Ray tracing is the future and ever will be..​


Because there's hardly anything more ill-fitting than realtime and RT.

Also you get tons of shimmering from pixel shading thats necessarily inaccurate because no fractional pixel values (you get from precomputed textures for "free").
Hence the need for TAA. Reality is that texture space precompute doesn't need it and was just fine with MSAA (nearly free for what it does).

Noone in their right mind would choose crappy smeary GI over actual subpixel AA but crappy smearing displays don't set the two apart well enough.
 
Last edited:

soutaiseiriron

Educated
Joined
Aug 8, 2023
Messages
226
Nvidia is "pushing the technology forward" in the same way that Creative "pushed the technology forward" in terms of sound in games. You think sound in games today is different compared to 20 years ago? The only different thing you might find is some games use some custom software sound engine that does the things the hardware would be doing if there actually was any progress done in that time.
Creative infamously monopolized many of the features (later open sourcing them as the widely used OpenAL) by suing and then purchasing the competition. it wasn't until AMD TrueAudio* that there started to be some hints toward better 3D positional audio again which kind of fizzled out and the same idea was later repackaged by RTX Audio or whatever it was called, which fizzled out worse than TrueAudio. Besides, it's a dumb comparison because it was clear by the Xbox 360 that audio was going to be done in software in the future since CPUs were really fast at that point. Creative just put the bullet in the corpse with their shitty drivers on Windows Vista.
AMD, NV and Intel are competing on even ground because they all have a word in steering DirectX and Khronos and patents haven't been an issue in this field. they can all make extensions to DX or Vulkan to implement whatever they like.
*ok, I concede. I forgot AMD TrueAudio existed until the topic shifted to audio, but TrueAudio was an AMD technology that did attempt to get developers to make better audio, so thumbs up there.

Even deciding to go for "upscale is required route" would have meant a good hardware upscaler not generic AI bullshit. Neah, pushing graphics forward was not what made them push RT and the tensor cores.
What does "hardware upscaler" mean to you? It's as hardware as it can get considering it's running on Tensor cores which could otherwise go totally unused for games. PS4 Pro had some form of hardware acceleration for checkerboard rendering, and that hardware was wasted because temporal upscalers quickly became better. Generalization > specialization as far as upscaling goes because this field is moving too fast build hardware around.

And DLSS since 2.0 is not really "generic AI bullshit" at all. We've seen what that looks like with DLSS 1.0, which looked like shit and barely any better than just sharpening the entire image. 90% of the work is done by temporal super-sampling which Nvidia copied from other developers, the last 10% is just a fast ML model knowing what info to discard (such as what it thinks is ghosting or moire aliasing) rather than add.

You're right that NV didn't do RTX to "push graphics forward" inherently. They did it to extend their lead on AMD. Nvidia historically has gone first on features, and still does. Plenty of people still think Shadowplay is something NV exclusive. Want to know why? Because AMD didn't have anything to compete with for like two years.

Hence the need for TAA. Reality is that texture space precompute doesn't need it and was just fine with MSAA (nearly free for what it does).
4x MSAA wasn't even free in 2006. MSAA is great for geometry (so is FXAA/MLAA/SMAA...) but it can't do anything at all for specular or subpixel aliasing which is going to be like 90% of aliasing and by far the most bothersome kind of aliasing in a modern game.
 

tritosine2k

Erudite
Joined
Dec 29, 2010
Messages
1,564
They dont use temporal in Hollywood cgi and theres no shimmer (16x SSAA is usual).
You can set RAGE to 16x MS and forget about shimmer (bc pre-shaded in texture space).
I used to set Mass Effect to 8x MSAA per eye in stereo in early 2010s and forgot about shimmer at 2x60fps( it's averaged between eyes and same as above).
 

soutaiseiriron

Educated
Joined
Aug 8, 2023
Messages
226
They dont use temporal in Hollywood cgi and theres no shimmer (16x SSAA is usual).
Yeah no shit, that's because it's pre-rendered hollywood CGI, not real time graphics. they have to show those effects on massive >8k screens often extremely close to the camera view. The point of TAA is to temporally imitate the effect of supersampling with similar ms cost to analytical AA which is as "free" as any AA can get, instead of actually running the game at 4x4 SGSSAA or whatever.
Rage and Mass Effect are graphically unimpressive forward rendered games made for xbox 360 that predated the big uptake in deferred rendering in the late 7th gen. ME ran at 2x msaa 720p even on 360, rage was native 720p no AA. of course they'll run alright at higher MSAA on 5x more powerful PC hardware.
 

tritosine2k

Erudite
Joined
Dec 29, 2010
Messages
1,564
It's 1080p 16x AA because they didn't even jump to 4k PLUS RT is so prohibitively expensive with high ray count per pixel and you dont even have 1 spp in realtime.
Even if you do it's some glorified model viewer without LOD.
 

Bad Sector

Arcane
Patron
Joined
Mar 25, 2012
Messages
2,263
Insert Title Here RPG Wokedex Codex Year of the Donut Codex+ Now Streaming! Steve gets a Kidney but I don't even get a tag.
yeah, fuck nvidia for pushing technology forward. let's just all take it up the ass from amd instead where they refuse to implement any new features unless nvidia goes first and they scramble to get something out of the door to have parity.

The problem with AMD here is that their market share is both too small and they do not seem to have the developer resources Nvidia has to court game developers, so it is much harder for AMD to introduce new features and have those features - Nvidia can introduce some new feature and both send their army of coders to big name game developers to help them integrate that new feature and rely on their big market share for the rest of the developers to adopt it.

It'd be nice if AMD tried to push things but TBH i personally do not expect them to do that, the most i expect them is to come up with new things to push current featureset (like chiplet-based manufacturing, which they managed to do before Nvidia/Intel).

I guess I just don't like chasing realism, so I'm looking at all this from that angle. Many of my favourite games just need the ability to blit rectangles, so... :)

Btw, the last recent 3D game I played was ELEX, and my view is you don't quite need anything more advanced than that to create a good game.

Well, i mentioned elsewhere recently that my favorite RPG is New Vegas and my favorite FPS is Quake, both being very light in terms of GPU resources (or no GPU at all in the case of Quake :-P). In terms of what i'm fine with, as long as something gives the impression of what it is supposed to be i am fine with it - e.g. i don't need realistic reflections, an spherical environment map (let alone a cube map) to give the impression that something is supposed to be "shiny" works as well as raytraced reflections for me - the latter are just bonus, but i wouldn't ignore a game for that.

The problem with RT is that you just know eventually developers aren't going to bother painting proper shadows 'cause, just turn on RT bro.

Developers aren't painting shadows by hand anyway (well, except for stylized games that use some painted look, but those are few), the alternative to raytraced shadows is -in last ~15 years at least- shadowmapping, both of which are renderer features.

Renderers will certainly switch to using raytraced shadows - and TBH as someone who has written his own renderers, of all the raytracing algorithms, raytraced shadows are what i'd like the most to be able to do as can you do much better and less glitchy shadows than anything involving shadowmapping. FWIW raytraced shadows are also the fastest of all raytraced algorithms to do - AFAIK (haven't tried it) you can do RT shadows even on a Steam Deck at full framerate.

Rage and Mass Effect are graphically unimpressive forward rendered games

IMO RAGE was graphically impressive for its megatexture approach which influenced virtualized resources in GPUs - something that several games use these days and AFAIK is a cornerstone feature of Unreal Engine 5's Nanite (as well as its inspiration).

Aesthetically the base game didn't look that great overall, though the DLC had some very good looking scenes (probably the best looking scenes in any id Tech 5 game).
 

tritosine2k

Erudite
Joined
Dec 29, 2010
Messages
1,564
IDK why would be deferred more impressive than forward, point lights are a physical impossibility and you don't bring area lights to the table anyway (nor transparency), can't even make a TAA toggle lmao.
 

soutaiseiriron

Educated
Joined
Aug 8, 2023
Messages
226
The problem with AMD here is that their market share is both too small and they do not seem to have the developer resources Nvidia has to court game developers, so it is much harder for AMD to introduce new features and have those features - Nvidia can introduce some new feature and both send their army of coders to big name game developers to help them integrate that new feature and rely on their big market share for the rest of the developers to adopt it.
They used to try a lot harder at this stuff in the early 2010s pushing TressFX, TrueAudio, Mantle (also another AMD innovation I forgot admittedly), etc on their sponsor titles. On the hardware side, they also had a large say in developing HBM which is foundational for AI obviously.
AMD sponsorship means a lot less these days because they don't really go for any of this anymore. Most you can expect with an AMD sponsor title these days is a lack of competing upscaling options.
It'd be nice if AMD tried to push things but TBH i personally do not expect them to do that, the most i expect them is to come up with new things to push current featureset (like chiplet-based manufacturing, which they managed to do before Nvidia/Intel).
Chiplets were just an implementation of TSMC's CoWoS, they don't really have a hand in manufacturing. There's definitely serious architectural brains that had to have been put into it to make it work with a consumer gaming GPU though, no question there. Even if it came with the bugginess and caveats that first generation AMD products often do.
Intel is a pretty close follower in chiplets with Ponte Vecchio in HPC and the Meteor Lake iGPU (admittedly the silicon is TSMC, even though packaging is Intel).

IMO RAGE was graphically impressive for its megatexture approach which influenced virtualized resources in GPUs - something that several games use these days and AFAIK is a cornerstone feature of Unreal Engine 5's Nanite (as well as its inspiration).

Aesthetically the base game didn't look that great overall, though the DLC had some very good looking scenes (probably the best looking scenes in any id Tech 5 game).
Yeah I was admittedly doing Rage dirty there. I know it was technically impressive but at the same time, it came out right around Crysis 2 which still holds up way better than Rage imo.
 

tritosine2k

Erudite
Joined
Dec 29, 2010
Messages
1,564
Well crysis2 is what "fakeosity" with hand placed point lights like metro2033. Rage did GPU tracing when frostbite engine still did CPU trace well into 2015 even BF4 precompute was cpu trace.

Carmack also says RAGE is not baking as much but early decoupled shading but storage tech of time put there a hard limit (read the interview). Ofc it's easier to give in to blind hate.
 

Rincewind

Magister
Patron
Joined
Feb 8, 2020
Messages
2,544
Location
down under
Codex+ Now Streaming!
Well, i mentioned elsewhere recently that my favorite RPG is New Vegas and my favorite FPS is Quake, both being very light in terms of GPU resources (or no GPU at all in the case of Quake :-P). In terms of what i'm fine with, as long as something gives the impression of what it is supposed to be i am fine with it - e.g. i don't need realistic reflections, an spherical environment map (let alone a cube map) to give the impression that something is supposed to be "shiny" works as well as raytraced reflections for me - the latter are just bonus, but i wouldn't ignore a game for that.
I can certainly sympathise. Related: I had a period when I was heavily into creating 3D art, and I was on my way to become a "technical" 3D artist. Extreme photorealism, that kind of stuff, you know. It's kinda fun to learn those 3D tools just for the learning's sake.

But then I realised, I'm just mimicking reality, basically recreating photos in a 3D package. Which is cool, but I just value hand-drawn art more, so I'd rather do that in my spare time. Just a blank sheet of paper and a pencil. You can lose yourself in all the technicalities and lose sight of the bigger picture, which is creating art. But then, some people like to be technicians, not artists.

Therefore I'm saying, all you need for good games is blitting rectangles :) And 640x480 with 256 colours is quite enough.
 
Joined
Jan 5, 2021
Messages
427
Let's remind everyone one clear thing : Digital Foundry doesn't play games, they measure them. And they measure with fucking framemeters to whine and complain about how this game plays at 60 FPS moooooost of the time but when you enter a big city while it loads it switches to 57 or sometimes even 55 FPS for a couple of seconds. Then they do a DLSS vs FSR analysis when they zoom on the edge of 3D models and compare stills of them being blurred and while a difference is noticeable, it's never apparent to a normal player who, you know, play the game. As a result of this kind of excessive scrutiny, people are complaining about horrible ports when they are, in fact, not. Jedi Survivor played nice on my GTX 980. I didn't fucking spend every minute of gameplay looking at the framerate, I just played it. The Last of Us was mostly 30 FPS on PS3. GTA 5 could hardly reach 30 FPS. And people had fun.
This is just literally false.

If you think stable, steady framerates don't matter, I doubt you actually play games. And we're not talking about a dip from 60 FPS to 59 FPS, when it goes into the 20's, it's a serious issue. A significant number of players had atrocious performance and horrible stuttering in Jedi Survivor to the point where the game was borderline unplayable, but I guess they just must be whining because YOU personally had a good experience (wow, imagine being this self obsessed!). So many people had frankly miserable experiences with the port that it's widely regarded as unoptimised. It's not just gamers whining and you're stupid if you think so. There's a very significant difference between a game being locked at 30FPS and a game having inconsistent framerates between 30 and 60 FPS. People complaining about one aren't complaining about the other (although games being locked at 30 sucks in it's own right). Like it or not the trend right now is for games to be terribly optimised and so this sort of extreme FPS inconsistency is essentially standard on PC and all modern consoles.

Dark Souls is virtually unplayable on the Xbox 360 because the FPS is so insanely inconsistent, in a game where you need to constantly time your attacks and rolls to be frame perfect. It's even worse than the PS3 version (which I didn't think was possible).

The same is true of FSR and DLSS. Both of them look fucking awful and I hope this trend dies out. FXAA literally looks better and FXAA is regarded by many as extremely shitty to the point of people putting up with jagged edges rather than using FXAA, or any AA at all if FXAA is the only option available. Temopral AA looks even worse and combined with DLSS makes for a frankly miserable experience that both looks awful and isn't particularly performant compared to just lowering the amount of post-processing crap cluttering up the screen.

You seem to think gamers just like to just endlessly whine because they aren't getting 60 FPS 100% of time in otherwise completely stable and playable games, and that "graphics don't matter, it's all about HAVING FUN'. As if we aren't seeing more and more games being released with horrendous performance, that cover the screen in a vast array of really awful looking post-processing effects, and which then lazily slap TAA over everything to "fix up the image" because they have essentially wasted graphics power on useless crap. It's gotten so bad to the point where in order to have a game that is both performant and provides an actually clear image where you can actually see what's on the screen requires either playing old games, or indie titles. AAA games are a complete write off in this regard, as are many of the Indie "thrown together in UE5 to look good" projects, both of which generally look awful most of the time and have horrible performance to boot.

This post is literally just people refusing to have some basic standards because "oh I had FUN, so nothing is wrong". Fallout New Vegas was a buggy, borderline unplayable mess at release. But people had fun. Therefore all the bugs are completely forgivable, right? WRONG. And there's a reason the community has been fixing it for over a decade. These things matter and you're selling yourself short if you put up with garbage for the sake of "having fun", especially when the garbage so often gets in the way of having fun. A game on a modern console jumping between 20 and 60 FPS is frankly unacceptable. A game smothering the screen with so much DLSS to hide it's poor optimisation that I can't even see 10 meters in front of my character is frankly unacceptable. You're delusional if you think these are acceptable practices. Stop coping and start accepting the reality for what it is. Gaming isn't just fucked in terms of gameplay, it's also fucked in terms of graphics.

I'm no fan of Digital Foundry and I don't watch their content, but your OP is nothing but misinformation.
 
Last edited:

As an Amazon Associate, rpgcodex.net earns from qualifying purchases.
Back
Top Bottom