Putting the 'role' back in role-playing games since 2002.
Donate to Codex
Good Old Games
  • Welcome to rpgcodex.net, a site dedicated to discussing computer based role-playing games in a free and open fashion. We're less strict than other forums, but please refer to the rules.

    "This message is awaiting moderator approval": All new users must pass through our moderation queue before they will be able to post normally. Until your account has "passed" your posts will only be visible to yourself (and moderators) until they are approved. Give us a week to get around to approving / deleting / ignoring your mundane opinion on crap before hassling us about it. Once you have passed the moderation period (think of it as a test), you will be able to post normally, just like all the other retards.

Decline I hate Digital Foundry.

Ezekiel

Arcane
Joined
May 3, 2017
Messages
5,616

Here's the video, he starts from 38:53.
The guy who speaks on the game before him said that the character models seemed to be 'from a bygone era' and he runs with it from there. Amazingly he compares its models unfavorably to the then preview of Forspoken, of all games. :lol: They all then precede to chuckle in agreement as he tuts at the game.

Watched this for 10-20 seconds, and the bearded guy in the bottom right corner makes some pretty good observations.

That new Uncharted game just looks ghastly. Basically they achieved the low-cost "realistic" 60 FPS soap opera / daytime TV look, congratulations. I'd take mid-late 2000s 3D graphics at 30 FPS to this any day that *looks* like a game. Uh, I don't want games to *not* look like games?

Don't have a problem with the HFR cinema look. Wished all of Avatar 2 had looked like that instead of this back and forth that made the 24 frame parts look terribly choppy. I did not watch Hobbit in the cinema. (Did not watch them at all, because I can't in the original framerate at home.) Billy Lynn's Long Halftime Walk also looked fine to me. You all think it's "soap opera" because you've had nothing but 24 or less since cinema began, I think. Can't even move camera a particular way because of low frame choppiness. But you still see it a lot of the time in pans. Someone in Movie thread told me it's because modern TVs are too fast. Doubt that explains every instance.
 

Ash

Arcane
Joined
Oct 16, 2015
Messages
6,715
I don't care for them as they are graphics and tech whores A.K.A a cancer in gaming. Hate is a strong word though. I simply don't watch them.
 

Rincewind

Magister
Patron
Joined
Feb 8, 2020
Messages
2,531
Location
down under
Codex+ Now Streaming!
I don't care for them as they are graphics and tech whores A.K.A a cancer in gaming. Hate is a strong word though. I simply don't watch them.
Exactly. We've already had the perfect technology to make good games by around 2005 even if you're a graphics whore. Then if you're an ultra-mega-graphics-whore, add another 5-10 years, so by 2010-2015. Honestly, if someone cannot make an engrossing game with 2005, 2010, or 2015 technology, the problem is elsewhere...

People have been able to make good games on 8-bit computers, and many 8-bit games are a lot better than these real-time ray-tracing wonders.

Technological innovation when it comes to gaming at this point is utterly pointless IMO.
 

Elttharion

Learned
Joined
Jan 10, 2023
Messages
1,488
Once upon a time they had useful tips in their articles detailing the performance cost of each graphical setting in some games. No idea if they still do this. They also had their recommended settings for a range of GPUs, those were useful.

Then I watched a video of them and I realized that they are horrible at actually playing the games. Never cared about their opinion again.

I dont see a reason to watch them when I can just type 'name of a game+benchmark' and there are dozens of videos showing the performance and without any kind of commentary.
 

Rincewind

Magister
Patron
Joined
Feb 8, 2020
Messages
2,531
Location
down under
Codex+ Now Streaming!
Yeah they strike me as people who never really play anything and are only interested in benchmarking stuff. Like people tuning their cars then actually using them for getting to the mall then back home twice a week...
 
Joined
Sep 1, 2020
Messages
1,124
I don't watch them usually, but I was mildly annoyed from seeing them retcon the history of racing games in one of their retrospectives. They seemed to be completely unaware of any game made in Europe during the 90s, and incorrectly attributed some innovations to Japanese games that came later. This is what I expect from American millennials who grew up playing the SNES and didn't bother looking further.

They're also ugly, and that's always a good reason to be suspicious.
 

ADL

Prophet
Joined
Oct 23, 2017
Messages
3,785
Location
Nantucket
Digital Foundry is a group of pseudo-intellectuals who shill relentlessly for the newest, shiniest graphics tech without regard for performance cost
Anyone who does this is good in my book. I want more games to future proof themselves by including damn near impossible to run settings. I'm tired of my games being held back because some shitter on a GTX1060 can't run it or even some nerd on a 4090 whining that he's not maxing it out on a 4090. Just slap a disclaimer on path tracing options saying "THIS IS NOT MEANT FOR THE VAST MAJORITY OF USERS, IF YOU CAN'T RUN IT, COME BACK IN FIVE YEARS. LOWER YOUR SETTINGS AND SHUT THE FUCK IN THE MEANTIME KTHX"
 

Modron

Arcane
Joined
May 5, 2012
Messages
10,160
Anyone who does this is good in my book. I want more games to future proof themselves by including damn near impossible to run settings. I'm tired of my games being held back because some shitter on a GTX1060 can't run it or even some nerd on a 4090 whining that he's not maxing it out on a 4090. Just slap a disclaimer on path tracing options saying "THIS IS NOT MEANT FOR THE VAST MAJORITY OF USERS, IF YOU CAN'T RUN IT, COME BACK IN FIVE YEARS. LOWER YOUR SETTINGS AND SHUT THE FUCK IN THE MEANTIME KTHX"
Nice SDG impression.
 

Hell Swarm

Educated
Joined
Jun 16, 2023
Messages
939
Again, my biggest issue is not actually DF themselves. Well. Yes it is. But no. The biggest issue is how the entire gaming sphere is wondering about what DF is going to say for each game release when it is, in fact, irrelevant.

FOR FUCK SAKE, They did a TECH ANALYSIS of SUPER MARIO RPG on the SWITCH.

View attachment 49135

View attachment 49136

These people are beyond mental repair.
They do occasionally give interesting analysis. Like From software games technically run at 60 fps, but they have frame pacing issues which gives you major frame rate issues in Bloodborne. Without them I would have thought From had tried to push the ps4 too hard and not that From's engine is fucked and they need to fix it or change engines.
I loves when normies go mad when their game go 59 fps for a couple of second on their "next-gen" hardware.

That said, this mindset is what actually killed gaming. Games quality will improve only when "next-gen" hardware won't be available/cheap anymore.
Normalfags don't care about FPS. They never have and never will. If they're engaging on online discourse they're no longer a normalfag.
I don't watch them usually, but I was mildly annoyed from seeing them retcon the history of racing games in one of their retrospectives. They seemed to be completely unaware of any game made in Europe during the 90s, and incorrectly attributed some innovations to Japanese games that came later. This is what I expect from American millennials who grew up playing the SNES and didn't bother looking further.

They're also ugly, and that's always a good reason to be suspicious.
Big problem with all gaming history. Americans dictate it and they have no exposure to anything but Nintendo, Sega, Microsoft and Sony. The European scene is plagued with trannies though and they're turning off a lot of people who would other wise support archiving this stuff. When you see "Special guest Kim Justice" on the website you know they're not people you want to be involved with. I have a large collection I would gladly donate to a suitable museum or archivist and I can't find any I would trust.
 

soutaiseiriron

Educated
Joined
Aug 8, 2023
Messages
224
And they measure with fucking framemeters to whine and complain about how this game plays at 60 FPS moooooost of the time but when you enter a big city while it loads it switches to 57 or sometimes even 55 FPS for a couple of seconds
they're never this anal and they just mention "with a few drops" if that's the case. they have an issue with stutter (1% and 0.1% lows), not small drops.
Then they do a DLSS vs FSR analysis when they zoom on the edge of 3D models and compare stills of them being blurred and while a difference is noticeable, it's never apparent to a normal player who, you know, play the game.
their tests of upscalers primarily focus on motion, not stills, and yes fsr is clearly the worst. fsr often looks like a blotchy mess with a lot of disocclusion and oversharpening artifacts around moving edges, while the competition understands edges significantly better because AI models are always going to have a greater understanding of these things than a hand tuned temporal algorithm. fsr isn't even great compared to TSR in unreal engine.
Jedi Survivor played nice on my GTX 980. I didn't fucking spend every minute of gameplay looking at the framerate, I just played it. The Last of Us was mostly 30 FPS on PS3. GTA 5 could hardly reach 30 FPS. And people had fun.
your gtx 980 ran that game at 40 fps and the game is a stuttery mess.
no body is taking 30fps fun away from you
they are also not 60fps purists and frequently recommend 30fps quality modes over 60fps performance modes.

DF is a gossip shop. For hardware reviews
they aren't a hardware channel, so i'm not sure what your point is.
 

Devastator

Learned
Joined
Jan 7, 2021
Messages
255
Location
Chaotic Neutral
Thank you for listening to my rant.

I think they've long departed from their initial principles, mirroring the course of many counterparts in the industry. Initially established on a solid foundation of benchmarks, they've since moved towards embellishment to bait clicks. This echoes the familiar narrative seen in traditional media, engaged in a desperate pursuit of online audience engagement. At the expense of journalistic integrity, of course. A true race to the bottom.

DF's content creation gravitates towards sensationalism, pandering to the whims of an uninformed majority. It's the only outcome the absence of any real objective, professional, and ethical standards. The prevalent discourse on social platforms fully reflects this, where debates rage over trivialities like brand rivalries or performance differences - all fueled solely by marketing narratives rather than any substantive understanding. So, reviewers now resemble mere extensions of corporate public relations machinery. While they may denounce outright failures, the bulk of their output tends towards endorsing mediocrity.

It becomes apparent DF have accurately analyzed (and adapted to) their audience. By making more outlandish content, they've tapped into a reliable stream of clicks and improved their revenue streams. This is a circle they can't afford to break or deviate from.

P. S. Digital Foundry's review of Baldur's Gate's performance on consoles showcased a masterclass in obfuscation and spin. They skillfully achieved the following: highlighting the game's excessive demands on console hardware, particularly in Act 3 where performance plummeted to an unacceptable level (with framerates dropping into the low 20s, not to mention the issue of frame pacing). However, they managed to frame this critique within an overall positive narrative, repeatedly emphasizing that the game's performance on consoles was generally satisfactory (even decently good). In other words, a complete disaster was presented as largely unproblematic! On the other hand... Given the fervent fandom surrounding BG3, dissenting voices in the media are probably counter productive. The prevailing strategy is to shower the game with praise, ensuring a steady stream of clicks and revenue. And for those daring to criticize, the approach seems to be diluting any negative feedback within a sea of praise.

P. P. S. I remember GameSpot? publicly stating that they get greater revenue from guides and editorial pieces compared to conventional reviews (I can't find the reference) So, it stands to reason that Digital Foundry might similarly prioritize generating income from speculative opinions of hypothetical hardware performance.
 
Last edited:

Ravielsk

Magister
Joined
Feb 20, 2021
Messages
1,554
Digital foundry these days is little more than a marketing outlet. Most of their videos these days are just running damage control for otherwise terrible products and selling overpriced hardware. I realized this when they were shilling cyberpunk's raytracing as a "major improvement" to the games lightning using still shots of alleys and bars where they would point out... well basically a slightly less artificial lightning.
This was during a time when GPU prices were at the very top. So they were essentially saying that a about 2000-3000 euro upgrade was justified for a upgrade that for about 98% of the game the player has no way of noticing.
Example of what I am talking about:
8NTjYnK.jpeg

And its not like they have gotten any better since. Almost all of their framerate analysis focuses exclusively on DLSS/FSR performance at the expense of native rendering. This especially with a bit of hindsight makes all their older framerate analysis worthless because a good chunk of them uses DLSS 1.0 or 2.0 so the looks and performance achieved there no longer match what a current day user would get. And this is something that will only get worse with time as new updates and GPUs come to market even a DLSS 3.0 will be replaced and whatever result they shill now will no longer by relevant.
 

soutaiseiriron

Educated
Joined
Aug 8, 2023
Messages
224
Digital foundry these days is little more than a marketing outlet. Most of their videos these days are just running damage control for otherwise terrible products and selling overpriced hardware. I realized this when they were shilling cyberpunk's raytracing as a "major improvement" to the games lightning using still shots of alleys and bars where they would point out... well basically a slightly less artificial lightning.
This was during a time when GPU prices were at the very top. So they were essentially saying that a about 2000-3000 euro upgrade was justified for a upgrade that for about 98% of the game the player has no way of noticing.
it was a major improvement, are they supposed to stop discussing or showcasing improvements in graphics because the hardware was new and expensive? would you have moaned about dx11 or physx comparisons 15 years ago too?
Almost all of their framerate analysis focuses exclusively on DLSS/FSR performance at the expense of native rendering.
there is nothing wrong with that considering upscaling is close to free performance at this point and that's how most people play their games + they generally also have native res comparisons. obviously sub-30fps filler footage on ultra high end games at 4k makes for really shitty video especially when youtube bitrate will eat most of the visible difference anyway.
This especially with a bit of hindsight makes all their older framerate analysis worthless because a good chunk of them uses DLSS 1.0 or 2.0 so the looks and performance achieved there no longer match what a current day user would get.
not how it works
dlss 2.0 has the same performance as dlss 3.7 and the resolution scales remain the same. you can simply drag and drop a new 3.7 .dll from github to any game that has a native implementation of 2.0 onwards also. 1.0 to 2.0 onwards is not a compatible API since they introduced a TAA component with 2.0 which was a major change, but very very few games kept dlss 1.0 since it only lasted about a year and only really nvidia partner titles used it to begin with. i believe the only games that still have 1.0 are monster hunter world and anthem.
 

Ravielsk

Magister
Joined
Feb 20, 2021
Messages
1,554
it was a major improvement
No, it just wasn't. Technically its a leap but in practical terms its a minor step up. Especially in the context of Cyberpunk 2077.
are they supposed to stop discussing or showcasing improvements in graphics because the hardware was new and expensive?
No, but there is a difference between showcasing something new and then there is 24 minutes of shilling for a 1000+ euro upgrade that even based on their examples is mostly invisible(unless you play Cyberpunk as a walking sim then I suppose you will notice it a lot more).
 

abija

Prophet
Joined
May 21, 2011
Messages
2,921
What annoys me is that for consoles they care about rock hard 60 FPS (or whatever the target is, which I can get behind) but then when it comes to PCs they suck the raytracing cock even if it means 35 FPS DLSS performance mode. OMG the reflections meanwhile the actual game is a blurry, unplayable mess.

There is some sort of podcast with them and cyberpunk/nvidia guys and someone said "matrix movie dvd quality still looks better than modern video games". So I guess the target is 24 framegen fps at 320p.
 

soutaiseiriron

Educated
Joined
Aug 8, 2023
Messages
224
Technically its a leap but in practical terms its a minor step up
Realistic bounce light, real reflections, realistic ambient occlusion, nothing having an unnatural glow which is an inevitable issue with any non-RT game is just a "minor step up"?
No, but there is a difference between showcasing something new and then there is 24 minutes of shilling for a 1000+ euro upgrade
They showed the game with a RTX 2060 at the end of the video, which was like $500 at the peak, and they didn't shill an upgrade, they showcased the tech. Again, are they supposed to showcase the tech without having something to run it on? Are they supposed to not show it at all? Are they supposed to wait for it to get cheaper in 3 years?
examples is mostly invisible
Individual examples can be, as with your screenshot, but that's something that's just a single part that adds to the totality of the image. Do they just ignore the small details and only focus on the largest and most noticeable ones (in other words, do their job badly)? Is the entire video just supposed to be this one screenshot?
jyBFEIQ.jpeg
 

ultimanecat

Arcane
Joined
Mar 19, 2015
Messages
591
I will give DF this much: in an industry overrun with journalists who jerk off developers (sometimes literally) and make no attempt to hide their disdain for people who actually buy them, DF is maybe the last mainstream outlet actually treating games like products that need to be purchased with money and which can be faulty, poorly made, and/or unfit for purpose. There’s hundreds of better options for consumer-grade news and reviews of games and hardware, but those better options tend to be ignored by devs and publishers when they complain, whereas DF has just enough juice in the industry that occasionally people on that side of things answer their emails.
 

abija

Prophet
Joined
May 21, 2011
Messages
2,921
Yeah they treat games like products they sell advertisement for.

Realistic bounce light, real reflections, realistic ambient occlusion, nothing having an unnatural glow which is an inevitable issue with any non-RT game is just a "minor step up"?

Is there any game out there where RT affects gameplay?
After 3 generations how many games that actually look good and RT makes a difference in the visual department? Less than 10?
Because hardware isn't even close to "there yet" we get small buffers and horrible performance even with upscalers on top of upscalers, spatio-temporal denoisers, antialiasing and sharpeners turning motion into a blurry smear. But hey, we gotta have those real reflections...
 

soutaiseiriron

Educated
Joined
Aug 8, 2023
Messages
224
Is there any game out there where RT affects gameplay?
is there any movie where being recorded on a consumer DSLR instead of 35mm affects the enjoyment? is there any movie where low quality VFX detract from the enjoyment (i.e. nearly anything prior to y2k)?
it's a stupid premise. visuals will go forward whether you like it or not because technology does not regress. there's always a demand for pushing forward. there's no point where people will say "yup, good enough" and stop innovating just because luddites like you hate it. ray tracing became the standard in pre-rendered animation after a long time, the same is happening to real time game graphics right fucking now.
and frankly, most of those who say that "graphics don't matter!" are total hypocrites and would unconditionally bitch and moan if a modern game released with 7th generation visuals and a $60 price tag.
Because hardware isn't even close to "there yet"
it was "there yet" 2 years ago. keep with the times. you can get playable RT performance out of even mainstream graphics cards at this point and shit like software lumen exists too and is becoming the minimum requirement to run certain games at all. by 2030, this shit isn't even going to be a topic of discussion and raytracing will just seem like the obvious answer.
upscalers, spatio-temporal denoisers, antialiasing and sharpeners turning motion into a blurry smear
i presume you're talking about TAA and derivative upscalers like dlss2/xess/fsr2, and fact is that the blur and detail loss from them is utterly negligible compared to native TAA (depending on resolution scale, though the target is 1:4 which all of them do well with at 4k, but at 2k and 1080 the results are spottier for sure). the alternative is having games be a complete disaster of aliasing and a hell of moire patterns, so taa is a requirement to not make the graphics look fucking broken.
also, would you have bitched about LOD models in 1998? tessellation/sub-division? if no, then temporal upscaling is good, and you have no right to complain.
 
Last edited:

Rincewind

Magister
Patron
Joined
Feb 8, 2020
Messages
2,531
Location
down under
Codex+ Now Streaming!
No, but there is a difference between showcasing something new and then there is 24 minutes of shilling for a 1000+ euro upgrade that even based on their examples is mostly invisible(unless you play Cyberpunk as a walking sim then I suppose you will notice it a lot more).
Yeah that's something I just don't get. All that 4k TV craze, etc. 4k is nice for productivity applications on a PC, sure, and CRT shaders, of course. But for regular games, 1080p is enough, maybe 1440p if you have a big ass 27"+ monitor.

Then for actual movies and TV shows, I'd be fine with 720p on a good plasma or OLED forever. 1080p makes a minuscule different for most content, and I can't imagine why would anyone prefer 4K given all its numerous disadvantages (massively increased file size and bandwidth requirements).

Same deal with 24 bit + 96 kHz audio for just listening to music (it's useful is some production scenarios, though). Most people don't even know anything about digital audio, just think "bigger numbers == better", then convince themselves that they "hear the difference", yet don't even know what kind of difference they should be listening for... Then in a hearing test it would turn out they probably don't even hear a thing above 14-18 kHz, depending on age (96 kHz audio can theoretically represent information up to about 48 kHz, and 48 kHz audio up to about 24 kHz).

You lose the 18k+ region by the age of 16-18 (depending on how you abuse your ears), unless you live in a forest, away from loud cities. I'm practically deaf above 14kHz, I've tested it numerous times, and that's pretty normal above 40. Then It only gets worse in your 50s, 60s, etc., people normally go down to about 12kHz or a bit lower.

No wonder lossy audio compression schemes basically brickwall filter everything upwards from 16-18k to effectively zero, depending on the quality settings. Extremely few people can hear anything up there. Maybe some new born babies, but they don't care much :)
 
Last edited:

Ravielsk

Magister
Joined
Feb 20, 2021
Messages
1,554
Yeah that's something I just don't get. All that 4k TV craze, etc. 4k is nice for productivity applications on a PC, sure, and CRT shaders, of course. But for regular games, 1080p is enough, maybe 1440p if you have a big ass 27"+ monitor.

Then for actual movies and TV shows, I'd be fine with 720p on a good plasma or OLED forever. 1080p makes a minuscule different for most content, and I can't imagine why would anyone prefer 4K given all its numerous disadvantages (massively increased file size and bandwidth requirements).
Its really just a marketing scheme to sell more TVs/monitors. We have reached a sort of technological "peak" where fundamentally there is no good reason to upgrade from a 1080p screen but companies still need to sell products so they keep inventing new and progressively more impractical standards. 4K and most recently 8K are resolutions that basically give you nothing in terms of picture clarity on a TV that has less than 65 inches and to get the full benefit you would need to have a TV the size of a wall. Yet you can easily purchase monitors with 4K resolutions as small as 15 inches.

Sure there is a difference in sharpness when you take a still screenshot and compare it but fundamentally in game you will not notice it and when watching shows or movies its largely pointless.
 
Last edited:

As an Amazon Associate, rpgcodex.net earns from qualifying purchases.
Back
Top Bottom