taxalot
I'm a spicy fellow.
400hz screens, bruh.
what the fuck does lighting have to do with taa or dlss? you mean dithered/subsampled effects/shadows/hair being reconstructed via TAA being a common optimization, which is why devs often don't allow for no TAA? you can ship a game without dithered effects if you'd like or force TAA off anyway.realtime lighting garbage they barely manage to ship
...yeah? i mean it has nothing to do with "proper area lights"(?) but yes it would still look like moire hell because geometry will still remain complex.And it would look splotchy moire broken if they even tried to do proper area lights.
PS1 era actually, thanks to heavy use of pre-rendering in 2D games, or full 3D games with limited draw distance hiding how ugly full 3D almost always is still to date, as well as things like early rendering flaws which created a kind of normal mapping/depth perception effect to simple geometry faces without even trying lol. Then low resolution otherwise leaving things to the imagination instead of the ugly reality seen in 90% of modern shite.Graphics peaked on the PS2 era.
Its really just a marketing scheme to sell more TVs/monitors. We have reached a sort of technological "peak" where fundamentally there is no good reason to upgrade from a 1080p screen but companies still need to sell products so they keep inventing new and progressively more impractical standards. 4K and most recently 8K are resolutions that basically give you nothing in terms of picture clarity on a TV that has less than 65 inches and to get the full benefit you would need to have a TV the size of a wall. Yet you can easily purchase monitors with 4K resolutions as small as 15 inches.
Sure there is a difference is sharpness when you take a still screenshot and compare it but fundamentally in game you will not notice it and when watching shows or movies its largely pointless.
I realized this when they were shilling cyberpunk's raytracing as a "major improvement" to the games lightning using still shots of alleys and bars where they would point out... well basically a slightly less artificial lightning.
This especially with a bit of hindsight makes all their older framerate analysis worthless because a good chunk of them uses DLSS 1.0 or 2.0 so the looks and performance achieved there no longer match what a current day user would get.
100% agree. It seems that I'm immune to most marketing bullshit, unlikely apparently most people. It just makes me wonder—why aren't they immune to it as well? I don't think I'm special, and I have the sight and hearing like any average human. If I can't perceive any difference, or only marginal, then what's the point? I highly doubt all those people happily buying the 4k/8k screens have superhuman eyesight.Its really just a marketing scheme to sell more TVs/monitors. We have reached a sort of technological "peak" where fundamentally there is no good reason to upgrade from a 1080p screen but companies still need to sell products so they keep inventing new and progressively more impractical standards. 4K and most recently 8K are resolutions that basically give you nothing in terms of picture clarity on a TV that has less than 65 inches and to get the full benefit you would need to have a TV the size of a wall.
Well yeah, but who cares. I mean, if you want to replicate the look of real-world footage, sure, it's important.It is a major improvement from a technical perspective - "slightly less artificial lighting" is exactly what raytracing does and getting it right in realtime is not trivial.
RT does these things automagically, dynamically (with interiors and exteriors) and perfectly instead of needing a million baked lightmaps and some things like ambient occlusion simply cannot be replicated well enough without it. it simplifies the development process while simultaneously having a much better result. even for unrealistic games there's benefits. look at wind waker on gamecube and wii u, just lighting improvements alone uplift the image a lot vs the flatness of the original.So I don't really see the point *for games*, or at least don't see it as a "holy grail" or something.
See, I just don't believe that.RT does these things automagically, dynamically (with interiors and exteriors) and perfectly instead of needing a million baked lightmaps and some things like ambient occlusion simply cannot be replicated well enough without it. it simplifies the development process while simultaneously having a much better result. even for unrealistic games there's benefits. look at wind waker on gamecube and wii u, just lighting improvements alone uplift the image a lot vs the flatness of the original.So I don't really see the point *for games*, or at least don't see it as a "holy grail" or something.
In 10 years people will unironically claim that Demon's Souls Remake improves on the original shake my damn head smdhjust lighting improvements alone uplift the image a lot vs the flatness of the original
Well yeah, but who cares. I mean, if you want to replicate the look of real-world footage, sure, it's important.
But lemme stop everybody right there—why is that important for a game? A stylised look which is not physically realistic can in fact look better.
I've been dabbling with 3D rendering a bit, and quite often people need to do a lot of fakery to *deviate* from what is 100% physically correct. That includes fake lightsources, and of course touching up the final images in Photoshop.
Non-physically accurate rendering might be not even fine, but highly preferable, from an artistic viewpoint. Even in real movies and TV shows there's tons of grading and image manipulation going on, so why not go straight to the "manipulated look". When using real cameras you don't have that freedom, but in games it's 100% CGI, so you can bend the rules of physical rendering.
painting is extremely expressive and has a lot to do with individual skill so no it's nothing like thatIt's a bit like saying "look, ray-tracing has killed painting!" Sure, ray-tracing is cool, but painters were just fine without it.
i'm ignoring stylistic elements but yes it does look better because WW GC is kind of desaturated, flat and lacks contrast. some argue that was the vision and that WW HD looks 'too 3D', but i'm not sure about that and I think technical limitations were a more likely factor, as WWHD has many of the same people credited as the original.In 10 years people will unironically claim that Demon's Souls Remake improves on the original shake my damn head smdh
this is just fancy SVOGI. it's a good way to fake it and it can look pretty good, but it's not going to compare to real RT for quality.This is the the level of (...) they want you to put up with for realtime lighting (obliterated by hysteresis)
There's no magic bullet, RT doesn't even have LOD, CP77 lags like hell, nv adjacent ray tracing gems book orders you to use static shadows for foliage (bc. as soon as youd have real dynamic scene hysteresis clogs ).this is just fancy SVOGI. it's a good way to fake it and it can look pretty good, but it's not going to compare to real RT for quality.This is the the level of (...) they want you to put up with for realtime lighting (obliterated by hysteresis)
yeah, fuck nvidia for pushing technology forward. let's just all take it up the ass from amd instead where they refuse to implement any new features unless nvidia goes first and they scramble to get something out of the door to have parity.RT is a scam from nVidia and because amd was too weak to capitalize during first generation