Overweight Manatee
Scholar
- Joined
- Sep 4, 2009
- Messages
- 3,520
Wunderpurps said:Does he have 100% or even 40% more FPS? No. Numbers like that are meaningless with no context, especially with an up to prepended to them.
When you make a naive test and boot up the game for 2 minutes then memory is not fragmented and you get no cache misses, and you get overoptimistic benchmark results.
So you play 5 minutes and there's zero cache misses, then the gets and sets could possibly be slower I guess. When you play for a few hours then things are less rosy, and then your instruction count doesn't matter at all because memory is fragmented and the weak spot is fetching data into memory.
So a few guys in a few cases seeminly got much better performance for a few minutes not specified in FPS, but most people even on short runs are not seeing more than 5-10 fps difference. I'm guess if they loaded the same game midway into the game and did the same stuff for about an hour the frame difference would be more like 1-2 FPS most of the time.
So wasting half your CPU time doesn't matter because eventually Skyrim's horrible memory problems will slow the game down more? Impeccable logic.
Furthermore, he said up to 40% improvement. FPS tests in CPU-intense areas show up to 40%. What more do you want?
Wunderpurps said:With the optimization, what can I say. I know for a fact you won't get any SSE instructions from your own code without writing them out yourself. When I run non graphics code I only get about 30% speedup compared to full debug mode, and the only option that does much aside from removing debug data is inlining.
The ability of the compiler to improve your code is inversely proportional to your ability to write good code in the first place.
For graphics code it all depends on what your app is doing but hopefully if you are a decent programmer with a decent engine you are only CPU bound when you have legitimate reason, such as calculating a bunch of paths.
OK, can we get back to talking about Bethesda's code?