Putting the 'role' back in role-playing games since 2002.
Donate to Codex
Good Old Games
  • Welcome to rpgcodex.net, a site dedicated to discussing computer based role-playing games in a free and open fashion. We're less strict than other forums, but please refer to the rules.

    "This message is awaiting moderator approval": All new users must pass through our moderation queue before they will be able to post normally. Until your account has "passed" your posts will only be visible to yourself (and moderators) until they are approved. Give us a week to get around to approving / deleting / ignoring your mundane opinion on crap before hassling us about it. Once you have passed the moderation period (think of it as a test), you will be able to post normally, just like all the other retards.

EPIC Unreal Engine 4 / Unity 5 / CryTek CryEngine 3 / Square Enix Luminous Studio

Wirdschowerdn

Ph.D. in World Saving
Patron
Joined
Nov 30, 2003
Messages
34,465
Location
Clogging the Multiverse with a Crowbar
It's all interlinked. You want something to look better? Then you also need better animations, more physic rigs, better behaviour etc. in order not to break the illusion of a realistic world. It will be inevitable that this will extend development time and cost....and yet, the end result will just be "another bland and shit game" thanks to academically educated game designers (game design schools lol) who never bothered or learned to think outside the box.
Good thing they're working on making Middleware and APIs to make it easier on all of that e.g.:
Animations (ANT): http://bf3blog.com/tag/ant/
Vegetation and plants (SpeedTree): http://www.speedtree.com/
Post Processing (YEBIS2, mentioned in the first post): http://www.siliconstudio.co.jp/middleware/yebis/jp/
UI (Scaleform): http://gameware.autodesk.com/scaleform
AI (look at the second CryEngine 3 video about AI above)
There's even stuff for face caption (L.A. Noire, newest Quantic Dream game etc.) or lip-sync (I think Valve already did that with Source)
etc.
There's always technical solutions for problems.
And again, scaling back shit for 7-10 year old hardware with 256MB dedicated graphics RAM and making it work above 30FPS also costs money and manpower, a lot actually.

I'm not sure "cause stuff gets more expensive/complicated" is a proper reason for technological stagnation, now already going for almost ~7+ years with some hardware parts/specs in the consoles being on the level of 10 years ago.
Developers have a brain and can decide in which way they want to make a game, what amount or graphical detail, what engine, if it's 2D/Retro, 3D or whatever and they should use it instead.
Some of them being retarded and deciding to only make "AAA flashy game" because they want to be moar like War of Gears or Call of Doody and then going bankrupt isn't exactly a reason to halt technology forever...

Small/Mid budget Indie games as well as "AAA productions" can both coexist as the likes of Minecraft prove and we're still getting Wasteland 2, Double Fine Adventure and Shadowrun Returns.

Look, that's all nice and dandy...but just look at the leaked PS4 specs. AMD "Steamroller" Quad-Core? Are you kidding be? That's actually a step down from the Cell. But at least it's supposed to have a AMD 7970 GPU that can render us those flashy graphics! Wow! But it probably can't even calculate a physics based game like LBP anymore.

Uhu. So far for having more progress.

And let's not forget that Epic Games right now has huge input in the determination what hardware gets into the box. You know, those guys who make those flashy engines, but nothing else.
 

MetalCraze

Arcane
Joined
Jul 3, 2007
Messages
21,104
Location
Urkanistan
Step down? Cell is a weakass but expensive shit that couldn't hold a candle to intel and AMD dual-cores coming out at the time but pushed PS3 price to $600.

But yeah PS4 will be weak. They will have to use mediocre hardware because parts are considerably more expensive today.

Again, it is fucking Real Time xD

Watch after 3:30

That explains why it had such huge pixels. Rendered in 1024x600 or something
 

Marsal

Arcane
Joined
Oct 2, 2006
Messages
1,304
Look, that's all nice and dandy...but just look at the leaked PS4 specs. AMD "Steamroller" Quad-Core? Are you kidding be? That's actually a step down from the Cell. But at least it's supposed to have a AMD 7970 GPU that can render us those flashy graphics! Wow! But it probably can't even calculate a physics based game like LBP anymore.
This is not just wrong, but colossally stupid on multiple levels. Why do you comment on stuff you haven't got the slightest clue about?
 

Wirdschowerdn

Ph.D. in World Saving
Patron
Joined
Nov 30, 2003
Messages
34,465
Location
Clogging the Multiverse with a Crowbar
The AMD "Steamroller" is based on the Bulldozer architecture, which is a total failure. It's maybe good for a laptop CPU, but not for a Next Gen gaming console, you fucking moron.

The CELL, if understood and optimized well, is after 6 years, still "respectable" in terms of GFLOPS. But I can guarantee you the Bulldozer APU bullshit will be one HUGE FLOP.

Luckily, CPU power these days isn't so important for gaming anymore, as more and more work gets off-loaded to the GPU. Still, it's pathetic and cheap. Even the Wii U has an IBM Power7 CPU, which outperforms current i7 x86 CPUs.
 

Marsal

Arcane
Joined
Oct 2, 2006
Messages
1,304
The AMD "Steamroller" is based on the Bulldozer architecture, which is a total failure. It's maybe good for a laptop CPU, but not for a Next Gen gaming console, you fucking moron.

The CELL, if understood and optimized well, is after 6 years, still "respectable" in terms of GFLOPS. But I can guarantee you the Bulldozer APU bullshit will be one HUGE FLOP.

Luckily, CPU power these days isn't so important for gaming anymore, as more and more work gets off-loaded to the GPU. Still, it's pathetic and cheap. Even the Wii U has an IBM Power7 CPU, which outperforms current i7 x86 CPUs.
Citation needed, faggot! Let's see some tests. Oh, wait, you're making stuff up and talking out of your ass, dumbfuck!

You aren't going to try to defend the retarded shit about 7970?
 

Wirdschowerdn

Ph.D. in World Saving
Patron
Joined
Nov 30, 2003
Messages
34,465
Location
Clogging the Multiverse with a Crowbar
The AMD "Steamroller" is based on the Bulldozer architecture, which is a total failure. It's maybe good for a laptop CPU, but not for a Next Gen gaming console, you fucking moron.

The CELL, if understood and optimized well, is after 6 years, still "respectable" in terms of GFLOPS. But I can guarantee you the Bulldozer APU bullshit will be one HUGE FLOP.

Luckily, CPU power these days isn't so important for gaming anymore, as more and more work gets off-loaded to the GPU. Still, it's pathetic and cheap. Even the Wii U has an IBM Power7 CPU, which outperforms current i7 x86 CPUs.
Citation needed, faggot! Let's see some tests. Oh, wait, you're making stuff up and talking out of your ass, dumbfuck!

You aren't going to try to defend the retarded shit about 7970?

Google it. Or try this, you lazy fuck.

And this has been around for at least a half year from multiple sources, only the more powerful GPU is new news.

AMD Bulldozer APU might be cheap as ass to manufacture, but I doubt it can do the same magic as the Cell using Havok (CPU based physics). And PhysX (GPU based physics) won't be an option this time, since PhysX belongs to Nvidia. But it doesn't matter, since Sony prefers to make movies and Final Fantasy bullshit instead. They're gonna be the next Sega if they keep making such stupid decisions. Gotta fire another 10k employees maybe to set things straight?
 

Marsal

Arcane
Joined
Oct 2, 2006
Messages
1,304
Google it. Or try this, you lazy fuck.

And this has been around for at least a half year from multiple sources, only the more powerful GPU is new news.
Not rumors, you retard, tests comparing Cell, Power7, Bulldozer and other x86 CPUs.

AMD Bulldozer APU might be cheap as ass to manufacture, but I doubt it can do the same magic as the Cell using Havok (CPU based physics). And PhysX (GPU based physics) won't be an option this time, since PhysX belongs to Nvidia. But it doesn't matter, since Sony prefers to make movies and Final Fantasy bullshit instead. They're gonna be the next Sega if they keep making such stupid decisions. Gotta fire another 10k employees maybe to set things straight?
OpenCL, brah. You're digging the hole deeper. Just give up and think before posting gibberish next time.

If the rumors are true, this is not a bad plan from Sony, IMO. You pair up a mass produced CPU and a powerful, discrete, mass produced GPU (no need to pay R&D costs up front) and get an expensive, but powerful console. You subsidize the first wave of consoles. After next TSMC die shrink, you combine CPU and GPU into one APU and get much cheaper, less power hungry, smaller console and rake in the profits from hardware, not just games (or lower the price and enjoy a huge sales boost).
 

MetalCraze

Arcane
Joined
Jul 3, 2007
Messages
21,104
Location
Urkanistan
Morgoth said:
The CELL, if understood and optimized well, is after 6 years, still "respectable" in terms of GFLOPS.

Yeah it's a one giant flop, especially compared to more or less modern CPUs.
 

Wirdschowerdn

Ph.D. in World Saving
Patron
Joined
Nov 30, 2003
Messages
34,465
Location
Clogging the Multiverse with a Crowbar
If the rumors are true, this is not a bad plan from Sony, IMO. You pair up a mass produced CPU and a powerful, discrete, mass produced GPU (no need to pay R&D costs up front) and get an expensive, but powerful console. You subsidize the first wave of consoles. After next TSMC die shrink, you combine CPU and GPU into one APU and get much cheaper, less power hungry, smaller console and rake in the profits from hardware, not just games (or lower the price and enjoy a huge sales boost).

It's not a bad plan for Sony since they can't afford to produce anything but cheap crap anymore, but it's bad for the customer, since it won't really be much more powerful than the PS3.

I doubt we'll see something better than, say Beyond: Two Souls (Oops, runs on PS3), except with 1080p and some more fps running on the PS4. Wow, big advancement indeed.

Not rumors, you retard, tests comparing Cell, Power7, Bulldozer and other x86 CPUs.

There are plenty of these test out there. The POWER7 crashes all it's competition. The Bulldozer is a piece of crap. But hey, it's an affordable piece of crap, so be my guest.
 

Dexter

Arcane
Joined
Mar 31, 2011
Messages
15,655
What the fuck are you talking about?
Nobody was doing PS3 games because it's shit and especially shit HARD to do parallelization right, it's easy if you want to do stuff like encoding or tasks you can spread out evenly... that's why a lot of people abused PS3s for those tasks, designing it with 8 cores/SPEs in mind was just retarded of them for the purpose of gaming (especially around 2005) and nearly fucked them over, especially at the start.

Next you're going to tell us that the CPU just needs more Ghz, cause more Ghz is always better and more important than architecture!

Also the newest rumors come from here: http://www.psx-sense.nl/89207/exclu...ver-de-playstation-4-inclusief-specificaties/

The Specs being:
  • AMD Fusion APU – Codenaam ‘Liverpool’
  • Quad Core AMD x86 op 3.2 GHz
  • ATI r10xx (3e generatie ) op 800 MHz en 1843 GFLOPS – Codenaam ‘Tahiti’
  • 2GB aan Unified memory (nog onzeker, dit kan 4GB worden)
  • Blu-ray
  • HDMI 1.4 – 1080p output
  • 320GB+ HDD
  • 16GB Flash Memory
The site also specifically says that the CPU is codenamed "Steamroller", which isn't even out yet and nobody really knows shit about: http://news.softpedia.com/news/AMD-...to-Bring-Significant-Performance-264918.shtml
And yet you're still going on about it...

Their biggest problem with the PS3 wasn't even the CPU or GPU so much, but 256MB of dedicated graphics memory, there's apparently also lots of developers this time around trying to convince them to go with 4GB RAM again... Can't do proper Open World games and especially awesome AI without enough RAM .
It's like they're oblivious, the Xbox360 nearly shipped with 256MB total RAM: http://www.1up.com/news/epic-games-cost-microsoft-billion
CryTek would like at least 8GB for "Next Gen" xD http://www.neoseeker.com/news/16241-crytek-wants-minimum-8gb-memory-next-console-generation/
 

Marsal

Arcane
Joined
Oct 2, 2006
Messages
1,304
There are plenty of these test out there. The POWER7 crashes all it's competition. The Bulldozer is a piece of crap. But hey, it's an affordable piece of crap, so be my guest.
Then it shouldn't be a problem to link a few, should it?

You aren't going to comment on OpenCL? You're learning!
 

Gord

Arcane
Joined
Feb 16, 2011
Messages
7,049
Comparing different CPU architectures is usually going to mean shit.
Even comparing different x86 arches is often complicated enough.
Look at Bulldozer vs Sandy Bridge. Obviously SB is much better in just about everything (with the exception of highly parallel tests, as the higher core count of Bulldozer helps in this case).
Still, if you take a more in-depth look you will even find some benches were BD architecture is better than SB (unfortunately those are far and few in between and don't contribute much to overall "real world" performance). Same if you optimize the code for BD instead of Intels architecture.
It won't make BD shine, but it will definitely make it look like less of an fiasco.
Anyway, my point is that if you look at benches that are close to what a specific architecture is good at, they might shine, but this often tells nothing about the complete picture.
Somehow I doubt that IBM reinvented the wheel with their glorious Power7 architecture (but inb4 WIntel conpiracy theories)..

As for Steamroller, they (AMD) are certainly ambitious with their roadmap. It might turn out decent enough, but I hope it's a 4 module version (with 1 module=2 integer cores). Ultimately it will depend a lot on the software I guess.
But then again, multiplatform might kill it, as instead of optimizing it to the strengths and weaknesses of the platform, devs will have to target pretty different hardware again(although differences between PS4 and PC might be minor in that case and OpenCL might actually bring some incline even for PC gaming).
I do agree with the critics that say than available memory is too low, though. They should definitely go for 4Gig minimum, better 8.
 

Wirdschowerdn

Ph.D. in World Saving
Patron
Joined
Nov 30, 2003
Messages
34,465
Location
Clogging the Multiverse with a Crowbar
The RAM they're using in consoles is gonna be a lot more expensive than what you find in PCs (DDR3). Going from 2GB to 4GB isn't that trivial in terms of costs as going from 4GB to 8GB on a PC (which right now is criminally ass cheap).

But since this is probably going to be the last console generation anyway, they should pump it full with decent amnounts of memory anyway and charge $50 more. If they also wanna add some special expensive controller gimmick instead of a simple Dualshock 4, don't expect a price point lower than $550 at least. Who really is gonna buy that, compounded with the fact that the launch lineup is gonna be shit anyway?
 

Marsal

Arcane
Joined
Oct 2, 2006
Messages
1,304
It looks good to a layman but actually they have gone full retard.

UR 3 is already a super overrated piece of shit console engine with baked lighting its main feature but now they use deferred lighting, meaning they have given up completely on PC platform.

The reason is that you can only use one light source at once and have a bunch of other technical issues that basically mean it's no longer an engine that does any real dynamic lighting at all.
What exactly do you mean by this? There are some disadvantages of deferred shading, but having ability to render quite a few lights in a scene without major performance penalty is one of its strengths. STALKER used deferred shading and it was one of the better looking games.

The RAM they're using in consoles is gonna be a lot more expensive than what you find in PCs (DDR3).
:roll: What are they going to use?
 

Gord

Arcane
Joined
Feb 16, 2011
Messages
7,049
If it's based on AMD's APU design I would expect either fast DDR3 ram or maybe (if available at decent prices until then) DDR4. Optimally in a configuration that offers tripple channel (which Steamroller should support, I think).
If they go cheap it will probably be 2 bars of fast DDR3, if they go expensive it might be 3x DDR4, or something in between.

Admittedly GPU performance in APUs depends a lot on RAM speeds, and DDR3 should be slower than e.g. the GDDR5 found in GPUs, so maybe they will change that in the PS4.
 

Marsal

Arcane
Joined
Oct 2, 2006
Messages
1,304
If it's based on AMD's APU design I would expect either fast DDR3 ram or maybe (if available at decent prices until then) DDR4. Optimally in a configuration that offers tripple channel (which Steamroller should support, I think).
If they go cheap it will probably be 2 bars of fast DDR3, if they go expensive it might be 3x DDR4, or something in between.

Admittedly GPU performance in APUs depends a lot on RAM speeds, and DDR3 should be slower than e.g. the GDDR5 found in GPUs, so maybe they will change that in the PS4.
I didn't see GDDR4 used for quite some time (on graphic cards). Do they even make it anymore?

I think they'll go with DDR3 and separate GDDR5 for GPU, then switch to unified DDR5 (or XDR2 if the price is right) when they integrate everything into the APU (if the rumors are true).
 
Self-Ejected

Davaris

Self-Ejected
Developer
Joined
Mar 7, 2005
Messages
6,547
Location
Idiocracy
What exactly do you mean by this? There are some disadvantages of deferred shading, but having ability to render quite a few lights in a scene without major performance penalty is one of its strengths. STALKER used deferred shading and it was one of the better looking games.

I have no interest in this stuff myself, but one of the guys at the C4 forums said back in 2008, only one of those lights casts a shadow. So have a close look at games that use the technique to check.

In that same thread Eric Lengyel gave 6 reasons why it is a dead end technique, but it is posted in the private area, so I don't think I should post it here.
 

Marsal

Arcane
Joined
Oct 2, 2006
Messages
1,304
What exactly do you mean by this? There are some disadvantages of deferred shading, but having ability to render quite a few lights in a scene without major performance penalty is one of its strengths. STALKER used deferred shading and it was one of the better looking games.

I have no interest in this stuff myself, but one of the guys at the C4 forums said back in 2008, only one of those lights casts a shadow. So have a close look at games that use the technique to check.

In that same thread Eric Lengyel gave 6 reasons why it is a dead end technique, but it is posted in the private area, so I don't think I should post it here.
Yes, that seems to be correct. "Dynamic shadow-casting lights are the most expensive, because they have to constantly regenerate their shadow maps." The cost can be reduced significantly if the said lights are relatively static. Maybe Epic figured out a good (cheap) way to do this?
 
Self-Ejected

Davaris

Self-Ejected
Developer
Joined
Mar 7, 2005
Messages
6,547
Location
Idiocracy
Yes, that seems to be correct. "Dynamic shadow-casting lights are the most expensive, because they have to constantly regenerate their shadow maps." The cost can be reduced significantly if the said lights are relatively static. Maybe Epic figured out a good (cheap) way to do this?

He also said there are problems with reflections, DR is a memory hog and alpha-blended objects must be rendered separately.

All I really know is I think the C4 renderer has a nice clean, clear look about it, while other engines I have seen, look kind of flat or grainy. Sorry, I can't get more technical than that, until I start working in that area. If you want good technical information, its better to create an account at the C4 forums and ask Eric.



One thing I did like about the Unreal 4 video, is they have dumped Unreal Script and have linked Kismet, directly to C++ code. I also like the flow lines indicating activity in Kismet. I hope Eric does something similar for C4, as the engine already has these components, they just need to be linked up.
 

Marsal

Arcane
Joined
Oct 2, 2006
Messages
1,304
He also said there are problems with reflections, DR is a memory hog and alpha-blended objects must be rendered separately.

All I really know is I think the C4 renderer has a nice clean, clear look about it, while other engines I have seen, look kind of flat or grainy. Sorry, I can't get more technical than that, until I start working in that area. If you want good technical information, its better to create an account at the C4 forums and ask Eric.
Thanks for the advice, but I don't really put much stock into the future of rendering techniques. I'm more interested in interactive aspects, than pretty pictures (although they do often mesh).
 
Self-Ejected

Davaris

Self-Ejected
Developer
Joined
Mar 7, 2005
Messages
6,547
Location
Idiocracy
Thanks for the advice, but I don't really put much stock into the future of rendering techniques. I'm more interested in interactive aspects, than pretty pictures (although they do often mesh).

Same here. I get excited about editors, AI techniques and streamlined work flows. That's why I liked their new Kismet integration. The thing I was surprised at is they didn't show a Behavior Tree editor. I thought BTs had become the standard in AAA games.
 

Oriebam

Formerly M4AE1BR0-something
Joined
Jul 6, 2011
Messages
6,193
i've been living under a rock... weren't there some news about AMD focusing on the cellphone market and stopping the making of processors, or something a few months ago?
 

As an Amazon Associate, rpgcodex.net earns from qualifying purchases.
Back
Top Bottom