Putting the 'role' back in role-playing games since 2002.
Donate to Codex
Good Old Games
  • Welcome to rpgcodex.net, a site dedicated to discussing computer based role-playing games in a free and open fashion. We're less strict than other forums, but please refer to the rules.

    "This message is awaiting moderator approval": All new users must pass through our moderation queue before they will be able to post normally. Until your account has "passed" your posts will only be visible to yourself (and moderators) until they are approved. Give us a week to get around to approving / deleting / ignoring your mundane opinion on crap before hassling us about it. Once you have passed the moderation period (think of it as a test), you will be able to post normally, just like all the other retards.

Decline PS4 wins the console war against XboxONE, yet it is a hollow victory as Consolesdämmerung is upon us

J_C

One Bit Studio
Patron
Developer
Joined
Dec 28, 2010
Messages
16,947
Location
Pannonia
Project: Eternity Wasteland 2 Shadorwun: Hong Kong Divinity: Original Sin 2 Steve gets a Kidney but I don't even get a tag. Pathfinder: Wrath
Debatable. Being the same architecture as used in a PC means that most of the optimization should carry over. But in any case, a $180 card will leave it in the dust.
It can't carry over, because the consoles have standard hardware, and games can be optimized to hell, while PCs have a wide range of specs.
 

MetalCraze

Arcane
Joined
Jul 3, 2007
Messages
21,104
Location
Urkanistan
So J_C can you tell me why GPU on PS4 will run faster than a comparable GPU on PC?

Since you aren't just repeating what PR department tells you and all, but a knowledgeable literate person?
 

Wirdschowerdn

Ph.D. in World Saving
Patron
Joined
Nov 30, 2003
Messages
34,675
Location
Clogging the Multiverse with a Crowbar
Is that why there's not a single game in XBox / PS4 line up that isn't a cinematic shooter or cinematic racing game or cinematic fighting no different from the same console shit of 10 years ago?

Ambitious games always take more time to develop, preferably on a final devkit.

Launch titles are almost always shitty, compromised action games of last-gen school put together hastely in 18 months to get to launch. Of course they started developing on PCs first to have a target.

I expect the first real Next-gen titles to emerge around 2015-2016 first, when cross-generation development hopefully has been abandoned for good.

But of course everyone using a brain would know that. But I see why everything always has to be explicitly explained to you, like to a small child.
 

J_C

One Bit Studio
Patron
Developer
Joined
Dec 28, 2010
Messages
16,947
Location
Pannonia
Project: Eternity Wasteland 2 Shadorwun: Hong Kong Divinity: Original Sin 2 Steve gets a Kidney but I don't even get a tag. Pathfinder: Wrath
At least with PS4/Xbone there's a good chance now game design will change significantly for the better, which will automatically benefit PCs too.
I don't want better graphics to be honest, I just want the devs to use that 8 GB RAM, and make bigger game worlds. The biggest drawback of the current generation was the memory limitation.
 
Joined
Jan 7, 2012
Messages
14,337
Debatable. Being the same architecture as used in a PC means that most of the optimization should carry over. But in any case, a $180 card will leave it in the dust.

Try to play the recent Tomb Raider (running on dat 8 year old console hardware)

Why would I want to do that?

maxed out with a $180 card with decent res/hertz. Good luck.

You should be able to max out just about any game these days assuming your limit is 1920x1080 and 45-60ish fps. Unless its a horribly bad port.

Oh, but but it still looks so much better on PC! Yeah? Well, but the point is, it's still the same shitty game.

At least with PS4/Xbone there's a good chance now game design will change significantly for the better, which will automatically benefit PCs too. It's a win-win. A bit nicer graphics can't hurt, either. Not worried about that, except shallow graphics whores like Skyway who never can get enough of staring at those Arma 3 ground textures.

I don't think anyone is disagreeing that better consoles will raise the minimum standard of graphics and potentially get us out of corridor-based game design.
 
Joined
Jan 7, 2012
Messages
14,337
What the fuck is HSA?
Heterogeneous Systems Architecture. GPU and CPU compute on the same die, utilizing the same cache and memory addresses. This means that you can do computations on the GPU and CPU from the same data sets without any memory transfers taking place.

Wasn't the same architecture in place in the previous generation?
Nope, not with this marketing budget.

Edited to be a bit more accurate.
 

MetalCraze

Arcane
Joined
Jul 3, 2007
Messages
21,104
Location
Urkanistan
Ambitious games always take more time to develop, preferably on a final devkit.
So where are announcements for at least one?

Launch titles are almost always shitty, compromised action games of last-gen school put together hastely in 18 months to get to launch. Of course they started developing on PCs first to have a target.
Actually all games are developed on PC. Somebody is going to compile all the code using a downclocked laptop CPU of Xbone/ps4?

I expect the first real Next-gen titles to emerge around 2015-2016 first, when cross-generation development hopefully has been abandoned for good.

And why do you think they will be ambitious projects considering that consoles never had a single one in all history of gaming?

2015-2016? By that time console hardware will run slower than budget PCs lol

I don't think anyone is disagreeing that better consoles will raise the minimum standard of graphics and potentially get us out of corridor-based game design.

Because it's RAM that stops consoles from getting out of corridor based design.

Just look at all those corridors of Skyrim! Or Oblivion! Or RDR! Or GTA4! Or Sacred 2!
 

grdja

Augur
Joined
Mar 20, 2011
Messages
250
Id like to remind everyone that performance (aka far far above what new consoles are getting) AMD 8 core chips are being trounced by quad core Intels in all but the best possibly multithreaded applications. And games don't belong in that category.

Unless you are using your GPU as GPGPU and doing CUDA and whatnot shit there is very little benefit for GPU having access to CPU cache. Plus semiconductor LAW OF THE WAY FUCKING REALITY WORKS that says that larger die surface leads to lower yields, no exceptions. And issues about clock speeds when you have so much shit together pretending to be one thing.

Die integrated graphics cards are nothing new, always were relegated to worst entry level ranges, while HSA is a bit better it still is mainly AMD trying to rebrand concent to move it away from "entry level shit". And they succeeded with Intel going same route (poor Intel, trying to dabble in graphics, never worked, never will, unless they buy nVidia and avoid all the monopoly investigations to follow).

Edit. @RAM. Yes praise the Jeebus that both consoles now have 8 gigs. 16 would have been better for future-proofing but its still improvement.
 

IDtenT

Menace to sobriety!
Patron
Joined
Jan 21, 2012
Messages
14,469
Location
South Africa; My pronouns are: Banal/Shit/Boring
Divinity: Original Sin
Ambitious games always take more time to develop, preferably on a final devkit.
So where are announcements for at least one?
Well to be fair, there was a lot of open world games and much less corridors at E3. That should continue.

Launch titles are almost always shitty, compromised action games of last-gen school put together hastely in 18 months to get to launch. Of course they started developing on PCs first to have a target.
Actually all games are developed on PC. Somebody is going to compile all the code using a downclocked laptop CPU of Xbone/ps4?
Irrelevant. It gets compiled by a compiler made for the console architecture.
 

MetalCraze

Arcane
Joined
Jul 3, 2007
Messages
21,104
Location
Urkanistan
And why do you think they will be ambitious projects considering that consoles never had a single one in all history of gaming
See, and this is why nobody likes you Skyshit.

Yep. You can't name any and it makes you rage.
No positive emotions in discussions with me. Can't just post bullshit and feel good and happy without that damn skyway pissing on it. Fuck that asshole.



Well to be fair, there was a lot of open world games and much less corridors at E3. That should continue.
Really? Can you name me those open world games?

On consoles:
MGS5... GTA clone Watchdogs... and... ugh.... shit.

Irrelevant. It gets compiled by a compiler made for the console architecture.

So what?
 

IDtenT

Menace to sobriety!
Patron
Joined
Jan 21, 2012
Messages
14,469
Location
South Africa; My pronouns are: Banal/Shit/Boring
Divinity: Original Sin
Id like to remind everyone that performance (aka far far above what new consoles are getting) AMD 8 core chips are being trounced by quad core Intels in all but the best possibly multithreaded applications. And games aren't currently built to be in that category.
Fixed.

Unless you are using your GPU as GPGPU and doing CUDA and whatnot shit there is very little benefit for GPU having access to CPU cache.
That's quite presumptuous of you to say they won't use it, after the console makers are banking on it. You think they chose this architecture so that the PC master race can piss on them from online forums?

Plus semiconductor LAW OF THE WAY FUCKING REALITY WORKS that says that larger die surface leads to lower yields, no exceptions. And issues about clock speeds when you have so much shit together pretending to be one thing.
Irrelevant red herring.

Die integrated graphics cards are nothing new, always were relegated to worst entry level ranges, while HSA is a bit better it still is mainly AMD trying to rebrand concent to move it away from "entry level shit".
Bit better? Haha, no.
 

tuluse

Arcane
Joined
Jul 20, 2008
Messages
11,400
Serpent in the Staglands Divinity: Original Sin Project: Eternity Torment: Tides of Numenera Shadorwun: Hong Kong
That's quite presumptuous of you to say they won't use it, after the console makers are banking on it. You think they chose this architecture so that the PC master race can piss on them from online forums?
I think they chose it because it's the best bang for the buck they could get and they're tired of eating a billion or two in hardware subsidies at the beginning of each console generation.
 

bonescraper

Guest
Well to be fair, there was a lot of open world games and much less corridors at E3. That should continue.
Really? Can you name me those open world games?

On consoles:
MGS5... GTA clone Watchdogs... and... ugh.... shit.
Pretty much half of the games shown... TWitcher 3, Assassin's Creed 4, Mad Max, Batman: Arkham Anotherone, Dead Rising 3, Faggot Age Drei, Saints Row 4, The Crew, NFS: Not Underground 3... and i'm pretty sure there are others that slipped my mind.
 

Gord

Arcane
Joined
Feb 16, 2011
Messages
7,049
Id like to remind everyone that performance (aka far far above what new consoles are getting) AMD 8 core chips are being trounced by quad core Intels in all but the best possibly multithreaded applications. And games don't belong in that category.

Absolutely, but now they have to adapt to multithreading, because otherwise this hardware won't see light.
Everybody wins.
And nobody says that Jaguar is faster than Intels Core iSomething. It's much cheaper, though, which is why Sony and Microsoft go with it.

Unless you are using your GPU as GPGPU and doing CUDA and whatnot shit there is very little benefit for GPU having access to CPU cache. Plus semiconductor LAW OF THE WAY FUCKING REALITY WORKS that says that larger die surface leads to lower yields, no exceptions. And issues about clock speeds when you have so much shit together pretending to be one thing.

Why should we care about yields? This is only Microsofts and Sonys problem :smug:


Die integrated graphics cards are nothing new, always were relegated to worst entry level ranges, while HSA is a bit better it still is mainly AMD trying to rebrand concent to move it away from "entry level shit". And they succeeded with Intel going same route (poor Intel, trying to dabble in graphics, never worked, never will, unless they buy nVidia and avoid all the monopoly investigations to follow).
This is the first iGPU ever in that perfomance category. It's roughly comparable to a 7850, while every other APU is still significantly below 7750.
So basically the difference between an entry level and an gamer GPU.
 

Micmu

Magister
Joined
Aug 20, 2005
Messages
6,163
Location
ALIEN BASE-3
I'd Xbone that redhead.
What's with the retard with dead animal on his head BTW?

Edit: ouch I'm like 10 pages late. sorry guys, continue :(
 

Gurkog

Erudite
Joined
Oct 7, 2012
Messages
1,373
Location
The Great Northwest
Project: Eternity
My uncle is a senior engineer at Intel. I should try to get him to talk about Intel's work with integrated GPUs and what he thinks about AMD buying out ATI to corner an integrated cpug/gpu market.
 

DalekFlay

Arcane
Patron
Joined
Oct 5, 2010
Messages
14,118
Location
New Vegas
I thought Xbone only had 6GB?

Even if it has eight as well, I read somewhere MS and Sony have said about 50% will be used by games. And that counts for RAM and VRAM together. So... yeah. Still a massive step up though, thank god.
 

As an Amazon Associate, rpgcodex.net earns from qualifying purchases.
Back
Top Bottom