Eurogamer: XboxOne Esram Underestimated Signficantly More Powerful 192GB/s

Joined
Jun 11, 2012
Messages
209
Reputation
0
Daps
182
Reppin
Australia
The Xbox One: (Memory) 8GB of 2133MHz of DDR3 (Memory Bus) 256bit (Memory Bandwidth) 68.3 GB/s
The PS4: (Memory) 8GB of 5500MHz of GDDR5 (Memory Bus) 256bit (Memory Bandwidth) 176 GB/s
Simply by looking at these specs you can see that there is a major issue with the stats for the Xbox One, as the PS4 has over 2x the amount of Memory Bandwidth – which is the amount of information that can travel down the channels, giving more output

The one thing that most people seem to disregard is that Microsoft has included eSRAM (Embedded Static Random Access Memory) on the Xbox One as a means to potentially give it a boost. There will be 32mb of eSRAM which is capable of processing 102 GB/s, meaning that in the a best case scenario the XBO could overall be able to do 176 GB/s worth of processes. But, as the eSRAM is still only able to handle 32mb at any single instant, even while fast, it could still only be able to have 32mb of memory at a single instant compared to the PS4’s much heftier memory.

Daily Reaction: Microsoft Tries to Downplay PS4
 

backbreaker65

All Star
Joined
May 1, 2012
Messages
4,681
Reputation
-415
Daps
10,398
Reppin
The Passage of Time
Don't delete it now. Lets see what this dude is talking about.



I think these are the two most important things. Right?

For higher frame rates you need both higher bandwidth and lower latency. Ps4 has higher bandwidth Xbox has lower latency. I'm sure system architecture has been set up to adress the deficiencies in both memory setups, so it's safe to assume that both can do high frame rates.

For larger worlds. The most important thing is the amount of available ram. In this instance the ps4 would have a clear advantage since it does have more usable ram. The concern comes in when you anylize the ps4 GPU's ability to use all of that ram in an efficient way. We won't really know the answer to this until some games start coming out. As of now, similar off the shelf GPUs can't efficiently manage over 4 gb ram.

Also current games are not using anywhere near the 5 GB of ram that the Xbox one has. Maybe somewhere far down the line the extra 2gb of ram will come in handy for ps4. In my opinion it will probably come down to developer optimizations. It may take them to jump through a couple technical hoops to get the most out of the Xbox ram in order to match performace with the ps4.

As for the ps4 gpu being 40% more powerful. Yes that is a true statement if you look at the on paper specs. But that doesn't mean your gonna get 40% better performing/looking games. For one, system architecture and dev tools are very important when it comes to performace on these closed systems.

And for 2 in real life performance terms the difference just isn't that much. These systems are 10 times as powerful as last gen systems, yet they still can't guarantee 1080p 60fps performace. So the idea that the ps4 is just gonna severely outperform the Xbox because the gpu is .4 times as powerful, is just not based in reality.

In conclusion. Toward the end of the generation if devs want to push the ps4 to the brink of its capabilities, they may have to make some minor cut backs for the Xbox version. But for the most part performace will be equal. At the beginning of the generation Xbox might even have the edge due to microsofts history of quality dev tools and support.

You took an understanding Specs and made it into something else. I deleted it because it was the wrong thread. I subsequently found the post I got from one of the devs over at neogaf, responding to this myth that DDR3 is faster than GDDR5.

NeoGAF - View Single Post - Maybe this thread posting thing isn't all it's cracked up to be



Originally Posted by NBtoaster

Perhaps GDDR5 latency is an issue.

To make this clear once and for all:

There is no latency difference between DDR3 and GDDR5 when you just look at the bare memory chips itself. The latency difference between DDR3 and GDDR5 comes from the different memory controllers and the different scenarios of usage in PCs. DDR3 is usally used for CPUs while GDDR5 is used for GPUs.

The main task of GPUs is rendering which requires a lot of bandwidth. GDDR5 is a high performance RAM-type that is able to outperform DDR3 easily in a typical rendering scenario. To make sure that the maximum bandwidth is available most of the time, GPU memory controllers combine many memory accesses to bursts. This is bad for the latency and great for the bandwidth, but since GPUs don't need latency for rendering it doesn't affect performance.

CPUs don't need a lot of bandwidth since they're dealing with computing. DDR3 delivers enough bandwidth for any modern day CPU, so GDDR5 would be overkill for computing. The computing tasks of a CPU are are extremely latency-critical. That's why memory controllers for CPUs work in the complete opposite way as GPU memory controllers. Instead of burst accessing the RAM, you'll make sure that every memory access can happen as immediate as possible. This will kill bandwidth but it will have a positive impact on latency which will eventually increase the computing performance of the CPU.

What does this mean for the PS4?

The PS4 uses a state-of-the-art heterogenous processor architecture from AMD (the so called "HSA") which combines CPU and GPU in one single chip. To ensure that such a heterogeneous processor can deliver maximum bandwidth for rendering and minimum latency for computing, AMD integrates a special DRAM controller. This DRAM controller allows the CPU memory controller to have low latency access while at the same time the GPU memory controller can burst access the RAM. That's why Sony can go for maximum bandwidth with one big GDDR5 RAM pool without having any headaches because of latency.


Read this article from 2008, and read the conclusion
http://theovalich.wordpress.com/200...r5-analysis-or-why-gddr5-will-rule-the-world/

Conclusion

GDDR5 ramped up during 2008 and we expect the technology becoming a standard for GPU add-in-boards in 2009. ATI will migrate to GDDR5, so will Nvidia. With Intel joining the pack with Larrabee, volumes should be ready to drive the cost of GDDR5 into budget for next generation of game consoles, starting in the 2010-11 timeframe. This is by far the most developed and well-thought memory standard that lacks childhood sicknesses like DDR2 and DDR3. GDDR5 is coming to market as a complete product, and offers solid future roadmap, with Differential GDDR5 even surpassing XDR2 DRAM in quest for highest possible per-pin bandwidth. By that time, Differential GDDR5 should be cheaper than GDDR3 is today.
 

satam55

Veteran
Supporter
Joined
Jul 16, 2012
Messages
45,910
Reputation
5,285
Daps
90,320
Reppin
DFW Metroplex
Btw.... for any of you tech savy posters, did you watch Mark Cerny's presentation about the road to the PS4? They were considering using 128-bit GDDR5 RAM @ I think it was 86 GB/s, but applying eDRAM to make the bandwidth 1000 GB/s.

It would've been fast as hell... but would've been very complicated for developers to utilize so they went for the more convenient method to get more games ready for launch. It was also interesting to note the "time to triangle" used to denote how much time it takes for a team to start developing a game for each of the PS family:

PS1 : 1-2 months
PS2 : 3-6 months
PS3 : 6-12 months
PS4 : 1-2 months

It seems that going away from the exotic architecture used in the PS2 (EE) and PS3 (Cell) made life much easier for developers as they already have the tools needed to create games.

Digital Foundry, just got hit with the marketing stick ran a story without fact checking, back to square one.

More confusion from the Microsoft camp, for no reason at all. I find it funny that Mark Cerny gives his lecture "road to the ps4" and he talks about edram design, the next day MS decides to give Digital Foundry FUD.

The theoretics of the theoretical was just a hypothetical of theoretics, just a bunch of talk.

But one thing I do agree with from a MS exec. these specs mean nothing, show me the games and the experiences.

 
Last edited by a moderator:

Heretic

GOLDGANG...
Joined
Aug 7, 2012
Messages
24,742
Reputation
7,317
Daps
71,246
Reppin
Alabama
mysmilie_4726.gif
mysmilie_4726.gif
mysmilie_4726.gif

Where is ja rule to make sense of all this?
 

MeachTheMonster

YourFriendlyHoodMonster
Joined
May 24, 2012
Messages
74,138
Reputation
4,305
Daps
117,628
Reppin
Tha Land
You took an understanding Specs and made it into something else. I deleted it because it was the wrong thread. I subsequently found the post I got from one of the devs over at neogaf, responding to this myth that DDR3 is faster than GDDR5.

NeoGAF - View Single Post - Maybe this thread posting thing isn't all it's cracked up to be



Originally Posted by NBtoaster

Perhaps GDDR5 latency is an issue.

To make this clear once and for all:

There is no latency difference between DDR3 and GDDR5 when you just look at the bare memory chips itself. The latency difference between DDR3 and GDDR5 comes from the different memory controllers and the different scenarios of usage in PCs. DDR3 is usally used for CPUs while GDDR5 is used for GPUs.

The main task of GPUs is rendering which requires a lot of bandwidth. GDDR5 is a high performance RAM-type that is able to outperform DDR3 easily in a typical rendering scenario. To make sure that the maximum bandwidth is available most of the time, GPU memory controllers combine many memory accesses to bursts. This is bad for the latency and great for the bandwidth, but since GPUs don't need latency for rendering it doesn't affect performance.

CPUs don't need a lot of bandwidth since they're dealing with computing. DDR3 delivers enough bandwidth for any modern day CPU, so GDDR5 would be overkill for computing. The computing tasks of a CPU are are extremely latency-critical. That's why memory controllers for CPUs work in the complete opposite way as GPU memory controllers. Instead of burst accessing the RAM, you'll make sure that every memory access can happen as immediate as possible. This will kill bandwidth but it will have a positive impact on latency which will eventually increase the computing performance of the CPU.

What does this mean for the PS4?

The PS4 uses a state-of-the-art heterogenous processor architecture from AMD (the so called "HSA") which combines CPU and GPU in one single chip. To ensure that such a heterogeneous processor can deliver maximum bandwidth for rendering and minimum latency for computing, AMD integrates a special DRAM controller. This DRAM controller allows the CPU memory controller to have low latency access while at the same time the GPU memory controller can burst access the RAM. That's why Sony can go for maximum bandwidth with one big GDDR5 RAM pool without having any headaches because of latency.


Read this article from 2008, and read the conclusion
100th Story- ANALYSIS: Why will GDDR5 rule the world? | Theo's Bright Side Of IT

Conclusion

GDDR5 ramped up during 2008 and we expect the technology becoming a standard for GPU add-in-boards in 2009. ATI will migrate to GDDR5, so will Nvidia. With Intel joining the pack with Larrabee, volumes should be ready to drive the cost of GDDR5 into budget for next generation of game consoles, starting in the 2010-11 timeframe. This is by far the most developed and well-thought memory standard that lacks childhood sicknesses like DDR2 and DDR3. GDDR5 is coming to market as a complete product, and offers solid future roadmap, with Differential GDDR5 even surpassing XDR2 DRAM in quest for highest possible per-pin bandwidth. By that time, Differential GDDR5 should be cheaper than GDDR3 is today.
HSA makes up for both latency and bandwidth deficiencies. Both the Xbox and Ps4 benefit from this which is why I said.

meachthemonster said:
I'm sure system architecture has been set up to adress the deficiencies in both memory setups, so it's safe to assume that both can do high frame rates.

At the end of the day, they are very similar setups and they both have very similar CPU/GPUs the different types of ram won't make any difference as far as performance goes.
 

Rico

Pro
Joined
Jun 22, 2012
Messages
3,255
Reputation
-180
Daps
960
Reppin
NULL
It doesn't matter to me... Im going to get an xbox... but if the ps4 is that much better I'd get one too.
 

daze23

Siempre Fresco
Joined
Jun 25, 2012
Messages
32,670
Reputation
2,755
Daps
45,502
these are just more reasons I don't wanna get stuck with hardware I can't upgrade
 

PS5 Pro

DC looking a 1/2 seed right about nuh
Joined
Feb 28, 2013
Messages
32,394
Reputation
-10,636
Daps
22,236
Reppin
The Original Rec Room Gang
mysmilie_4726.gif
mysmilie_4726.gif
mysmilie_4726.gif

Where is ja rule to make sense of all this?

Someone post a story about Xbox being good at something, and a sony fan will always pull up a post from Neogaf and be like "See, its not true :sadbron:"

Is the Esram more powerful than originally thought? I know if it is, and Microsoft is saying it is? If its not, developers will not remain quiet about it
Then people will say "why you lie" and they'll be like... well, depends.

What happens when sony lies?

All I can do is look at the games, and Xbox games look superior to me as of today. So I'm good with my more or less powerful setup :win:
 

Rico

Pro
Joined
Jun 22, 2012
Messages
3,255
Reputation
-180
Daps
960
Reppin
NULL
Someone post a story about Xbox being good at something, and a sony fan will always pull up a post from Neogaf and be like "See, its not true :sadbron:"

Is the Esram more powerful than originally thought? I know if it is, and Microsoft is saying it is? If its not, developers will not remain quiet about it
Then people will say "why you lie" and they'll be like... well, depends.

What happens when sony lies?

All I can do is look at the games, and Xbox games look superior to me as of today. So I'm good with my more or less powerful setup :win:

:ld: that's the dilemma... I don't want to play any of those Sony Games and its about an 90% risk that online will be subpar and they don't have any free ps4 games to give away and mask that. I do more with my xbox than game... and Sony probably didn't know that. But if xbox plays MKV's than I'm good
 
Top