What do you expect from next gen consoles in terms of graphics?

yseJ

Empire strikes back
Joined
Apr 30, 2012
Messages
47,135
Reputation
2,894
Daps
69,612
Reppin
The Yay
Moore's law hasn't run out or slowed down. Computing power continues its exponential rise. Sony and Microsoft could choose to not release their news systems till 2020 but that doesn't mean they did it because the technology hasn't progressed.

There are 3 guarantees in life. Death. Taxes. And computers continuing to advance exponentially.
you keep citing moores law...
I gotta ask you this, myself being a computer engineer...do you even understand moores law ? :shaq2:

at its core, the law states exponential growth in number of transistor per chip. even if we look at it as pure processing exponential growth, that still doesnt relate to graphics directly.

at some point, we'll be able to process some of the most intricate photorealistic graphics in the world, but who will actually put it together ? who will actually create billions of polygons that are perfect ?

moores law is great to show that a brute force algorithm thought to take too long in the late 80s is now trivial. not as clear-cut for graphics
 

Liquid

Superstar
WOAT
Joined
Apr 30, 2012
Messages
37,121
Reputation
2,655
Daps
59,922
:comeon: I guess some people are just always pessimistic. :yeshrug:

Once again you have NO EVIDENCE that games aren't advancing graphically. You keep talking about short term windows like from 08-12, but you ignore the fact that there hasn't been a new system developed during that period. What you seem to not understand is that PC gaming is NOT ubiquitous. If there isn't a pool of millions of potential buyers, then no one is gonna spend $100 million to develop a game that only 1-10,000 people might buy.

And even if PC gaming was ubiquitous, there isn't a uniform system in which a developer can work on so he can maximize the graphics. Developers are instead gonna build their games around systems which are the most widely used (that is the 7 year old consoles) and just port the games onto PC. It's the reason why games like Uncharted on PS3 still look as good as most games on high performance PCs. Sure most PC games look better, the key however is that the game wasn't built to maximize those specs. So its wrong to extrapolate that because PC games haven't progressed tremendously in terms of graphics in the last 5 years, that a wall has been hit in terms of graphics.
Breh these last 4-5 years have been the slowest in terms of big leaps we have made in terms of graphics. The PC really hasn't been the "lead" platform in terms of the mainstream since the mid 90's so your whole argument is flawed from the jump.

Just look at what I consider to be the milestones from the PS2-Current Era...all platforms.

So lets have a little breakdown of games that I remember making a leap in terms of graphics.

2001 - Max Payne 1 (PC) - MGS2, ICO, and FFX on the PS2 - Halo on the XBOX
2002 - Metroid Prime and Resident Evil Remake on the GC
2003 - Zelda Wind Waker, Max Payne 2 on the PC (still holds up well)
2004 - DOOM 3 and Half-Life 2 on the PC, HALO 2 on the XBOX
2005 - Resident Evil 4
2006 - Oblivion
2007 - Crysis...nothing else was close
2008 - Dead Space, MGS4
2009 - Batman Arkham Asylum on the PC, Uncharted 2
2010 - Metro 2033, Heavy Rain, Mass Effect 2
2011 - BF3, Uncharted 3, Skyrim, L.A. Noire
2012 - Far Cry 3, Halo 4

Now lets look at 2007 to now. Crysis holds up to anything else out there and is going on 6 years. Like I linked before, Batman Arkham Asylum was pretty incredible on the PC...that shyt looks amazing TODAY compared to other games that have been released...even Skyrim and Far Cry 3. Oblivion to Skyrim is another example. No doubt that Skyrim looks better, but I don't think the difference is a major step up compared to what a 5 year gap used to mean in the past. Hell for a while DOOM 3 looked damn good given its age.

Crysis came in and crushed the buildings thats the last game that came out that people were like :whoa: my card can't handle that...AT ALL. Not even the highest NVIDIA card could bench it. Metro 2033 was a leap with its tesselation, phsyx, and after effects, but I don't think it looks THAT much better than anything else to tell you the truth.
 

yseJ

Empire strikes back
Joined
Apr 30, 2012
Messages
47,135
Reputation
2,894
Daps
69,612
Reppin
The Yay
Sorry but I see no evidence of a wall. I'm not old enough to remember gaming in the early or mid-90s.
:heh:

All I know is that since 98, the rate of technological change has only ACCELERATED not decelerated. Its seems as though each new system only increases the speed of change rather than slow it down.
funny you should say this since 98 is widely considered the peak of tech change and since then it declined, look it up


The adoption of HD technology required changing the physical infrastructure of our country. Computational technology on the other hand doesn't require that. All it requires is the ability to flip those 0s and 1s even faster so that more information can be encoded per second.
more information encoded per second does not mean getting steady advances in graphics
yet again, you dont seem to understand that the 'wall' in graphics isnt bound by computational power but rather complexity of the actual graphics constructed and tools to make it so.

in the late 90s you could construct a tetrahedron in 3ds max, texture it and call it a day.

now, you have to have a huge team of ppl just doing models, and animations are so expensive its ridiculous.
 

daze23

Siempre Fresco
Joined
Jun 25, 2012
Messages
32,609
Reputation
2,755
Daps
45,377
maxpayne3_2013_02_04_etuw4.png
 

Liquid

Superstar
WOAT
Joined
Apr 30, 2012
Messages
37,121
Reputation
2,655
Daps
59,922
The Max Payne series has always had amazing texture work. I remember booting into Max Payne 2 about 2 years ago and could not believe it still looked relatively good for an 8 year old game lol.
 
Joined
May 16, 2012
Messages
39,600
Reputation
-17,856
Daps
84,290
Reppin
NULL
you keep citing moores law...
I gotta ask you this, myself being a computer engineer...do you even understand moores law ? :shaq2:

at its core, the law states exponential growth in number of transistor per chip. even if we look at it as pure processing exponential growth, that still doesnt relate to graphics directly.

at some point, we'll be able to process some of the most intricate photorealistic graphics in the world, but who will actually put it together ? who will actually create billions of polygons that are perfect ?

moores law is great to show that a brute force algorithm thought to take too long in the late 80s is now trivial. not as clear-cut for graphics

Not a computer engineer but I've read enough to know what the exponential rate of computing power means. It seems as though you have less knowledge than I do on this topic even though you claim to be a computer engineer. Here's Epic's Tim Sweeney explaining how more RAW computing power can produce better graphics

at the 4:25 mark he says that to make photo-realistic games we only need 50x more computational power than the graphic cards of today.....he puts it at 2 more generations before we get games that are INDISTINGUISHABLE from real life.......that makes my mid-2020s date sound slightly early but still at about the right time

we are not very far away from playing games that are equivalent to being in the Matrix.....thanks for Moore's law
 
Last edited by a moderator:

daze23

Siempre Fresco
Joined
Jun 25, 2012
Messages
32,609
Reputation
2,755
Daps
45,377
The Max Payne series has always had amazing texture work. I remember booting into Max Payne 2 about 2 years ago and could not believe it still looked relatively good for an 8 year old game lol.

Max Payne 3 had an incredible amount of detail. Rockstar don't fukk around

maxpayne3_2013_02_04_bou7y.png
 
Joined
May 16, 2012
Messages
39,600
Reputation
-17,856
Daps
84,290
Reppin
NULL
If you watch Tim Sweeney's speech around the 9 minute mark. He says that we know exactly how much computational power it will take to make photorealistic graphics. We also have the correct algorithms to produce it. He estimates the computing power necessary to do this at 5,000 teraflops. Which is slightly less than what the most powerful supercomputers of today are doing with rooms full of processors. Moore's law tells us that in 10 years that computing power will be available in a $1,000 laptop and a couple of years later in a $400 console.

We'll be able to produce creatures in the game that look human but as he says, they might not behave human because we still don't have algorithms for human thought, speech, etc. So the problem is not graphics. We know how to do that. Its only a question of computing power on that front. The problem for the future of gaming as always will be AI and gameplay, because we still don't have the right algorithms to solve those problems.
 
Joined
May 16, 2012
Messages
39,600
Reputation
-17,856
Daps
84,290
Reppin
NULL
Here is a intel technician showing off real-time photo-realistic rendering. [ame="http://www.youtube.com/watch?v=su504HbsX8c"]Next Generation Photo-Realistic Rendering - YouTube[/ame] Sure the image being rendered is only a crown, however, its still an example of where the technology is right now and how much it can advance in just a decade.

Photo-realistic video games should be possible in 10-15 years. The technology already exists. Its just a matter of it becoming cheap/ubiquitous so that it can used to mass produce games.
 

Malta

Sweetwater
Joined
Apr 30, 2012
Messages
66,896
Reputation
15,250
Daps
279,767
Reppin
Now who else wanna fukk with Hollywood Court?
I'm sorry but this takes the cake for retarded comments in this thread. Can you provide a single shred of evidence that we have reached a point of diminishing returns? Computational power is the one thing in life that has so far seemed to defy this law of diminishing returns. Moore's law continues to give us more powerful computers that make tasks that were impossible just 10 years ago, ubiquitous today.

Compare video games of today to those of 10 years ago and tell me we've reached a point of diminishing returns. I'm not that old but I've lived thru the PS1 to PS3 transitions and it has been nothing short of astounding. Computing power continues to advance exponentially. CGI in movies today are immensely better than those of 10 years ago. There is no reason to believe that it won't continue to get better. Everyone and their mama has been calling for the death of Moore's law for almost 50 years now but it keeps chugging away. DARPA is now planning to build an exascale supercomputer by 2018. That is a computer 1,000x faster than the fastest supercomputer of today. That work will have the effect of making laptops and consoles in 2020, 1,000x faster than they are today. You don't need to be brain surgeon to realize that we are far from the point of diminishing returns.

Don't bias reality with the fact that we haven't had a new system in 7 years. The only reason it looks like not much has happened lately is because developers have been working on hardware that is 7 years old. We have yet to see serious developers work with technology as advanced as what will be available in the new playstation and xbox. Lets wait until the new games come out before you start making declarations that have no basis in fact.


You're expecting CG level graphics, yet you're calling my comment retarded :dry:
 

Data-Hawk

I have no strings on me.
Joined
May 6, 2012
Messages
8,423
Reputation
1,995
Daps
16,331
Reppin
Oasis
Some people are just always looking for bad news. The simple truth is that for all the people saying graphics have hit a wall. WATCH ANY fukkING NEW MOVIE. They are all using CGI that is rendering images that look stunningly real. The only reason we can't get those in video games right now is because of COST.

The movie industry is using farms of supercomputers working for weeks to render movies like Avatar or Life of Pi. Those real life graphics now cost millions of dollars and take weeks. However thanks for MOORE'S LAW, which has held true for the last 60 years, that computational power will be available on a $1,000 laptop in the mid-2020s.

The issue of whether graphics can get better is NOT a question of if, but of WHEN. All you clowns saying we've hit a wall need to read a fukking book rather than doing keyboard prognostication.

Even if that power was on desktop/laptops. Games still wouldn't look like that because games are more Intensive then movies. also you have to think about the power/heat that would be generated. Think about the PSU/Cooling systems you would need to get away with generating that type of real-time rendering. I couldnt even imagine at this time how one PSU could handle all of that when thr movie industry is using multiple computers stored in some cold ass room and its still taking them a long time render those scene

I dont have the clip with me, but John Carmack give a speech a few years back about the difference between the gaming industry and the movie industry and why the movie industry moves faster than games. he also stated that after 1 CPU you lose return on investment as you add more CPU's
. Desktop hit that wall after we cross the 1 GHz line, hence everything being Dual-Core/Quad core etc.

Yes computers will continue to become faster but its going to take alot more than just throwing in more cores/upping the ram etc. Yes walls do exist. Look at NASA which has some of the smartest ppl in the world, how much have they advanced since the 60's?


In the end I dont anybody is totally disagreeing with you Swag, you're just making it sound like Cost is the reason why we arent there yet. In my opinion we would need some type of breakthrough and i believe its going to come from mobile devices as they have to deal with adding more power and finding a way to not kill your cell battery.
 

Data-Hawk

I have no strings on me.
Joined
May 6, 2012
Messages
8,423
Reputation
1,995
Daps
16,331
Reppin
Oasis
Also when we say wall its more like slowing down for a period of time and something will eventually come along that will blow everybody a away It just wont be anytime soon. We need to figure out how to make things go faster using less resources and the #1 thing is using less HEAT/Power. lol Imagine your electric bill if you had movie like graphics on your desktop updating at 60 FPS.
 
Top