CryEngine 3 'beyond' next-gen console capability

I may not like consoles? Where did I say that?

And if by "hardcore" (good lord do I hate these senseless titles) you actually mean only those who could actually have access to whatever computer hardware was available in the late 40's throughout the 60's, then yes.

I meant "you" as in people in general not at anyone specifically. And yeah I meant the 40's through the 60's. :rolleyes: Up until probably the mid 90's pc gaming wasnt a big part of the gaming pie. Yes there where plenty of people on the pc gaming, but not like it is today. The gap has narrowed considerably in the last 15 years. Thats all I meant.
 
First: This was NOT a realtime demo, if you want to compare the Square Enix video to something from 2005, use the FFVII PS3 tech demo, which looks worse than almost every good PS3 game.

Second: Back when real Killzone 2 footage was shown, I compared it in detail with that old video and it actually surpasses it in many aspects. Weapon detail, geometry, fire dynamics, draw distance and some other stuff are better in the actual game and before you start to discuss this further, yes, I am willing to make the same comparison again just for you, but not now, because I have other stuff to do.
Okay, I didn't realise the SE tech demo was real time, but I think you still get my point: Don't take a tech demo on unspecified hardware to actually mean much. That's my point of view, at least.
You do not have to upgrade your components every year to be able to play the latest games. The great thing about PC is the ability to upgrade when YOU want; the user has the ability to choose what level of graphics quality they would like to utilize. Low-medium-high-very high-ultra and even custom settings to achieve any level in between any of the pre-determined settings. It's not harder than that. You can have a 5 year old PC and still be able to play the latest games.
Okay, you're saying you don't have to upgrade every year, which is true. I think upgrading every five years is actuallly pretty reasonable across the board, even for more casual gamers. But, and this is important, I'd say:

If the gamers that currently flock the consoles were to invest a similar budget into PC gaming, developers would still have to keep in mind that a rather large part of the market is using hardware that is five years old. If they want to keep selling stuff as well as they're doing now, they'd still have to make sure that their games look somewhat good on five year old hardware.

So, I kinda fail to see what benefit it would have if there were no consoles. As long as you can't get cutting edge hardware every two our three years for the same budget that a console would ask you to invest, large parts of the market would still be playing on outdated hardware. Nothing would be gained, if you're asking me.
Pirated PC games have the DRM stripped out. I think consoles are tougher because the hardware has to be modded.
Exactly. With almost all Pc games, pirating goes like this: Download Daemon Tools and uTorrent, go to torrent website, download a game by a given release group, mount image, install game, copy & paste cracked files.

Almost any kind of DRM can be cracked and, to my knowledge, has been cracked. Even fully server dependent games such as World of Warcraft.

Now, on a console, you usually have to modify the hardware and/or jailbreak it, which is far more complex than downloading two programs that are openly available anywhere and copy-pasting a few files. You usually have to create physical copies of the games, too, making things even more inconvenient.

The more open nature of a computers operating system makes it far easier to circumvent any sort of copy protection or DRM tactic. Anyone who ever wanted to run homebrew software on a mainstream console should know that it is far more inconvenient to do so than it would be on a PC, where stuff like that is absolutely normal.
 
What is gained is the ability to utilize all aspects of a developed engine.

Not everyone will be able to experience it; but why should those who can be restricted to limitations set by those who can't? Sounds like communism.

Like I said, you can scale back graphical settings, which would inherently turn off various techs like AA, nothing is lost for those users who cannot afford higher end hardware. They would still get the same/higher level of detail than console and have the ability to upgrade when they deem necessary.

For example my brother only spent $500 on his gaming PC. He can run BF3 at 1680x1050 on very high settings and achieve 60 FPS. Consoles limit Frostbite 2 to 720p at 30 FPS at something around the equivalent of medium graphical settings (not to mention the bottle-neck in console compute which limits the players on screen to 24 as opposed to PC's 64).

We are moving into 4k resolutions, 3D gaming, on multiple screens. Consoles will not be able to do this for several generations.
 
Last edited:
I would like to add: Most computers and the content that is made on them can push far past what any PC or Console can display at even 30fps.

Most game asset are made to a much higher degree of detail then what you see even on the best $10K+ gaming rig.

The artist can make graphics far better. Just look at these renderings. (They take anywhere from 1 to over 24 hours to render just one frame, many of these people have more CPU power than most high end gaming rigs)

http://www.raph.com/3dartists/artgallery/

But the different hardware (All of it OpenGL VS DirectX, Ati vs Nvidia, Intel vs AMD... the list goes on), is what is holding the games back. Sure as a whole, if every gamer had a $10K computer (and consoles died) with more RAM then many computer have hard-drive space, most game would look awesome.
But... We still have bottlenecks.
  • Each GPU and CPU has different tools. So hardware compatibility is still and always will be BIG a problem..
  • Each game still needs to find skilled artist to make them.
  • Even if you give the artist more space, they will find the limits and start complaining.
  • EDIT, I forgot to mention monitor resolution. If 3 monitors won't max the GPU, gamers will add more.
Down sizing the graphics from the supercomputer limits back to what most people can display would mean not just changing some LODs but making a new game engine (If you want it to be multi-hardware). So that is why very few games are made to push "very fast" computers.

Any hardware difference is going to hurt games, so I would think if Ati, Nvidia, Intel, AMD all picked a very strict standard (Like the W3 did for HTML, only this time everyone plays by the rules) for everything a game could ever use (GPU calls, vertex buffers, particles.... this list is almost infinite). Then we would have a huge bottle neck removed.

Consoles are not the only thing hurting PC games (And really, the only thing they may be hurting is sales).
 
Last edited:
Yeah, no it didn't.

The videogame market first started off with Arcade machines and consoles like Pong, or stuff like video poker/football handhelds. Yes, the first game prototypes were on big computers, no one really used PCs back then for gaming though, because almost no one had a PC. A bigger PC gaming market started with the C64, but consoles from Atari for example were still the gaming platform number one, until the gaming market crashed down. Then, Nintendo released the NES (another console) and gaming became popular again.
 
Back