Antialiasing... Do you think PD will do it?

  • Thread starter Thread starter Dracwolley
  • 104 comments
  • 8,194 views
All the shadow jaggies etc. will be fixed in final GT5 and game will run in smooth 60 fps anywhere. I am running Prologue in 1080p on FullHD Samsung, as well on borrowed friend's HD ready LG. What is 720p good for? The picture is rendered in 1280x720p but HD ready TV has 1366x768 resolution and sadly it rescales the picture again resulting in really muddy picture.
 
All the shadow jaggies etc. will be fixed in final GT5 and game will run in smooth 60 fps anywhere. I am running Prologue in 1080p on FullHD Samsung, as well on borrowed friend's HD ready LG. What is 720p good for? The picture is rendered in 1280x720p but HD ready TV has 1366x768 resolution and sadly it rescales the picture again resulting in really muddy picture.

Well i hope the shadow jaggies are gone, but the shadows will still seem a bit hard edged and blocky. I don't think the PS3 will be capable of soft shadows in GT5.

http://www.gamedev.net/reference/articles/article2193.asp

comparisonf.jpg


GT5 is akin to the image on the right. Well maybe not quite that bad....
 
Well its either Antialiasing with 30fps and less cars, or 16 cars on road and 60fps with jaggies and pixalated edges. Since GT5:P is having some problems already with keeping a solid 60fps i think that the PS3 just wont do much more.
 
Well i hope the shadow jaggies are gone, but the shadows will still seem a bit hard edged and blocky. I don't think the PS3 will be capable of soft shadows in GT5.

http://www.gamedev.net/reference/articles/article2193.asp

comparisonf.jpg


GT5 is akin to the image on the right. Well maybe not quite that bad....

:D Tilt your head to the left, that picture of shadow casting looks like a gentleman standing errr...well use your imagination LOL

The shadows really come down to can the engine and ps3 manage to do soft stencil shadows? I dont think so. They might be able to improve them in their current state, but the chances of seeing full soft shadows are low because its not a small step, its performance intensive
 
:D Tilt your head to the left, that picture of shadow casting looks like a gentleman standing errr...well use your imagination LOL

The shadows really come down to can the engine and ps3 manage to do soft stencil shadows? I dont think so. They might be able to improve them in their current state, but the chances of seeing full soft shadows are low because its not a small step, its performance intensive

Lol just noticed that :lol:

Yeah, i don't think we will see soft shadows either, it's still an independant option in the menus of a lot of PC games, because it's quite intensive.

Then again this video shows soft shadowing being done by the CPU. Cell is pretty damn powerful.

 
Lol just noticed that :lol:

Yeah, i don't think we will see soft shadows either, it's still an independant option in the menus of a lot of PC games, because it's quite intensive.

Then again this video shows soft shadowing being done by the CPU. Cell is pretty damn powerful.

True enough. But then when you read what CPU they are running on, and what resolution and how fast....it does still show how intensive it is. I think its a minor feature and CPU threads should be reserved more for AI and other more important things
 
I do not want PD to try to improve visuals if 60FPS is not achieved.
+1

It's tough as it is in GT5P, and further improved graphics could only mean loss of FPS, which is a shame because AA would be nice indeed...
 

Its an advert for intel more than anything. For years we have known that the merging of CPU and GPU was on the horizon. Some of the things said are pipe dreams however and half truths. Full realtime ray tracing for example in games is a pipe dream- rasterisation isnt going anywhere. The article is very much based on software experience, Tim sweeney is of course a software engineer. Saying software rendering can be more flexible than a fixed pipeline is mind numbingly obvious. Actually making it happen so that it is remotely as fast as hardware rendering is another thing however. Wishful thinking right now.

http://www.tomshardware.com/reviews/ray-tracing-rasterization,2351.html
 
Last edited:

Read the article i linked to. Ray tracing has a lot of myth surrounding it, and a lot of hype from the casual crowd. In truth its not the ultimate be all end all solution, its not something the games industry should be aiming at right this second

http://www.tomshardware.com/reviews/ray-tracing-rasterization,2351-9.html

In short, the advantages that ray tracing can actually bring like reflections and transparency arent that important for most games. Especially versus the massive performance cost. Does a real time ray traced bugatti veyron REALLY look that much better than a rasterized version? Sure it can have more accurate reflections but as the article states, the eye and brain dont typically care that much

3789326444_4136467ba2_b.jpg


VERSUS

8449-nvidiaraytrace.jpg


is real time ray tracing REALLY that much superior to advanced rasterization in your eyes?? Anyway. Im really off topic here and probably should be wound back.
 
Last edited:
If we forget ray tracing vs rasterization subject for awhile, i really think combining cpu and gpu into a one unit will help programming and that will make games easier to make; you will have to code only one processor.

And case with PS3 has been that software and hardware designers did not communicate enough.
 
If we forget ray tracing vs rasterization subject for awhile, i really think combining cpu and gpu into a one unit will help programming and that will make games easier to make; you will have to code only one processor.

And case with PS3 has been that software and hardware designers did not communicate enough.

Ps3 is case in point about pig headedness hardware engineering over software consultation. Exactly why Tim Sweeney uses Ps3 as an example of cost and difficulty, and emphasising ease of use over maximum theoretical performance
 
Ps3 is case in point about pig headedness hardware engineering over software consultation. Exactly why Tim Sweeney uses Ps3 as an example of cost and difficulty, and emphasising ease of use over maximum theoretical performance

Yes but in the long term, when the developers aquire the skills, this pig headed hardware will be superior. Consoles have a much longer lifespan than PC.
 
Yes but in the long term, when the developers aquire the skills, this pig headed hardware will be superior. Consoles have a much longer lifespan than PC.

The amount of times i have gone over this with people doesnt bear thinking about :grumpy: in the long term it wont count for a lot. Because by the time most developers get a grip the system will be last generation.

Unfortunately for sony, microsoft got it right IMO, a bunch of software engineers designing a console architecture and SDK gets you the best console development environment ever seen. For all that raw CPU performance Ps3 cant compete. By the time it does it will be irrelevant. im sure sony have learnt a lesson here though.

Ease of use and thus reduction of time, cost and difficulty always beats out brute force and ignorance. ESPECIALLY when a cutting edge console is driven by its software and third party support. After all games are a business. Publishers are interested more in their bottom line than anything else, it'll always be the case like it or not. That is the fatal mistake sony made.
 
Last edited:
Also there was a bit of greedy thinking from sony that they did not want 3rd party necessarily even know how to make best use of the ps3, so that sony stuff would look the best. Also they thought that they would still have all japanese 3rd party games as exclusives for free. Also MS used sony "blueprints" to make processor for 360.
Sony have learned much this generation hard way.
 
The amount of times i have gone over this with people doesnt bear thinking about :grumpy: in the long term it wont count for a lot. Because by the time most developers get a grip the system will be last generation.

Unfortunately for sony, microsoft got it right IMO, a bunch of software engineers designing a console architecture and SDK gets you the best console development environment ever seen. For all that raw CPU performance Ps3 cant compete. By the time it does it will be irrelevant. im sure sony have learnt a lesson here though.

Ease of use and thus reduction of time, cost and difficulty always beats out brute force and ignorance

It depends how you look at it. Sony have partnershipped in creating a processor that will outlive this generation of consoles and most likely be used in PS4. All sony has to do for the most part with PS4 is scale the same tech up. On the other hand microsoft will most likely have to develop a whole new architecture entirely which will also envolve a whole new learning process, whilst Sony developers will be able to apply the same skills from PS3 to PS4. The whole point of Cell was it was designed to be scaleable, it wasn't cheap, i can't see Sony using a completly different processor in PS4.
Microsoft on the other hand will most likely go along the same lines as Sony for there next console, so technically you could say they are a generation behind.
 
It depends how you look at it. Sony have partnershipped in creating a processor that will outlive this generation of consoles and most likely be used in PS4. All sony has to do for the most part with PS4 is scale the same tech up. On the other hand microsoft will most likely have to develop a whole new architecture entirely which will also envolve a whole new learning process, whilst Sony developers will be able to apply the same skills from PS3 to PS4. The whole point of Cell was it was designed to be scaleable, it wasn't cheap, i can't see Sony using a completly different processor in PS4.
Microsoft on the other hand will most likely go along the same lines as Sony for there next console, so technically you could say they are a generation behind.

Name a developer outside of Sony that wants to make a game that will have double the development costs of its rival platform, maybe double the time, more resources needed. More challenges. Then the end result comes out at the very BEST, a gnats dick better looking than said rival platform. 9/10 times worse. Because there are schedules to keep, budgets, deadlines.

Then tell said publisher that said game will be exclusive to the lowest install base machine and most expensive machine on the market through the worst global recession in two decades. :ouch:

As for sony having a processor for the next generation already then its a dangerous gamble. Hardware moves on extremely fast, its practically suicidal to rely on an architecture for a decade or more no matter how much you fiddle and tweak it. Just ask intel....Microsoft will simply look at what is the best option at the time for the money and go with that. That philosophy netted them xbox, and it netted them 360. Worked so far.

The ONLY way Ps3 can get out of this rut it is in soon enough is to sell much much faster than 360 and overtake it as the lead platform by a very long way. But of course such is the vicious circle, you need good software and lower hardware prices to make that happen. This also takes time, time ps3 doesnt have if the next generation appears in 2012/2013 as most predict
 
Last edited:
Yes but in the long term, when the developers aquire the skills, this pig headed hardware will be superior. Consoles have a much longer lifespan than PC.

That might happen, but I doubt it. In reality, I believe PS3 will get crappier ports, less optimised games and overall less games. Sony may want to support the PS3 for its 10 year life cycle (even after the PS4 comes out) the same way they have with the PS2, but game developers may not see it the same way.

You'll get a handful of well optimised games like GT5 and Killzone, but beyond that PS3 will suffer in quality and quantity of games.

All consoles have a longer lifespan than PCs... but PCs are (usually) required for every day tasks anyway, so the cost of them can be slightly better justified. Plus, peripherals and games are cheaper. When I bought my $1300AUd PC (when Xbox 360 was still over $500 and PS3 was probably $700 or more), I justified it by realising I needed a PC anyway, and a crap PC + console was more expensive than a good gaming PC.
 
Name a developer outside of Sony that wants to make a game that will have double the development costs of its rival platform, maybe double the time, more resources needed. More challenges. Then the end result comes out at the very BEST, a gnats dick better looking than said rival platform. 9/10 times worse. Because there are schedules to keep, budgets, deadlines.

Then tell said publisher that said game will be exclusive to the lowest install base machine and most expensive machine on the market through the worst global recession in two decades. :ouch:

As for sony having a processor for the next generation already then its a dangerous gamble. Hardware moves on extremely fast, its practically suicidal to rely on an architecture for a decade or more no matter how much you fiddle and tweak it. Just ask intel....Microsoft will simply look at what is the best option at the time for the money and go with that. That philosophy netted them xbox, and it netted them 360. Worked so far.

It's ironic you should say that because a lot of intels latest chips are based on hardware prior to but not including Pentium 4. In a way they have gone full circle.
 
It's ironic you should say that because a lot of intels latest chips are based on hardware prior to but not including Pentium 4. In a way they have gone full circle.

What intel did with netburst was select the best architecture for the time period. The irony being as time went on, the better solution just happened to be down the road of older ideas. This isnt really that relevant to my point apart from the fact you cant predict the needs of the future with ultimate accuracy, so you have to adapt quickly. Sony have created a solution to a problem in gaming that doesnt exist.

CELL is the answer to a question no one asked.
 
What intel did with netburst was select the best architecture for the time period. The irony being as time went on, the better solution just happened to be down the road of older ideas. This isnt really that relevant to my point apart from the fact you cant predict the needs of the future with ultimate accuracy, so you have to adapt quickly. Sony have created a solution to a problem in gaming that doesnt exist.

CELL is the answer to a question no one asked.

Most manufacturers in general are going down this multi core route, both Processors and GPU's. Lot's of simple cores on a die as opposed to a few complex ones. Following this line of reasoning it would be reasonable to assume that for the most part all that needs to be done with cell is scale it up. More cores on a die. The benefits of multi core technology as i'm sure you know is it scales very well, pretty much 1:1.
 
Most manufacturers in general are going down this multi core route, both Processors and GPU's. Lot's of simple cores on a die as opposed to a few complex ones. Following this line of reasoning it would be reasonable to assume that for the most part all that needs to be done with cell is scale it up. More cores on a die. The benefits of multi core technology as i'm sure you know is it scales very well, pretty much 1:1.

If only it were just about how many cores you can have or how many a developer might want. Simple terms, in truth it isnt, its about how your architecture performs for the task needed. This is exactly why a machine with three cores and a custom GPU somehow manages to outperform a machine with 9 cores and a bunch unusable by the dev.

Probably because the first time you start to design a console the number one question is what is the best for the task in hand. i.e gaming!! Microsofts answer was a custom machine built ground up for games. Sonys answer was a CPU mostly intended and practical for other markets. The difference is stark from the offset. Microsoft designed the hardware with a sole focus, and Sony took their eye off the ball with nearly everything. Bluray is another perfect example. Do you really think it came about as sony were sat looking to answer the question of games needing more disc space? Or where they truthfully just looking for a platform to launch their next format?

Did the question ever actually exist for this generation??
 
Well i hope the shadow jaggies are gone, but the shadows will still seem a bit hard edged and blocky. I don't think the PS3 will be capable of soft shadows in GT5.

http://www.gamedev.net/reference/articles/article2193.asp

comparisonf.jpg


GT5 is akin to the image on the right. Well maybe not quite that bad....
That looks more like a simple blur edge effect to hide the jaggies on low-res shadows.

Anyway an effect like that would ruin the realistic hard edge shadows in sunny tracks:

Porsche_track_day_003.JPG
Porsche_track_day_006.JPG


What it needs GT5 is to render the shadows at more resolution in close ups and cockpit view, the technique is spot on, even has edge antialiasing.

dscf2307l.jpg


A true soft shadow effect only would be useful for real-time weather and time changes, but too much expensive and complex to render in a game like GT5.
 
@Vulcan project-

An interesting read

For years, developers were able to take advantage of faster and faster processors from Intel and Advanced Micro Devices. All they had to do was write their program once, and it would run faster and faster as Intel and AMD cranked up the clock speed.

But overheating forced chip companies to adopt designs with two or more processor cores running at slower speeds, which meant that some applications written to run on a single thread couldn't take advantage of that extra horsepower. This has required an entirely new way of looking at software development, prompting Intel this week to release another batch of software development tools aimed at helping developers make that transition.

Major games take years to develop, meaning that most of the games released around the time that dual-core chips hit the market in 2005 were not built with two lanes in mind. The good news is that developers have found a way around this so far with patches, which alert the game that it has two cores to work with.

"I'd say we're at a 'C-plus' right now. When the first dual-core chips came out (in 2005), we were at a D-minus."
--Randy Stude,
director, Intel's game platform office The bad news is that releasing patches is only a stopgap solution until game studios sell games designed with multiple software threads in mind. More and more studios seem to be getting the message, with dozens of major titles in the works for multicore processors. But this is hard work--the abandonment of decades of programming expertise for a new way of exploiting processor power.

"I'd say we're at a 'C-plus' right now," said Randy Stude, director of Intel's game platform office, assigning a grade for the industry's progress toward parallel development. "When the first dual-core chips came out (in 2005), we were at a D-minus."

Intel and AMD have spent significant time and energy urging developers to take advantage of the "low-hanging fruit"--easy ways to make their games more aware of parallel computing. AMD even sponsored a coding competition last year to help drive those points home.

As a result, over the past year, major game studios such as Blizzard Entertainment (World of Warcraft) and Id Software (Quake and Doom) have released patches to make their games multicore-friendly.

But that's not the same as having designed the game from day 1 with multiple processors or multicore chips in mind, said Ted Pollak, an analyst at Jon Peddie Research.

"It won't give the same kind of performance, but it's going to help, and it's better than nothing," Pollak said.

According to lists supplied by Intel and AMD, just more than 25 games are available that were designed with multiple-core processors in mind. One of those games, THQ's Supreme Commander, made its debut in February.

"We feel it's a design choice you have to make from the outset," THQ spokesman Ben Collier said.

Unfortunately, it's not always that simple. Massive PC games are multiyear projects, and many companies are reluctant to tinker with code that has been well-received by the public. Some developers are just working on a single game, while others are creating game engines that will power several games.

One company thinks that it has a product that can help alleviate the long nights spent coding for multicore chips. "It's a way to continue to use serial programming but achieve a parallel approach to data parallelism," said Ray DePaul, CEO of RapidMind.

Most of the work on the RapidMind development platform has been for IBM's multicore Cell processor, but the company is working on tools to support multicore x86 chips from Intel and AMD as well, DePaul said. Developers use an API (application programming interface) to write their application, and the platform figures out how to distribute the load across multiples cores.

A company called PeakStream has a similar product that can let developers plunge right into the multicore world.

Intel thinks that developers might as well just get used to the parallel world, however. Soon all PCs will have at least dual-core chips, with quad-core desktop chips already available from Intel and coming later this year from AMD.

Console games appear headed in that direction as well, Stude said, given the use of multicore chips in the PlayStation 3 and Xbox 360.

"The learning curve is becoming less and less to get threading work done," Stude said.
 
A true soft shadow effect only would be useful for real-time weather and time changes, but too much expensive and complex to render in a game like GT5.

Its all a bit too much for GT5 now i think. I say it again that people are asking rather a lot more than is possible with the current hardware. The focus should be on getting it good enough not to compromise framerate or resolution or any other aspects currently in place.
 
EDIT: Forget it, thread is moving faster than I care to keep up with :p

But regards to multicore games. Its far from 1:1 scaling. Quad cores have been around for a few years now, but for gaming a good dual core is still better than a slower quad. Even just using 2 cores PC games dont fully utilise them both, most just offload some simple tasks to the 2nd one to increase what can be done on the 1st one before it bottlenecks.
 
@Vulcan project-

An interesting read

Interesting maybe, but also no offence, irrelevant to current console technology. My point is that PS3 was not as focused to be a games machine like microsoft console hardware and software has been. I dont see any defence for that which leads on to the second point:

Predicting the future accurately is still impossible last time i checked, and gambling on something still being with the trend a decade down the line is once again not the best way to build a new console.
 
Back