1080 Vs. 720, huge PNG inside (56K you're warned!)

  • Thread starter R0M
  • 51 comments
  • 24,325 views
Can't really compare a still frame from a moving image the quality of 720p and 1080i/p. The best way is by sampling it on an actual game on a particular TV in motion. But 1080 with always have more image information and will show a better image as long as the screen size is 40 inches or more, that's the reason why you need more pixel count for image sharpness. If you are going to compare it in a small screen like 27 inches or less you will hardly tell the difference and would make the argument useless.
 
Thats not true.
Higher resolution is higher resolution and you will also notice the difference on a smaller screen.
The difference between 1080p (even upscaled like in GT5P or WipEout HD) and 720p is very noticable on my 24" SAMSUNG monitor.
 
First let me warn anyone looking at screenshots and jumping to any conclusions on which is better... don't!

(although I certainly understand and appreciate the effort it took, and the desire people have to do it)

First of all, they are just screenshots, not video, and second of all, there are many different variables that will negatively and positively impact image quality based on each specific set-up, source, and what they are watching.

Not to mention the fact that if you truly capture a single frame on 1080i video... you are really only seeing every other line of the image, the rest being scaled to fill the missing horizontal lines of resolution 1920x540 scaled to 1920x1080)... which will make the 1080i seem inferior, but the reality is, if the 1080i is properly flagged and deinterlaced is it absolutely identical to 1080p when watched as video... not as a single frame... and that's because the two 1920x540 interlaced fields are refreshed at such a high rate that the human eye perceives the two fields as a single 1920x1080 frame.

Interlaced video is an amazing technology, and while progressive video has some significant advantages, so does interlaced video WHEN properly flagged and deinterlaced for progressive displays (LCD, DLP, PDP, LCoS, etc).

For anyone who really wants to have a better understanding on the differences between interlaced and progressive video, as well as the advanatges and disadvantages each technology has to offer, why they are both used, I would stringly suggest you start out by reading the wiki listings for both Interlacing and Deinterlacing - and follow some of their referenced sources for even more detailed explanations. I've checked both of these listings and can say with confidence they are both accurate and informative. 👍

In addition, there are numerous discussions about 1080p vs 1080i vs 720p vs 480p vs 480i in the Electronics & Home Theater area of the forum as well as the PlayStation 3 area of the forum as regards to what resolution works best for specific types of displays, and thus I welcome anyone interested to go and look and read over those threads and discussions. Not only that, but in many of those threads and discussions, links have been provided to outside sources with even more information and data that supports the information provided.

However, let me at least try and address some of the things mentioned already in this thread in an attempt to clear up some of the popular misconceptions about resolution.




Ugh, 1080i is horrible for gaming.
&
I'm no expert or anything, but I've heard that 1080i is horrible for videos, but fine for gaming.

Neither is necessarily true, nor are they necessarily untrue, it all depends on several factors.

If you have a display, like a CRT that supports 1080i, then 1080i is actually ideal, and will look better than 720p, and many of those types of HDTVs don't support 1080p - thus 1080i is ideal in those cases.

In addition, if you have a fixed pixel display (LCD, DLP, PDP, LCoS, etc) and it has a native resolution (the actual pixel count) is 1920x1080, and it has a video processor that can properly deinterlace, then 1080i is going to almost always look better than 720p.

However if your display has a *native resolution of 720p (1280x720) or as if often the case 768p (1366x768), then you'll most often get the best results by viewing 720p video, simply because you might be able to avoid additional artifacts caused by scaling, not to mention the chance any needed deinterlacing is done improperly, resulting in a loss of 50% of the vertical resolution... which would also require additional scaling to fill in the gaps.

* It is also critical to understand the difference between native resolution and supported resolution. Just because a display says it "supports" 1080i or even 1080p, does in no way mean it can actually display it in 1080i or 1080p without downscaling it to match the actual native resolution of the display's panel.

If you own a display that does not have a 1920x1080 pixel panel, you'll almost always get a better result feeding it a 720p image.

Although in the case of content that is native 1080p, and if the display has a very good video processor, then seeing as it is going to have to be downscaled any way, then in that specific case, you may very well get better results feeding it the 1080p signal.





GT
Overal it gives everyone a good indication of the difference in quality, and that is important. Now let's see what everyone thinks :)

It really doesn't, due to the inherent differences between video and screen shots - especially when the source is interlaced, as either you are only seeing a single field compared to a full frame, or you are seeing how that specific display was able to deinterlace it, in which case YMMV... but one can not tell that simply by looking at a screenshot.




By the way, for anyone who doesnt know what the numbers mean:
720=1280x720 pixels
1080=1920x1080 pixels

Actually, 720 and 1080 just refers to the number of horizontal lines of resolution.

Case in point, although GT5:P is "1080p" the actual number of vertical lines of resolution change depending on what you are doing in the game. In races, the number of vertical lines of resolution drops to 1280, and thus to output in 1080p, the PS3 scales the horizontal resolution from 1280 back to 1920. If it is set to output in 720p, then it scales down the vertical lines of resolution from 1080 to 720.

In fact, there are many games that are "1080p" that do not have 1920 vertical lines of resolution. Some are 1600x1080, 1440x1080, 1280x1080, and even 960x1080, like Conflict: Denied Ops.

While most "720p" PS3 games are 1280x720, a few are not even 720, let alone 1280. Like:
  • 1024x768 Beijing 2008
  • 1152x648 Fracture
  • 1152x640 Dark Sector
  • 1120x630 Alone in the Dark
  • 1024x600 Call of Duty 4 & 5
  • 1040x585 Guitar Hero
The same is true for many 360 games.

In some multi-platform cases the "720p" PS3 versions have more native resolution than the "720p" 360 versions, like Tomb Raider:Underworld and Elder Scrolls IV: Oblivion which are both "full" 720p, thus rendered at 1280x720 (900,000 pixels per frame), while the 360 versions are rendered at only 1024x576 (590,000 pixels). That means the PS3 versions have 66% more native resolution than the 360 versions... or about 300,000 pixels per frame - nearly the equivalent of 480p video on DVD.

On the other side of the coin, the 360 versions of Grand Theft Auto IV and Dark Sector are rendered at 1280x720, while the PS3 versions are rendered at 1152×640. That's a loss of 20% of the resolution, or about 180,000 pixels.

And don't think that by simply scaling these games to 1280x720 or even 1920x1080 they are going to look better. The simple fact is that scaling not only adds artificial artifacts, but that it can only try and guess/predict how each additional pixel should be, and effectively all it really does is blows up the native image, as it can't create original detail into the image.

If you have a 1080p display and you do notice an improvement when feeding it a signal that is scaled by an upscaling player (like a PS3), versus feeding it the native lower resolution from the same player - that is usually an indicator that your player has a video processor that does a better job scaling than your display.

This is also why one must be careful comparing images captured from one display and concluding they will look the same on a different display. Not all displays, video processors, deinterlacers, and scalers are create equally, and in fact there is often distinct differences, even among the same manufacturers product line.




Not only is there a great deal of misconception regarding interlaced and progressive video, as well as the differences between native and supported resolutions, as well as rendered and scaled resolutions, but there is also a great deal of misconception about having to use a "big" display in order to even see the difference between 720p and 1080p... and that is absolutely untrue... although that to has been discussed many times before in the same places mentioned earlier in this post... and if this post get's any longer I might exceed the character limit. :D


BTW: My apologies for those that don't like long posts for one reason or another, but seeing as these misconceptions come up so often, I just want to try at least to put some of them to rest once and for all.
 
Last edited:
In addition, if you have a fixed pixel display (LCD, DLP, PDP, LCoS, etc) and it has a native resolution (the actual pixel count) os 1920x1080, and it has a video processor that can properly deinterlace, then 1080i is going to almost always look better than 720p.
Yes, but most people who use 1080i have a display with a different native resolution 1366x768 or something like you also stated.
It really doesn't, due to the inherent differences between video and screen shots - especially when the source is interlaced, as either you are only seeing a single field compared to a full frame, or you are seeing how that specific display was able to deinterlace it, in which case YMMV... but one can not tell that simply by looking at a screenshot.
Thats true, maybe i shouldnt have posted "1080i is terrible" but "1080i on a TV that has a native resolution of 1366... blah". ;)
Actually, 720 and 1080 just refers to the number of horizontal lines of resolution.
Yes and no.
If a game has "1080p" or "FullHD" on the box it has to support 1080 vertical and 1920 horizontal lines. But yeah, most games, if not all, upscale the image and are rendered in a lower resolution, some games even in pretty strange ones (like Yakuza 3 wich is afaik in 1024x1024).
And don't think that by simply scaling these games to 1280x720 or even 1920x1080 they are going to look better. The simple fact is that scaling not only adds artificial artifacts, but that it can only try and guess/predict how each additional pixel should be, and effectively all it really does is blows up the native image, as it can't create original detail into the image.
Yep, this is also the reason why the 1080 screens here are so washed out, because they are scaled images.


I actually agree with you in everything and i knew all of that.
Maybe i didnt wrote it properly though. :lol:
 
Uh, no, not junk. Did you read what I wrote? There are advantages and disadvantages to each format. If you don't have a decent 1080i set to base this opinion on, why persist? All HD television is broadcast in 1080i, is that junk too?

I still fail to understand how a captured 1080i image is any different than a captured 1080p image, it's all up on the screen when the capture is "taken". If there is a difference, why do we see the entire image with the 1080i, shouldn't half of it be missing? As I understand it, they are different only when displaying images in real time, not for stationary captures.

Well, I do have a 'decent' 1080i set to base on. 720P is a better looking picture, you are not going to argue that fact out of any one. While 1080i is a bigger picture, it doesn't necessarily mean that it is better. Did you even look at the link I posted? Here is another one that explains the debate and why it is better.

If you think of progressive as a better refresh rate, then you might get it.

I would be happier if broadcasters output full HD, but they do not because true full HD is too expensive and requires a lot more bandwidth to broadcast.

I will take a 720P res any day; you will get a clearer image with the lower res.

When the day comes that GT is output in full HD, you will see the difference in a interlaced and a progressive picture.
 
Ace, thanks, I get it now. My 1080i TV is inherently different (old rear projector, not a fixed pixel display) and a bad comparison in this case. I can see how 1080i would be compromised on an LCD.

CWR, in your opinion, it's junk. Fantastic, but I don't value your opinion on this topic. Nowhere do I say 1080i is better, in my case there are circumstances where I prefer it, and that has a lot to do with my particular TV and monitor. For GT5P it's a wash. In your dorm room it might be different. I already understood the difference between progressive and interlaced, I didn't know why it would matter in a still image, OK? Thanks for the help.
 
Last edited:
Well, I do have a 'decent' 1080i set to base on.

What is the make and model of your 1080i set?


I will take a 720P res any day; you will get a clearer image with the lower res.

Clearer? Maybe for certain content on specific displays. Less resolution? Yes absolutely. Just as it is wrong to say 1080i is always better than 720p, it is equally wrong to say 720p is always better than 1080i - both statements are far too vague to be accurate, and don't take into account the many variables that make both of those statements to be false.

You (and that very abridged article you linked to) appear to be getting hung up on the refresh rate. I suggest you investigate further, including what the human eye is capable of detecting, and why in most cases when 1080i signals are properly flagged and deinterlaced are indistinguishable to the same signal in 1080p... and in those cases, they are clearly going to be "better" than the same signal downscaled to 720p.


When the day comes that GT is output in full HD, you will see the difference in a interlaced and a progressive picture.

GT5:P is already available to output a progressive signal, and other than when racing, is full HD (1920x1080), and even when racing it is 1080p (1280x1080) scaled to 1920x1080.

You can also force it to output in 1080i, and when doing so on my 1080p PDP (Vizio VP-504), as I have discussed in the fore mentioned threads, it looks identical to when I set it to output in 1080p, but that display also has an excellent deinterlacer. The same in not true for my 42" LCD (Vizio VU42), which does not do a proper deinterlacing, and thus I lose 50% of the vertical resolution when watching 1080i content... not to mention LCD motion blur... which again is not the fault of 1080i, as it can be just as easily seen in 1080p video.

Which brings us to the other problem, and that you seem to be ignoring the possibility that the reason 1080p and even 720p looks better on many displays is that perhaps those displays in question DO NOT properly deinterlace. It's been an industry problem for years that until only recently have they significantly improved... but it has nothing to do with the quality of 1080i. It has to do with the display's deinterlacer, and based on your assessments, possibly you have a display that is not properly deinterlacing, and thus assuming this is the case for all displays... when it is not.
 
You (and that very abridged article you linked to) appear to be getting hung up on the refresh rate. I suggest you investigate further, including what the human eye is capable of detecting, and why in most cases when 1080i signals are properly flagged and deinterlaced are indistinguishable to the same signal in 1080p... and in those cases, they are clearly going to be "better" than the same signal downscaled to 720p.

For digitally rendered scenes (without post processing) such as we have in GT5p, 720p at 60 hz will appear smoother than 1080i at 30hz. Although the eye cannot process at higher than 24 hz, input from multiple frames within the same interval will be blended together by the eye (well, the brain anyway), further enhancing the sense of smooth motion.

I didn't believe myself that you could distinguish between a 30 hz and 60 hz framerate, back when I was working on video games (using the old NTSC standard), but when shown the same scenes running back to back, the difference is very obvious.

Movies compensate for this by adding blurring effects to the stills themselves, so the playback will appear natural when run at 24 hz. Also, there would be no difference between 24 hz and 48 hz, especially since the source material runs at 24 hz, so you'd need to duplicate the frames anyway.
 
Ace, thanks, I get it now. My 1080i TV is inherently different (old rear projector, not a fixed pixel display) and a bad comparison in this case. I can see how 1080i would be compromised on an LCD.

CWR, in your opinion, it's junk. Fantastic, but I don't value your opinion on this topic. Nowhere do I say 1080i is better, in my case there are circumstances where I prefer it, and that has a lot to do with my particular TV and monitor. For GT5P it's a wash. In your dorm room it might be different. I already understood the difference between progressive and interlaced, I didn't know why it would matter in a still image, OK? Thanks for the help.

Yes, I am saying 1080i is junk.. ;) Stick with 720P, you won't be able to tell the difference. Other wise, spend the money to get a FULL HD set. Its all relative, what difference does it make? Your eyes can only see so well. In my eyes 1080i is just a blown up version of 720P. All we know is what we are told and what we hear to be the truth. Even with your own eyes, that judgment is made upon someone else saying this is better than that.

Further more, I really don't see what difference it would make, if my TV were in my house or dorm room. In a still image, that image is confined to a computer programs properties. That 'still' image will never be as good as seeing it on the screen. Just look really close, and you can see how 1080 is not as sharp (as you said before). If you think about, you can just see how its just a blown up version of 720P. It looks like a up convert, and if I am not mistaken, that is what it is.

My TV is a 32" Phillips (32PFL55...) While not a good as a Sharp or Samsung, still considered a 'decent' set. (Max output 1080i)

Digital-Nitrate, its funny how you agree and disagree with me. I assume you are some kind of professional in the home automation field, or work for some TV manufacturing. But you know what they say, assuming makes an 🤬 out of you and me. ;)
 
Yes, I am saying 1080i is junk.. ;) Stick with 720P, you won't be able to tell the difference. Other wise, spend the money to get a FULL HD set. Its all relative, what difference does it make? Your eyes can only see so well. In my eyes 1080i is just a blown up version of 720P. All we know is what we are told and what we hear to be the truth. Even with your own eyes, that judgment is made upon someone else saying this is better than that.

Further more, I really don't see what difference it would make, if my TV were in my house or dorm room. In a still image, that image is confined to a computer programs properties. That 'still' image will never be as good as seeing it on the screen. Just look really close, and you can see how 1080 is not as sharp (as you said before). If you think about, you can just see how its just a blown up version of 720P. It looks like a up convert, and if I am not mistaken, that is what it is.

My TV is a 32" Phillips (32PFL55...) While not a good as a Sharp or Samsung, still considered a 'decent' set. (Max output 1080i)

Digital-Nitrate, its funny how you agree and disagree with me. I assume you are some kind of professional in the home automation field, or work for some TV manufacturing. But you know what they say, assuming makes an 🤬 out of you and me. ;)

No, I don't agree with you, nor does science. I quite clearly said suggesting one is better than the other is wrong because it doesn't take in account all the variables where one CAN be better than the other.

Most importantly, there is no assumption needed... you're TV, a 32" Phillips 32PFL55 can't possibly output 1080i... it a 1366x768 LCD display. It's a fixed pixel display and MUST scale down any signal over 1366x768 in order to display it... and considering Philips past benchmark tests, it also very likely fails to do proper deinterlacing.... either of which would cause you to get an inferior image from 1080i material.

Once again, there is a HUGE difference between supported resolution, and the native resolution. Your display has a native resolution of 1366x768... thus you'll never see 1080i displayed as a full HD image. If someone sold you that TV by saying it could display a 1080i image without having to scale it down to 1366x768, you were bamboozled.

Also, video technology and performance as well as the human vision system are not subjective. It's based on sound scientific fact, so there is no need to just ignore the science, and simply write it off as a difference of opinion. It is however, which is what I've been saying all along, largely dependent on each person's equipment... and in your case, your TV isn't even capable of displaying deinterlaced 1080i in its original 1920x1080 resolution, but instead, like many 768p screens under 40" from over a year ago, it likely downscales the horizontal resolution from 1920 to 1366, and upscales the vertical resolution from 540 to 768. This mean in your case, 1080i signals will only give you 1366x540 native pixel image (w/ vertical rez scaled to 768). So naturally it isn't going to look as good to you as 1280x720 native pixel image.

However, that's because of the TV you use, and not because of the quality of a 1080i signal. That isn't an assumption, it's science.
 
Last edited:
Thanks Digital-Nitrate for your accurate messages ;)
I couldn't do better to express my point of view... :)

About above 1080i pictures, as the camera was steady when captured, odd and even fields where grabbed. That's the reason why, in that case, I don't think it's 1920x540 upscaled to 1920x1080 and talked about 1080p “illusion” in my original post :)

And by the way I enjoy well documented and technical long posts, specially about video :D
 
1qs8i57agysgr8spr968.png


Quick, lo-res comparison, but you can see a few things. if you want more, i'll make 'em.

biggest difference is the numbers in the middle.
 
1qs8i57agysgr8spr968.png


Quick, lo-res comparison, but you can see a few things. if you want more, i'll make 'em.

biggest difference is the numbers in the middle.

Hmm.... 720P looks nicer in my opinion.. More softer image. The 1080I looks sharper and more jaggies visible...
 
Listen (pay attention reading) up, I've read the 1st page of the thread, and it's just a pile of BS.

The difference between 720P and 1080I is mostly unseen, because 1080i usually is just an upscaled image of the same 720p. And if a game natively can run no higher than 720p, then there will be no difference in 1080i/p. But if a game can run 1080p, then if you can't see the difference on the screen of a hdtv which does 1080p with 100hz, you should stop gaming immediately and go check your eyes.

It's not just about the numbers, it's about the game and how does it use the ps3. If it is cross-compiled or just written by an asshole, then it will look the same, both on 1080p and 720p. Like lego batman, the game looks worse @1080p than Orange Box games do at 720p.

If you want to see the difference between resolutions, you should really use a fullHD monitor with a PC. Like, try running a game @ 1280*720/1024, and then try running it at 1920*1080.
 
The difference between 720P and 1080I is mostly unseen, because 1080i usually is just an upscaled image of the same 720p.
No, 1080i is interlaced, 720p is not, just alone that is a huge difference.
And if a game natively can run no higher than 720p, then there will be no difference in 1080i/p.
Wrong, almost NO game has native 1080p and they still look better on a FullHD display. If you scale up to 1080i you get a scaled, interlaced image, if you scale to 1080p you will get a scaled progressive image, which is always better than an interlaced image at the same resolution, ALWAYS!
It's not just about the numbers, it's about the game and how does it use the ps3. If it is cross-compiled or just written by an asshole, then it will look the same, both on 1080p and 720p. Like lego batman, the game looks worse @1080p than Orange Box games do at 720p.
Thats true though, MCLA has the same issues (especially noticible in the XMB).
 
No, 1080i is interlaced, 720p is not, just alone that is can be a huge difference.

Fixed.

As already explained in this thread back around post #30 and bellow, when properly flagged and deinterlaced on a 1080p display, 1080i is ALWAYS going to look better than 720p of the same images.


Wrong, almost NO game has native 1080p

Almost NO isn't an actual figure, and it can be misinterpreted, case in point:

So you mean, no game actually runs fullHD on the consoles ? Could you explain why ?

There are actually several games that run in native 1080p for the PS3 at least. Last count I believe there were over 20, but that was some time ago.

Just keep in mind, resolution isn't really a limiting factor right now in terms of video game graphics... which are not even close to matching the photo-realism as seen in 1080p movies on Blu-ray. This is also why some 720p games look better than 1080p games... as the graphic artists for some of those great looking 720p games put in a lot more work to include an amazing amount of detail for a video game, like Uncharted (which is 720p). Where as some 1080p games have less detail, not because of the resolution, but simply because the developer didn't take the time to add it. Remember, video games are nothing more than rendered graphics... someone still has to create all that detail... and then you need a powerful enough CPU & GPU, among other things, to render it properly... and then you need the right equipment, display, settings, etc to see the images at their best.


If you scale up to 1080i you get a scaled, interlaced image, if you scale to 1080p you will get a scaled progressive image, which is always better than an interlaced image at the same resolution, ALWAYS!

Only if the interlaced video isn't properly flagged and you have a display or source player (console, disc player, cable box, sat receiver, etc) that does not do proper deinterlacing.

So no, that is not ALWAYS the case.



As I have mentioned before, there are many links among the chain of events that determines the quality of video by the time your eyes receive the images, many of which can and do negatively impact image quality, and to make blanket statements that do not take in account the many very real variables, conditions, and exceptions is misleading at best.
 
Last edited:
As already explained in this thread back around post #30 and bellow, when properly flagged and deinterlaced on a 1080p display, 1080i is ALWAYS going to look better than 720p of the same images.
I didnt mean the difference in image quality, i just meant the the difference between 720p and 1080i itself. 1080i isnt the same as 720p upscaled.
There are actually several games that run in native 1080p for the PS3 at least. Last count I believe there were over 20, but that was some time ago.
This is new to me, can you give me a list?
I think i just know one and thats Virtua Tennis 3 and im not even sure about that one.
GT5P, LAIR, WipEout HD all are upscaled, GT5P and WOHD are looking crisp like nothing else though.
EDIT:
Only if the interlaced video isn't properly flagged and you have a display or source player (console, disc player, cable box, sat receiver, etc) that does not do proper deinterlacing.
Okay maybe "always" was a bit over the top, but so far no one could prove me the opposite and ive never heard from a "payable" TV with a proper deinterlacing.
 
I went from a 52in 1080i JVC projection to a Samsung 19in 720p LCD. None of you would give 1080i a chance after you saw what i have seen. My old TV I couldnt see aot of things on the screen, they simply werent there. Also, I have read in many places that 1080p is only useful on screens bigger than 32in, because there isnt enough space on a smaller TV to see all the lines. My point... i sucks, p is alot better, alot. If you are a geek that can de-interlace a tv more power to you. If you want to plug-n-play, like I do, get a small p tv right in front of your face. You guys make watching tv seem like alot of work, when in actuality, if you just buy a Samsung your life will be alot better, and GT will look like a new game. Can anyone argue a 1080p image and a 720p image look different on a 19in screen?
 
Also, I have read in many places that 1080p is only useful on screens bigger than 32in, because there isnt enough space on a smaller TV to see all the lines.
Thats nonesense, i have a 24" Samsung 2493hm and the difference between 720p games and 1080p is huge, you just need to sit closer of course, but higher resolution is higher resolution.
 
Back