I'm HD Too!!

  • Thread starter Thread starter Jedi2016
  • 24 comments
  • 1,389 views
Messages
4,161
Picked up my Westinghouse LVM-47W1 47" 1080p television today.

I haven't had time to do much with it yet. I hooked up the Xbox first, watch it improve in stages as I set to widescreen mode and then from 480p all the way up to 1080p.

HD video roxx. HD game roxx. Didn't play for long, since I had to go drop off the guy that helped me move it.

When I got home, I popped in The Incredibles to run a quick THX calibration on it. Can't do the color just yet (waiting on the blue filter glasses), but I fixed up the brightness/contrast.

Couple things I noticed:

1) It's HUGE.

2) I never knew "standard" DVD could look so good. I mean... DAMN.

3) It's frikkin' HUGE.

4) I think the Xbox360, even in 1080p mode, is still only upscaling, even "native" 1080p images and the dashboard, because it still has a "soft" edge on images I know are razor-sharp on my computer monitor.

5) It's ****in' HUGE.

I think that about does it for now. Did I mention how big it is?

I'll try to get my digital camera up and running and get some pics up later.

It's HUGE. I mean, seriously. I did manage to find a single stuck pixel. But this thing has over two million of them. The individual pixels are so small that even up close, I couldn't tell what color the stuck pixel was. And even when you knew it was there, you couldn't see it from more than a foot away from the screen, much less all the way back on my couch.

I'm gonna go run out and get some chow, and sit down and watch something. Episode III, maybe... :)









It's HUGE.
 
Congrats. 1080p is all the rage these days. I wonder if it's really so much better. So your TV upscales 1080i images or 720p images to 1080p? And I guess it upscales standard def images all the way to 1080p as well? Does the scaling look funny on standard def images?

Overall pretty sweet. You're on the cutting edge... for now.
 
Congrats! Welcome to 2006. :)

With the X360 1080p problem, what cables are you using? Component or VGA?

By the way, what TV did you have previously?
 
Danoff: Yeah, it upscales everything to 1080p. Very good upscaling, too, I can't imagine my DVDs would look any better on an upconverting DVD than what the TV is doing with it. In fact, the TV does just as good a job at deterlacing and inverse 3:2 pulldown (for film sources) that I left the DVD player on "normal" mode, even though it supports progressive. The only thing I've seen so far that looks "ugly" when upscaled is the splash screen on my DVD player, which has no aliasing on the letters, so you can see a lot of jaggies there. But the movies themselves look fantastic, no jaggies at all, and a surprisingly sharp picture.

Duck, I'm using component cables. Not sure what problems the 360 has with 1080p, but it seems to be working pretty well. The games look simply fantastic. With the combination of HD with the larger screen size, I can see everything.

I moved up from a fifteen-year-old Sanyo 27" SDTV. It was so old it didn't even have S-video inputs, all it had was a single composite input. The difference between the two is staggering, even on SD sources, if for no other reason than the screen is so much bigger. And widescreen.. now I'm watching DVDs in their full resolution, where I was always losing something previously due to the size reduction to fit an anamorphic DVD image on a 4:3 TV.
 
Congrats on your new big screen, Jedi. 👍 Little over three years ago, I also jumped from a 27" to 47" and I gotta tell you....... 47" shrinks. Seriously, I was watching a DVD on it today thinking, "It's too small!". :D Since my old TV only has 480p/1080i, I would have to buy a new TV sometime. Since I live in an apartment, I don't think my next TV will be any bigger than a 47". Might even get smaller. :dopey:
 
It's pretty damn big.. hehe. I think the room is only thirteen feet wide on that wall. The TV is nearly four feet wide, that leaves only about four and a half feet on either side of it before you hit a wall. It covers nearly a third of the width of the room.
 
Hey Jedi: What made you go with the Westinghouse model vs. 'the rest out there?' Just curious as I'm looking into moving my current Panasonic 32' that I use on my race rig into my bedroom and upgrading to something bigger (42'in. probably) and am definitely looking for HD 1080p input capability.

Thanks,


SRD:
 
I'm guessing the price. I'm betting he got a killer deal.
 
Size+resolution+inputs+price = good deal. :)

It's 47", which, as I said, is huge. It's native 1920x1080. It's got every kind of input known to man (2 component, 2 DVI, 1 HDMI, 1 S-video, 1 composite, plus a slew of audio inputs if you *shudder* want to use the internal speakers). And it was only $1900 with three years no interest (Best Buy). That boils down to an average monthly payment of less than sixty bucks. Which is pocket change.

And the reviews I read (which is a LOT, I researched the hell out of this thing) all said the same thing: While it's not quite in the same league as "name-brand" TVs from Sony or whoever, it's got a better-than-average picture, and it's a veritable steal at Westinghouse's prices. One of those "best bang for your buck" TVs.

The only thing to consider if you're thinking about getting one is that technically, it's not a television.. it's classified as a monitor, because it doesn't have a TV tuner in it. If you want to watch TV on it, you'll need an external tuner (like a cable box, DirecTV or whatever). But most people have these things already.. about the only thing anyone would use the internal tuners for would be pulling TV in from over-the-air broadcasts. That's one of the ways they were able to knock some money off the price.
 
Thanks for the reply Jedi! 👍 Guess I'll be checking out "Westinghouse" the next time I drop by a Best Buy or Circuit City!!
 
Congrats on the new acquisition, and welcome to the wonderful world of true HD (1920x1080)! If you think wonderfully mastered standard DVDs look great on that display... just wait until you see what reference quality 1080p DVDs (HD DVD & Blu-ray) look like! (just be sure the HD player outputs 1080p and not 1080i)

BTW: Last month...
When I get my PS3, I'm going to buy one, maybe two Blu-Ray movies to watch on it. Depends on what's out at the time, and what format it's in (I'm ignoring all of the launch titles.. single-layer MPEG2 << dual-layer VC1). It kind of bugs me that I actually have to research specific titles beforehand to find out what codec it uses. DVD was simple, everything was the same.. hehe.
Now...
2) I never knew "standard" DVD could look so good. I mean... DAMN.
Glad to see you now recognize that MPEG2 is not the culprit for poor video. The biggest culprit is poor masters... but having a nice display also makes a big difference! :)

4) I think the Xbox360, even in 1080p mode, is still only upscaling, even "native" 1080p images and the dashboard, because it still has a "soft" edge on images I know are razor-sharp on my computer monitor.
This is my understanding as well. The XB360 can only render 1280x720 images, and then the video processor scales them to 1920x1080 if outputting in 1080p... or 1920x540 if outputting in 1080i. The PS3 can render in 1920x1080, and can also output 1080p so there is no risk of losing resolution due to improper deinterlacing.

I did manage to find a single stuck pixel. But this thing has over two million of them. The individual pixels are so small that even up close, I couldn't tell what color the stuck pixel was.
To check for stuck pixels it is a good idea to use the RGB test patterns. Most quality displays, such as yours, have these built-in to the processor and are accessible from the service menu.

I would also recommend you purchase Digital Video Essentials and/or AVIA. These are both excellent, yet inexpensive calibrations DVDs, and short of hiring an ISF certified calibrationist, they will not only help you accurately calibrate the display (MUCH better results than the THX test patterns), but they will also test the performance capabilities of your display, and help you diagnose any problems you may be having with your particular display. 👍 👍



Danoff: Yeah, it upscales everything to 1080p. Very good upscaling, too, I can't imagine my DVDs would look any better on an upconverting DVD than what the TV is doing with it. In fact, the TV does just as good a job at deterlacing and inverse 3:2 pulldown (for film sources) that I left the DVD player on "normal" mode, even though it supports progressive.
I'm not familiar with that model, but many 1920x1080, even 1280x720 displays do not actually deinterlace, but instead they use the video processor and scale each interlaced field and display it progressively. Unfortunately, there is no regulation that forces manufacturers to disclose this. This means that if a TV does not properly deinterlace an interlaced image, it loses 50% of the detail by taking each 1920x540 interlaced field, and scaling them to 1920x1080 and then displays it progressively. The effect of this is the loss of 540 lines of data each frame (which will make it look softer than native 1920x1080 images). Much like how XB360 games that are rendered in 1280x720 frames are then scaled up to 1920x1080.

Displays that do correct 3:2 pulldown (despite what they claim in their specs) are even less common!

These are very serious issues that consumers need to be made aware of and to voice their concerns with the display industry, including retailers!

Click HERE for more info on just how bad this situation really is.


EDIT: The good news, if you are using a PS3 or Blu-ray player, is that Blu-ray supports the original 24p film rate with its native timing. This means there is NO film:video conversion and thus no 3:2 pulldown. Unfortunately, HD DVD only uses 30p timing for 24p thus it does require 3:2 pulldown, and even when properly done, 3:2 pulldown often creates artificial motion artifacts due to the repeated frames. These are more easily noticed when viewing panning shots, as each pixel in each frame is changing.
 
Too bad it only has 1000:1 contrast ratio.

Jedi, can you take a pic for us of your TV on with a black screen at night without any lights on?
 
I'm not familiar with that model, but many 1920x1080, even 1280x720 displays do not actually deinterlace, but instead they use the video processor and scale each interlaced field and display it progressively. Unfortunately, there is no regulation that forces manufacturers to disclose this. This means that if a TV does not properly deinterlace an interlaced image, it loses 50% of the detail by taking each 1920x540 interlaced field, and scaling them to 1920x1080 and then displays it progressively. The effect of this is the loss of 540 lines of data each frame (which will make it look softer than native 1920x1080 images). Much like how XB360 games that are rendered in 1280x720 frames are then scaled up to 1920x1080.

Displays that do correct 3:2 pulldown (despite what they claim in their specs) are even less common!

These are very serious issues that consumers need to be made aware of and to voice their concerns with the display industry, including retailers!

Click HERE for more info on just how bad this situation really is.

I've always known that MPEG-2 wasn't a "problem", but I do concede that the new HD codecs are more efficient and less prone to compression artifacts and soft edges than MPEG-2 is. It works wonderfully for DVD, though.

That reminds me of something I read on the Bits a while back. Bill visited one of the studios to check out their BD stuff, and he pointed out that the single best bit of HD video that he'd ever seen in his life had been encoded with MPEG-2. So I know it's not "the devil".. hehe.

I'm also well aware of the interlacing issues.. the link you posted is referring specifically to deinterlacing 1080i material, whereas everything I'm watching is only 480i (aside from the 360, of course). And it IS doing full-bore inverse 3:2 pulldown, reintegrating the full 480p image as it was meant to be seen. I've seen a few instances where the TV has to "catch up" when I fast-forward or something, and the lack of proper deinterlacing in those instances is clear as day, everything looks very jagged and downright fugly. But once I hit "Play" again, everything snaps back to full resolution. I'm very familiar with how deinterlacing and pulldown works.. after spending half a day kicking After Effects in the pants until it properly processed a 2:3:3:2 sequence, I know just about all there is to know on the subject.. hehe. And I know exactly what to look for. The TV is doing it correctly, no doubt. But I don't have any 1080i signals to send it, except for the 360, but with game sources, it'd probably be harder to tell whether I'm seeing deinterlacing artifacts or simple aliasing.

The contrast ratio seems fine, Omnis. Yeah, blacks aren't "true" black, but they're actually a fair bit darker than they were on the old CRT that it replaced. On the flip side, whites are a good deal brighter than my old TV, although that could just be the size of the thing. Displaying an all-white image will light up most of downstairs. And the advertised contrast ratio is actually 1200:1.. hehe.

I keep forgetting to pick up batteries for my camera whenever I go out. Just bear in mind ahead of time that my camera is also quite old, and the pictures won't be the best. :) In fact, it probably won't pick up the screen at all with all the lights out. But I'll try to find a nice dark movie scene to shoot, that should do it.

I'll probably get around to getting the DVE or Avia at some point. Or I'll just convince my buddy to add it to his Blockbuster queue. For some reason, they offer it online, but not in their stores. Ijits. I have a fair few test patterns that I downloaded and am displaying through the Xbox. The one I used for dead pixels is pure black, so the dead pixel stands out pretty well on that screen when you're up close (it now appears that the pixel is truly dead and not just stuck.. it's all three subpixels locked open, which is a sign it's not getting any power).
 
I've always known that MPEG-2 wasn't a "problem"
I must have misunderstood your earlier post then where you said you didn't want the early Blu-ray releases because they were using MPEG2.

That reminds me of something I read on the Bits a while back. Bill visited one of the studios to check out their BD stuff, and he pointed out that the single best bit of HD video that he'd ever seen in his life had been encoded with MPEG-2. So I know it's not "the devil".. hehe.
Good to hear. I am always amazed to read many self proclaimed experts on AVS & HTF complain about MPEG2 when the artifacts they are seeing are clearly from the transfer and have nothing to do with the codec.

I'm also well aware of the interlacing issues.. the link you posted is referring specifically to deinterlacing 1080i material, whereas everything I'm watching is only 480i (aside from the 360, of course).
Actually this is not the case. Deinterlacing (or lack thereof) effects ALL interlaced signals. So if a display is scalling instead of doing proper deinterlacing, then 480i resolution is actually 280p scaled to match the display as that is the actual resolution of each 480i field. To add to the confusion, most displays that do actually deinterlace 480i signals do not deinterlace 1080i signals... and simply scale it. 👎

And it IS doing full-bore inverse 3:2 pulldown, reintegrating the full 480p image as it was meant to be seen.
👍
 
I think in my original post about Blu-Ray, I was referring to simply wanting the better video quality overall. One of the reviews I read of The Fifth Element.. in fact, MOST reviews I read of that one, said that it offers almost zero improvement over the Superbit DVD.

As for efficiency of the codecs, though.. VC1 at 50GB is still better than MPEG-2 at 25GB.. hehe. Hell, it's better than VC1 at 25GB.. lol. I think it's mostly the size thing.. the second layer means a LOT less compression.

Is there anywhere I could rent a HD-DVD player? That would be one way to test out it's deinterlacing capabilities with 1080i.
 
The contrast ratio seems fine, Omnis. Yeah, blacks aren't "true" black, but they're actually a fair bit darker than they were on the old CRT that it replaced. On the flip side, whites are a good deal brighter than my old TV, although that could just be the size of the thing. Displaying an all-white image will light up most of downstairs. And the advertised contrast ratio is actually 1200:1.. hehe.

Word. 👍

Still... pics. Now. :D
 
Can someone give a good explanation or link to what 3:2 pulldown is and how it works? The whole concept is confusing me.


KM.
 
That could take QUITE a while, Kieran.. hehe.

Try this: http://en.wikipedia.org/wiki/Telecine. There's a section partway down the page that describes how 3:2 pulldown works. Basically, it's a method used to convert 24p to 60i. Inverse 3:2 pulldown does the reverse.. it takes in 60i footage and recombines the original frames to create 24p footage.
 
Oh boy, that was/is confusing. I think I need to get myself some good reading material on home cinema/HDTV/etc. because all this stuff has me confused to buggery. So just ignore me, I think I've got some reading studying to do before I can meaningfully contribute to these conversations.


KM.
 
It's the method used to display 24fps content on most displays.

Your TV (or DVD player I guess) does this process when playing DVDs (same with HD-DVD and Blu-Ray). They take teh 24fps progressive content, and most TVs (even if displaying progressive) can only display at 60fps progressive, so to make it short:

Instead of having frames like this:

1:1:1:1.. (24 total)

It's like
3:2:3:2:3 so on up to 60 I think, or maybe it's 3:2:2:3 (display one frame 3 times, another one 2 times so on). Then there's reverse pulldown that does the opposite (or other way around not too sure).
 
I think in my original post about Blu-Ray, I was referring to simply wanting the better video quality overall. One of the reviews I read of The Fifth Element... in fact, MOST reviews I read of that one, said that it offers almost zero improvement over the Superbit DVD.
This only shows how bad/suspiciously biased DVD "reviewers" have gotten over the years. I've also read these so-called reviews, and they are beyond laughable in that in most cases they clearly expose the reviewer's lack of any understanding on how film captured, transferred to video, and encoded on discs... either that, or they are exposed as deliberately misleading unsuspecting readers.

Without going into great detail on the differences between MPEG-2 and VC1 or MPEG-4 (H.264), the easiest way to show that MPEG-2 is not the cause of poor video quality is to examine some of the excellent reference video that has been released on standard DVD, and even more so, as Bill Hunt discovered, the incredible HD video encoded in MPEG-2. If MPEG-2 caused these so-called artifacts that these "reviewers" love to claim that it does... then guess what... ALL MPEG-2 coded video would show similar artifacts. The simple fact is that they do not!

The sad truth is that just about anyone can not only claim to be a DVD reviewer, but can sign up as one on just about any number of sites, even well established and respected sites. In most cases, all it takes to be one is to send an email to the site admins saying you want to be one, and voila. As a result, not only are there countless reviewers who really have no understanding of what causes artificial artifacts, but there is no telling what kind of equipment they are actually using (not what they might claim they are using) to reviews discs, which more often then not, causes far more artifacts than the DVD transfers themselves.

Having the right equipment (players, processors, cables, displays, etc) is probably 90% of the battle in having the opportunity to see film and video at its best. Beyond that, it is then up to the studios/networks to provide the best possible source material, and then it is up to the producers to provide the best possible equipment for making the masters, then it is up to the technicians to properly use that equipment.

Now I haven't seen the masters for Columbia/TriStar's Blu-ray edition of The Fifth Element, but I have seen the BD disc and compared it to the SB edition, and anyone who says there is almost zero improvement is either lying, or doesn't have a quality system/display.

Anyone who even has a basic understanding of film and video knows that EVEN if Sony used the exact same source as used in making the transfer for the SB edition. And EVEN if they used the exact same master (assuming that like most studios, they now make HD transfers and scale them down for standard DVD releases), even then, the Blu-ray edition will still have six times more native resolution than the SB edition. It also has a bit rate that is four times greater than the SB edition, and nearly twice as much as any HD DVD disc.


As for efficiency of the codecs, though.. VC1 at 50GB is still better than MPEG-2 at 25GB.. hehe. Hell, it's better than VC1 at 25GB.. lol. I think it's mostly the size thing.. the second layer means a LOT less compression.
Not necessarily. It really only means it has twice the capacity as it is already using VC-1's max bit rates. And if less compression is what you are after, then MPEG-2 is the codec for you as it uses 1/4 the compression rate as VC-1 and MPEG-4. An added bonus to Blu-ray is that it can playback MPEG-2 video at nearly twice the bit-rate of VC-1 on HD DVD. The reason HD DVD wont use MPEG-2 is that it doesn't have enough disc capacity, and thus it REQUIRED the use of either VC-1 or MPEG-4 which use four times the compression rate as MPEG-2.

The good news, is that despite VC-1 and MPEG-4's higher compression rates, they do not impact the quality of the video... which is why they are great codecs, especially for online distribution as they can be downloaded at four times the rate of MPEG-2 video. And in the case of HD DVD, these new codecs are a necessity due to its limited disc capacity.
 
Good choice on the TV Jedi. It's a great buy for the money, especially since it's 1080p capable. Does the Best Buy you go to have a Magnolia section? I work in Home Theater at Best Buy so I hope you got good help and such :)
 
When I buy an HDTV over 37inch's for my bedroom, I am going for the 1080p LCD Sony W2000 series in a 40inch size. However I will stick with my 1080i 26inch LG LCD HDTV at the moment.

But Jedi you should of brought the 46inch Sony W2000, because its black levels have been said to be better then Plasma's.
 
When I buy an HDTV over 37inch's for my bedroom, I am going for the 1080p LCD Sony W2000 series in a 40inch size. However I will stick with my 1080i 26inch LG LCD HDTV at the moment.

But Jedi you should of brought the 46inch Sony W2000, because its black levels have been said to be better then Plasma's.

I think come the new year, once we've re-decorated the lounge, I'll be replacing my aging Sony 25" CRT with the KDL40W Sony LCD, it does seem to be the best around at the moment. If you shop around you can get it for 3/4 retail price. 👍
 
Back