Sony's 10,000 Nit TV at CES 2018

  • Thread starter hankolerd
  • 60 comments
  • 14,063 views
Looks like my next TV... in 5 years or so.

I'm surprised 8K is really a thing, because i wonder what kind of raw power is needed to showcase a game like GT Sports in real time and native 8K? PS4 Pro can't handle GT Sport in native 4K, so i wonder what the PS5 will do. A jump to native 8K? Sounds optimistic to me.

I also wonder what the next big thing really is. Playing a racing game on a 8K Super-HDR TV? Or playing it on the next generation of VR?? What do you guys think?
I've been in the AV industry for a little over 19 years and 4K was discussed back then as was a theoretical 8k. Just to clarify though, the term 4k and 8k are lies. It suggests four and eight thousand lines of resolution. They are only half that. The four and eight came because an UHD picture is the equivalent of four 1080 screens in a tile formation. The actual resolution of "4k" is 2.1k.
 
I recently bought a new 4K HDR OLED tv from LG. I've been very happy with it for playing Gtsport and watching movies on Netflix and Amazon. It's bigger than my last tv, at 55" somewhat a nice upgrade from my old 42" Panasonic Vierra TX-P42GT30B Plasma tv. HDR looks awesome on this B7V. So far I haven't experienced any bad screen burn in but I have had image retention. This is remedied by watching movies + gaming and vice versa and scheduling running the pixel refresher to run every night I shut down the tv. Also I have no bad signs of vertical banding or bad screen uniformity. DSE isn't really that bad just yet. But I'd guess I'm still running the panel in over so many hours. Motion judder and flicker is a problem and sometimes it's not as smooth as I'd like as my Plasma was but my Plasma suffered a little too and was never perfect with motion but it was better and more fluid. The 2018 OLED might finally fix this with Black frame insertion. As watching some content with TrueMotion enabled on my B7 introduces interpolation artefacts. But a lot of movies look amazing and the tv has 0 motion issues with them. All in all the tv is bright enough for me and I'm very happy with it. I will be watching closely the Sony A1E, LG C8, E8, W8, and the new Panasonic OLED especially.
 
I recently bought a new 4K HDR OLED tv from LG. I've been very happy with it for playing Gtsport and watching movies on Netflix and Amazon. It's bigger than my last tv, at 55" somewhat a nice upgrade from my old 42" Panasonic Vierra TX-P42GT30B Plasma tv. HDR looks awesome on this B7V. So far I haven't experienced any bad screen burn in but I have had image retention. This is remedied by watching movies + gaming and vice versa and scheduling running the pixel refresher to run every night I shut down the tv. Also I have no bad signs of vertical banding or bad screen uniformity. DSE isn't really that bad just yet. But I'd guess I'm still running the panel in over so many hours. Motion judder and flicker is a problem and sometimes it's not as smooth as I'd like as my Plasma was but my Plasma suffered a little too and was never perfect with motion but it was better and more fluid. The 2018 OLED might finally fix this with Black frame insertion. As watching some content with TrueMotion enabled on my B7 introduces interpolation artefacts. But a lot of movies look amazing and the tv has 0 motion issues with them. All in all the tv is bright enough for me and I'm very happy with it. I will be watching closely the Sony A1E, LG C8, E8, W8, and the new Panasonic OLED especially.
Avoid OLED. Due to it being an organic system the colours breakdown and deteriorate much faster than LCD or LED. Also, avoid running the pixel corrector thing you were talking about, it simply ages the pixels when faster. When looking to replace your screen look at Samsung. The pictures on LG screens when compared to other brands do not display colours with as much definition as other premium brands.

All that said...... OLED DO five better black than LED or LCD though.
 
I've been in the AV industry for a little over 19 years and 4K was discussed back then as was a theoretical 8k. Just to clarify though, the term 4k and 8k are lies. It suggests four and eight thousand lines of resolution. They are only half that. The four and eight came because an UHD picture is the equivalent of four 1080 screens in a tile formation. The actual resolution of "4k" is 2.1k.

it is not true. 1080p is also known as 2K because movie resolution are different as normal tv's. 1080p is 1920 width and 1080 hight. as movieformat is 2048 width and 1080 hight. 1920 and 2048 are about 2000 pixels therefor 2K. 4K is 3840 × 2160 or 4k DCI (movie format) = 4096 × 2160 which is about 4000 pixels therefor 4K.

It is all about the width.
 
it is not true. 1080p is also known as 2K because movie resolution are different as normal tv's. 1080p is 1920 width and 1080 hight. as movieformat is 2048 width and 1080 hight. 1920 and 2048 are about 2000 pixels therefor 2K. 4K is 3840 × 2160 or 4k DCI (movie format) = 4096 × 2160 which is about 4000 pixels therefor 4K.

It is all about the width.
What do you do for a living?
 
What do you do for a living?

It is not about what I do today, It is about what my experience and what my perspective is.

Maybe your experience is different but so far I know was 4k and 8k never were expected in the future. There are even tapes thrown away because it took to many storage and they never expected that 4k or even 8k was comming to cinemas. (I'll show a source if you want but I cannot find it immediately)

I agree later resolution revolutions were expected but I have never heard 2K / 4K / 8K is about the pixel height. Because 1080p or 480i are only used because they tell if the image is progressive or interlaced.

But maybe your experience is about pixel heights. But is seems very odd to me.


EDIT:

source: http://4k.com/news/20-years-movie-titles-filmed-4k-nobody-bothered-save/

Eastman Kodak released some of the first film scanners that were able to scan at 4096 x 2160 resolution all the way back in 1992 and the first movie to be processed in 4K was actually Snow White and the Seven Dwarfs all the way back in 1993. The original film as digitized to 4K, processed, cleaned up and then downgraded back to a much smaller digital resolution that fit the technology of the time and the original 4K scan was simply erased.
 
Last edited:
It is not about what I do today, It is about what my experience and what my perspective is.

Maybe your experience is different but so far I know was 4k and 8k never were expected in the future. There are even tapes thrown away because it took to many storage and they never expected that 4k or even 8k was comming to cinemas. (I'll show a source if you want but I cannot find it immediately)

I agree later resolution revolutions were expected but I have never heard 2K / 4K / 8K is about the pixel height. Because 1080p or 480i are only used because they tell if the image is progressive or interlaced.

But maybe your experience is about pixel heights. But is seems very odd to me.


EDIT:

source: http://4k.com/news/20-years-movie-titles-filmed-4k-nobody-bothered-save/
That's a brilliant article. Lines of resolution have always been counted horizontally. If you look at how the system has been working since the start of time everything is based on horizontal. The old Teletext system was carried on horizontal lines. The interlaced or progressive side that you were just talking about is nothing more than the order that the lines at fired. The number in front of that is the resolution. That's why it's listed as 480/520/720/1080/2160. It's horizontal lines that carry the info. You can discuss this if you want but I spent 8.5 years working for Sony and have spent since February 2006 working in the professional installing industry.
 
Yep on my cheapo non HDR 50" tv I've been tempted to wear sunglasses.
That one race a month ago with the setting sun, I had to wear sunglasses- I felt stupid sitting there on my rig indoors with my sunglasses on, but the sun was obstructing my view of the apex in a few spots.
 
That's a brilliant article. Lines of resolution have always been counted horizontally. If you look at how the system has been working since the start of time everything is based on horizontal. The old Teletext system was carried on horizontal lines. The interlaced or progressive side that you were just talking about is nothing more than the order that the lines at fired. The number in front of that is the resolution. That's why it's listed as 480/520/720/1080/2160. It's horizontal lines that carry the info. You can discuss this if you want but I spent 8.5 years working for Sony and have spent since February 2006 working in the professional installing industry.

I fully agree with you. :cheers:
I know the horizontal lines contains the info, the teletext, etc. (you can even see the teletext lines if you are watching tv at a videorecorder if you know what I mean, the black/white lines above). But I used to work with the vertical lines when making cuts for disks. Because film widescreen is shot at 2k DCI = 2048 horizontal. But Blu rays are printed for FullHD tv's. So scans downscales to 1920. Like dvds to 720. The horzontal lines are 'less value' as blu ray are printed as 1080p even if the film as 1920 by 800 lines. Therefor I was used to 4K (or 2K or DVD width) as 4096. And not as 2160p which means 2.1K like you said. In the end it is just a matter of perspective, so I understand what you are saying and why.
(The article was just to show people didn't expect such high resolutions and throw things away)
 
I fully agree with you. :cheers:
I know the horizontal lines contains the info, the teletext, etc. (you can even see the teletext lines if you are watching tv at a videorecorder if you know what I mean, the black/white lines above). But I used to work with the vertical lines when making cuts for disks. Because film widescreen is shot at 2k DCI = 2048 horizontal. But Blu rays are printed for FullHD tv's. So scans downscales to 1920. Like dvds to 720. The horzontal lines are 'less value' as blu ray are printed as 1080p even if the film as 1920 by 800 lines. Therefor I was used to 4K (or 2K or DVD width) as 4096. And not as 2160p which means 2.1K like you said. In the end it is just a matter of perspective, so I understand what you are saying and why.
(The article was just to show people didn't expect such high resolutions and throw things away)
I could talk about this all day. Would love to get on to the subject of what the public perceive as "HD" or "4k". People invest extra money in 4K Amazon and Netflix when neither actually broadcast in 4k or anywhere near it. A lot of companies add grain to their higher resolution images because people complain about a super clean image running at a high refresh rate. It's hard for the brain to process a super clean video.
We could also talk about all those people who pay for sky HD or even a 4k service without knowing sky can't even broadcast an image at 720 properly. Everything from sky is upscaled by the box.
 
Avoid OLED. Due to it being an organic system the colours breakdown and deteriorate much faster than LCD or LED. Also, avoid running the pixel corrector thing you were talking about, it simply ages the pixels when faster. When looking to replace your screen look at Samsung. The pictures on LG screens when compared to other brands do not display colours with as much definition as other premium brands.

All that said...... OLED DO five better black than LED or LCD though.

The OLED colours will not deteriorate as fast as you say they will. It will take at least a long time or many thousands or hundred of thousands of hours, for 11 hours a day for 26 years for the organic materials, colours, one after the other to start showing signs of deterioration.

The B7 LG picture definition is great, colours are excellent, motion is good, detail and contrast ratio is infinite and looks great, HDR looks great and the LG is tone mapped for more pop and dynamic definition highlights over the Sony or Panasonic.

HD and 4K DB and HDR look amazing. Motion is the only issue and upscaling SD might not be as great as Sony or Samsung but HD and 4K look brilliant and equally impressive. Cable is also very passing and watcheable for SD.

I have had only slight image retention but it disappears and I don't leave static images on the tv. No banding or burn in or DSE. I look after the tv and don't use it for hours on end playing static images. Sure there are motion hiccups but there are with the Sony and the Panasonic.

The 2018 sets will have better processing and even better motion handling with black frame insertion. And also higher steps of colour gradation for less colour banding. Overall I'm very happy with my B7. It is a big leap over my Panasonic Vierra GT50 Plasma tv.

The only issues with OLED that I feel need to be improved is motion interpolation artefacts with trumotion engaged. Motion judder, colour banding. And the panel lottery with panel uniformity, vertical banding, DSE. But those same defects affect LED LCD far worse.

I am excited for the non missive Micro LED tech though.
 
Last edited:
I fully agree with you. :cheers:
I know the horizontal lines contains the info, the teletext, etc. (you can even see the teletext lines if you are watching tv at a videorecorder if you know what I mean, the black/white lines above). But I used to work with the vertical lines when making cuts for disks. Because film widescreen is shot at 2k DCI = 2048 horizontal. But Blu rays are printed for FullHD tv's. So scans downscales to 1920. Like dvds to 720. The horzontal lines are 'less value' as blu ray are printed as 1080p even if the film as 1920 by 800 lines. Therefor I was used to 4K (or 2K or DVD width) as 4096. And not as 2160p which means 2.1K like you said. In the end it is just a matter of perspective, so I understand what you are saying and why.
(The article was just to show people didn't expect such high resolutions and throw things away)

I always wondered what the black white codes were at the top of the screen when HDTV became a thing. On a 1:1 pixel match 1080p tv I always got bar code like stuff running at the top of the screen. We don't have teletext here but perhaps it was closed caption data. There also was a blue vertical line on the left which I think is for priming a CRT screen. The blue line on the left still appears on some content.

It's a shame anamorphic blu-rays never became a thing, they would look better now upscaled to 4K. Such a waste to store 280 black lines. What I would really like is a 4K player that downscales to 4:4:4 1080p, that would be a nice upgrade for my 1080p projector that supports 4:4:4 content. The resolution of blu-ray is fine, less compression artifacts and 4:4:4 color would be nice.
 
I could talk about this all day. Would love to get on to the subject of what the public perceive as "HD" or "4k". People invest extra money in 4K Amazon and Netflix when neither actually broadcast in 4k or anywhere near it. A lot of companies add grain to their higher resolution images because people complain about a super clean image running at a high refresh rate. It's hard for the brain to process a super clean video.
We could also talk about all those people who pay for sky HD or even a 4k service without knowing sky can't even broadcast an image at 720 properly. Everything from sky is upscaled by the box.

Yeah, because 4k is even better on 1080p because the 'expert' said so :lol:. Upscale --> Downscale --> loss of qualtiy compared to original image. But this happens with everything what people don't know. People love labels and expert reviews.

I always wondered what the black white codes were at the top of the screen when HDTV became a thing. On a 1:1 pixel match 1080p tv I always got bar code like stuff running at the top of the screen. We don't have teletext here but perhaps it was closed caption data. There also was a blue vertical line on the left which I think is for priming a CRT screen. The blue line on the left still appears on some content.

It's a shame anamorphic blu-rays never became a thing, they would look better now upscaled to 4K. Such a waste to store 280 black lines. What I would really like is a 4K player that downscales to 4:4:4 1080p, that would be a nice upgrade for my 1080p projector that supports 4:4:4 content. The resolution of blu-ray is fine, less compression artifacts and 4:4:4 color would be nice.

I don't know about blue lines. But 'side'-lines or mostly solvable by screen adjustment. The black bars don't matter on a disk, they also don't need a lot of storage because they remain black. A disk is always printed as 'full' so you can't at stuff. But I know it is a waste of storage if you rip dvds. But you are not supposed to :P.
 
As for the question: "When will 8k be standard in gaming?

We are even still 2 years away from native 4k becoming a standard on consoles as the current gen enhanced consoles can not handle native 4k across the board and dont even get me started on the fps. I cant play 30fps titles anymore.
Then the best PC gaming graphics card (1080 Ti) cant handle a locked 4k/60 on current demanding open world games like Assassins Creed Origins. And i doubt that the next flagship "Ti" Volta GPU will do that either because games still will continue to look better, so probably in two years even a flagship Volta Ti will struggle with 4k/60 Ultra on the then current most demanding games.

Having said that, i think 8k gaming as an affordable option on PC is at least two GPU generations or 3-4 years away.
And as for consoles i see checkerboarding 8k or dynamic 8k in 2025 with a PS6 or XBOX 3

I'd say a lot of games will be 4k or close to it on PS5. I'm betting PS5 will be a step up from Xbox One X but not a huge one.

What I'm also betting is that 60 fps will never be standard. It will be for fighting games, and most driving games, but it will never be the standard on consoles IMO. There's just too much benefit to leaving games at 30 fps and cranking up the graphics to the max.

Sure, some of it is laziness/dev's "settling" for 30fps, some of it is the CPU's in these consoles not being powerful enough, but most of the time there's always a compromise developers can make. Either 30 fps and the best graphics possible, or 60 fps and sometimes significantly worse graphics.

Usually for a game to look great and run 60 on consoles it has to be designed from the ground up with that target in mind. Doom, Wolfenstein II, Battlefield 1 and GT Sport come to mind.

Most games don't need 60 fps. There are so many types of games where it's nothing but a luxury and not even close to a necessity so I don't see 60 fps ever becoming the standard on console. And it doesn't bother me either. Most of the best games ever in my opinion are or were all 30 fps.

4k though I think is within our reach considering these half gen consoles are already doing it or close to it. Give us a console like PS5 which I would guess will have a 12 TFLOP GPU and hopefully a CPU comparable to a current gen i7 with plenty of ram when it releases and I think 4k will be the standard the way 1080p is right now.

I'm guessing PS5 will release in 2020 or 2021 by the way
 
Last edited:
I am seriously impressed they already have a prototype at such high brightness levels, wonder how damaging it is to your eyes though. Personally I just want Sony to release a small Android 4K HDR TV that is not laggy with as minimal input lag, HDMI 2.1 and decent speakers.
I could talk about this all day. Would love to get on to the subject of what the public perceive as "HD" or "4k". People invest extra money in 4K Amazon and Netflix when neither actually broadcast in 4k or anywhere near it. A lot of companies add grain to their higher resolution images because people complain about a super clean image running at a high refresh rate. It's hard for the brain to process a super clean video.
We could also talk about all those people who pay for sky HD or even a 4k service without knowing sky can't even broadcast an image at 720 properly. Everything from sky is upscaled by the box.
Some strong claims there, what resolution do likes of Amazon, Netflix and Sky broadcast at then the 4K content?
 
I don't know about blue lines. But 'side'-lines or mostly solvable by screen adjustment. The black bars don't matter on a disk, they also don't need a lot of storage because they remain black. A disk is always printed as 'full' so you can't at stuff. But I know it is a waste of storage if you rip dvds. But you are not supposed to :P.

The vertical (fuzzy) blue line appears on the left of 4:3 content, can't get rid of it with screen adjustment, unless you like zoom or stretch mode. I think it's a bad conversion to digital that causes that line to be there.

Storing 1080 lines instead of 800 puts more stress on the compression algorithm, yet in quiet scenes more detail should be able to be stored to help with upscaling to 4K. Movies don't fill the entire disk anyway http://www.avsforum.com/forum/150-b...-audio-video-specifications-thread.html?pp=60 Perhaps the 4K mastered ones are, those use higher bitrates and x.v.color. I haven't tried one of those yet, too busy playing GTS to watch movies!

It's a balancing act between resolution and bitrate. Blu-ray doesn't look that great when you pause it during heavy action, better than pausing Netflix or HDTV, yet still very much bit starved. It's probably the same for 4K UHD, however downscaling that to 1080p should provide a more stable picture with less up and down swings in resolvable detail. I wonder what's going to store video content for 8k.
 
The vertical (fuzzy) blue line appears on the left of 4:3 content, can't get rid of it with screen adjustment, unless you like zoom or stretch mode. I think it's a bad conversion to digital that causes that line to be there.

Storing 1080 lines instead of 800 puts more stress on the compression algorithm, yet in quiet scenes more detail should be able to be stored to help with upscaling to 4K. Movies don't fill the entire disk anyway http://www.avsforum.com/forum/150-b...-audio-video-specifications-thread.html?pp=60 Perhaps the 4K mastered ones are, those use higher bitrates and x.v.color. I haven't tried one of those yet, too busy playing GTS to watch movies!

It's a balancing act between resolution and bitrate. Blu-ray doesn't look that great when you pause it during heavy action, better than pausing Netflix or HDTV, yet still very much bit starved. It's probably the same for 4K UHD, however downscaling that to 1080p should provide a more stable picture with less up and down swings in resolvable detail. I wonder what's going to store video content for 8k.

Of course Native 4k downscale to 1080p should give better image quality then something shot in 1080p. With downscaling there is also noise reduction. 'full' is between quotes because of course not every movie has the same size. But try any dvd and open it in Windows Explorer, you'll see 0 bytes of xxx available. It is not really full, it is just it looks like it is full.

Some strong claims there, what resolution do likes of Amazon, Netflix and Sky broadcast at then the 4K content?

He means not all scenes / movies / tv are captured in 4K. So they use upscale to make sure everything you get is 4K. This means if the scene is not shot in native 4K it will be upscaled and broadcasted. So if you are watching at an 1080p screen, and the scene is shot in 1080p which is upscaled to 4k you will have loss of quallity instead of a native 1080p source. Nevertheless streaming uses almost always lot compression which doesn't improve the quality.
 
Some strong claims there, what resolution do likes of Amazon, Netflix and Sky broadcast at then the 4K content?

Amazon and Netflix stream at 4k however they are heavily compressed. Sky, im assuming are just as bad.

The only content ive seen which appears to push at a half decent bitrate is BT Sport in 4k in the UK.
 
I am seriously impressed they already have a prototype at such high brightness levels, wonder how damaging it is to your eyes though. Personally I just want Sony to release a small Android 4K HDR TV that is not laggy with as minimal input lag, HDMI 2.1 and decent speakers.

Some strong claims there, what resolution do likes of Amazon, Netflix and Sky broadcast at then the 4K content?
They add grain to the image. No point having a grainy 4k image. I challenge you to watch something Amazon or Netflix are claiming is 4k and compare it to a 4k image from YouTube.
 
Of course Native 4k downscale to 1080p should give better image quality then something shot in 1080p. With downscaling there is also noise reduction. 'full' is between quotes because of course not every movie has the same size. But try any dvd and open it in Windows Explorer, you'll see 0 bytes of xxx available. It is not really full, it is just it looks like it is full.


He means not all scenes / movies / tv are captured in 4K. So they use upscale to make sure everything you get is 4K. This means if the scene is not shot in native 4K it will be upscaled and broadcasted. So if you are watching at an 1080p screen, and the scene is shot in 1080p which is upscaled to 4k you will have loss of quallity instead of a native 1080p source. Nevertheless streaming uses almost always lot compression which doesn't improve the quality.

It just shows 0 bytes left since it's a read only disk. I'm not sure why they don't always utilize the maximum available size, maybe it's not as simple as setting a target file size for compression. I don't know how h.264 encoders work, yet with variable bit rate the result is unpredictable.

A lot of 4K UHD discs are unfortunately still upscaled. On IMDB you get see whether the movie actually had a 4K master in the technical specs. For example Mad Max Fury road only has a 2K Digital intermediate (master format) so any 4K UHD print is upscaled. To get a real 4K movie you need at least 5K source format and a 4K Digital intermediate when the movie was made. Star Wars the last Jedi has a 4K digital intermediate, yet if you look at the sources there's plenty scense that don't make the 4K cut.

What you do get on all 4K UHD discs is DCI-P3 color instead of BT.709 and 4x the color resolution, 1080p instead of 540p. HDR is also a mixed bag. Added in post process doesn't always look great like for example Mad Max Fury road where added HDR highlights makes the fake fire effects stand out.

Blu-ray still has plenty life left. There's no reason it can't use the new color space or store 4:4:4 video. Ofcourse with a new format they can charge extra for those who care. Even though I have a 4K HDR tv now, I don't see enough gain to start upgrading my 500+ blu-ray collection again. From DVD to blu-ray was a huge step, yet from blu-ray to 4K UHD is pretty minor. Perhaps in a few years when every movie is native 4K at shot in HDR I might change my mind. I'm getting older too and less interested in new movies, at least I hope it's that a not movies getting worse every year :/

So far the best looking movies I have seen were shot at 60mm. I was a treat to go see Interstellar at 60mm, despite the long drive. Unfortunately it was only select scenes and the rest was still 35mm. For blu-ray Baraka and Samsara look exceptional. Those were scanned in 8K from 60mm negatives, then mastered in 4k and downscaled to 1080p for blu-ray. It's all slow moving images too so little compression artifacts, the result looks amazing. The 96kzh soundtrack fits well with it.

8K tv is nice, however I'm still waiting for actual 1080p content coming from cable, or to stream 1080p without a ton of compression artifacts.
 
Amazon and Netflix stream at 4k however they are heavily compressed. Sky, im assuming are just as bad.

The only content ive seen which appears to push at a half decent bitrate is BT Sport in 4k in the UK.
That's cool that you guys get some sports in 4k.

Here, and someone correct me if I'm wrong, all broadcast TV is still primarily 720p or 1080i. I think only a few things are broadcast in 1080p
 
So far the best looking movies I have seen were shot at 60mm. I was a treat to go see Interstellar at 60mm, despite the long drive. Unfortunately it was only select scenes and the rest was still 35mm. For blu-ray Baraka and Samsara look exceptional. Those were scanned in 8K from 60mm negatives,
Don't you mean 70mm?
 
They add grain to the image. No point having a grainy 4k image. I challenge you to watch something Amazon or Netflix are claiming is 4k and compare it to a 4k image from YouTube.

Its not that they add a "grain" to the image. I'm not sure where you are getting that from without calling in your 19 years in the industry but that's nonsense.

It's just a low bit rate.. Simply that they are cheaping out to save bandwidth. Netflix 4k streams work at around 14mbps. It's far far too low a bitrate for a crisp 4k image.

The problem is two fold. Price of bandwidth and the average connection speed.

Most people would complain if their uhd stream required a 40 or 50mb connection to get a decent 4k stream.. So as it stands netflix use hevc at around 15mb bit rate. Decent but in time will rise I suspect as more people get 4k sets. To the average punter it's a fine compromise as it's a clear step up from 1080 content.


That's cool that you guys get some sports in 4k.

Here, and someone correct me if I'm wrong, all broadcast TV is still primarily 720p or 1080i. I think only a few things are broadcast in 1080p

Im not sure where you are but the UK is also primerily 1080i if that for HD content also. Most places are the same to save bandwidth tbf.
 
Last edited:
Its not that they add a "grain" to the image. I'm not sure where you are getting that from without calling in your 19 years in the industry but that's nonsense.

It's just a low bit rate.. Simply that they are cheaping out to save bandwidth. Netflix 4k streams work at around 14mbps. It's far far too low a bitrate for a crisp 4k image.

The problem is two fold. Price of bandwidth and the average connection speed.

Most people would complain if their uhd stream required a 40 or 50mb connection to get a decent 4k stream.. So as it stands netflix use hevc at around 15mb bit rate. Decent but in time will rise I suspect as more people get 4k sets. To the average punter it's a fine compromise as it's a clear step up from 1080 content.




Im not sure where you are but the UK is also primerily 1080i if that for HD content also. Most places are the same to save bandwidth tbf.

Fantastic information!!
I would argue that if I'm paying extra for 4k (I don't) I would insist on it being a crisp picture don't you agree?
 
Fantastic information!!
I would argue that if I'm paying extra for 4k (I don't) I would insist on it being a crisp picture don't you agree?

Yea and no.. Its a balancing act. Take for example as a rough way to think of it - Blu Ray is encoded at about 50mbps now that's just 1080p netflix does HD at 7mbps roughly. Scale that up and the encoding process from streaming 4k has to drop the bit rate substantially or it's just not manageable.

Remember you and I are paying for convenience at having content on demand so something has to give. The codecs used by netflix, YouTube etc are about providing content in a cost effective way.

The bandwidth to provide that at near source bit rate is just not manageable.

If you want 4k movies then uhd blu Ray is the only way to do that.

4k netflix isn't that bad at a distance tbf so although I'd prefer it to be better.. Its perfectly watchable and a good just up from the standard HD feeds.
 
I had fully intended to skip 4K and wait for 8K. I remember a couple of years ago reading about some impressions of an 8K set (I think at CES) and the general consensus was that it was like looking through a window. Unfortunately, one of my 1080p TVs died and I found a killer deal on a 4K Bravia (Android OS was a selling point, too), so I made the jump. Really wish I could have held out, though. I suspect 8K is probably the end point for resolution. After that, there's not much more the human eye can perceive.
Spoke with a friend of mine about this as he sells this kind of gear. He said you'd be waiting several years before these actually hit the market, and that at an affordable price. I've got a 4K TV but no HDR. So my next purchase will be 4K HDR. That should hold me over to get good use out of the PS5 for a few years.
 

Latest Posts

Back