What definition do they record Hollywood movies in? Is it higher than 1080p?
There isn't a simple yes or no answer. Only digital cameras have "resolution", but even digitally, many directors are using what is commonly known as
"4k" digital cameras. 4K is a lose reference to the number of vertical lines of resolution, where as 1080p refers to the number of horizontal lines of resolution. Thus a typical 4K camera or projector would have a resolution of at least 3840x2160 which is four times greater than 1920x1080.
The most accurate way to compare resolutions of digital cameras though is in terms of pixels. A 1920x1080 camera has
2 MP (mega pixels), however the most popular 4K camera used to make many motion pictures has over
12 MP (5760x2160). It's called the
Genesis and was designed by Panavision. Several well known films were shot using Genesis cameras, like
Superman Returns,
Apocalypto, and was most recently used to shoot Frank Miller's soon to be released film,
The Spirit.
The Genesis is but one of several "4K" cameras being used. One could argue the most popular one right now due to it's superb picture quality and incredibly cheap price is Red Digital's
"Red One". It has over
9 MP (4096x2304).
That being said, for every film shot in 4K digital, there are ten being shot in 1080p digital.... and far more though are still shot on traditional 35mm film cameras.
Now this is where it get's particularly complicated. Film, unlike digital formats, has no inherent resolution. What it does have is detail, and it is by which we can compare film to digital formats. While not precise, as so many factors play a role into how much detail can be captured on both film and digital formats, in general and under perfect conditions, it is assumed that
35mm film can capture as much detail as a
12-15 MP camera can. Taking it to extremes,
65/70mm film in equally ideal conditions can capture nearly
twice as much detail.
The problem is, I have seen thousands of films over the years and I have not once seen an instance where a film was captured in perfect conditions. There are so many different ways for a film to lose detail, especially when shot on film stock. The
biggest factor is lighting. Nothing kills detail faster than a poorly lit shot. Perhaps the next biggest factor is the
quality of the camera, especially the lenses being used. A low quality lens can drop the amount of detail that can be captured by as much as 50%.
To put it in a more real world perspective, if you took a dozen 8-10MP still cameras and compared shots of the exact same image under the exact same conditions, you wont likely see a dozen identical photographs. The biggest culprit is the difference in quality of the lenses, but there are many other factors as well, so simply having
a camera that has a high resolution in no way guarantees you'll get great looking shots... especially compared to high quality professional motion cameras, some of which can cost as much as
$100,000.
Then you have to add in the ability of the cinematographer, the editing equipment, the way the negatives are handled and stored, the way the prints are made, how the prints are handled...
the list goes on and on and on.
This is why when you go to the theater, while one film you watch may look visually amazing, another can look horrible.
Film is one of the most delicate mediums we have.
That being said, there are films that are fifty years old that even today look far better than most films being shot today.
Now digital photography has certainly helped in some regards and hurt in others, but just like film, you can find movies shot on digital cameras that look amazing, and others that look terrible.
So as you can see, there isn't a simple answer, although it is certainly very safe to say that under the right conditions, and when properly used,
even fifty year old film cameras can capture far more detail than what a 1080p display could reproduce...
but those are the exceptions, not the rule.
BTW: In case you may not pick up on it, my GTP ID is a reference to my allegiance to both modern and traditional forms of film... Nitrate being a reference to cellulose nitrate, one of the first forms of film stock. As it is in many aspects of my life, I am both a traditionalist and a modernist.
Does Japan have a higher definition than the current standard.
If you mean does Japan use a broadcast standard with a greater resolution than the rest of the world, then the answer is
no.
However, they are currently leading the way in the development of much higher resolution digital formats.
NHK has been developing what was once called
UHDV (Ultra High Definition Video), but has since been come to be known as
SHV (Super Hi-Vision). It has over
33 MP (7680x4320), runs at
60 frames per second, supports up to
22.2 channels of sound, and runs on the
21 GHz frequency band... yes it's some serious $%& to behold!
They have been developing it for many years now and even demonstrating it over the last five years. They hope to make it an international standard by 2015, but most industry analysts agree that that is extremely unlikely.
The fact remains that while the majority of films look better in 1080p than even in 720p
(less than 1MP), many do not due to the poor quality of the original source, and this is even more true for broadcast TV programming. Considering the still growing popularity of 1080p
(2MP), and the lack of content that would benefit from a "4K"
(8MP) display let alone a massive 33MP display makes this endeavor seem rather pointless, except for us fellow AV gear heads who love this sort of thing.
EDIT: Long post, so I highlighted key points to make it easier to browse through.
