Good read on current-gen frame rate benchmark

The programmers were worried it might create too much separation between the players and the characters, that the movements would feel too fluid for a game.
Seriously? :rolleyes:
And when asked if 60 FPS will become the new standard for games, Gregory was optimistic. "We hope so," he said.
Seriously?? :rolleyes:

I'm not even a 60fps snob, but this is pretty ignorant stuff.
 
At 60 frames, seeing his breathing change, or when a Clicker shows up and you hear that sound and the way he moves changes, because all the animations are that much more fluid, I think that comes across even more now. That's going to change the play experience just a little bit in the way the players experience that
Looking good up until the bolded part. The difference between 30 and 60 fps is astounding so I don't think it's wise to say it's going to change the experience "just a little bit"
Naughty wasn't completely sold on the idea of implementing 60 FPS at first. The programmers were worried it might create too much separation between the players and the characters, that the movements would feel too fluid for a game.
Agree with @Wolfe here, you'd have to be as ignorant as Kaz to think that 60 fps is a bad idea. Separation? Really?
The Last of Us, going back and playing the 30 fps version feels, to quote some people in the office, 'broken.'
Brownie Points for realizing that 60 > 30 fps. Almost a complete 180 from what they said above.
And when asked if 60 FPS will become the new standard for games, Gregory was optimistic. "We hope so," he said. "It used to just be that first-person shooters were 60 by default, but a lot of other games didn't feel the need for it. I think we're showing that it does make a difference even in a non-FPS type game.
:lol:

Do they feel like they're pioneers? Is that what I'm getting at here? ND, there have been plenty, PLENTY, of non-fps games that are 60 fps. Lose Brownie Points for being ignorant again.

Oh and it looks like we might not even reach a broad 60fps across the board this generation either.
 
Agree with @Wolfe here, you'd have to be as ignorant as Kaz to think that 60 fps is a bad idea. Separation? Really?

It's a very real concern actually. The effectiveness of a cinematic experience is very dependent on frame rate. Low frame rate, that is.

Think about talk shows. When the show switches from interview to musical act, the frame rate changes. That's because at higher frames, musicians poncing about on stage look all the more ridiculous. Think also about motion interpolation on lcd tvs. Whack on the extra software generated frames and suddenly your high brow drama flick looks that much more like a low brow tv soap.

Oddly enough the concern is actually one of separation due to lack of separation. Meaning that our suspension of disbelief is compromised when we see something of filmic content, without filmic execution. It's confronting in the wrong way, and third person games will be particularly susceptible. The risk is akin to how breaking the fourth wall tends to come across. Unless it's being used for deliberate effect, it's just terribly distracting.
 
Think about talk shows. When the show switches from interview to musical act, the frame rate changes.

Really, what frame rate do they choose? I don't watch these shows but TV is 60hz and 50hz. I doubt they start lowering frame rate mid broadcast.
 
Last edited:
I could go on about how 60 > 30 regardless of the situation but I think TotalBiscuit describes it best. If you've watched this video then you'll know what I'm talking about.

There may be some 🤬 in this video so beware.
 
I know what you're referring to, LeMansAid, and this isn't directed at you:

The pursuit of a "cinematic experience" is the cancer of the modern gaming industry, and I don't use that phrase lightly. I find it absurd that a developer would resist implementing 60fps when possible in order to maintain a filmic execution, because they're worried about the game being "too fluid for a game." That's sad. I thought effects like lens flare were silly enough 15-20 years ago. Now developers are actually inclined to imitate 24fps film? And they pass it off like it's the way it's supposed to be?

The Last of Us is said to have innovative gameplay, but in general, fewer developers make games these days. The proliferation of interactive movies has flooded the industry with a new kind of mega-budget shovelware. If you ask me, the best indie games are miles away better experiences than most of what 3rd party publishers put out.
 
Now developers are actually inclined to imitate 24fps film? And they pass it off like it's the way it's supposed to be?

The developers of The Order: 1886 were aiming for 24fps, until they hilariously decided it 'doesn't feel good' so now they're aiming at 30. I can't find any non-tabloid sources for this, though, so here's a NeoGAF thread about it (possible NSFW language I guess, I haven't read it). I don't know why they're insisting this is intentional when practically anyone interested in gaming knows it's because the hardware can only do so much and they've chosen resolution and/or high-quality graphics over frame rate.
 
Really, what frame rate do they choose? I don't watch these shows but TV is 60hz and 50hz. I doubt they start lowering frame rate mid broadcast.

When in an editing studio watching moving images of oneself, it quickly becomes apparent that high frames = total tool. In stark contrast to low frames..... partial tool. Music videos look astonishingly amateur unless the frames are dropped to 24 or less. As for talk shows and their musicians, I'd think they must shoot at 30 frames, pushing it out at 60Hz, then switch down to the magical 24 or less (right down to 15 still works from memory), still outputting at 60Hz. A 24 fps film at 24Hz and 60Hz are very similar, 24Hz is just that bit more stable for not having to do the horrendous mathematical calculations required. Therefore being much less vulnerable to judder.

Real life has motion blur, and 24 fps is meant to be the sweet spot in capturing the right amount of blur on film. Unfortunately just as people are becoming increasingly used to pitch correction, along with it's sine wavey side effects (and actually end up missing the side effects when not present), motion interpolation is contorting peoples' perception of what is natural in video. Their augmented reality looks unnatural to me, and good old non-interpolated video might look smudgey or something to them.

As for games, there's plenty that wouldn't be playable at 24 fps, but that's probably as much to do with the compound effect of input lag. As it turns out @Wolfe, the indie devs might actually be the ones more likely to try something at 24 fps. I don't know how 24Hz recognition works on tvs, but I'd love to try perhaps a walk 'em up with a 24/24 marriage.

I think there's probably a lot that can be done towards proper syncing for video games. It may be that we won't realise what we we're missing until we see genuine smooth, as opposed to the "more frames good" mentality. I run Assetto Corsa sometimes and just shake my head at the high frames I get while the image is still jumping around like a lunatic. You'd think that locking at 60 would get it done at 60Hz, but no.

Also, I could be wrong on some or many of these things. Just trying to make sense of stuff I've picked up along the way.
 
Last edited:
Lower framerates never appear as natural to me as 60fps, motion blur or not. Forza Horizon is the steadiest 30fps game I know of, with motion blur to help compensate. While playing, the framerate is barely noticable compared to the usual "as much as 30fps" standard. I don't mind playing a game at 30fps when it's locked like that. But if I switch back and forth with a 60fps game, Horizon's halved framerate is totally obvious. It looks like...a movie. Whereas Mario Kart 8, even in spite of its whimsy, is more like peering through a window into a fantasy world.

Maybe the display you use has to do with it, because my CRT never misses a beat, with regard to input lag or any malady brought about by the image processing on HDTVs. I agree that motion interpolation looks wrong, but I think that's because it's terribly stuttery/inconsistent. At least it looks that way to me. :yuck:
 
When in an editing studio watching moving images of oneself, it quickly becomes apparent that high frames = total tool. In stark contrast to low frames..... partial tool. Music videos look astonishingly amateur unless the frames are dropped to 24 or less. As for talk shows and their musicians, I'd think they must shoot at 30 frames, pushing it out at 60Hz, then switch down to the magical 24 or less (right down to 15 still works from memory), still outputting at 60Hz. A 24 fps film at 24Hz and 60Hz are very similar, 24Hz is just that bit more stable for not having to do the horrendous mathematical calculations required. Therefore being much less vulnerable to judder.

Real life has motion blur, and 24 fps is meant to be the sweet spot in capturing the right amount of blur on film. Unfortunately just as people are becoming increasingly used to pitch correction, along with it's sine wavey side effects (and actually end up missing the side effects when not present), motion interpolation is contorting peoples' perception of what is natural in video. Their augmented reality looks unnatural to me, and good old non-interpolated video might look smudgey or something to them.

As for games, there's plenty that wouldn't be playable at 24 fps, but that's probably as much to do with the compound effect of input lag. As it turns out @wolf, the indie devs might actually be the ones more likely to try something at 24 fps. I don't know how 24Hz recognition works on tvs, but I'd love to try perhaps a walk 'em up with a 24/24 marriage.

I think there's probably a lot that can be done towards proper syncing for video games. It may be that we won't realise what we we're missing until we see genuine smooth, as opposed to the "more frames good" mentality. I run Assetto Corsa sometimes and just shake my head at the high frames I get while the image is still jumping around like a lunatic. You'd think that locking at 60 would get it done at 60Hz, but no.

Also, I could be wrong on some or many of these things. Just trying to make sense of stuff I've picked up along the way.

Assetto Corsa is a beta, and fps counters don't tell the real data. AC has all kinds of issues and is a poor reference right now. Please don't use AC as some proof of concept. Frames in computer rendering can be late or dropped, for many reasons but fps counters will show 60fps. The game is at fault here, not 60fps. Turn10 are not monkeys, they go for 60fps locked every time and sacrifice graphics.

100 years ago film cost money and I think its fair 24fps for film was the minimum you can get away with and you save lots of cash and use less storage, less expensive everything really. So much is cut and not used or thrown away its only logical to use 24fps back then. I doubt they thought oh wow 24fps is the sweet spot.

Same can be said for digital in past years, more frames can dramatically increase file sizes. It's really now where files sizes are not a big deal.

On PC some games support 24fps. Your TV will detect it and do a 24fps pulldown.

Anyway, for the topic, gaming is best when the game runs well and always puts out a frame at 16ms to achieve 60hz or higher. Games don't have the motion blur right to ever look good and it can add more lag trying.

Motion resolution is another problem. The human eye blacks out on the world every moment but you don't see it and gets focus. If it didn't we'd get sick because of blur, it would look like some hollywood film if we start to move.

Oculus rift is shooting for 85hz to 120hz and using low persistance to more match our eyes. They have some of the best tech people, they don't go for that just for a joke or some silly condition or belief. 24fps 30 fps are no go. 60fps is not good enough.

Lastly another good reason for blurry output is effects and stage sets don't have to be exact. You can get away with murder.

Not that I want 60fps films though. I don't watch many since late 90s and if people want 24fps for whatever reason then go ahead.
 
Last edited:
Incredible. :lol: I wonder if developers like ND will eventually realize they are trying to morph Video Gaming into an already existant genre, are they aware movies exist? Video Games should be aiming for the most fluid experience possible, not making shoddy decisions because it effects 'muh story telling'. Good thing they seen sense and tried a 60FPS mode for the remastered edition.
 
Incredible. :lol: I wonder if developers like ND will eventually realize they are trying to morph Video Gaming into an already existant genre, are they aware movies exist? Video Games should be aiming for the most fluid experience possible, not making shoddy decisions because it effects 'muh story telling'. Good thing they seen sense and tried a 60FPS mode for the remastered edition.

Naughty Dog's vision is all about character driven good story telling with a solid fun game to match, interesting to hear their point on it.
 
It's not about Hz though, unless artificially changing Hz (ala lcd tv). I can only assume that that is effectively like going back and changing the frame rate of the source. Native Hz can be high and it will be fine/better. Refresh rate and frame rate are totally different.

The Hobbit was shot at 48fps and looked quite augmented and odd really, and if the topic taken away from the article is "Is Naughty Dog justifiably concerned about upping to 60fps?". I'd give an emphatic yes. Doesn't necessarily mean they shouldn't do it though, and doesn't mean that it would look like The Hobbit. But concern is warranted. Also see Apocalypto, a movie that messed with using varied frame rates. It skips between a filmic and home video look, depending on the scene. Quite nasty at times.

With tv, you've got the real life images, the recording and processing of those images, then the displaying of the images. Discounting motion interpolation, it's the recording and processing stage that is potentially problematic. I'm just trying to work out what the equivalent stage would be in gaming. It's not the display stage, so it must be before that. It's possible that it comes down to animations rather than fps. Play some of The Last Of Us with maxed motion interpolation and for sure, those extra frames are going to make it look very weird indeed, but maybe again because it's affecting the source (animations in this case). Maybe fps to games is equal to Hz to films, and animations equal to recording/processing.
@vasiliflame I reckon you know stuff about stuff. What do you think?

Pretty sure though that wanting higher everything will help generate some really crap looking games.

Maybe the display you use has to do with it, because my CRT never misses a beat, with regard to input lag or any malady brought about by the image processing on HDTVs. I agree that motion interpolation looks wrong, but I think that's because it's terribly stuttery/inconsistent. At least it looks that way to me. :yuck:

See, I find that weird that you'd bring up "stuttery/inconsistent" as the issues motion interpolation introduces, when the glaring and utterly damning one is that it ruins the image by giving it the "soap opera" look.

Haha, just so happens that you are talking to a tv nut, with the best of the best at hand. Plus multiples. If you have an XBR960 you'll know precisely what I'm on about.
 
@LeMansAid -- I've never seen motion interpolation that actually resembles a smooth framerate. My sample size of HDTVs isn't terribly large, but the imperfection is what ruins it. If it was actually perfect, you wouldn't even notice. If you'd never seen what's playing before, you would just presume it was shot at "soap opera" speed.
 
Sure the imperfections are very apparent, but the image as a whole loses it's artistic integrity. That is what is immediately and cripplingly apparent. The aim with it is to get around lcd's inherent short-comings, with side effects being various impurities (halo-ing, etc.), but also the soap opera effect. Done without the impurities it will still look horrendous, and even more digital than lcd already does.

If you don't see that, I can understand why you wouldn't see what ND was concerned about.
 
It's certainly different when it's a Hollywood film I recognize. There's no denying that. However, if the HDTV could do it perfectly I would find it refreshing to see, like switching to a 60fps game after playing <30fps games for a while. Almost like being on the set yourself during filming. I think that would be cool.

I'm just not terribly attached to this artistic integrity you speak of. ;) But you can freely chalk that up to me being a non-TV-nut who rarely watches movies or TV shows (no cable/satellite/anything subscription).
 
Strange how LeMansAid brings up Assetto Corsa as possible proof of 60fps not being good. Quite shocking really.
 
It wasn't the wisest example on reflection. Not sure if you read, but I'm now thinking that fps in games might be equivalent to refresh rate for film, and not an issue. Feel free to "well duh" me.

@Wolfe Tv nut because I have a lot of them, not because I watch a lot of tv.

I can't argue on a matter of taste, in regards to motion interpolation. If you would like it (done without impurities), so be it. I think it is, and would be, dreadful.

Interesting (or "shocking" as chromatic might say) that you would marry the concepts of going from <30fps to 60fps with motion interpolation. That would, for most experts, validate ND's concern. But as explained, I'm pretty sure that they're alien concepts anyway.
 
@LeMansAid -- Sorry, that sentence of mine was unclear. I knew what you meant by TV nut; in contrast, I play on a humble CRT (zero image processing delay FTW). I was just adding that I don't even watch movies/TV, either.

I don't think what I said validates Naughty Dog's concern. Going from 24fps to "60fps" is plainly similar to going from <30fps to 60fps. But while people are used to films at 24fps, I think it's beyond ridiculous to unironically constrain a videogame to the same standard. Like Classic said, movies are already a thing. Games aren't movies.
 
I don't think what I said validates Naughty Dog's concern. Going from 24fps to "60fps" is plainly similar to going from <30fps to 60fps.

Except it's not. Going from 24Hz to 60Hz for a movie will make only the slightest difference. Going from 24fps to 60fps for a movie will make a huge difference. It's all about refresh rates vs frames per second. Playing a movie at 96Hz refresh rate will look as the director intended, playing a movie at a motion interpolated 100Hz (effectively upping the frame rate) will look totally different to what the director intended. Fps is to games as what Hz is to movies, the higher the better. There are suggestions (http://www.100fps.com/how_many_frames_can_humans_see.htm) that we need 500fps/Hz to create a perfect image that covers all factors.

The "good read" in this thread would have benefited greatly from explaining why (and not just that) ND ended up thinking 60fps was no problem for them, and actually a good thing.

For the record, I have nine tvs that are currently used for gaming. Single plasma for general, triple plasma with cockpit for racing, dual crts with cockpits (for old school racing), dual crt for light gun stuff, and single crt for general retro gaming. If it's not what you've got, look into getting a Sony XBR960. I use two of the the three I have for the old school racing and they are stunning. Highest number of lines displayed on any crt tv in history. If you did end up getting one, feel free to PM me about adjusting the overscan. Actually, have you considered that you may need to do that for what you currently use?

Ah!! I too often get too excited, and sidetracked about tvs.
 
@LeMansAid -- I understand the point, that it's not how the director saw it while working on it. On the other hand, it wouldn't necessarily violate their creative vision. After all, 24fps isn't always a deliberate choice, it's mostly an accepted standard. I did come up with a possible analogy for your point, though -- taking a game in the opposite direction, like running a PC game on a minimum-spec system with a very poor framerate.

Kudos on naming the Sony XBR960. Looking it up, that's one heavy mofo, but it's exactly the type of display that would be an ideal upgrade for me. 👍 Our house isn't very big, and I've been concerned that when I eventually get an HDTV, I'd have to set up my CRT in another place for older games.
 
I understand the point, that it's not how the director saw it while working on it. On the other hand, it wouldn't necessarily violate their creative vision. After all, 24fps isn't always a deliberate choice, it's mostly an accepted standard.
These days more than ever the director will only see anything remotely like their vision for a film at the editing and processing stage. It wouldn't matter what frame rate it was recorded at, most dierctors would still want it dropped down to 24 for the final render. If you have a device that can record at 24fps and 60fps you could do a test: Record half a minute or so of yourself singing into a hairbrush or playing air guitar using 24, then 60 frames. Watch both of them back, and I guarantee that you will prefer the 24fps version. If you dropped it further in an editing program you may like that even more.


I did come up with a possible analogy for your point, though -- taking a game in the opposite direction, like running a PC game on a minimum-spec system with a very poor framerate.
Fps is to games as what Hz is to movies, the higher the better.
I did get to the bottom of it eventually. Like I said already, it would be frames of animation in a game that could cause augmented reality issues if too high, rather than the overall frame rate itself.
Kudos on naming the Sony XBR960. Looking it up, that's one heavy mofo, but it's exactly the type of display that would be an ideal upgrade for me. 👍 Our house isn't very big, and I've been concerned that when I eventually get an HDTV, I'd have to set up my CRT in another place for older games.

By HDTV I assume you mean flat panel (that Sony crt is very much hd). Time has almost run out though for the natural progression from crt. Plasma is also a phosphor based display and would be a much more comfortable switch compared to lcd's overt digital-ness. As the RIP plasma thread shows, it has sadly been prematurely killed off. If you can find an XBR960 though, grab it. I kind of wish I was in to modern fighting games, these Sonys would be the ultimate advantage with their combination of definition and lack of lag. Turns out that I just let a free one pass by, as I already own three of them. That's hard to do. If you keep checking Craigslist you'll be sure to find one eventually. Be aware though that sometimes people just list as "Sony tv" or worse still "old tv", so it can be a bit laborious.
 
I could go on about how 60 > 30 regardless of the situation but I think TotalBiscuit describes it best. If you've watched this video then you'll know what I'm talking about.

Anything that TotalBiscuit says should be taken lightly, but I overall agree with the point that achieving constant 60fps in a video game is sheer lunacy. To put it into perspective, movie distributors today show movies in 24 fps at the theater then upscale the film for home releases to 60 fps. The first Hobbit film was the recent exception and that was only shown in 48 fps.

No, if anything, the main point of attack for game developers should be screen resolution. Even at last gen graphics, we got some solid games that was at 1080p, and that trend should continue. That is one main advantage that game developers have over any other medium, and it is high time that they should push that weight around.
 
@LeMansAid -- Maybe I'm in a minority, but I'm just not that attached to the "movie" look, least of all when it comes to games, which are not and (IMO) should not try so hard to be movies. If you experience this separation issue with 60fps games, I'll have to concede that some people do (including a few in the gaming industry, apparently). Nonetheless, for Naughty Dog to declare 60fps a "new standard" in gaming and pat themselves on the back is ridiculous.

There's also still a huge difference between <30fps and 30fps. As I said from the beginning, I don't mind playing a game at 30fps, but missing that target with a fluctuating framerate is just plain bad. It only annoys me, but it makes my wife sick to her stomach.
 
http://www.eurogamer.net/articles/digitalfoundry-2014-the-last-of-us-face-off
Like I said really.

Anyway, trying to render at 60fps is pretty taxing and sacrifices have to be made. I doubt Naughty Dog will stick to this when other games are looking as good or better at 30fps. They thought it fine to release a game going from 20fps to 30fps all the time instead of dialling back the graphics on the PS3 to lock it at 30fps.

I would think ND will be trying to create the best looking graphics again on PS4 so they'll have to go back to what you can get a away with frame rate wise. Maybe they can create some options like us pc gamers have but 30-60 is quite large jump.
 
Back