There's a performance / quality sweet spot around about 140-150% scaling - 1440p is only 133% of 1080p. That's linear dimension, since that's what you see - pixel counts are more like 195-225% and 177% respectively. But, like-for-like, native 1440p still looks better than any supersampled 1080p material, so I think there is a benefit to a little more resolution in hardware given the current (and near-future) state of content creation. In terms of detail, 4K is 150% of 1440p, interestingly - so 4K (2160p) on a 1440p display would be a pretty "future proof" target in gaming terms, in my opinion. This doesn't factor in the "AI" upscaling schemes, either. So you could render at (e.g.) 100%, intelligently upscale to 150% (with less total performance impact than just rendering at 150%, and with comparable quality) then downscale to the display hardware. It might be hard to get this working at high refresh rates, though. You could also sample different regions of the screen at different densities (more in the middle, less at the periphery, say) and save some pixels there, too. From all of the above, I'd wager that developers are generally not targeting native 4K at all. 1440p is probably it.