As the games industry thunders on, we're seeing something of a plateau when it comes to graphical potential. Yes, a cursory search of Google will bring about "reasons to buy a 4K TV" or "why the PS4 Pro/Xbox One X is worth it", but for the vast majority of punters, the leap in quality just isn't there. Where once 2D became 3D and later HD, with 4K, you have to wonder if we're fast approaching a glass ceiling on one of gaming's most formerly bankable aspects.
Visuals have sold consoles since day one; they easily turn heads, and remain one of the most immediate ways for a customer to feel satisfied with their purchase. Hook up the new system, turn on a new TV - done.
That said, to achieve what the average consumer now thinks is a "4K image", it's far more than rounding off pixels and ensuring clarity, taking hours, days, years-worth of detail comprising everything from character model clothing, to background subtleties like foliage or the intricate billows of a cloud.
This detail doesn't come cheap either, and yet it's at the heart of every major triple-A game - not to mention the burgeoning indie market occasionally ticking the boxes of visual splendour.
Ultimately - and you can look to the array of Digital Foundry side-by-side comparison videos as proof - we've arrived at a level of graphical quality between 1080p and 4K that you often need a magnifying glass to extract. The likes of Spider-Man, God of War and Forza Horizon 4 still look stunning on "base hardware", and it makes you wonder: Why, if there is an unspoken level of satisfaction with where we're at, does the industry keep pushing for more?
Does "bigger equal better", and why are we chasing it?