I skipped the text and looked at the images and was unable to understand if they were supposed to be bad or good examples. I liked them. Then k read through the text and learned that they are supposed to be bad examples.
But why though? I suspect that either I am not good at this kind of thing, or this is a purist thing, like „don’t put pineapples on pizza because they don’t do that in Italy“.
I don’t want games to look realistic. A rainy day outside looks gray and drab, there is nothing wrong with rainy days in games not looking like the real thing, but awesome and full of contrasts.
>I don’t want games to look realistic. A rainy day outside looks gray and drab, there is nothing wrong with rainy days in games not looking like the real thing, but awesome and full of contrasts.
In photography and cinematography contrast and color curves are near ubiquitously modified artistically to evoke a certain feeling. So even without 3D renderings added colors are adjusted for aesthetic over raw realism.
I totally agree. The example pictures in the article look fine.
I don't know what the author wants, but perhaps it's some kind of industry insider view similar to where "true artists' make movies that are so dark you can't see anything, and the dialog is quiet mumbling and the sound effects are ear-shattering. Perhaps there's an equivalent to that in games.
I can see why people wouldn't like them - they're all oversaturated and most of them go for the cheesy "everything is teal and orange" or "everything is piss yellow" gradings. There's a quote I heard in a photography tutorial once that goes something like "once you've moved all the sliders to what you want, move them back 50%", and games basically don't that.
But the biggest problem with the screenshots is they literally aren't HDR. So how can we judge their HDR?
Real life has a lot of sensations that games don't. A rainy/foggy day might look boring, but it feels nice to be out in (ideally). Well, that and computer audio is/can be about as good as humans can perceive, but displays are nowhere near it.
So both of these mean you have to jack up the sensation so people can feel something.
From what I understood is that these are supposedly bad because they look like video games instead of photographs. Not sure what the problem with that is though. I'm fine with video games looking like video games.
I truly don’t understand the author’s opinions about contrast here. The RE7 image is the only one here that looks ‘realistic’, and at a glance could be mistaken for a photograph, and he says it’s got way too much contrast.
No other image here comes anywhere even close, definitely not Zelda nor GTA5.
Personally I think the whole problem with the first 5 images is that they don’t have enough contrast, and they have too much detail. The color handling isn’t the only reason they don’t look realistic, but making sure every single pixel’s nicely exposed and that nothing gets too dark or too bright is allowing to let all the CG fakeness show through. One of the reasons the RE7 image looks better is you can’t clearly see every single thing in the image.
If you take photographs outside and the sun is in the shot, you will absolutely get some blown out white and some foreground blacks, and that’s realism. The CG here is trying too hard to squeeze all the color into the visible range. To my eyes, it’s too flat and too low contrast, not too high contrast.
The zelda screenshot he uses as an example of how good things look without HDR, looks terrible to me. It is all washed out with brightness and bloom, and all the shadows in the landscape that in reality would almolst be black, are very light grey.
For me games being too dark and not being able to see anything is a pet peeve. I can see the point in a horror game, but I will set the gamma or turn up the brightness if it makes the game hard to play.
Oh I agree. The art director needs to be exposing the important gameplay elements to be visible. That doesn’t mean they should avoid blacks for everything though, and that’s what all images except the RE7 image are doing.
Out of my mind, the destiny 2 is the biggest offender of this category. If I can't see shit at all, how does the feeling artist trying to convince even matter? I will just turn the brightness in graphic card setting all the way up. Because the cap in in game setting is insanely low.
Plus isn't not even a horror game. Come on, you are a shooter game. How does a shooter game that you can't see anything even make sense?
> The RE7 image is the only one here that looks ‘realistic’, and at a glance could be mistaken for a photograph, and he says it’s got way too much contrast.
It looks like a cheap film camera or a home video screenshot. So it gives off a feeling of nostalgia to a sufficiently old person, but this is also the kind of photo you'd reject as a pro, because it's totally overexposed.
One problem with photorealism is a lot of players are on bad displays, or in bad viewing environments. Games often take this into account in their visual direction so that they will be more legible in these different environments. It used to be even worse when designing a game for say the Gameboy Advance or original Nintendo DS where you knew the screen wasn't backlit or wasn't particularly bright so your images needed to be bright and colorful. Even now, a Nintendo Switch game might be played on the bus.
For big budget games the solution for this is typically to have brightness calibration when the game first boots up, but the game itself still needs to be designed adaptively so that it's not Too Dark or Too Bright at critical points, otherwise the playability of the title is jeopardized. This runs counter to a goal of photorealism.
and found they did really well because the art was designed to look good on bad screens and poor viewing conditions. I think of it in terms of Ansel Adam's Zone theory in that the ideal image is (1) legible if you quantize it to 11 tones of grey (looks OK printed in the newspaper), but (2) has meaningful detail in most or all of those zones.
I'm kinda disappointed that the Nintendo 3DS version didn't use the stereo effects but they would have had to decided if her hair forms a sheet or a cone.
You're arguing that game engines should imitate photographic cameras, but they should imitate our eyes, which will never blow out whites outside in the sun.
Our eyes absolutely blow out whites in the sun. Doubly so when looking at the sun or even reflections immediately after being in the dark for a while, and when looking at bright that is very near dark in your visual field.
I’m not necessarily arguing games should imitate cameras, I really only think over-compressing the dynamic range is bad, and I don’t understand why the author is arguing for that.
Do you have a new technique to decode eye-brain perception in terms of how we perceive visual signals? Do you have a paper indicating how you make this claim for everyone?
Do you really need a paper? It’s well known that looking at the sun does damage to rods and cones, because it far exceeds their response range, long before perception gets involved.
Chemical reactions in the rods and cones are only a small portion of vision processing. The rest is in the brain, with a great deal of various processing happening, that eventually comes to cognition and understanding what you see.
And parts of the visual cognition system also synthesize and hallucinate vision systems as well, like the vision hole where the optic nerve meets the eye. But cognitively, the data is there smeared across time and space (as in a SLAM algo putting the data where it should go, not what is measured).
What, exactly, is relevant about the perception and cognition systems if the signal from rods and cones is clipped or distorted? By ‘blown out’ we are talking about the rods and cones being saturated and unable to respond meaningfully. Your question doesn’t make sense, and I’m neither making claims about nor arguing over what happens in the perceptual system to bad/saturated inputs.
I don’t know what you mean by ‘in the sun’ != ‘at the sun’. I’m the one who said ‘in the sun’ and I was talking about staring at the sun. I’m not sure what your point is, but if you’re trying to say that a game render of looking at the sun is different than the experience of actually looking at the sun, then I wholly agree. A game will (rightly and thankfully) never fully recreate the experience of looking at the sun. If you’re trying to defend &carlosjobim’s claim that human vision doesn’t have an absolute upper luminance limit, then I think you need to back that claim up with some evidence.
Woah, the sun is bright? How do you know this is true for everyone? Do you have a peer reviewed RCT paper posted in a high impact journal confirming this?
It's a cloudy day here and I'm within my rather dimly lit office. If I look out the window, it is no problem to see clouds in all their details, and I don't loose any details within the darker environment in the room. A camera will either blow out the entire sky outside the window to capture the details in the room - or make the room entirely black to capture the details of the sky through the window.
I mean, most people reading our comment thread here have their smart phone by their side and can instantly verify that eyes do not blow out whites or compress blacks like a camera. The dynamic range of our eyes is vastly superior to cameras. So aiming to imitate cameras is a mistake by game developers.
Of course, staring straight into the sun or a very bright light or reflection is a different matter.
The first three pictures in the article have direct sun visible in the sky and not clipping. I was referring to that. The sun itself does blow out when you look directly at it, but please don’t spend time staring at the sun as it will damage your eyes.
The dynamic range of human eyes is not vastly superior to cameras. Look it up, or measure. It’s easy to feel like eyes have more range because of adaptation, foveation, iris, etc.
Again, I didn’t argue that games should imitate cameras. But that would be better than what we have in games; movies look way better than the game screenshots in this article.
One big issue I never understood is why do we need photorealism in games at all. They seem to benefit card manufacturers and graphic programmers, but other than that I feel it has nothing to do — and in fact may have negative impact on game quality.
Photorealism is a bad idea if your movement engine isn't good enough to handle the character walking around on uneven terrain. For racing games or flight simulators or such it is less of a problem, but seeing a regular person being absolutely flummoxed by a knee-high wall is massively immersion breaking.
It's something I really noticed when playing Disaster Report 4, where the people look amazingly realistic but some restrictions are clearly just 'developers didn't make this bit walkable'.
> For racing games or flight simulators or such it is less of a problem,
Cars are also easier to make photorealistic. Less uncanny valley effect, lots of flat shiny surfaces.
What absolutely breaks immersion for me in most AAA car games is the absolute lack of crash, scratch, and dirt mechanics. Cars racing around the track for 2 hours don’t look like showroom pieces! Make ‘em dirty darn it. And when I crash into a wall …
I’m really excited to try Wreckfest 2 when I get around to it. Arcade-ish driving, not super photorealistic, they put it all on realistic soft body collision physics instead.
I seem to recall hearing that car manufacturers only allow their vehicles to be licensed for use in games if they won't really get visually damaged. Kinda funny to see cars just bounce off each other in Gran Turismo. But rally games tend to be better at that (I may have lost a door or two (or a few dozen, but who's counting) in WRC).
You might like BeamNG.drive. It has soft-body physics simulation (also for driving dynamics, so it's not arcadey) and decent graphics. It's more like a sandbox with half-done "actual game" mods AFAIU, but happens to be quite popular and very highly rated anyway. I'm on the fence about buying it myself.
I had a great time recently on my first trip to a racetrack, and the most surprising thing to me was how all the cars were utterly beat to shit. Not like in a bad way, but in like... a sports gear way? They were all working (well, mostly, one guy had a real bad time on his second lap and I'm pretty sure his engine was DONE) but the panels were quite battered, and a number had full on body damage I'm assuming from track contact.
And granted this was an amateur race day, just weekenders having a good time, but it makes sense when you think about it: if the body panels aren't like falling off and are just a bit beat up... why replace them? Especially on some of these cars (late model Corvettes and Mustangs) they don't come cheap at all, and they'll require refinishing and you have to do your livery over again too.
Like a hockey player doesn't buy a new helmet every time they get hit, they/the team would be broke before the season was out.
I think it's like porn. Not sure about you guys, but for me soft-core always looks better than HD hardcore. Soft-core encourages imagination and conveniently covers any body part that is a bit far from perfect.
And that's why I always think ladies who wear just enough clothes are way more sexy than nude ladies.
This is true in Wukong too, which is otherwise a very good-looking game. There are various points where rocks and scaffolds look just as climbable as those in the game area, yet the game engine places an invisible wall in your way. It breaks immersion instantly.
I think it's more that they didn't have the display language to mark those inaccessible parts of the world as "boring", and prevent the player from wanting the walk into that invisible wall in the first place. Or placing the invisible wall 1m infront of a real wall for NO REASON.
While also expecting you to go around searching for hidden goodies nd secret paths.
I swear, the invisible walls are the only thing pushing it to a 9/10 from a 10/10 for me.
We don't need photorealism in games, but it does help with immersion. Many people, like me, feel like they are inside the game world, rather than playing a game with a TV/monitor in front of them. Photorealism is essential for this feeling - at least for me .
The most amazing gaming experience I've ever had was walking around the city at night in Cyberpunk 2077. For the first time in my life, I felt I was actually in the future. Zelda can't pull that off with me, despite being a great game from other perspectives.
I find this an interesting argument. I wonder if it's a generational thing.
If we define immersion as "your vision focuses on what's inside the screen and you ignore the world around the screen, and you mostly ignore that your control of the player character is through a keyboard and mouse", then I've experienced immersion with every first person game ever, including Minecraft. I never considered that some people might need photorealism for that at all. There was another commenter that mentioned being unable to walk over a short wall due to character controller limitations as being immersion-breaking. I agree this is annoying but the qualia of it is more like a physical confusion rather than being something that actually breaks my experience of the game.
I'm also thinking this might be related to why I find VR to be, while very cool, not some revolutionary new technology that will fundamentally change the world.
I had a similar experience in a a game meant to simulate regular city car driving.
Most releveant to this comment thread however was the fact that the graphics were very crude and not in a good way. I absolutely dispute the claim that realism equals (immersion/presence - I'm not getting involved in the debate about the distinction between the two)
I’d argue that immersion has little to do with graphics, even for FPS. Actually I had more immersion in some text adventure games than in some AAA games — and not out of nostalgia because I never played the said text adventure games before.
I’d agree that certain degree of graphics helps with immersion, but photorealistic graphics only offers cheap immersion which turns off the immersion centre in the brain — Ok this is just my babble so 100% guess.
Agreed. Immersion in a game world, at least for me, is less about how accurately it visually reflects reality and more about how detailed the overall world feels -- whether the designers have crafted worlds that feel like they live and breathe without you, that you could imagine inhabiting as someone other than the protagonist. For instance, I can imagine what it would be like to live in Cyberpunk 2077's Night City, whether I was a merc like V or just one of the nobodies trying to get by that you pass on the street; I can imagine living in Dishonored's Dunwall (or the sequel's Karnaca) in the chaos and uncertainty of their plagues; I can put myself in the shoes of one of the faceless, downtrodden members of the proletariat of Coalition-occupied Revachol in Disco Elysium; a lot of AAA games, on the other hand, feel like theme park rides--well-crafted experiences that are enjoyable but don't stick with you and discourage you from thinking too deeply about them because they don't withstand much scrutiny. But Cyberpunk 2077 is evidence that they don't have to be that way, and Dishonored and Disco Elysium are equally evidence that you don't need a half-billion-dollar budget and photorealistic graphics to create immersive worlds.
(edited to clarify that I'm not laboring under the misapprehension that Cyberpunk 2077 isn't a AAA game)
I recall a paper from GDC many years back that studied the perception of immersion and they measured and ranked maybe a dozen different factors. Graphics and visuals were surprisingly low on the list. The number one thing was the player’s sense of identity and clear understanding of their goals. Players tended to correlate realism with high immersion too.
That’s definitely in the same realm, but not the one I was thinking of. I believe I’m thinking of something maybe 10 years earlier, it had multiple authors, at least one woman, and some of the authors were psychology researchers who were into games. I’d wouldn’t be surprised if this is a theme and avenue of research that has come up many years at GDC.
I bought cyberpunk when it released, i may have even pre-ordered, i don't remember. I played about 20 minutes after the title drop, you know the one. It was buggy, and didn't really look that good to me, on my samsung 4k monitor.
I then played it again, on the same monitor, last year, and i was pleased with the gameplay, but again, i didn't find anything that remarkable about the overall graphics. the fidelity was great, especially at distance, due to 4k.
I'm 50 hours deep in literally as i type this (about to launch the game), and this time, this time it is completely different. I have an LG 2k HDR screen with "Smart HDR" and i finally - finally - get it. Your eyes have to adjust just like in real life, to go from dark indoors to bright outdoors. you can see tail-lights and headlights in the mountains of NPCs driving around. lasers sweeping you are menacing.
Even fallout 4, which is the first game i played in 4k 10 years ago, looks easily 10 times better in HDR. And i only have the "vanilla+" mod set, 5GB of mods, not the 105GB modset.
I coined the phrase 4 or 5 years ago, that HDR stood for: Hot Damn, Reds! and really, reds are still my least favorite part, they burn to deeply, but from watching several movies on an HDR 4k TV and being real unimpressed, to just these two games, my entire viewpoint has drastically changed.
I didn't know you could put arbitrary people into photo mode in CP2077, and also pose them and move them around, so i was just entering photo mode as best i could and lighting and fiddling with the curves; however, these all took over 4 seconds to "render" to the final image, which i found interesting: https://imgur.com/a/DTesuhF
You're not alone, Cyberpunk's blend of near-future with realism whilst maintaining a clear art style that is not total realism is very immersive. I have spent countless hours wandering around Night City, not even playing the main gameplay.
CP2077 was the game I drove most carefully in when not on a mission, just coz it felt right that V wouldn't be hooning around his home turf. The immersion was incredible.
There's something about the image quality of Cyberpunk that looks off to me, and I can't quite put my finger on it. Maybe the hair rendering? Shadowing?
It's clearly going for photo realism, but it somehow looks worse to me than older, lower-fidelity games.
DLSS really messes with the realism, however for actual gameplay it's less annoying to me than i thought it would be from such games as diablo IV and others in that cohort. If you want maximum quality, don't let an AI draw what the developers (artists) intended, just draw what the developers intended. i replied to a sibling comment with 4 photo mode screenshots, and you can see that there's a lot of variation in environment lighting, and all of the ambient light is pre-arranged by the design team and developers. In CP2077 a lot of quests are "go to <location> at dusk/dawn/night/noon, or between x&y time, because they want the scene to be cinematic, and it shows. Harsh fluorescent lighting on scenes with a doctor, muted, hazy interactions with a shady character or a scene with emotional turmoil, long shadows and lots of reds at the end of a story arc.
It really feels like they put so much work into how everything looks in the primary and secondary stories.
i can agree though that just "jobbing" it looks more like a run-of-the-mill shooter, though.
I do, but not like cyberpunk. I like to both read and watch movies, but I feel a lot more immersed with images than I do with words. It's not a binary rating (immersed vs not immersed), it's a gradient that makes things resonate more strongly with photorealism.
This is one reason, I believe, why some people can't stand animated cartoons. I like them but I know many people who won't even consider watching animation.
You can get immersed in anything. With games or VR realism, it’s like extra depth of immersion when your brain switch to think in same way as you think in real world rather than adapting to physics or terrain of fake world.
I think, like polygon count, resolution, FPS, etc, realism is very easy to objectively assess and compare even with no artistic background, which makes it a target both for gamers (who want to explain why they like a game, or debate which game is better) and studios who want something they can point to.
IMO it leads to really stilted experiences, like where now you have some photo realistic person with their foot hovering slightly in space, or all that but you still see leaves clipping through eachother, or the unanny valley of a super realistic human whose eyes have a robotic lock on your face, etc.
Physical interaction with game worlds (wasd and a single pivot, or maybe a joystick and a couple buttons) hasn't increased in depth in 20 years which only emphasizes the disjointedness.
I totally agree with your last paragraph except to add: there has actually been some great advances in interaction, but people vote with their playtime, and I think the reality is that the "median gamer" is totally content with WASD + mouse/the typical controller thumbstick movement. In the same way that so many are content that many game mechanics boil down to combat and health bars.
I am personally not content with that and I explore all I can, and am trying to make games that skirt the trends a little bit.
But that stark contrast between visual fidelity but a lack of interactivity has been a pet peeve of mine for a while. You can even do so much more with just mouse and keyboard interactions, but I think it's overshadowed by the much lower risk visual fidelity goals.
A large section of the gaming public sees photo-realistic games as serious, and prefers them for high-budget games. It's a rat race for devs though - its just incredibly expensive to create high quality models, textures, maps.
I've been playing Cyberpunk 2077, and while the graphics are great, it's clear they could do more in the visual realm. It doesn't use current gen hardware to the maximum, in every way, because they also targeted last-gen consoles. I'm thinking in particular of the PS5s incredibly fast IO engine with specialized decompression hardware. In a game like Rachet and Clank: A Rift Apart, that hardware is used to jump you through multiple worlds incredibly quickly, loading a miraculous amount of assets. In Cyberpunk, you still have to wait around in elevators, which seem like diegetic loading screens.
And also the general clunkiness of the animations, the way there's only like two or three body shapes that everyone conforms to - these things would go farther in creating a living/breathing world, in the visual realm.
In other realms, the way you can't talk to everyone or go into every building is a bit of a bummer.
I think chasing photorealism also hurts the modding community, which hurts the players. No ordinary modding community could push out photorealistic contents in a realistic span of time. I think that's why we are seeing less and less mods nowadays comparing to the late 90s and early 2000s.
For FPS, HL2/Doom3 is probably the last generation that enjoys a huge modding community. Anything above it pushes ordinary modders away. I believe it is still quite possible to make mods for say UE4, but it just took such a long time that the projects never got finished.
In certain way, I so much wish the graphics froze by the year 2005.
Let's see how GTA VI will change this and the industry.
I personally like Cyberpunk's 2077 style, it looks great maxed out with HDR. Yes, the models aren't the best, but the overall look/vibe is spectacular at times.
> In Cyberpunk, you still have to wait around in elevators, which seem like diegetic loading screens.
Cyberpunk has vanishingly few elevators. While it may be a loading hide in some spots, it's certainly not indicative of the game which otherwise has ~zero loading screens as you free roam the city including going in & out of highly detailed buildings and environments.
> I've been playing Cyberpunk 2077, and while the graphics are great, it's clear they could do more in the visual realm. It doesn't use current gen hardware to the maximum
I'm not sure how you can reach this conclusion to be honest. Cyberpunk 2077 continues to be the poster child of cutting edge effects - there's a reason Nvidia is constantly using it for every new rendering tech they come out with.
Nintendo games don't look like cheap cartoons at all. They are absolutely not photorealistic but they do put a lot of work on the aesthetics/art and it's most of the time relly impressive once you take the hardware limitations into account.
Mario 64 ran on the same console that was known for its 3D blur.
Mario Galaxy 1&2 (which are still totally modern in terms of aesthetics) ran on what was basically an overclocked gamecube.
Mario Kart 8 which is still more beautiful than a lot of modern games ran on the Switch, which is itself based on a 2015 mid-range smartphone hardware.
I think it's more that Nintendo's choice of hardware (and its relative lack of horsepower) force them into more stylized visuals because it means photo-realism is basically off the table to start with. We the audience tend not to care, because Nintendo has capable artists who can create something aesthetically pleasing outside of "realistic" graphics.
There are tens (if not hundreds) of indie and B-games that offer the same experience as most current Nintendo titles. Nintendo is doing well more because of nostalgia - it's the parents buying those consoles for their kids because they have very fond memories with Nintendo from their own childhoods.
I don't suffer from that particular nostalgia, not having had a Nintendo console (C64/Amiga diehard here), but I bought Wii and Switch, and a couple of first-party games for each.
I considered, and passed on, the other consoles.
Nintendo is playing a different game than other console/game makers (excuse the pun), IMHO.
This. To me one of the reasons why Coffee Stain Studios is such a successful publisher is that its games typically don't push for visual realism for the sake of it (hardly possible anyway when they feature dwarves, alien species and the like).
My take is that video game devs learn to aspire to cinema, since they're both making "entertainment art that exists on a screen" and cinema is more widely accepted as art among the intelligentsia (not that I agree).
The lore was annoying to listen to; whenever I wanted to listen to an audio log, I had to stop playing the game and watch the exact same video of a man smoking and being mysterious.
The cool game mechanics were basically just the gravity gun from Half Life 2, which came out over 20 years ago.
It did have some cool environmental set pieces, but overall I just found the game too pretentious for something that was basically a rip off of the SCP wiki.
I was a bit confused by this aspect of control. It was lauded as an example of a top-tier graphics. I liked the game, but its graphics felt mid to me. Maybe due to the grey indoor environments?
Yes but Control isn't sold "just so the end user can enjoy exercising their new GPU and monitor", it is sold for gamers to play a great game. And IMO it is Remedy's best game since Max Payne 2 (i haven't played Alan Wake 2 though) because of its gameplay and atmosphere, not because of its visuals (which, do not get me wrong, are great, but that is largely because of the art direction and visual design, not because of raytracing -- in fact personally i first played and finished the game on an RX 5700 XT which has no raytracing at all and had to tone down a few visual effects, but still found the visuals great).
I don't really see your point. It was used by benchmarking youtubers for that benchmarking, so it at least sold to them for that reason. It's also the reason I bought it: any later enjoyment is unrelated.
Control was one of the first big games to come out after Nvidia’s first line of GPUs with raytracing hardware (RTX 20xx) and one of the first games to use those hardware features. That’s why it was used as a showcase (there was probably a deal between Remedy and nvidia to make this happen, not sure).
It was a good looking game at the time, but remember it originally came out on PS4/Xbox One and that version did NOT have raytracing.
I've wondered whether photorealism creates its own demand. Players spend hours in high-realism game worlds, their eyes adjust, and game worlds from ten years ago suddenly feel wrong; not just old-fashioned, but fake.
This is also true for non-photorealistic 3D games. They benefit from high-tech effects like outline shaders, sharp shadows, anti-aliasing and LoD blending - but all of that tech is improving over time, so older efforts don't look quite right any more, and today's efforts won't look quite right in 2045.
When a game developer decides to step off this treadmill, they usually make a retro game. I'd like to see more deliberately low-tech games which aren't retro games. If modern players think your game looks good on downlevel hardware, then it will continue to look good as hardware continues to improve - I think this is one reason why Nintendo games have so much staying power.
This has been the norm in 2D game development for ages, but it's much more difficult in 3D. For example, if the player is ever allowed to step outdoors, you'll struggle to meet modern expectations for draw distance and pop-in - and even if your game manages to have cutting-edge draw distance for 2025, who can say whether future players will still find it convincing? The solution is to only put things in the camera frustum when you know you can draw them with full fidelity; everything in the game needs to look as good as it's ever going to look.
Completely agree. People lament the death of the RTS genre for all kinds of reasons but I think the biggest one was the early-2000s switch to 3D. Performance considerations meant you have way fewer units. The only exception was that Supreme Commander was somehow able to get around this, but suffered heavily from the second big problem with 3D RTSes: the tiny unit models are so much harder to tell apart in 3D compared to 2D.
The RTS switch to 3D was a mistake and I think RTSes will continue to fail until their developers realize what actually makes them fun is actively hindered by this technology.
I agree it doesn't benefit most games, but it's still genuinely amazing to see sometimes.
I suspect part of the challenge with making a hit game with last-gen graphics (like Breath of the Wild) is that you need actual artists to make it look good.
How do you understand other human desires? That is, what is different about the desire to match reality in other mediums is different from other more understandable desires?
For the same reason it was searched for in painting for so long, and for the same reason movies and plays often meticulously recreate (or film in) real locales and use period-appropriate attire: people, by and large, love looking at reality way more than stylized images.
There are exceptions, but the general public will almost always prefer a photo-realistic renaissance painting to a Picasso portrait, a lavish period piece like Titanic to an experimental set design like Dogville.
This is [...] a series examining techniques used in game graphics and how those techniques fail to deliver a visually appealing end result
All I see is opinions though. And the internet is full of them. You just have to Google "why does this game look so ...". At least if the author had compared the search stats of "good/bad/beautiful/washed out" it would've carried some weight.
The GTA 5 screenshot is a terrible example. It looks like a cheap, dead, video game environment, reminding me how far we've come.
While the graphics aren't as good as some other modern titles, the world and art design make up for it ten times over. There are a bunch of locations that could be paintings, especially:
- The first steps in Limveld
- Liurnia of the Lakes (from Stormveil)
- Leyndell
- The first look at the Scadutree
- Cerulean Coast
- Stone Coffin Fissure
- Enir Ilim
I can't remember another property with a similar diversity of incredibly beautiful and imposing areas.
In my experience Elden Ring looks better when you turn the graphics quality down. Baldur's Gate isn't particularly ugly for a '98 game.
And I agree that it would be nice to have some positive examples. I think there were a bunch of SNES games which did it well, but that may just be nostalgia.
I found this video to visualise what tone mapping is trying to achieve, and why "photorealism" is hard to achieve in computer graphics: https://www.youtube.com/watch?v=m9AT7H4GGrA
And I indirectly taught me how to use the exposure feature in my iPhone camera (when you tap a point in the picture). It's so that you choose the "middle gray" point of the picture for the tone mapping process, using your eyes which have a much greater dynamic range than a CCD sensor. TIL.
> the exposure feature in my iPhone camera…choose the "middle gray" point of the picture for the tone mapping process
No, it uses that to set the physical exposure via the shutter speed and ISO (iPhones have a fixed aperture, so that cannot be changed). It literally says this in the video you linked. This is not tone mapping. Tone mapping in a way may also happen afterwards to convert from the wider dynamic range of the sensor if the output format has a more limited dynamic range.
I've heard a good point that our eyes have, in fact, a boring 1:100 range of brightness. Eyes can rapidly adjust, but the real game changer is our ability to create an image in our video memory, which has an unlimited brightness range. Eyes give us maybe a 2d uint8 framebuffer, but our mind creates and updates a float32 3d buffer. This is why this experience cannot be reproduced on a screen.
Because of fast & per-pixel level light control. Though this is true even if we completely ignore whether human eyes actually manifest a 100:1 auto-adapting dynamic range window.
Overuse of reflective surfaces are the same kind of fad we saw with bloom in the
mid 2000s and early 2010s. Now that SSR everywhere is technicaly feasible gamedevs want to use them everywhere. I think this started 5-10 years ago and RTX has renewed the meme, unfortunately.
Specular highlights are cheap (frame time and artist time) and beautiful when done right, so everyone tries to do them and they get overcooked.
There is a secondary problem in big budget games where modeling work gets farmed out leading to selection for "what looks good in the preview pic." In the preview pic, the asset artist gets to choose background/scene/lighting, and it's an easy trick to choose them to make the specular highlights pop. The person doing integration buys the asset, drops it in wildly different background/scene/lighting, and now the specular highlights are overcooked because the final scene wasn't chosen for the specific purpose of leveraging specular highlights.
Recently, some of it seems to be just to highlight raytracing hardware. Cyberpunk uses a lot of metal reflective surfaces to give a futuristic/tech vibe. But that's one sort of futurism. There'll be plenty of use of natural stone, wood, and tile far far into the future.
The common wisdom is that it's more difficult to make sunny and dry environments look pretty than it is overcast and wet ones. I tend to agree with this based on the end results I've seen over the many years.
That's what I used to think too.. but Spec Ops: The Line is entirely based in desert, even has a shot of sarin horror and while 'pretty' isn't the word I'd use, it is stunning.
It is amusing now that you point it out. There are always trends that come and go in these large scale industrial artforms. As others point out in this case likely a response to technical advancements and desire to emphasize those. Another example that would come to mind here is is the orangey-sunlit ears that seemed to show up everywhere to show off subsurface scattering.
Thinking back - films also are always doing some new exciting thing all at once. That wild colored lighting aesthetic of the past decade comes to mind. That's a result of refined color correction software and awesome low-cost LED lights. Or drone shots. So many drone shots.
It's usually a group-think phenomenon where everyone was previously unable to do something and now they can and everyone wants to try it. And then there are successes and management points at those and yells 'we want that, do that!', and distribution follows, and if becomes mandatory. Until everyone is rolling their eyes and excited about another new thing.
It's a silly phenomenon when you think about it - any true artist-director would likely push back on that with a coherent vision.
"Mann sprayed down the city’s nocturnal streets with tens of thousands of gallons of water, so that they took on an unreal, painterly glow." - New York Times
I feel like this is very much a personal preference thing.
They even called out Horizon Zero Dawn for looking very bad, and Zelda for looking very good.. while in my opinion the exact opposite is true.
I do see the point of the author: HZD goes for a "realistic", high-fidelity 3D fantasy world, yet the lighting makes no sense in physical terms. The contrast and brightness shown in the picture are all over the place, and can only be an artifact of visualising a world through a computer screen which has a very limited dynamic range - it is immersion-breaking. The Resident Evil 7 picture below looks much better. The video I linked in another comment explains why: in the physical world, the stronger the light, the more washed-out the colour will become. HZD is a saturated, high-contrast mess with too much compression in the low light, because of a bad colour mapper in their pipeline.
One can claim HZD's look is an "artistic choice" and that's inarguable, but the author believes it's simply not enough attention to the tone mapping process, which is a very complicated topic that's not usually taken seriously in game dev compared to film production.
To be fair - if I remember the location correctly - that screenshot is somewhat misleading because it's camera position is from the inside of a large ruin, with the ceiling and right wall of the "cave entrance" being just outside the frame.
No, the author posits that Zelda explicitly goes for artistry and ignores any pretense of realism (that then falls flat on it's face when using an over-contrasting tone-map like in the HZD screenshot).
The problem I personally have with the Zelda example given is that it looks really bland to me - the landscape looks really washed out - the author says "Somebody would paint this. It’s artistic.", but I don't think anyone would paint with such bleached-out colours.
Oh, I see. I disagree that the original HZD had a pretense of realism though. The remastered version does and well illustrates the uncanny-ness https://www.youtube.com/watch?v=IlWK_ELBW08 . The outrageous god rays, bloom and lens flare in the remaster compensate for that because you can't actually see anything due to them blinding you...
I think with enough exposure to the overdone contrast ratios, you start to get tired of it. It sacrifices a lot of clarity.
I agree it does look good in some cases, for example I enjoy the look of Battlefield 1 a lot, but when playing it I often noticed I had issues seeing detail in darker areas.
One game that actually puts a lot of effort into this is X-plane. They use physics based rendering and with recent updates they have done quite a bit of work on this (clouds, atmosphere, natural looking colors and shadows, HDR, etc.
There's a stark contrast here with MS Flight Simulator which looks great but maybe a bit too pretty. It's certainly very pleasing to look at but not necessarily realistic.
One thing with flying is that visibility isn't necessarily that good and a big part of using flight simulators professionally is actually learning to fly when the visibility is absolutely terrible. What's the relevance of scenery if visibility is at the legal minimums? You see the ground shortly before you land, a few feet in front of you.
And even under better conditions, things are hazy and flat (both in color and depth). A crisp, high contrast, saturated view is pretty but not what a pilot deals with. A real problem for pilots is actually spotting where the airport is. Which is surprisingly hard even when the weather is nice and sunny.
An interesting HDR challenge with cockpits is that the light level inside and outside are miles apart. When flying in the real world, your eyes compensate for this when you focus on the instruments or look outside. But technically any screenshot that features a bright outside and clearly legible instruments at the same time is not very realistic but also kind of necessary. You need to do some HDR trickery to make that work. Poor readability of instruments is something X-plane addressed in one of their recent updates. It was technically correct but not that readable.
X-plane rendering has made some big improvements with all this during the v12 release over the last three years.
I suspect contrast in a lot of the games he's skewering is high because they are shootery type games where players need too see things, understand them, and react to them quickly
Also I don't necessarily see a need to make everything look like physical film.
This seems pretty irrelevant now. This article is from 2017 which is before we had proper real HDR support in Windows 10 and much better HDR support now in Windows 11.
And before we had OLED gaming monitors which can actually now display good HDR at 1000+ nits.
This was definitely during a transitional phase with mostly fake HDR techniques that needed tone-mapping. Now we have real HDR that doesn't need tone-mapping, or only a small amount of tone-mapping above the display peak nits point.
> And before we had OLED gaming monitors which can actually now display good HDR at 1000+ nits.
It’s worth pointing out these monitors for the most part can not sustain it or achieve it at anything other than the smallest possible window sizes, such as the 1-3% window sizes at best.
> Now we have real HDR that doesn't need tone-mapping, or only a small amount of tone-mapping above the display peak nits point.
For the reasons outlined above (and other) tone mapping is still heavily required.
It’s worth noting that OLED TVs do a significantly better job at displaying high nits in both percentage of the display and in sustaining it. It’s my hope the monitors eventually catch up because I waited a long time for it to become monitor sized.
> It’s worth pointing out these monitors for the most part can not sustain it or achieve it at anything other than the smallest possible window sizes, such as the 1-3% window sizes at best.
Sure, but the parts of the image that are anywhere near 1000 nits are usually quite small and are things like muzzle flashes or light fixtures or centers of explosions, or magic effects etc.
> Sure, but the parts of the image that are anywhere near 1000 nits are usually quite small and are things like muzzle flashes or light fixtures or centers of explosions, or magic effects etc.
Sure, but plenty of things are bright enough in combination at varying window sizes that combined the panels have to drop down significantly. So you might get 1000 nits for a muzzle flash but ~200nits at best for a “bright sunny day.”
The problem is way too many people (I’m not suggesting you) don’t realise this and just think they are “getting 1000nits!”
Yes, I own this display and it’s one of the better ones for brightness which is why I grabbed it.
However even on the latest firmware, It has a bunch of issues including with colours in HDR unfortunately. It also has incredibly aggressive ABL. Still a great display, but with more limitations compared to the TVs than I’d like still. They’ll get there though hopefully in few more generations.
When HDR is implemented properly, and you have a proper HDR display, it's such a transformative experience! Most games, however, don't have good HDR implementations. And for whatever reason HDR on Windows is still awful in 2025.
HDR is GREAT! Everyone trying to implement HDR + tone mapping excessively just for the sake of it and exaggerating it to show-off (just like those oversaturated Samsung phone screens) is not.
Yeah. There've been a laundry list of innovations over the years that people will invent, show how it improves how a scene looks, and then for the next few years everyone turns it up to 11 and it looks like shit. Bloom, SSAO, lens flare, film grain, vignetting, DoF.
After a while people turn it back down to like a 4 and it improves things.
For Horizon Zero Dawn I'd argue that the colors are clearly an artistic choice. They're not going for realistic colors at all. And the original game and its sequel do look very, very good.
There do seem to be plenty of issues around HDR for sure, in some games I had to intentionally disable HDR on my PS5 because it just looked bad on my setup.
I was excited when I first heard about HDR but when I saw the implementation I thought: gee, they're going to screw up both the SDR and the HDR and that seems to be the case quite often. Going from SD -> HD your picture got better although it often got stretched out, but it's not so clear the HDR version of a movie is really going to be an improvement.
Note that this post is of course about high internal dynamic range specifically and the necessary tonemapping that then follows for presenting an SDR image, not about how modern games do actual HDR (but then that should be pretty similar on a high level to the extent I understand anyways).
> In the real world, the total contrast ratio between the brightest highlights and darkest shadows during a sunny day is on the order of 1,000,000:1.
And this is of course silly. In the real world you can have complete darkness, at which point dynamic range shoots up to infinity.
> A typical screen can show 8 (curved to 600:1 or so).
Not entirely sure about this either, monitors have been pulling 1000:1 and 2000:1 dynamic ranges since forever, even back in 2017 when this article was written, but maybe I just never looked too deep into it.
The static contrast ratio (1000:1+) you mention is different from effective perceived contrast after tone mapping - manufacturers' specs measure black-to-white in ideal conditions, while tone mapping algorithms must compress real-world luminance ranges (millions:1) into that limited display range while preserving perceptual detail.
Hmm. I like the author's main point in many video games doing this unrealistically, but there are a few sticking points that are relevant from the past few years:
- The omission of discussing HDR monitors, and how you can't really capture that on a screenshot. This is a game changer, especially with new games and monitors.
- The omissions of discussing Unreal5 games that have come out in the past few years. (e.g. Talos principle 2, Hellblade 2, Stalker 2)
- Not enough examples of games that do it well, with A/B comparisons of similar settings
- The Nintendo screenshot as an example of doing things right isn't working for me.
Another interesting example of lighting done well is Kingdome Come Deliverance 2. The details don't look nearly as nice as, e.g. UE5 game and it unfortunately doesn't support monitor HDR, but it has very realistic looking lighting and scenes.
Yeah, since then there were some games with very natural looking contrast and colors, perhaps most notably Red Dead Redemption 2 (2018). Or, years later, Kingdom Come Deliverance 2 (2025), which you already mentioned. As a negative example: as far as I can tell, Horizon Forbidden West (2022) mostly doubled down on the exaggerated color contrast he criticized in the predecessor.
Concur on Forbidden West having the same problem as HZD that the author mentioned. I remember thinking about the dark/indoor areas with accent lighting, and vegetation in particular compared to similar scenes in Talos Principle 2. (Similar release dates) HZD wasn't in the same league as Talos.
The points are: game graphics is indeed suffering, but the problem is not being unlike films and photos, it's the opposite. The games should stop using film industry produced tone mapping curves and instead create their own, making a clean break.
Path of Exiles 2 is a good recent example of a game that does a pretty good job of contrast and tone (staying true to the dark, gritty theme that is going for). I think it was smart of the devs to keep all the high contrast to effects and lighting.
I disable it everywhere I can. In Instagram for example. When it is turned on (the default) every now and then I get some crazy glaring image in my feed that hurts.
Maybe it is because I don't play games? Is HDR useful anywhere outside of games?
> ...I get some crazy glaring image in my feed that hurts.
Are you using an Apple machine to do your browsing? I have heard that Apple has (for some damn reason) decided to do this sort of crap with HDR-pictures-in-an-otherwise-SDR-document. It's nuts. This doesn't happen to me on Windows, and -because I use xorg- I've no idea what happens on Linux.
The screenshot of "Zelda: Breath of the Wild " the author holds up as an exemplar looks unrealistically tone mapped to me. The "bad" screenshots in the lede look more natural and pleasing. Not realistic -- they're too stylized for my taste -- but the Zelda screenshot is simply unrealistic in a different direction.
This is an example of a phenomenon I've seen many times on the internet:
- Person has a critique of certain media (books, authors, games etc). They are valid critiques.
- You ask what the person thinks is an example of media that doesn't have this problem, or the media they like.
- The examples given are not in the same league, or do the one thing better, and many other aspects poorly.
> The exposure level is also noticeably lower, which actually leaves room for better mid-tone saturation.
Decades ago, when I shot film, I remember discovering that I really liked how photos looked when underexposed by half a stop or so. I never knew why (and I wasn’t developing my own film, so I’ve no idea what the processor may have been doing), but I wonder if this was a contributing factor.
This is apparently an unpopular opinion, but in many games (fantasy RPGs come to mind), I like the fake look. It helps it look other-worldly, IMO. I think for something like Flight Sim, I’d prefer photorealism, but otherwise I’m fine with it looking like, well, a video game.
It might be a generational thing, too; I was born in the late 80s, and my formative years were spent playing cartoonish games like Commander Keen, Command & Conquer, etc.
> But all of them feel videogamey and none of them would pass for a film or a photograph. Or even a reasonably good offline render. Or a painting. They are instantly recognizable as video games, because only video games try to pass off these trashy contrast curves as aesthetically pleasing.
Author is fumbling the difference between aesthetics and realism. Videogames feeling videogamey? What a travesty.
It apparently took Mozilla a couple decades to allow displays to present #ff0000 as sRGB red correctly mapped into the display’s LUT, rather than as (100%, 0%, 0%) in the display’s native LUT, which is why for several years anyone using Firefox on a ProPhoto or Adobe RGB or, later, DCI-P3 or BT.2020 display would get eye-searing colors from the web that made you flinch and develop a migraine. It was, I assume, decided that the improper tone mapping curve gave their version of the web more lifelike color saturation than other browsers — at least on their majority platform Windows, which lacked simple and reasonable color management for non-professional users until Windows 11. So Firefox looked brighter, flashier on every shitty Windows display in the world, and since displays were barely capable of better than sRGB, that was good.
Unfortunately, this also meant that Firefox gave eyestrain headaches to every design professional in the world, because our pro color displays had so much more eye-stabbing color and brightness capability than everyone else’s. It sucked, we looked up the hidden preference that could have been flipped to render color correctly at any time, and it was tolerable.
Then Apple standardized DCI-P3 laptop displays on their phones and tablets, where WebKit did the right thing — and on laptops and desktops, where Firefox did not. Safari wasn’t very good yet back then to earn conversions, though certainly it is now, and when people tried to switch from Firefox the colors looked washed out and bland next to that native display punch. So everyone thought that Apple’s displays were too bright whenever they surfed the web and suffered through a bad LUT experience — literally, Firefox was jamming 100% phosphor brightness into monitors well in excess of sRGB’s specified luminosity range — by dimming their displays and complaining about Apple.
And one day, Chrome showed up; faster, lighter, and most critically, not migraine inducing. The first two advantages drew people in; the third made them feel better physically.
Designers, professionals, everyone who already had wide color monitors and then also students; would have eventually discovered (perhaps without ever realizing it!) that with Chrome (and with Safari, if they’d put up with it), they didn’t have to dim their monitors, because color wasn’t forcibly oversaturated on phosphors that could, at minimum, emit 50% higher nits than the old sRGB-era displays. The web didn’t cause eye strain and headaches anymore.
Firefox must have lost an entire generation of students in a year flat — along with the everyone in web design, photography, and marketing that could possibly switch. Sure, Chrome was slightly better at the time; but once people got used to normal sRGB colors again, they couldn’t switch back to Firefox without everything being garish and bright, and so if they wished to leave Chrome they’d exit to Safari or Opera instead.
I assume that the only reason Firefox finally fixed this was that CSS forcibly engraved into the color v3 specification a few years ago that, unless otherwise hinted, #ff0000 is in the sRGB color space and must be rendered as such. Which would have left them no room to argue; and so Firefox finally, far too late to regain its lost web designer proponents, switched the default.
As the article describes, Nintendo understands this lesson fully, and chose to ship Zelda with artistic color that renders beautifully assuming any crap TV display, rather than going for the contrast- and saturation-maximizing overtones of the paired combination of brighter- and more-saturated- than sRGB that TV manufacturers call HDR. One need only look to a Best Buy TV wall to understand: every TV is blowing out the maximum saturation and brightness possible, all peacocks with their plumage flashing as brightly as possible, in the hopes of attracting another purchase. Nintendo’s behaviors suck in a lot of ways, but their artistic output understands perfectly how to be beautiful and compelling without resorting to the Firefox approach.
(Incidentally, this is also why any site using #rrggbb looks last-century when embedded in, or shown next to, one designed using CSS color(..) clauses. It isn’t anything obvious, but once you know how to see it, it’s like the difference between 18-bit 256color ANSI and 24-bit truecolor ANSI. They’re not RGB hex codes; they’re sRGB hex codes.)
The article lacks some examples of what "done right" means. It points to some videogames that "do it terribly" (they look ok to me? not photorealistic, but not every videogame has to be like that?) but it does not show how "a correct" version of each image would be. Just says "it looks obviously bad". Sorry but I don't see it. I'm fine with videogames looking videogamey.
This article is just misinformed. Source: I’ve been working with color space conversion, HDR tone mapping, gamut mapping and “film look” for 20 years.
It’s clear from their critique of the first screenshots that their problem is not with HDR, but contrast levels. Contrast is a color grading decision totally separate from HDR tonemapping.
There’s then a digression about RED and Arri that is incorrect. Even their earliest cameras shot RAW and could be color matched against each other.
Then they assert that tone mapping is hampered by being a 1D curve, but this is more or less exactly how film works. AAA games often come up with their own curves rather than using stock curves like Hable or ACES, and I would assume that they’re often combined with 3D LUTs for “look” in order to reduce lookups.
The author is right about digital still cameras doing a very good job mapping the HDR sensor data to SDR images like JPEGs. The big camera companies have to balance “accuracy” and making the image “pleasing,” and that’s what photographers commonly call their “color science.” Really good gamut mapping is part of that secret sauce. However, part of what looks pleasing is that these are high contrast transforms, which is exactly what the author seems to not like.
They say “we don’t have the technical capability to run real film industry LUTs in the correct color spaces,” which is just factually incorrect. Color grading software and AAA games use the same GPUs and shader languages. A full ACES workflow would be overkill (not too heavy, just unnecessarily flexible) for a game, because you can do you full-on cinema color grading on your game and then bake it into a 3D LUT that very accurately captures the look.
The author then shows a screenshot of Breath of the Wild, which I’m nearly positive uses a global tonemap—it just might not do a lot of dynamic exposure adjustment.
Then they evaluate a few more images before praising a Forza image for being low contrast, which again, has nothing to do with HDR and everything to do with color grading.
Ultimately, the author is right that this is about aesthetics. Unfortunately, there’s no accounting for taste. But a game’s “look” is far more involved than just the use of HDR or tone mapping.
It's not just games, it's regular day-to-day UI too. I'm using an Acer 185Hz VRR HDR10 Gaming monitor.. on Eco mode with HDR disabled. Everything just looks better with HDR turned off for some reason I can't explain.
That's normal. For HDR to look good, you need a monitor that hits approximately 1000 nits in brightness. Your monitor only hits 250, which is completely insufficient to display HDR content.
This is one of the stupid things with many monitors, showing HDR at 250 nits is worse than showing no HDR at all. So no matter what you do, 99% of HDR content will look bad on your screen.
I agree that 250 nits is too low, but my monitor clocks in at 400 and HDR already looks better, if only thanks to the increased colour channel resolution - particularly visible in highlights, clouds etc. Where there previously was just a single colour blob I now can observe details impossible to display with just eight bits per channel.
Interestingly my laptop's display reaches 500 nits and that is already painfully high outside of midday hours. My phone goes to 875 and I find that only to be useful outside in the summer sun.
The difference is between SDR and HDR.
Going full blast with a full image at 500 nits or having an image averaging 200 nits with only peaks at 500 are two vastly différence things.
I have a C3 OLED and everything also looks better with HDR off.
Games are just truly awful in making scenes completely in viewable, even when the HDR areas, the blacks and whites, have interactive elements in them you need to see and know about.
I have a C4 OLED and I thought what you said was also true for me until I figured out what settings I needed to change on my TV to match my console (Nintendo Switch 2). Had to turn on HGiG, manually adjust the peak brightness level on the console itself, and suddenly things looked great.
Not that many games on the console that take advantage of it, mind you. More testing needed.
> For HDR to look good, you need a monitor that hits approximately 1000 nits in brightness.
I disagree. The wide color gamut is -for me- a huge thing about HDR. My VA monitor provides ~300 nits of brightness and I've been quite happy with the games that didn't phone in their HDR implementation.
Plus, any non-trash HDR monitor will tell the computer it's attached to what its maximum possible brightness is, so the software running on that computer can adjust its renderer accordingly.
> Plus, any non-trash HDR monitor will tell the computer it's attached to what its maximum possible brightness is, so the software running on that computer can adjust its renderer accordingly.
My monitor does do that, but alas the software itself (Windows 10) wasn't good enough to adjust stuff correctly. It did made the decision to switch to ArchLinux easier by being one less thing I'll be missing
VA screens have pretty damn good contrast, and OLED monitors tend to have low peak (and sometimes even spot!) brightness.
A while back, I tried an OLED gaming monitor that was widely reviewed as being very good. While it was somewhat better than the VA monitor that I've been using for years, it was nowhere near 1,500 USD good. I could see someone coming from an IPS or TN screen being very impressed with it, though.
VA screens have terrible black smearing though. I also bought an OLED display and returned it because it was just very dim. I own a miniled display that peaks at 1000cd full screen (it has a fan to handle the heat) and I'm still looking for an OLED replacement.
Something is poorly implemented with Windows UI on HDR, and Macbooks it all looks fine then HDR content just appears brighter, I think the rest of the UI becomes duller too at that point but on windows it feels like running HDR on the Windows desktop means the whole screen looks dull, least it does on my 5K HDR Dell.
Not sure if I'm missing a setting, but I end up having to manually turn HDR on before playing a game and off after.
As someone who worked a lot in realistic VFX I concur with the observation that nearly no game is doing tone mapping right and my guess to why that is always has been the fact that doing it right is just very complex.
There are many, many things artists need to do correctly, many of which have no idea of the whole pipeline. Let's say someone creates a scene with a tree in it. What is the correct brightness, saturation and gamma of that trees texture? And if that isn't correct, how could the lighting artist correctly set the light? And if the texture and the light is wrong the correct tone lmap will look like shit.
My experience is that you need to do everything right for a good tonemap to look realistically, and that means working like a scientist and having an idea of the underlying physical formulae and the way it has been implemented digitally. And that is sadly something not many productions appear to pull off. But if you pull it off everything pops into place.
The added complication with games is of course that you can't just oprimize the light for one money shot, it needs to look good from all directions. And that means it is hard to make it look as good as a film shot, because that risks making it look like crap from other directions which studios aren't willing to risk.
The dragon in The Hobbit isn't just about the tonemapping, it is at least as much (if not more so) a lighting issue. But the two can influence each other in a bad way.
All of the "bad" examples look like they're playing on a PC with poorly set gamma curves. Play on a TV where the curves are setup properly because TV people actually care about color reproduction.
I skipped the text and looked at the images and was unable to understand if they were supposed to be bad or good examples. I liked them. Then k read through the text and learned that they are supposed to be bad examples.
But why though? I suspect that either I am not good at this kind of thing, or this is a purist thing, like „don’t put pineapples on pizza because they don’t do that in Italy“.
I don’t want games to look realistic. A rainy day outside looks gray and drab, there is nothing wrong with rainy days in games not looking like the real thing, but awesome and full of contrasts.
>I don’t want games to look realistic. A rainy day outside looks gray and drab, there is nothing wrong with rainy days in games not looking like the real thing, but awesome and full of contrasts.
In photography and cinematography contrast and color curves are near ubiquitously modified artistically to evoke a certain feeling. So even without 3D renderings added colors are adjusted for aesthetic over raw realism.
I totally agree. The example pictures in the article look fine.
I don't know what the author wants, but perhaps it's some kind of industry insider view similar to where "true artists' make movies that are so dark you can't see anything, and the dialog is quiet mumbling and the sound effects are ear-shattering. Perhaps there's an equivalent to that in games.
I can see why people wouldn't like them - they're all oversaturated and most of them go for the cheesy "everything is teal and orange" or "everything is piss yellow" gradings. There's a quote I heard in a photography tutorial once that goes something like "once you've moved all the sliders to what you want, move them back 50%", and games basically don't that.
But the biggest problem with the screenshots is they literally aren't HDR. So how can we judge their HDR?
> I don't know what the author wants
If only they had written an article about what they wanted...
> but perhaps it's some kind of industry insider view similar to where "true artists' make movies that are so dark you can't see anything
Nope, it's not that.
Real life has a lot of sensations that games don't. A rainy/foggy day might look boring, but it feels nice to be out in (ideally). Well, that and computer audio is/can be about as good as humans can perceive, but displays are nowhere near it.
So both of these mean you have to jack up the sensation so people can feel something.
From what I understood is that these are supposedly bad because they look like video games instead of photographs. Not sure what the problem with that is though. I'm fine with video games looking like video games.
Reminds me of how movies / shows these days have gotten so dark, when in the past even dark scenes were often lit in such a way as to show details.
I truly don’t understand the author’s opinions about contrast here. The RE7 image is the only one here that looks ‘realistic’, and at a glance could be mistaken for a photograph, and he says it’s got way too much contrast.
No other image here comes anywhere even close, definitely not Zelda nor GTA5.
Personally I think the whole problem with the first 5 images is that they don’t have enough contrast, and they have too much detail. The color handling isn’t the only reason they don’t look realistic, but making sure every single pixel’s nicely exposed and that nothing gets too dark or too bright is allowing to let all the CG fakeness show through. One of the reasons the RE7 image looks better is you can’t clearly see every single thing in the image.
If you take photographs outside and the sun is in the shot, you will absolutely get some blown out white and some foreground blacks, and that’s realism. The CG here is trying too hard to squeeze all the color into the visible range. To my eyes, it’s too flat and too low contrast, not too high contrast.
> definitely not Zelda nor GTA5.
The zelda screenshot he uses as an example of how good things look without HDR, looks terrible to me. It is all washed out with brightness and bloom, and all the shadows in the landscape that in reality would almolst be black, are very light grey.
His argument is that it looks like something someone would paint and I quite agree with that.
I agree, it is washed out, and I was trying to find what exactly in the image the author really liked, but all I saw was a decolorated postcard.
For me games being too dark and not being able to see anything is a pet peeve. I can see the point in a horror game, but I will set the gamma or turn up the brightness if it makes the game hard to play.
Oh I agree. The art director needs to be exposing the important gameplay elements to be visible. That doesn’t mean they should avoid blacks for everything though, and that’s what all images except the RE7 image are doing.
Out of my mind, the destiny 2 is the biggest offender of this category. If I can't see shit at all, how does the feeling artist trying to convince even matter? I will just turn the brightness in graphic card setting all the way up. Because the cap in in game setting is insanely low.
Plus isn't not even a horror game. Come on, you are a shooter game. How does a shooter game that you can't see anything even make sense?
> The RE7 image is the only one here that looks ‘realistic’, and at a glance could be mistaken for a photograph, and he says it’s got way too much contrast.
It looks like a cheap film camera or a home video screenshot. So it gives off a feeling of nostalgia to a sufficiently old person, but this is also the kind of photo you'd reject as a pro, because it's totally overexposed.
One problem with photorealism is a lot of players are on bad displays, or in bad viewing environments. Games often take this into account in their visual direction so that they will be more legible in these different environments. It used to be even worse when designing a game for say the Gameboy Advance or original Nintendo DS where you knew the screen wasn't backlit or wasn't particularly bright so your images needed to be bright and colorful. Even now, a Nintendo Switch game might be played on the bus.
For big budget games the solution for this is typically to have brightness calibration when the game first boots up, but the game itself still needs to be designed adaptively so that it's not Too Dark or Too Bright at critical points, otherwise the playability of the title is jeopardized. This runs counter to a goal of photorealism.
I made thermal prints (receipt printer) of concept art from Pokémon Sun and Moon for the Nintendo 3DS and Switch, like this one
https://safebooru.org/index.php?page=post&s=view&id=1821741
and found they did really well because the art was designed to look good on bad screens and poor viewing conditions. I think of it in terms of Ansel Adam's Zone theory in that the ideal image is (1) legible if you quantize it to 11 tones of grey (looks OK printed in the newspaper), but (2) has meaningful detail in most or all of those zones.
I'm kinda disappointed that the Nintendo 3DS version didn't use the stereo effects but they would have had to decided if her hair forms a sheet or a cone.
It's not about being realistic but good looking.
Okay, the only image that looks “good” to me in terms of color handling is the RE7 image.
Zelda is gorgeous.
You're arguing that game engines should imitate photographic cameras, but they should imitate our eyes, which will never blow out whites outside in the sun.
Our eyes absolutely blow out whites in the sun. Doubly so when looking at the sun or even reflections immediately after being in the dark for a while, and when looking at bright that is very near dark in your visual field.
I’m not necessarily arguing games should imitate cameras, I really only think over-compressing the dynamic range is bad, and I don’t understand why the author is arguing for that.
> Our eyes absolutely blow out whites in the sun.
Do you have a new technique to decode eye-brain perception in terms of how we perceive visual signals? Do you have a paper indicating how you make this claim for everyone?
Do you really need a paper? It’s well known that looking at the sun does damage to rods and cones, because it far exceeds their response range, long before perception gets involved.
'In the sun' != 'at the sun'
And you completely miss what I'm asking too.
Chemical reactions in the rods and cones are only a small portion of vision processing. The rest is in the brain, with a great deal of various processing happening, that eventually comes to cognition and understanding what you see.
And parts of the visual cognition system also synthesize and hallucinate vision systems as well, like the vision hole where the optic nerve meets the eye. But cognitively, the data is there smeared across time and space (as in a SLAM algo putting the data where it should go, not what is measured).
What, exactly, is relevant about the perception and cognition systems if the signal from rods and cones is clipped or distorted? By ‘blown out’ we are talking about the rods and cones being saturated and unable to respond meaningfully. Your question doesn’t make sense, and I’m neither making claims about nor arguing over what happens in the perceptual system to bad/saturated inputs.
I don’t know what you mean by ‘in the sun’ != ‘at the sun’. I’m the one who said ‘in the sun’ and I was talking about staring at the sun. I’m not sure what your point is, but if you’re trying to say that a game render of looking at the sun is different than the experience of actually looking at the sun, then I wholly agree. A game will (rightly and thankfully) never fully recreate the experience of looking at the sun. If you’re trying to defend &carlosjobim’s claim that human vision doesn’t have an absolute upper luminance limit, then I think you need to back that claim up with some evidence.
Woah, the sun is bright? How do you know this is true for everyone? Do you have a peer reviewed RCT paper posted in a high impact journal confirming this?
It's a cloudy day here and I'm within my rather dimly lit office. If I look out the window, it is no problem to see clouds in all their details, and I don't loose any details within the darker environment in the room. A camera will either blow out the entire sky outside the window to capture the details in the room - or make the room entirely black to capture the details of the sky through the window.
I mean, most people reading our comment thread here have their smart phone by their side and can instantly verify that eyes do not blow out whites or compress blacks like a camera. The dynamic range of our eyes is vastly superior to cameras. So aiming to imitate cameras is a mistake by game developers.
Of course, staring straight into the sun or a very bright light or reflection is a different matter.
The first three pictures in the article have direct sun visible in the sky and not clipping. I was referring to that. The sun itself does blow out when you look directly at it, but please don’t spend time staring at the sun as it will damage your eyes.
The dynamic range of human eyes is not vastly superior to cameras. Look it up, or measure. It’s easy to feel like eyes have more range because of adaptation, foveation, iris, etc.
Again, I didn’t argue that games should imitate cameras. But that would be better than what we have in games; movies look way better than the game screenshots in this article.
One big issue I never understood is why do we need photorealism in games at all. They seem to benefit card manufacturers and graphic programmers, but other than that I feel it has nothing to do — and in fact may have negative impact on game quality.
Photorealism is a bad idea if your movement engine isn't good enough to handle the character walking around on uneven terrain. For racing games or flight simulators or such it is less of a problem, but seeing a regular person being absolutely flummoxed by a knee-high wall is massively immersion breaking.
It's something I really noticed when playing Disaster Report 4, where the people look amazingly realistic but some restrictions are clearly just 'developers didn't make this bit walkable'.
> For racing games or flight simulators or such it is less of a problem,
Cars are also easier to make photorealistic. Less uncanny valley effect, lots of flat shiny surfaces.
What absolutely breaks immersion for me in most AAA car games is the absolute lack of crash, scratch, and dirt mechanics. Cars racing around the track for 2 hours don’t look like showroom pieces! Make ‘em dirty darn it. And when I crash into a wall …
I’m really excited to try Wreckfest 2 when I get around to it. Arcade-ish driving, not super photorealistic, they put it all on realistic soft body collision physics instead.
I seem to recall hearing that car manufacturers only allow their vehicles to be licensed for use in games if they won't really get visually damaged. Kinda funny to see cars just bounce off each other in Gran Turismo. But rally games tend to be better at that (I may have lost a door or two (or a few dozen, but who's counting) in WRC).
You might like BeamNG.drive. It has soft-body physics simulation (also for driving dynamics, so it's not arcadey) and decent graphics. It's more like a sandbox with half-done "actual game" mods AFAIU, but happens to be quite popular and very highly rated anyway. I'm on the fence about buying it myself.
Collin Mc Rae Rally 2 and 2005 did it fine for its era. What CMR2 did was incredible, the damages were very real.
I had a great time recently on my first trip to a racetrack, and the most surprising thing to me was how all the cars were utterly beat to shit. Not like in a bad way, but in like... a sports gear way? They were all working (well, mostly, one guy had a real bad time on his second lap and I'm pretty sure his engine was DONE) but the panels were quite battered, and a number had full on body damage I'm assuming from track contact.
And granted this was an amateur race day, just weekenders having a good time, but it makes sense when you think about it: if the body panels aren't like falling off and are just a bit beat up... why replace them? Especially on some of these cars (late model Corvettes and Mustangs) they don't come cheap at all, and they'll require refinishing and you have to do your livery over again too.
Like a hockey player doesn't buy a new helmet every time they get hit, they/the team would be broke before the season was out.
I think it's like porn. Not sure about you guys, but for me soft-core always looks better than HD hardcore. Soft-core encourages imagination and conveniently covers any body part that is a bit far from perfect.
And that's why I always think ladies who wear just enough clothes are way more sexy than nude ladies.
Hopefully this doesn't offend anyone.
I get it, I prefer seeing two bears be tender and affectionate rather than just 'bend over and spell run'.
This is true in Wukong too, which is otherwise a very good-looking game. There are various points where rocks and scaffolds look just as climbable as those in the game area, yet the game engine places an invisible wall in your way. It breaks immersion instantly.
I think it's more that they didn't have the display language to mark those inaccessible parts of the world as "boring", and prevent the player from wanting the walk into that invisible wall in the first place. Or placing the invisible wall 1m infront of a real wall for NO REASON.
While also expecting you to go around searching for hidden goodies nd secret paths.
I swear, the invisible walls are the only thing pushing it to a 9/10 from a 10/10 for me.
We don't need photorealism in games, but it does help with immersion. Many people, like me, feel like they are inside the game world, rather than playing a game with a TV/monitor in front of them. Photorealism is essential for this feeling - at least for me .
The most amazing gaming experience I've ever had was walking around the city at night in Cyberpunk 2077. For the first time in my life, I felt I was actually in the future. Zelda can't pull that off with me, despite being a great game from other perspectives.
I find this an interesting argument. I wonder if it's a generational thing.
If we define immersion as "your vision focuses on what's inside the screen and you ignore the world around the screen, and you mostly ignore that your control of the player character is through a keyboard and mouse", then I've experienced immersion with every first person game ever, including Minecraft. I never considered that some people might need photorealism for that at all. There was another commenter that mentioned being unable to walk over a short wall due to character controller limitations as being immersion-breaking. I agree this is annoying but the qualia of it is more like a physical confusion rather than being something that actually breaks my experience of the game.
I'm also thinking this might be related to why I find VR to be, while very cool, not some revolutionary new technology that will fundamentally change the world.
> VR to be, while very cool, not some revolutionary new technology
VR despite its limitations is the one thing I’ve ever achieved “presence” in, as in feeling if for a brief moment, I was actually there.
Elite dangerous, OLED Unit, HOTAS. For a brief moment in time my brain believed it was in the cockpit of a spaceship.
I had a similar experience in a a game meant to simulate regular city car driving.
Most releveant to this comment thread however was the fact that the graphics were very crude and not in a good way. I absolutely dispute the claim that realism equals (immersion/presence - I'm not getting involved in the debate about the distinction between the two)
I’d argue that immersion has little to do with graphics, even for FPS. Actually I had more immersion in some text adventure games than in some AAA games — and not out of nostalgia because I never played the said text adventure games before.
I’d agree that certain degree of graphics helps with immersion, but photorealistic graphics only offers cheap immersion which turns off the immersion centre in the brain — Ok this is just my babble so 100% guess.
Agreed. Immersion in a game world, at least for me, is less about how accurately it visually reflects reality and more about how detailed the overall world feels -- whether the designers have crafted worlds that feel like they live and breathe without you, that you could imagine inhabiting as someone other than the protagonist. For instance, I can imagine what it would be like to live in Cyberpunk 2077's Night City, whether I was a merc like V or just one of the nobodies trying to get by that you pass on the street; I can imagine living in Dishonored's Dunwall (or the sequel's Karnaca) in the chaos and uncertainty of their plagues; I can put myself in the shoes of one of the faceless, downtrodden members of the proletariat of Coalition-occupied Revachol in Disco Elysium; a lot of AAA games, on the other hand, feel like theme park rides--well-crafted experiences that are enjoyable but don't stick with you and discourage you from thinking too deeply about them because they don't withstand much scrutiny. But Cyberpunk 2077 is evidence that they don't have to be that way, and Dishonored and Disco Elysium are equally evidence that you don't need a half-billion-dollar budget and photorealistic graphics to create immersive worlds.
(edited to clarify that I'm not laboring under the misapprehension that Cyberpunk 2077 isn't a AAA game)
I recall a paper from GDC many years back that studied the perception of immersion and they measured and ranked maybe a dozen different factors. Graphics and visuals were surprisingly low on the list. The number one thing was the player’s sense of identity and clear understanding of their goals. Players tended to correlate realism with high immersion too.
Oh that sounds really interesting, I’d like to track it down.
Was it this one? https://www.gdcvault.com/play/1015464/Attention-Not-Immersio...
That’s definitely in the same realm, but not the one I was thinking of. I believe I’m thinking of something maybe 10 years earlier, it had multiple authors, at least one woman, and some of the authors were psychology researchers who were into games. I’d wouldn’t be surprised if this is a theme and avenue of research that has come up many years at GDC.
I bought cyberpunk when it released, i may have even pre-ordered, i don't remember. I played about 20 minutes after the title drop, you know the one. It was buggy, and didn't really look that good to me, on my samsung 4k monitor.
I then played it again, on the same monitor, last year, and i was pleased with the gameplay, but again, i didn't find anything that remarkable about the overall graphics. the fidelity was great, especially at distance, due to 4k.
I'm 50 hours deep in literally as i type this (about to launch the game), and this time, this time it is completely different. I have an LG 2k HDR screen with "Smart HDR" and i finally - finally - get it. Your eyes have to adjust just like in real life, to go from dark indoors to bright outdoors. you can see tail-lights and headlights in the mountains of NPCs driving around. lasers sweeping you are menacing.
Even fallout 4, which is the first game i played in 4k 10 years ago, looks easily 10 times better in HDR. And i only have the "vanilla+" mod set, 5GB of mods, not the 105GB modset.
I coined the phrase 4 or 5 years ago, that HDR stood for: Hot Damn, Reds! and really, reds are still my least favorite part, they burn to deeply, but from watching several movies on an HDR 4k TV and being real unimpressed, to just these two games, my entire viewpoint has drastically changed.
I didn't know you could put arbitrary people into photo mode in CP2077, and also pose them and move them around, so i was just entering photo mode as best i could and lighting and fiddling with the curves; however, these all took over 4 seconds to "render" to the final image, which i found interesting: https://imgur.com/a/DTesuhF
You're not alone, Cyberpunk's blend of near-future with realism whilst maintaining a clear art style that is not total realism is very immersive. I have spent countless hours wandering around Night City, not even playing the main gameplay.
CP2077 was the game I drove most carefully in when not on a mission, just coz it felt right that V wouldn't be hooning around his home turf. The immersion was incredible.
There's something about the image quality of Cyberpunk that looks off to me, and I can't quite put my finger on it. Maybe the hair rendering? Shadowing?
It's clearly going for photo realism, but it somehow looks worse to me than older, lower-fidelity games.
DLSS really messes with the realism, however for actual gameplay it's less annoying to me than i thought it would be from such games as diablo IV and others in that cohort. If you want maximum quality, don't let an AI draw what the developers (artists) intended, just draw what the developers intended. i replied to a sibling comment with 4 photo mode screenshots, and you can see that there's a lot of variation in environment lighting, and all of the ambient light is pre-arranged by the design team and developers. In CP2077 a lot of quests are "go to <location> at dusk/dawn/night/noon, or between x&y time, because they want the scene to be cinematic, and it shows. Harsh fluorescent lighting on scenes with a doctor, muted, hazy interactions with a shady character or a scene with emotional turmoil, long shadows and lots of reds at the end of a story arc.
It really feels like they put so much work into how everything looks in the primary and secondary stories.
i can agree though that just "jobbing" it looks more like a run-of-the-mill shooter, though.
> We don't need photorealism in games, but it does help with immersion.
This is a blanket statement I would disagree with.
> Many people, like me, feel like they are inside the game world, rather than playing a game with a TV/monitor in front of them
I can't disagree with a statement about personal preference.
So which is it?
Out of curiosity do you not get immersed in books?
I do, but not like cyberpunk. I like to both read and watch movies, but I feel a lot more immersed with images than I do with words. It's not a binary rating (immersed vs not immersed), it's a gradient that makes things resonate more strongly with photorealism.
This is one reason, I believe, why some people can't stand animated cartoons. I like them but I know many people who won't even consider watching animation.
You can get immersed in anything. With games or VR realism, it’s like extra depth of immersion when your brain switch to think in same way as you think in real world rather than adapting to physics or terrain of fake world.
I think, like polygon count, resolution, FPS, etc, realism is very easy to objectively assess and compare even with no artistic background, which makes it a target both for gamers (who want to explain why they like a game, or debate which game is better) and studios who want something they can point to.
IMO it leads to really stilted experiences, like where now you have some photo realistic person with their foot hovering slightly in space, or all that but you still see leaves clipping through eachother, or the unanny valley of a super realistic human whose eyes have a robotic lock on your face, etc.
Physical interaction with game worlds (wasd and a single pivot, or maybe a joystick and a couple buttons) hasn't increased in depth in 20 years which only emphasizes the disjointedness.
I totally agree with your last paragraph except to add: there has actually been some great advances in interaction, but people vote with their playtime, and I think the reality is that the "median gamer" is totally content with WASD + mouse/the typical controller thumbstick movement. In the same way that so many are content that many game mechanics boil down to combat and health bars.
I am personally not content with that and I explore all I can, and am trying to make games that skirt the trends a little bit.
But that stark contrast between visual fidelity but a lack of interactivity has been a pet peeve of mine for a while. You can even do so much more with just mouse and keyboard interactions, but I think it's overshadowed by the much lower risk visual fidelity goals.
A large section of the gaming public sees photo-realistic games as serious, and prefers them for high-budget games. It's a rat race for devs though - its just incredibly expensive to create high quality models, textures, maps.
I've been playing Cyberpunk 2077, and while the graphics are great, it's clear they could do more in the visual realm. It doesn't use current gen hardware to the maximum, in every way, because they also targeted last-gen consoles. I'm thinking in particular of the PS5s incredibly fast IO engine with specialized decompression hardware. In a game like Rachet and Clank: A Rift Apart, that hardware is used to jump you through multiple worlds incredibly quickly, loading a miraculous amount of assets. In Cyberpunk, you still have to wait around in elevators, which seem like diegetic loading screens.
And also the general clunkiness of the animations, the way there's only like two or three body shapes that everyone conforms to - these things would go farther in creating a living/breathing world, in the visual realm.
In other realms, the way you can't talk to everyone or go into every building is a bit of a bummer.
I think chasing photorealism also hurts the modding community, which hurts the players. No ordinary modding community could push out photorealistic contents in a realistic span of time. I think that's why we are seeing less and less mods nowadays comparing to the late 90s and early 2000s.
For FPS, HL2/Doom3 is probably the last generation that enjoys a huge modding community. Anything above it pushes ordinary modders away. I believe it is still quite possible to make mods for say UE4, but it just took such a long time that the projects never got finished.
In certain way, I so much wish the graphics froze by the year 2005.
HL2/Doom3 have built in mod support, so I don't think it's fair to compare it to games that don't have mod support.
Let's see how GTA VI will change this and the industry.
I personally like Cyberpunk's 2077 style, it looks great maxed out with HDR. Yes, the models aren't the best, but the overall look/vibe is spectacular at times.
> In Cyberpunk, you still have to wait around in elevators, which seem like diegetic loading screens.
Cyberpunk has vanishingly few elevators. While it may be a loading hide in some spots, it's certainly not indicative of the game which otherwise has ~zero loading screens as you free roam the city including going in & out of highly detailed buildings and environments.
> I've been playing Cyberpunk 2077, and while the graphics are great, it's clear they could do more in the visual realm. It doesn't use current gen hardware to the maximum
I'm not sure how you can reach this conclusion to be honest. Cyberpunk 2077 continues to be the poster child of cutting edge effects - there's a reason Nvidia is constantly using it for every new rendering tech they come out with.
This, in a nutshell, is why Nintendo is doing so well.
Their hardware is underpowered, games look like cheap cartoons, but the effort spent into gameplay more than compensates.
I don't agree here.
Nintendo games don't look like cheap cartoons at all. They are absolutely not photorealistic but they do put a lot of work on the aesthetics/art and it's most of the time relly impressive once you take the hardware limitations into account.
Mario 64 ran on the same console that was known for its 3D blur.
Mario Galaxy 1&2 (which are still totally modern in terms of aesthetics) ran on what was basically an overclocked gamecube.
Mario Kart 8 which is still more beautiful than a lot of modern games ran on the Switch, which is itself based on a 2015 mid-range smartphone hardware.
I think it's more that Nintendo's choice of hardware (and its relative lack of horsepower) force them into more stylized visuals because it means photo-realism is basically off the table to start with. We the audience tend not to care, because Nintendo has capable artists who can create something aesthetically pleasing outside of "realistic" graphics.
There are tens (if not hundreds) of indie and B-games that offer the same experience as most current Nintendo titles. Nintendo is doing well more because of nostalgia - it's the parents buying those consoles for their kids because they have very fond memories with Nintendo from their own childhoods.
I don't suffer from that particular nostalgia, not having had a Nintendo console (C64/Amiga diehard here), but I bought Wii and Switch, and a couple of first-party games for each.
I considered, and passed on, the other consoles.
Nintendo is playing a different game than other console/game makers (excuse the pun), IMHO.
I would disagree. And I know many adults in the PC gaming space like myself who would disagree.
I like my indie games, but not many are putting out what Nintendo is.
I mean it’s all subjective though.
This. To me one of the reasons why Coffee Stain Studios is such a successful publisher is that its games typically don't push for visual realism for the sake of it (hardly possible anyway when they feature dwarves, alien species and the like).
My take is that video game devs learn to aspire to cinema, since they're both making "entertainment art that exists on a screen" and cinema is more widely accepted as art among the intelligentsia (not that I agree).
Some games are sold just so the end user can enjoy exercising their new GPU and monitor. Crysis and Control come to mind.
>Control
Did we play the same game? Some of the best lore-building and environmental theming around, paired with some cool mechanics?
Sure, the combat got repetitive but this was hardly something to "just sell GPUs"
That's not the game I played.
The lore was annoying to listen to; whenever I wanted to listen to an audio log, I had to stop playing the game and watch the exact same video of a man smoking and being mysterious.
The cool game mechanics were basically just the gravity gun from Half Life 2, which came out over 20 years ago.
It did have some cool environmental set pieces, but overall I just found the game too pretentious for something that was basically a rip off of the SCP wiki.
I was a bit confused by this aspect of control. It was lauded as an example of a top-tier graphics. I liked the game, but its graphics felt mid to me. Maybe due to the grey indoor environments?
Nevertheless, it was commonly used for showing off (cloudy, particle-y) raytracing.
Yes but Control isn't sold "just so the end user can enjoy exercising their new GPU and monitor", it is sold for gamers to play a great game. And IMO it is Remedy's best game since Max Payne 2 (i haven't played Alan Wake 2 though) because of its gameplay and atmosphere, not because of its visuals (which, do not get me wrong, are great, but that is largely because of the art direction and visual design, not because of raytracing -- in fact personally i first played and finished the game on an RX 5700 XT which has no raytracing at all and had to tone down a few visual effects, but still found the visuals great).
I don't really see your point. It was used by benchmarking youtubers for that benchmarking, so it at least sold to them for that reason. It's also the reason I bought it: any later enjoyment is unrelated.
Lots of things are used for benchmarking. Very few are made with it in mind.
Crysis' system requirements at launch were so far above what most people had that I'll give you that. Control wasn't that way at all.
I don’t really see your point because you appear to be moving the goalposts.
> Some games are sold just so the end user can enjoy exercising their new GPU and monitor.
Being used “for benchmarking” and “being sold just” for that purpose are two very different things.
Control was one of the first big games to come out after Nvidia’s first line of GPUs with raytracing hardware (RTX 20xx) and one of the first games to use those hardware features. That’s why it was used as a showcase (there was probably a deal between Remedy and nvidia to make this happen, not sure).
It was a good looking game at the time, but remember it originally came out on PS4/Xbox One and that version did NOT have raytracing.
I've wondered whether photorealism creates its own demand. Players spend hours in high-realism game worlds, their eyes adjust, and game worlds from ten years ago suddenly feel wrong; not just old-fashioned, but fake.
This is also true for non-photorealistic 3D games. They benefit from high-tech effects like outline shaders, sharp shadows, anti-aliasing and LoD blending - but all of that tech is improving over time, so older efforts don't look quite right any more, and today's efforts won't look quite right in 2045.
When a game developer decides to step off this treadmill, they usually make a retro game. I'd like to see more deliberately low-tech games which aren't retro games. If modern players think your game looks good on downlevel hardware, then it will continue to look good as hardware continues to improve - I think this is one reason why Nintendo games have so much staying power.
This has been the norm in 2D game development for ages, but it's much more difficult in 3D. For example, if the player is ever allowed to step outdoors, you'll struggle to meet modern expectations for draw distance and pop-in - and even if your game manages to have cutting-edge draw distance for 2025, who can say whether future players will still find it convincing? The solution is to only put things in the camera frustum when you know you can draw them with full fidelity; everything in the game needs to look as good as it's ever going to look.
> One big issue I never understood is why do we need photorealism in games at all
Because WOW factor sells, specially if it's a new ip. You can see most trailers full of comments "this looks bad".
Completely agree. People lament the death of the RTS genre for all kinds of reasons but I think the biggest one was the early-2000s switch to 3D. Performance considerations meant you have way fewer units. The only exception was that Supreme Commander was somehow able to get around this, but suffered heavily from the second big problem with 3D RTSes: the tiny unit models are so much harder to tell apart in 3D compared to 2D.
The RTS switch to 3D was a mistake and I think RTSes will continue to fail until their developers realize what actually makes them fun is actively hindered by this technology.
I'm on the gameplay > graphics bandwagon too but StarCraft II and Age of Empires IV are proof that 3D is not the problem.
It's a lot easier to get a large team of artists to follow the same artstyle when that artstyle is just "realism". Also, photoscans are convenient.
There is no "we". Some people like it some of the time.
Yeah that’s fair.
Solar Ash is a good example of a non photorealistic 3d game
I agree it doesn't benefit most games, but it's still genuinely amazing to see sometimes.
I suspect part of the challenge with making a hit game with last-gen graphics (like Breath of the Wild) is that you need actual artists to make it look good.
sales of games say otherwise. 2d pixel games have some occasional hits but the large number of games that make money go for more realism.
How do you understand other human desires? That is, what is different about the desire to match reality in other mediums is different from other more understandable desires?
For the same reason it was searched for in painting for so long, and for the same reason movies and plays often meticulously recreate (or film in) real locales and use period-appropriate attire: people, by and large, love looking at reality way more than stylized images.
There are exceptions, but the general public will almost always prefer a photo-realistic renaissance painting to a Picasso portrait, a lavish period piece like Titanic to an experimental set design like Dogville.
This is [...] a series examining techniques used in game graphics and how those techniques fail to deliver a visually appealing end result
All I see is opinions though. And the internet is full of them. You just have to Google "why does this game look so ...". At least if the author had compared the search stats of "good/bad/beautiful/washed out" it would've carried some weight.
The GTA 5 screenshot is a terrible example. It looks like a cheap, dead, video game environment, reminding me how far we've come.
I think the author's list of "ugly" games is missing Witcher III, Hellblade, God of War (2018), Elden Ring, Baldur's Gate...
And we need some examples of good, cinematic, artful tone mapping, like any scene of a Hollywood movie set in Mexico...
> tone mapping, like any scene of a Hollywood movie set in Mexico
That's not tone mapping, but color grading
I'm not sure you were being ironic, I find the witcher 3 and elden ring beautiful
I think Elden Ring is a little ugly but still a world I want to experience.
While the graphics aren't as good as some other modern titles, the world and art design make up for it ten times over. There are a bunch of locations that could be paintings, especially:
- The first steps in Limveld
- Liurnia of the Lakes (from Stormveil)
- Leyndell
- The first look at the Scadutree
- Cerulean Coast
- Stone Coffin Fissure
- Enir Ilim
I can't remember another property with a similar diversity of incredibly beautiful and imposing areas.
In my experience Elden Ring looks better when you turn the graphics quality down. Baldur's Gate isn't particularly ugly for a '98 game.
And I agree that it would be nice to have some positive examples. I think there were a bunch of SNES games which did it well, but that may just be nostalgia.
given the rest of the games listed I assume he means baldurs gate 3. Many younger people out themselves by just calling it baldurs gate.
> Baldur's Gate isn't particularly ugly for a '98 game.
I remember it looked beautiful. Especially comparing to early 3D games of that era.
I found this video to visualise what tone mapping is trying to achieve, and why "photorealism" is hard to achieve in computer graphics: https://www.youtube.com/watch?v=m9AT7H4GGrA
And I indirectly taught me how to use the exposure feature in my iPhone camera (when you tap a point in the picture). It's so that you choose the "middle gray" point of the picture for the tone mapping process, using your eyes which have a much greater dynamic range than a CCD sensor. TIL.
> the exposure feature in my iPhone camera…choose the "middle gray" point of the picture for the tone mapping process
No, it uses that to set the physical exposure via the shutter speed and ISO (iPhones have a fixed aperture, so that cannot be changed). It literally says this in the video you linked. This is not tone mapping. Tone mapping in a way may also happen afterwards to convert from the wider dynamic range of the sensor if the output format has a more limited dynamic range.
I've heard a good point that our eyes have, in fact, a boring 1:100 range of brightness. Eyes can rapidly adjust, but the real game changer is our ability to create an image in our video memory, which has an unlimited brightness range. Eyes give us maybe a 2d uint8 framebuffer, but our mind creates and updates a float32 3d buffer. This is why this experience cannot be reproduced on a screen.
If our eyes can only see 100:1, why is OLED taking off? LCD has been claiming 1000:1 for decades
Because of fast & per-pixel level light control. Though this is true even if we completely ignore whether human eyes actually manifest a 100:1 auto-adapting dynamic range window.
In my case it's because the motion clarity of OLED is excellent.
Why does everything (in big-budget video games) look shiny and wet?
If it is an attempt at realism, reality is not constantly shiny and wet.
If it a subjective artistic choice, it is objectively wrong and ugly.
Is there an expectation that everything look shiny and wet to make it seem more "dynamic"?
Is it an artists' meme, like the Wilhelm Scream in cinematic sound design?
Overuse of reflective surfaces are the same kind of fad we saw with bloom in the mid 2000s and early 2010s. Now that SSR everywhere is technicaly feasible gamedevs want to use them everywhere. I think this started 5-10 years ago and RTX has renewed the meme, unfortunately.
Isn’t Unreal Engine guilty with this? That’s how I often recognize it’s an Unreal Engine game.
Specular highlights are cheap (frame time and artist time) and beautiful when done right, so everyone tries to do them and they get overcooked.
There is a secondary problem in big budget games where modeling work gets farmed out leading to selection for "what looks good in the preview pic." In the preview pic, the asset artist gets to choose background/scene/lighting, and it's an easy trick to choose them to make the specular highlights pop. The person doing integration buys the asset, drops it in wildly different background/scene/lighting, and now the specular highlights are overcooked because the final scene wasn't chosen for the specific purpose of leveraging specular highlights.
tl;dr artists ship the org chart too
Recently, some of it seems to be just to highlight raytracing hardware. Cyberpunk uses a lot of metal reflective surfaces to give a futuristic/tech vibe. But that's one sort of futurism. There'll be plenty of use of natural stone, wood, and tile far far into the future.
I thought this was going to be the subject of the article. For years now, everything looks weirdly shiny.
The common wisdom is that it's more difficult to make sunny and dry environments look pretty than it is overcast and wet ones. I tend to agree with this based on the end results I've seen over the many years.
That's what I used to think too.. but Spec Ops: The Line is entirely based in desert, even has a shot of sarin horror and while 'pretty' isn't the word I'd use, it is stunning.
It is amusing now that you point it out. There are always trends that come and go in these large scale industrial artforms. As others point out in this case likely a response to technical advancements and desire to emphasize those. Another example that would come to mind here is is the orangey-sunlit ears that seemed to show up everywhere to show off subsurface scattering.
Thinking back - films also are always doing some new exciting thing all at once. That wild colored lighting aesthetic of the past decade comes to mind. That's a result of refined color correction software and awesome low-cost LED lights. Or drone shots. So many drone shots.
It's usually a group-think phenomenon where everyone was previously unable to do something and now they can and everyone wants to try it. And then there are successes and management points at those and yells 'we want that, do that!', and distribution follows, and if becomes mandatory. Until everyone is rolling their eyes and excited about another new thing.
It's a silly phenomenon when you think about it - any true artist-director would likely push back on that with a coherent vision.
Michael Mann starting with Thief (1981).
"Mann sprayed down the city’s nocturnal streets with tens of thousands of gallons of water, so that they took on an unreal, painterly glow." - New York Times
I feel like this is very much a personal preference thing. They even called out Horizon Zero Dawn for looking very bad, and Zelda for looking very good.. while in my opinion the exact opposite is true.
I do see the point of the author: HZD goes for a "realistic", high-fidelity 3D fantasy world, yet the lighting makes no sense in physical terms. The contrast and brightness shown in the picture are all over the place, and can only be an artifact of visualising a world through a computer screen which has a very limited dynamic range - it is immersion-breaking. The Resident Evil 7 picture below looks much better. The video I linked in another comment explains why: in the physical world, the stronger the light, the more washed-out the colour will become. HZD is a saturated, high-contrast mess with too much compression in the low light, because of a bad colour mapper in their pipeline.
One can claim HZD's look is an "artistic choice" and that's inarguable, but the author believes it's simply not enough attention to the tone mapping process, which is a very complicated topic that's not usually taken seriously in game dev compared to film production.
The author is more pointing out that these games don't look realistic. Look at the foreground of the HZD shot - why is it almost black in daylight?
To be fair - if I remember the location correctly - that screenshot is somewhat misleading because it's camera position is from the inside of a large ruin, with the ceiling and right wall of the "cave entrance" being just outside the frame.
Zelda looks realistic to them?
No, the author posits that Zelda explicitly goes for artistry and ignores any pretense of realism (that then falls flat on it's face when using an over-contrasting tone-map like in the HZD screenshot).
The problem I personally have with the Zelda example given is that it looks really bland to me - the landscape looks really washed out - the author says "Somebody would paint this. It’s artistic.", but I don't think anyone would paint with such bleached-out colours.
yeah, no one would ever do that https://artlogic-res.cloudinary.com/w_2000,h_1600,c_limit,f_...
Oh, I see. I disagree that the original HZD had a pretense of realism though. The remastered version does and well illustrates the uncanny-ness https://www.youtube.com/watch?v=IlWK_ELBW08 . The outrageous god rays, bloom and lens flare in the remaster compensate for that because you can't actually see anything due to them blinding you...
I think with enough exposure to the overdone contrast ratios, you start to get tired of it. It sacrifices a lot of clarity. I agree it does look good in some cases, for example I enjoy the look of Battlefield 1 a lot, but when playing it I often noticed I had issues seeing detail in darker areas.
One game that actually puts a lot of effort into this is X-plane. They use physics based rendering and with recent updates they have done quite a bit of work on this (clouds, atmosphere, natural looking colors and shadows, HDR, etc.
There's a stark contrast here with MS Flight Simulator which looks great but maybe a bit too pretty. It's certainly very pleasing to look at but not necessarily realistic.
One thing with flying is that visibility isn't necessarily that good and a big part of using flight simulators professionally is actually learning to fly when the visibility is absolutely terrible. What's the relevance of scenery if visibility is at the legal minimums? You see the ground shortly before you land, a few feet in front of you.
And even under better conditions, things are hazy and flat (both in color and depth). A crisp, high contrast, saturated view is pretty but not what a pilot deals with. A real problem for pilots is actually spotting where the airport is. Which is surprisingly hard even when the weather is nice and sunny.
An interesting HDR challenge with cockpits is that the light level inside and outside are miles apart. When flying in the real world, your eyes compensate for this when you focus on the instruments or look outside. But technically any screenshot that features a bright outside and clearly legible instruments at the same time is not very realistic but also kind of necessary. You need to do some HDR trickery to make that work. Poor readability of instruments is something X-plane addressed in one of their recent updates. It was technically correct but not that readable.
X-plane rendering has made some big improvements with all this during the v12 release over the last three years.
I suspect contrast in a lot of the games he's skewering is high because they are shootery type games where players need too see things, understand them, and react to them quickly
Also I don't necessarily see a need to make everything look like physical film.
This seems pretty irrelevant now. This article is from 2017 which is before we had proper real HDR support in Windows 10 and much better HDR support now in Windows 11.
And before we had OLED gaming monitors which can actually now display good HDR at 1000+ nits.
This was definitely during a transitional phase with mostly fake HDR techniques that needed tone-mapping. Now we have real HDR that doesn't need tone-mapping, or only a small amount of tone-mapping above the display peak nits point.
> And before we had OLED gaming monitors which can actually now display good HDR at 1000+ nits.
It’s worth pointing out these monitors for the most part can not sustain it or achieve it at anything other than the smallest possible window sizes, such as the 1-3% window sizes at best.
> Now we have real HDR that doesn't need tone-mapping, or only a small amount of tone-mapping above the display peak nits point.
For the reasons outlined above (and other) tone mapping is still heavily required.
It’s worth noting that OLED TVs do a significantly better job at displaying high nits in both percentage of the display and in sustaining it. It’s my hope the monitors eventually catch up because I waited a long time for it to become monitor sized.
> It’s worth pointing out these monitors for the most part can not sustain it or achieve it at anything other than the smallest possible window sizes, such as the 1-3% window sizes at best.
Sure, but the parts of the image that are anywhere near 1000 nits are usually quite small and are things like muzzle flashes or light fixtures or centers of explosions, or magic effects etc.
https://www.rtings.com/monitor/reviews/asus/rog-swift-oled-p...
This is OLED gaming monitor that came out 2 years ago measures 904 nits on a 10% sustained white window.
> Sure, but the parts of the image that are anywhere near 1000 nits are usually quite small and are things like muzzle flashes or light fixtures or centers of explosions, or magic effects etc.
Sure, but plenty of things are bright enough in combination at varying window sizes that combined the panels have to drop down significantly. So you might get 1000 nits for a muzzle flash but ~200nits at best for a “bright sunny day.”
The problem is way too many people (I’m not suggesting you) don’t realise this and just think they are “getting 1000nits!”
>https://www.rtings.com/monitor/reviews/asus/rog-swift-oled-p...
Yes, I own this display and it’s one of the better ones for brightness which is why I grabbed it.
However even on the latest firmware, It has a bunch of issues including with colours in HDR unfortunately. It also has incredibly aggressive ABL. Still a great display, but with more limitations compared to the TVs than I’d like still. They’ll get there though hopefully in few more generations.
When HDR is implemented properly, and you have a proper HDR display, it's such a transformative experience! Most games, however, don't have good HDR implementations. And for whatever reason HDR on Windows is still awful in 2025.
Any examples of good HDR in games?
I liked the talos principle 2 inside the pyramid after reducing the gamma a bit on a WOLED display
Here's my list from a couple of months ago, along with some related commentary: <https://news.ycombinator.com/item?id=43986463>
HDR is GREAT! Everyone trying to implement HDR + tone mapping excessively just for the sake of it and exaggerating it to show-off (just like those oversaturated Samsung phone screens) is not.
Yeah. There've been a laundry list of innovations over the years that people will invent, show how it improves how a scene looks, and then for the next few years everyone turns it up to 11 and it looks like shit. Bloom, SSAO, lens flare, film grain, vignetting, DoF.
After a while people turn it back down to like a 4 and it improves things.
Yes, like everything: Nylon might be my favourite example of us never being able to use innovation in moderation.
So were 3d movies until they stopped filming in 3d and started adding pointless effects in postprocessing :)
From an interview with legendary Nintendo designer Gunpei Yokoi and Yukihito Morikawa of MuuMuu:
"Do these playworlds really need to be that photorealistic, I wonder? I actually consider it more of a minus if the graphics are too realistic."
https://shmuplations.com/yokoi/
For Horizon Zero Dawn I'd argue that the colors are clearly an artistic choice. They're not going for realistic colors at all. And the original game and its sequel do look very, very good.
There do seem to be plenty of issues around HDR for sure, in some games I had to intentionally disable HDR on my PS5 because it just looked bad on my setup.
I was excited when I first heard about HDR but when I saw the implementation I thought: gee, they're going to screw up both the SDR and the HDR and that seems to be the case quite often. Going from SD -> HD your picture got better although it often got stretched out, but it's not so clear the HDR version of a movie is really going to be an improvement.
Note that this post is of course about high internal dynamic range specifically and the necessary tonemapping that then follows for presenting an SDR image, not about how modern games do actual HDR (but then that should be pretty similar on a high level to the extent I understand anyways).
> In the real world, the total contrast ratio between the brightest highlights and darkest shadows during a sunny day is on the order of 1,000,000:1.
And this is of course silly. In the real world you can have complete darkness, at which point dynamic range shoots up to infinity.
> A typical screen can show 8 (curved to 600:1 or so).
Not entirely sure about this either, monitors have been pulling 1000:1 and 2000:1 dynamic ranges since forever, even back in 2017 when this article was written, but maybe I just never looked too deep into it.
The static contrast ratio (1000:1+) you mention is different from effective perceived contrast after tone mapping - manufacturers' specs measure black-to-white in ideal conditions, while tone mapping algorithms must compress real-world luminance ranges (millions:1) into that limited display range while preserving perceptual detail.
I'm not entirely sure how that's relevant to what I was saying.
Hmm. I like the author's main point in many video games doing this unrealistically, but there are a few sticking points that are relevant from the past few years:
Another interesting example of lighting done well is Kingdome Come Deliverance 2. The details don't look nearly as nice as, e.g. UE5 game and it unfortunately doesn't support monitor HDR, but it has very realistic looking lighting and scenes.The article is from 2017.
Oh wow. I wonder what the author thinks of the new trends! I bet he or she would be pleasantly surprised by some of them.
Yeah, since then there were some games with very natural looking contrast and colors, perhaps most notably Red Dead Redemption 2 (2018). Or, years later, Kingdom Come Deliverance 2 (2025), which you already mentioned. As a negative example: as far as I can tell, Horizon Forbidden West (2022) mostly doubled down on the exaggerated color contrast he criticized in the predecessor.
Concur on Forbidden West having the same problem as HZD that the author mentioned. I remember thinking about the dark/indoor areas with accent lighting, and vegetation in particular compared to similar scenes in Talos Principle 2. (Similar release dates) HZD wasn't in the same league as Talos.
I don't want realistic looking games, I want pretty looking games.
Look at movies that go all in on realism, can't see anything, can't hear anything. That's terrible.
The fact you can't hear anything has nothing to do with realism. It's lazyness.
https://www.slashfilm.com/673162/heres-why-movie-dialogue-ha...
Please watch this for yet another take on the issue: https://www.youtube.com/watch?v=j68UW21Nx6g
The points are: game graphics is indeed suffering, but the problem is not being unlike films and photos, it's the opposite. The games should stop using film industry produced tone mapping curves and instead create their own, making a clean break.
Personally, I agree with the video.
Path of Exiles 2 is a good recent example of a game that does a pretty good job of contrast and tone (staying true to the dark, gritty theme that is going for). I think it was smart of the devs to keep all the high contrast to effects and lighting.
[2017]
I really don't know what to think of HDR.
I have yet to get any benefit out of it.
I disable it everywhere I can. In Instagram for example. When it is turned on (the default) every now and then I get some crazy glaring image in my feed that hurts.
Maybe it is because I don't play games? Is HDR useful anywhere outside of games?
> ...I get some crazy glaring image in my feed that hurts.
Are you using an Apple machine to do your browsing? I have heard that Apple has (for some damn reason) decided to do this sort of crap with HDR-pictures-in-an-otherwise-SDR-document. It's nuts. This doesn't happen to me on Windows, and -because I use xorg- I've no idea what happens on Linux.
The opposite extreme: physics-based rendering that models the entire optical chain of a film camera, including the emulsion layers.
https://youtu.be/YE9rEQAGpLw?feature=shared
The screenshot of "Zelda: Breath of the Wild " the author holds up as an exemplar looks unrealistically tone mapped to me. The "bad" screenshots in the lede look more natural and pleasing. Not realistic -- they're too stylized for my taste -- but the Zelda screenshot is simply unrealistic in a different direction.
This is an example of a phenomenon I've seen many times on the internet:
> The exposure level is also noticeably lower, which actually leaves room for better mid-tone saturation.
Decades ago, when I shot film, I remember discovering that I really liked how photos looked when underexposed by half a stop or so. I never knew why (and I wasn’t developing my own film, so I’ve no idea what the processor may have been doing), but I wonder if this was a contributing factor.
This is apparently an unpopular opinion, but in many games (fantasy RPGs come to mind), I like the fake look. It helps it look other-worldly, IMO. I think for something like Flight Sim, I’d prefer photorealism, but otherwise I’m fine with it looking like, well, a video game.
It might be a generational thing, too; I was born in the late 80s, and my formative years were spent playing cartoonish games like Commander Keen, Command & Conquer, etc.
Past discussion: https://news.ycombinator.com/item?id=15534622
> But all of them feel videogamey and none of them would pass for a film or a photograph. Or even a reasonably good offline render. Or a painting. They are instantly recognizable as video games, because only video games try to pass off these trashy contrast curves as aesthetically pleasing.
Author is fumbling the difference between aesthetics and realism. Videogames feeling videogamey? What a travesty.
All these massive studios with their grand budgets can't make a game which is not looking like a cartoon.
A real masterpiece of modern graphics is a game made by two brothers called "Bodycam"
Previously posted in 2017
https://news.ycombinator.com/item?id=15534622
Am I taking crazy pills or are the blacks actually crushed in zelda but not crushed in zero dawn (opposite to what's stated).
It apparently took Mozilla a couple decades to allow displays to present #ff0000 as sRGB red correctly mapped into the display’s LUT, rather than as (100%, 0%, 0%) in the display’s native LUT, which is why for several years anyone using Firefox on a ProPhoto or Adobe RGB or, later, DCI-P3 or BT.2020 display would get eye-searing colors from the web that made you flinch and develop a migraine. It was, I assume, decided that the improper tone mapping curve gave their version of the web more lifelike color saturation than other browsers — at least on their majority platform Windows, which lacked simple and reasonable color management for non-professional users until Windows 11. So Firefox looked brighter, flashier on every shitty Windows display in the world, and since displays were barely capable of better than sRGB, that was good.
Unfortunately, this also meant that Firefox gave eyestrain headaches to every design professional in the world, because our pro color displays had so much more eye-stabbing color and brightness capability than everyone else’s. It sucked, we looked up the hidden preference that could have been flipped to render color correctly at any time, and it was tolerable.
Then Apple standardized DCI-P3 laptop displays on their phones and tablets, where WebKit did the right thing — and on laptops and desktops, where Firefox did not. Safari wasn’t very good yet back then to earn conversions, though certainly it is now, and when people tried to switch from Firefox the colors looked washed out and bland next to that native display punch. So everyone thought that Apple’s displays were too bright whenever they surfed the web and suffered through a bad LUT experience — literally, Firefox was jamming 100% phosphor brightness into monitors well in excess of sRGB’s specified luminosity range — by dimming their displays and complaining about Apple.
And one day, Chrome showed up; faster, lighter, and most critically, not migraine inducing. The first two advantages drew people in; the third made them feel better physically.
Designers, professionals, everyone who already had wide color monitors and then also students; would have eventually discovered (perhaps without ever realizing it!) that with Chrome (and with Safari, if they’d put up with it), they didn’t have to dim their monitors, because color wasn’t forcibly oversaturated on phosphors that could, at minimum, emit 50% higher nits than the old sRGB-era displays. The web didn’t cause eye strain and headaches anymore.
Firefox must have lost an entire generation of students in a year flat — along with the everyone in web design, photography, and marketing that could possibly switch. Sure, Chrome was slightly better at the time; but once people got used to normal sRGB colors again, they couldn’t switch back to Firefox without everything being garish and bright, and so if they wished to leave Chrome they’d exit to Safari or Opera instead.
I assume that the only reason Firefox finally fixed this was that CSS forcibly engraved into the color v3 specification a few years ago that, unless otherwise hinted, #ff0000 is in the sRGB color space and must be rendered as such. Which would have left them no room to argue; and so Firefox finally, far too late to regain its lost web designer proponents, switched the default.
As the article describes, Nintendo understands this lesson fully, and chose to ship Zelda with artistic color that renders beautifully assuming any crap TV display, rather than going for the contrast- and saturation-maximizing overtones of the paired combination of brighter- and more-saturated- than sRGB that TV manufacturers call HDR. One need only look to a Best Buy TV wall to understand: every TV is blowing out the maximum saturation and brightness possible, all peacocks with their plumage flashing as brightly as possible, in the hopes of attracting another purchase. Nintendo’s behaviors suck in a lot of ways, but their artistic output understands perfectly how to be beautiful and compelling without resorting to the Firefox approach.
(Incidentally, this is also why any site using #rrggbb looks last-century when embedded in, or shown next to, one designed using CSS color(..) clauses. It isn’t anything obvious, but once you know how to see it, it’s like the difference between 18-bit 256color ANSI and 24-bit truecolor ANSI. They’re not RGB hex codes; they’re sRGB hex codes.)
The article lacks some examples of what "done right" means. It points to some videogames that "do it terribly" (they look ok to me? not photorealistic, but not every videogame has to be like that?) but it does not show how "a correct" version of each image would be. Just says "it looks obviously bad". Sorry but I don't see it. I'm fine with videogames looking videogamey.
The article shows 5 images from games showing it done poorly, and 4 images from games showing it done well.
A fifth image of it done well was added in an edit.
It doesn’t matter if the images are different. To the author it might be super obvious why some are good and others aren’t. To me they all look good.
I cannot be the only who barely notices this in games.
slightly related but a lot of newer games have weird colour filtering and post processing effects now that drive me nuts
After reading this article I feel like I learned nothing about what makes HDR good or bad.
This article is just misinformed. Source: I’ve been working with color space conversion, HDR tone mapping, gamut mapping and “film look” for 20 years.
It’s clear from their critique of the first screenshots that their problem is not with HDR, but contrast levels. Contrast is a color grading decision totally separate from HDR tonemapping.
There’s then a digression about RED and Arri that is incorrect. Even their earliest cameras shot RAW and could be color matched against each other.
Then they assert that tone mapping is hampered by being a 1D curve, but this is more or less exactly how film works. AAA games often come up with their own curves rather than using stock curves like Hable or ACES, and I would assume that they’re often combined with 3D LUTs for “look” in order to reduce lookups.
The author is right about digital still cameras doing a very good job mapping the HDR sensor data to SDR images like JPEGs. The big camera companies have to balance “accuracy” and making the image “pleasing,” and that’s what photographers commonly call their “color science.” Really good gamut mapping is part of that secret sauce. However, part of what looks pleasing is that these are high contrast transforms, which is exactly what the author seems to not like.
They say “we don’t have the technical capability to run real film industry LUTs in the correct color spaces,” which is just factually incorrect. Color grading software and AAA games use the same GPUs and shader languages. A full ACES workflow would be overkill (not too heavy, just unnecessarily flexible) for a game, because you can do you full-on cinema color grading on your game and then bake it into a 3D LUT that very accurately captures the look.
The author then shows a screenshot of Breath of the Wild, which I’m nearly positive uses a global tonemap—it just might not do a lot of dynamic exposure adjustment.
Then they evaluate a few more images before praising a Forza image for being low contrast, which again, has nothing to do with HDR and everything to do with color grading.
Ultimately, the author is right that this is about aesthetics. Unfortunately, there’s no accounting for taste. But a game’s “look” is far more involved than just the use of HDR or tone mapping.
It's not just games, it's regular day-to-day UI too. I'm using an Acer 185Hz VRR HDR10 Gaming monitor.. on Eco mode with HDR disabled. Everything just looks better with HDR turned off for some reason I can't explain.
That's normal. For HDR to look good, you need a monitor that hits approximately 1000 nits in brightness. Your monitor only hits 250, which is completely insufficient to display HDR content.
This is one of the stupid things with many monitors, showing HDR at 250 nits is worse than showing no HDR at all. So no matter what you do, 99% of HDR content will look bad on your screen.
I agree that 250 nits is too low, but my monitor clocks in at 400 and HDR already looks better, if only thanks to the increased colour channel resolution - particularly visible in highlights, clouds etc. Where there previously was just a single colour blob I now can observe details impossible to display with just eight bits per channel.
Interestingly my laptop's display reaches 500 nits and that is already painfully high outside of midday hours. My phone goes to 875 and I find that only to be useful outside in the summer sun.
The difference is between SDR and HDR. Going full blast with a full image at 500 nits or having an image averaging 200 nits with only peaks at 500 are two vastly différence things.
I have a C3 OLED and everything also looks better with HDR off.
Games are just truly awful in making scenes completely in viewable, even when the HDR areas, the blacks and whites, have interactive elements in them you need to see and know about.
I have a C4 OLED and I thought what you said was also true for me until I figured out what settings I needed to change on my TV to match my console (Nintendo Switch 2). Had to turn on HGiG, manually adjust the peak brightness level on the console itself, and suddenly things looked great.
Not that many games on the console that take advantage of it, mind you. More testing needed.
> For HDR to look good, you need a monitor that hits approximately 1000 nits in brightness.
I disagree. The wide color gamut is -for me- a huge thing about HDR. My VA monitor provides ~300 nits of brightness and I've been quite happy with the games that didn't phone in their HDR implementation.
Plus, any non-trash HDR monitor will tell the computer it's attached to what its maximum possible brightness is, so the software running on that computer can adjust its renderer accordingly.
> Plus, any non-trash HDR monitor will tell the computer it's attached to what its maximum possible brightness is, so the software running on that computer can adjust its renderer accordingly.
My monitor does do that, but alas the software itself (Windows 10) wasn't good enough to adjust stuff correctly. It did made the decision to switch to ArchLinux easier by being one less thing I'll be missing
Unless it's a MINI LED or OLED display, it simply doesn't have the contrast to properly render a lot of what makes HDR... HDR.
Calibrate the display with HDR enabled for a better SDR response.
VA screens have pretty damn good contrast, and OLED monitors tend to have low peak (and sometimes even spot!) brightness.
A while back, I tried an OLED gaming monitor that was widely reviewed as being very good. While it was somewhat better than the VA monitor that I've been using for years, it was nowhere near 1,500 USD good. I could see someone coming from an IPS or TN screen being very impressed with it, though.
VA screens have terrible black smearing though. I also bought an OLED display and returned it because it was just very dim. I own a miniled display that peaks at 1000cd full screen (it has a fan to handle the heat) and I'm still looking for an OLED replacement.
Something is poorly implemented with Windows UI on HDR, and Macbooks it all looks fine then HDR content just appears brighter, I think the rest of the UI becomes duller too at that point but on windows it feels like running HDR on the Windows desktop means the whole screen looks dull, least it does on my 5K HDR Dell.
Not sure if I'm missing a setting, but I end up having to manually turn HDR on before playing a game and off after.
As someone who worked a lot in realistic VFX I concur with the observation that nearly no game is doing tone mapping right and my guess to why that is always has been the fact that doing it right is just very complex.
There are many, many things artists need to do correctly, many of which have no idea of the whole pipeline. Let's say someone creates a scene with a tree in it. What is the correct brightness, saturation and gamma of that trees texture? And if that isn't correct, how could the lighting artist correctly set the light? And if the texture and the light is wrong the correct tone lmap will look like shit.
My experience is that you need to do everything right for a good tonemap to look realistically, and that means working like a scientist and having an idea of the underlying physical formulae and the way it has been implemented digitally. And that is sadly something not many productions appear to pull off. But if you pull it off everything pops into place.
The added complication with games is of course that you can't just oprimize the light for one money shot, it needs to look good from all directions. And that means it is hard to make it look as good as a film shot, because that risks making it look like crap from other directions which studios aren't willing to risk.
The dragon in The Hobbit isn't just about the tonemapping, it is at least as much (if not more so) a lighting issue. But the two can influence each other in a bad way.
All of the "bad" examples look like they're playing on a PC with poorly set gamma curves. Play on a TV where the curves are setup properly because TV people actually care about color reproduction.