Jump to content

HDR in Gaming


CarloOos
 Share

Recommended Posts

I figured we were overdue a separate thread for this, as it's a hot button topic for those who care about it but must be an interminable thread clog for those who don't.

 

We're about four years into HDR being supported by the consoles now, and it seems like implementation is as hit and miss now as it's ever been. Loads of high profile titles still have incorrectly mapped black levels, broken HDR calibration screens and non-mapped SDR cutscenes being played in HDR (Cyberpunk being a recent offender that managed all three).

 

On the other hand, there's also an element of taste and expectations. Lots of people have come to expect eye-searing contrast from their HDR images, despite that not being what the format was designed for. Just because something doesn't always hit 'true black' doesn't necessarily mean it's broken, it's perfectly valid to make those choices creatively as well (I'd use AC Valhalla as a recent example of this).

 

Traditionally HDR was designed so you'd have your normal SDR image in 0-100 nits, and then 100-1000+ nits would be reserved for specular highlights, which can allow for a really deep, nuanced image. This is mostly true in film but it's not really how many games have used it so far, because most given examples of 'good' HDR often look the same as SDR version but mapped across the full range of 0-1000 nits, creating extreme brightness and contrast but not really adding any extra depth (I'd say Doom Eternal looks like this, for example). Don't get me wrong, I'd rather have that than a knackered washed out grey image, but I'd also like to see a bit more subtlety going forward.

 

Dolby Vision for gaming is supposedly around the corner, and if the uptake it strong it could potentially change things. I don't know exactly how they'll apply it in games, but when you're working in Dolby you generally only have to focus on the brightest version of the image (the 4000 nit version, for example) and the 1000, 600 and 100 nit versions are generated mostly automatically from that. If developers actually go for this (and more TVs actually support it) it could potentially free them up from having to worry about how their engine works across HDR and SDR and allow them to focus on a single set of values.

 

That said, I imagine one of the main roadblocks to this, and one of the reasons HDR has been so hit-and-miss so far, is that HDR screens are expensive (reference quality ones eye-wateringly so), and the vast majority of developers do their work on normal SDR monitors.

 

Anyway, that's my ramble over. HDR games; the good, the bad and the ugly. Have at it.

Link to comment
Share on other sites

To be fair, Forza Horizon 3 was given as an early example of strong HDR but I thought that had issues too. It looked amazing when you were driving into the sun, but when you were driving away from it the image went all dark and overcast, which looked ridiculous considering the big bright blue sky. Horizon 4 looks phenomenal though.

Link to comment
Share on other sites

Done well it’s incredible but goften it’s badly implemented... however it must be hard when most people have horrible lcd screens set to stupid brightness levels washing out all colours to then set accurate colours with HDR doing the heavy lifting.

 

Fortnite looks stunning without HDR, better than some HDR games but then you see the really good stuff and fall in love with HDR again

 

 

Link to comment
Share on other sites

Yeah, the weather effects in FH4 are exceptional. Generally though, I'd be hard pushed to tell what HDR is doing. And if that's the fault of me having a mid-range LCD, instead of a cutting edge OLED, then Average Joe isn't seeing the benefit, either. 

 

Edit - you can't really blame consumers here, if it doesn't work then it isn't designed well enough. HD just worked, everyone saw the benefit. 

Link to comment
Share on other sites

I can’t say I ever notice bad HDR. The only game that’s ever really bothered me is RDR2, because the HDR version looks signicantly worse to my eyes. 
 

I have an OLED TV, but don’t really cross examine the black levels or whatever. I just leave HDR on and play.

 

HDR on PC seems like a complete disaster, though. Doesn’t seem like much has improved in the past 4 years, and it’s one of the reasons I prefer consoles these days. 

Link to comment
Share on other sites

I find I constantly have to adjust HDR on the telly or whatever console.

 

Currently it looks great on the TV’s apps but shite on the same apps on the Xbox, despite setting it up on that too several times. Games look ok, but sometimes washed out or more commonly a bit yellow.

Link to comment
Share on other sites

3 minutes ago, TehStu said:

Yeah, the weather effects in FH4 are exceptional. Generally though, I'd be hard pushed to tell what HDR is doing. And if that's the fault of me having a mid-range LCD, instead of a cutting edge OLED, then Average Joe isn't seeing the benefit, either. 

 

Edit - you can't really blame consumers here, if it doesn't work then it isn't designed well enough. HD just worked, everyone saw the benefit. 


They had their shitty lcds setup wrong then too ;) 

 

HDR is an absolute game changer when you have the tv setup properly so your colours are accurate and the black levels are correct - add HDR into the mix and the jump is way above any resolution jump of recent times but get a poorly setup tv with low end HDR capabilities and it will often look worse than a decent SDR setup tvfor any input let alone games hence giving devs some slack, just a little as there is loads of appalling HDR in gaming.

Link to comment
Share on other sites

19 minutes ago, TehStu said:

Yeah, the weather effects in FH4 are exceptional. Generally though, I'd be hard pushed to tell what HDR is doing. And if that's the fault of me having a mid-range LCD, instead of a cutting edge OLED, then Average Joe isn't seeing the benefit, either. 

 

Edit - you can't really blame consumers here, if it doesn't work then it isn't designed well enough. HD just worked, everyone saw the benefit. 

 

It's overly complicated, no doubt about it. Four different HDR standards, with various TV manufacturers picking and choosing those standards and implementing them differently in different models. We're still in the wild west period.

 

It's absolutely no surprise you see people on forums saying things like 'to make this game look good, set it to 4000 nits and turn the gamma down to -100' which is absolutely insane advice, but between the game and their screen there's a million different variables and if I didn't deal with this stuff at work I wouldn't know any better either.

Link to comment
Share on other sites

20 minutes ago, HarryBizzle said:

I can’t say I ever notice bad HDR. The only game that’s ever really bothered me is RDR2, because the HDR version makes signicantly worse to my eyes. 
 

I have an OLED TV, but don’t really cross examine the black levels or whatever. I just leave HDR on and play.

 

HDR on PC seems like a complete disaster, though. Doesn’t seem like much has improved in the past 4 years, and it’s one of the reasons I prefer consoles these days. 


Yes, I don’t really know what HDR is supposed to do.

 

Cyberpunk looks amazing to me in HDR, and its HDR is supposed to be broken, but I haven’t played any full fat console games other than Switch since the 360, so it just looks how I expect a modern game to look.

Link to comment
Share on other sites

Biggest issue for me is how the PS5 forces everything into HDR, so even games which weren’t designed for it are displayed in HDR. It makes a lot of PS4 games look noticeably worse than when running them on a PS4 Pro, where the auto mode would actually disable HDR on games which didn’t use it. 
 

Not sure if this is something Sony have acknowledged and plan fixing or not. 

Link to comment
Share on other sites

19 minutes ago, Shimmyhill said:

They had their shitty lcds setup wrong then too ;) 

 

Nope. If it doesn't just benefit the consumer, whether automatically or by walking them through a wizard or whatever, then nope.

 

I know you're winking, but everything has to just work. Now, whether the exact shade of red matches what the director intended because I have my color temp wonky and the room is lit with yellow bulbs and awkwardly reflecting on the screen? That's fine. But technology just needs to work. Imagine having to delve into firewall settings to make the most of your internet connection, lest you only get 40% performance or whatever.

 

It's probably fine that HDR doesn't work easily without tinkering because most people probably don't notice, or care what it is. It's lousy for anyone who is interested in it, though.

Link to comment
Share on other sites

 

3 minutes ago, Jazz Glands said:

Biggest issue for me is how the PS5 forces everything into HDR, so even games which weren’t designed for it are displayed in HDR. It makes a lot of PS4 games look noticeably worse than when running them on a PS4 Pro, where the auto mode would actually disable HDR on games which didn’t use it. 
 

Not sure if this is something Sony have acknowledged and plan fixing or not. 

 

Yeah I have HDR turned off on PS5 for this reason, and because my TV doesn't display HDR very well in Game Mode.

Link to comment
Share on other sites

The irony being that I can't really tell if that's good or not on my non-HDR PC screen. I mean, the scene's lighting is lovely, but I wouldn't also say "yeah, some nice HDR". 

 

Maybe suitably equipped TVs should have some before/after in their calibration, to showcase the difference. Mine came with a Dolby Vision app and, yeah, it looks fine. I tweaked a couple of settings based on some reviews that a TV review site mentioned (nothing absurd, mostly turning off frame interpolation and other guff. But I don't know if it's good

 

Argh :)

 

Link to comment
Share on other sites

I’ve never understood why films and tv shows just work with HDR whilst games need multiple poorly explained sliders and menus to activate it. Tell me to set the white between 0-1000 doesn’t mean anything to the average user and setting the image so you can ‘barely see it’ means I don’t know what they mean by barely.

Link to comment
Share on other sites

Rarely had any issues with HDR on either PS4 pro or PS5, other than the games where it's known to be rubbish (i.e. RDR2).

 

What disappoints me most on HDR recently is that on the Series S it's locked to 4K displays, rather than being its own option, so whilst I have a 1080p HDR TV for that console, I can't take advantage of HDR on Series S games or the Auto-HDR on backwards compatible ones :(  poor, given that a bog standard PS4 would allow HDR at 1080p.

Link to comment
Share on other sites

1 hour ago, HarryBizzle said:

I can’t say I ever notice bad HDR. The only game that’s ever really bothered me is RDR2, because the HDR version looks signicantly worse to my eyes. 
 

I have an OLED TV, but don’t really cross examine the black levels or whatever. I just leave HDR on and play.

 

HDR on PC seems like a complete disaster, though. Doesn’t seem like much has improved in the past 4 years, and it’s one of the reasons I prefer consoles these days. 

 

I take it you haven't played Resident Evil 2 in HDR.

 

As if by magic:

 

45 minutes ago, Down by Law said:

The best example of HDR i've seen is the Gun Shop Kendo sign in Resident Evil 2, the neon effect is so lifelike.

 

Here is a redundant picture of it 

 

maxresdefault.jpg

 

This is a perfect example of why the HDR in Resident Evil 2 is fucking terrible.

 

Look how grey everything looks. It's at night and there's hardly any black in that picture. You should not be able to read the "RPD" on Leon's back.

 

The Resident Evil 2 HDR is a perfect example of how bad it can be. It just makes it look like the gamma settings on your screen are fucked.

 

Here is the same scene with HDR off:

 

 

image.thumb.png.773dcc131781d866acfa556b1a183be7.png

 

You actually get blacks and not greys, and so have the horror atmosphere the original developers intended.

 

Link to comment
Share on other sites

46 minutes ago, TehStu said:

Nope. If it doesn't just benefit the consumer, whether automatically or by walking them through a wizard or whatever, then nope.

 

I know you're winking, but everything has to just work. Now, whether the exact shade of red matches what the director intended because I have my color temp wonky and the room is lit with yellow bulbs and awkwardly reflecting on the screen? That's fine. But technology just needs to work. Imagine having to delve into firewall settings to make the most of your internet connection, lest you only get 40% performance or whatever.

 

It's probably fine that HDR doesn't work easily without tinkering because most people probably don't notice, or care what it is. It's lousy for anyone who is interested in it, though.


HDR looking worse on a current tv is no different from SD content looking worse when consumers moved from crt to LCD, outside of resolution you can argue (quite correctly) that we are still getting a worse picture from out tellys now than we were 20 years ago...
 

My point is that if you care about having a decent picture on any current tv you need to spend a few hours tweaking it per input, per device and day/ night so HDR is no different and no less consumer friendly than any tv for the past 20+ years.

 

If you buy a telly and leave it in default, dynamic or Samsung mode then you deserve crap HDR.

 

 

Link to comment
Share on other sites

5 minutes ago, Shimmyhill said:

My point is that if you care about having a decent picture on any current tv you need to spend a few hours tweaking it per input, per device and day/ night so HDR is no different and no less consumer friendly than any tv for the past 20+ years.

 

 

Absolute rubbish. OLED's have a fantastic picture straight out of the box with no tweaking.

 

You may want to turn off motion blur if you don't like the soap opera effect. That's it.

 

Most top-end TVs will be similar.

 

And the issue is that cheap TVs just cannot do good HDR. They lack the brightness and contrast, regardless of how many settings you fiddle with. Most sub-£500 TVs will be sold as 'HDR' TVs with absolutely garbage HDR picture quality - in many cases it's barely any different from their SDR picture.

 

That doesn't mean that they don't get sold as 'HDR' screens though.

Link to comment
Share on other sites

1 minute ago, Isaac said:

 

Absolute rubbish. OLED's have a fantastic picture straight out of the box with no tweaking.

 

You may want to turn off motion blur if you don't like the soap opera effect. That's it.


I own OLEDs and tweaked both - they have a decent picture out of the box I’ll agree and I’d argue HDR is equally decent on defaults but tweaked you can make decent improvements.If you have only turned motion blur off on your OLED head over to RTINGS and try their base setups to truly get your OLED to sing!

 

However I’m taking about the vast majority of TVs the consumers buy that need way more time spent on them to look anything other than nasty let alone getting HDR to look good and lots of them will never do well here as the tech is poor :( 

Link to comment
Share on other sites

25 minutes ago, Shimmyhill said:

If you buy a telly and leave it in default, dynamic or Samsung mode then you deserve crap HDR.

 

13 minutes ago, Shimmyhill said:

However I’m taking about the vast majority of TVs the consumers buy that need way more time spent on them to look anything other than nasty let alone getting HDR to look good and lots of them will never do well here as the tech is poor :( 

 

It's consumers' fault. It's not the fault of consumers.

 

Roger that.

Link to comment
Share on other sites

1 hour ago, Down by Law said:

Resident Evil 2

messed up its HDR implementation. Blacks are greys - surely the worst genre to do that in! (EDIT: just saw the other post lol.)

 

Thankfully it looks like VIIlage has a better implementation, going by the Maiden technical demo.

 

 

So yeah HDR.

The issue is greed at the end of the day - several entities trying to control the oil well through coming up with their own standards to try and get everyone on. Of course that wasn't going to work! HGIG is a step forward but there's a ways to go.

The next layer of shit under that is that devs don't put enough into proper grading/mastering.

 

Link to comment
Share on other sites

HDR isn't just about searing brightness. One of the major benefits of HDR that isn't promoted so well is WCG: Wider Colour Gamut. Let's go back to the '80s. Look at the colours on the C64, and then on the Amiga. The Amiga clearly has a wider range of colours to play with, right? I've simplified it, but it's essentially the same thing with WCG available when HDR is active. This results in a more natural picture - provided what your viewing takes advantage of the wider colour gamut.

 

Most HDTVs use a colour standard called Rec.709. That's the C64 of colour standards, but still looks fantastic (see Cyberpunk). Most 4kTVs now have the ability to process Rec.2020, the Amiga of colour standards. Switching to HDR also switches to Rec.2020 if it's encoded in the source material. 

 

As for the perception of HDR, I can spot bad HDR straight away. I saw it immediately with Cyberpunk, also spotted it immediately with Red Redemption 2 when the HDR update came out. At the opposite end I can spot good HDR too, Doom Eternal on PC was very pleasant surprise, and more recently there's Miles Morales and Demon's Souls on PS5 as standouts.

 

I love HDR ... when it's done right. On PS4 and PS5 it's mostly been done right, especially on first party titles. Third party games are more hit/miss, Resident Evil 2 remake for example - what a shit show with HDR on.

Link to comment
Share on other sites

I don’t really know how much use most games make of the expanded gamut, considering the ‘source’ of the image in a game comes from the textures and those textures are presumably RGB (which is effectively the same as Rec709). It’s a bit different to film where the potential for extra colour information is inherent in the raw footage.

 

Post-shaders, lighting effects etc are perhaps a different story, but even then they’re only manipulating the colour values of what’s already there.

Link to comment
Share on other sites

5 minutes ago, CarloOos said:

I don’t really know how much use most games make of the expanded gamut

Yeah, this is something I'd like to know too. I've got Miles Morales on right now, if I go into the service menu of my LG C9 it tells me it's displaying Rec.2020, but I have no idea if it's the game using it, or just the PS5 doing it at a system level.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. Use of this website is subject to our Privacy Policy, Terms of Use, and Guidelines.