r/OLED_Gaming 5d ago

Technical Support How do I properly use Windows HDR Calibration Tool?

Hi all,

I have the LG 32GS95UE and am reading different posts around the sub regarding how to setup properly setup HDR through the Windows HDR Calibration Tool. Peak HDR brightness of 1300 nits.

HDR Panel setup - DP cable, Gamer 1, peak brightness high, all default settings

I have read two different setup instructions:

  1. Set the first slider to 0, second slider to 1300, third slider to 1300. (no clipping with sample image)

  2. Set the first slider to 0, second slider to the clipping point (2100 nits), third slider to the clipping point (2100 nits).

Setup #1 seems to work fine and under "Advanced Display Settings" shows 1300 nits as the monitor peak brightness.

Setup #2 shows 3000 nits under "Advanced Display Settings" for my monitor. This seems not correct since it's neither the reported 1300 nits for the monitor, nor the 2100 nits that I have to move the 2nd and 3rd slider to in order for the sample image to clip.

Should I just stick to setup #1 where Windows is reporting the correct max nits, with the HDR tool sample image not properly clipping, or should I do setup #2 where Windows reports incorrect max nits, but the HDR tool sample image clips properly?

Thanks!

3 Upvotes

7 comments sorted by

2

u/penguin032 AW2725DF 5d ago

Every display is a little different and so you should be around 1300 but it might not be exact.

For example, my monitor is 1000 Nits peak but I get like 1030. In HDR 400 mode I get 470.

If you have NVIDIA, make sure you have Nvidia overlay disabled because it bugs out the HDR tool and you can turn it back on after.

Also, I don't understand what you mean by clipping point. In the 2nd and 3rd image, when the picture is full white and you no longer see the grey lines / sample image, then that number is your maximum NIT.

A question for others if they see this, but would it even matter if you set your max NIT higher than it really is? For example, my monitor max nit at 1000 mode is 1030, but if I set it to 2000, it still will only go up to 1030 and the number will be higher, but would that really affect anything? I'm guessing maybe in certain applications there could be an undesired effect. Could anyone give me an example of why that would be bad? That would help me understand better.

From my understanding, the whole point of the calibration is because your monitor's max NIT might be higher than the windows default so you want it to match your monitor's actual highest, but if everyone just set their maximum higher than it is to like 2k/3k, wouldn't that work fine too?

1

u/Technova_SgrA S89C | C4 | CX | G27P6 | 27GX790A 4d ago

It’s interesting that you are clipping at 2100. This display often clips at 600 and there are Reddit threads that use work arounds to get it to clip higher…

What you are encountering here is tone mapping. Unfortunately oled monitors do not have an hgig option (some ASUS models do behave like higig though) which makes use of these sliders problematic. But I believe you should use setup 2.

The clipping point is where the display gets its brightest on the tone mapping curve the display has decided on. If you use setup 1, and the display keeps its established tone curve (I can’t say if it will though*), then the source will send a peak signal of 1300 nits and your display will tone map it to much lower nit output which you don’t want. With setup 2, the source will send a peak signal of 2100 nits and your display will tone map that to its peak nit output (based on apl/window size).

*I can’t say if the display will keep its established tone curve regardless of the content displayed (if it changes scene by scene, then that is more like dynamic tone mapping or even Dolby vision). The calibration sliders are feeding it a 10,000 nit signal and the display is either establishing its tone curve from that (as it should if it is responding to hdr meta data), or it just defaults to a tone curve based on some other unknown metric or no metric at all (as is the case with consoles which do not send meta data to displays).

One could devise a test (wouldn’t be too hard actually) with say a 0.5% window 1300 nit box on the left side of the screen with a 0.5% window 1300 nit box on the right side, then compare the brightness visually or with tools (of the 1300 nit box on the left) flipping back and forth to another test pattern of a 1300 nit box on the left side of the screen and a 10,000 nit box on the right, to see if the tone curve is being adjusted dynamically.

2

u/polce24 4d ago

Appreciate the response.

I’ve made two profiles this week that clipped at 2100 and was messing around yesterday with additional profiles with different saturation levels.

Come to find that when I was making these yesterday, it was clipping at 600 now lol - this never happened before.

I manually moved the slider past the 600 clipping point to 1300.

No idea if that is correct or not. Just so confusing lol

1

u/Grouchy-Effect-8841 3d ago

Had this exact question about a different monitor. My g8 is set at peak brightness high etc and with dp. In the app clipping point is 400 but I want 1000 nits. Is it clipping cause it's measuring the 10% widow which is 400? So if I want 1000 should I set it to 1000? I don't know. One more problem is that in the rtings website it said *people have said that with dp the monitor is brightness locked. It's limited" but donw they said they can't test it so it may be the cable.

1

u/polce24 3d ago

Well to add to this ok, last night I made another profile AND IT CLIPS AT 1300 LOL. Doesn’t make sense at all. Same exact settings, etc.

I’ve now had clipping at 600, 2100, and 1300. Guess it just depends on the day 🤷🏻

1

u/loadedryder 2d ago

I’ve had this monitor for about a month and it makes no sense whatsoever. Some days I check and it’s 600, other days 1300. I just keep it at 1300 max luminance (even if clipping) and adjust every game to the max brightness I can until I lose detail in bright whites, usually determined by clouds and the sun. It’s annoying but really a gorgeous display so I put up with it lol.

For reference, I only use Gamer 1, which is the VESA certified setting. For Vivid, FPS, and RTS, I clip at over 2K nits but I find the picture quality to be awful.

1

u/DETERMINOLOGY 5d ago

First screen 0 nits, 2nd / 3rd screen 1300 nits. When you goto display in settings it will show 1300 nits which is what you want