How do you all calibrate modern screens for SDR and HDR?

We have the 240p test suite for CRTs, but what do you guys use for modern stuff? I tend to use random test patterns from YouTube but I’m not at all scientific about it.

Are there any resources you have used that you could recommend? I’m find color saturation and temperature the hardest to get right. I usually end up with perfect visuals and one oversaturated color that I can’t dial down without ruining the rest of the picture.

Then HDR and Dolby Digital formats present their own issues since many sets that support those formats don’t truly have the nits to do them correctly.

They still look good on my OLED though.

Also, I find other people’s numbers for screen calibration to always look terrible and either too vibrant or too washed out. So I always rely on my own judgment.

But I know, for sure, there has to be a more scientific way of doing this.

1 Like

If sRGB is an option I’ll always go with that when displaying SDR. Vast majority of content is still made around the colour space.

HDR capable content should be using the DCI-P3 colour space, so ideally you’d want your display, TV, or operating system to be colour aware so you don’t have to worry about switching.

I noticed Windows’ HDR mode does properly respect the above - if you turn on HDR but are displaying SDR content you’ll see the sRGB colour space, with the wider P3 only being used for HDR content to avoid oversaturating everything.