Almost thirty years ago the PNG image format was developed as a significantly better and more modern alternative for lossless image storage than any of the existing ones (particularly GIF). It did many things right, and it compiled into one all the various features of images that almost none of the other existing formats supported simultaneously (eg. GIF only supports 256 colors, which is horrendous, and only a single-color transparency. TIFF and several other formats support almost all image format features but have extremely poor compression. And so on.)
However, there was one thing that the PNG format tried to do "right" but which ended up causing a ton of problems and become a huge pain in the ass for years and years to come, particularly when support for the format became widespread, including by web browsers. And that feature was support for a gamma correction setting.
Without going into the details of what gamma correction is (as this can be easily found online), it's an absolute swamp of complications, and with the standardization and widespread support for PNG it became a nightmare for at least a decade.
In the vast, vast majority of cases, particularly when using image files in web pages, people just want unmodified pixels: Black is black, white is white, 50% gray is 50% gray, and everything in between. Period. If a pixel has RGB values (10, 100, 200), then they want those values being used as-is (eg. in a web page), not modified in any way. Particularly, if you eg. specify a background or text color of RGB (10, 100, 200) in your web page, you definitely want that same value to visually match exactly when used in an image.
When PNG became widely supported and popular in web pages, its gamma specification caused a lot of problems. That's because when a gamma value is specified in the file format, a conforming viewer software (such as a web browser) will change those pixel values accordingly, thus making them look different. And the problem is that not only do different systems use different gamma values (most famously Windows and macOS used, maybe even still use, different values for gamma), but support for gamma correction varied among browsers, some of them supporting it others not.
"What's the problem? PNG supports a 'no gamma' setting. Just leave the gamma setting out. Problem solved." Except that the PNG standard, at least back then, specifically said that if gamma wasn't specified in the file, for the viewing software to assume a particular gamma (I think it was 2.2). This, too, caused a lot of problems because some web browser were standard-conformant in this regard, while others didn't apply any default gamma value at all. This meant that even if you left the gamma setting out of the PNG file, the image would still look different in different browsers.
This was a nightmare because many web pages assumed that images will be shown unmodified and thus colors in the image will match those used elsewhere in the page.
I think that in later decades the situation has stabilized somewhat, but it can still raise its ugly head.
I feel that HDR currently is similar to the gamma issue in PNG files in that it, too, causes a lot of problems and is a real pain in the ass to deal with.
If you buy a "4k HDR" BluRay, most (if not all) of them assume that you have a HDR-capable BluRay player and television display. In other words, the BluRay will only contain an HDR version of the video. Most of them will not have a regular non-HDR version.
What happens if your TV does not support HDR (or the support is extremely poor), and your BluRay player does not support "un-HDR'ing" the video content? What happens is that the video will be much darker and with very wrong color tones, and look completely wrong.
This is the exact situation, at least currently, with the PlayStation 5: It can act as a BluRay player, and supports 4k HDR BluRay discs, but (at least as of writing this) it does not have any support for converting HDR video material to non-HDR video (eg. by clamping the ultra-bright pixels to the maximum non-HDR brightness) and will just send the HDR material to the TV as-is. If your TV does not support HDR (or it has been turned off because it makes the picture look like ass), the video will look horrendous, with completely wrong color tones, and much darker than it should be.
(It's a complete pain in the ass. If I want to watch such a BluRay disc, I need to go and turn HDR support on, after which the video will look acceptable, even though my TV has very poor-quality HDR support. But at least the color tones will be ok. Afterwards I need to go back and turn HDR off to make games look ok once again. This is such a nuisance that I have stopped buying 4k HDR BluRays completely, after the first three.)
More recently, I got another instance of the pain-in-the-ass that's HDR, and it happened with the recently released Nintendo Switch 2.
Said console supports HDR. As it turns out, if your TV supports HDR, the console will turn HDR on, and there's no option to turn it off, anywhere. (There's one setting that claims to to turn it off for games, but it does nothing.)
I didn't realize this and wondered for some weeks why the picture looks like ass. It's a bit too bright, everything is a bit too washed.out, with the brightest pixels just looking... I don't know... glitched somehow. The thing is that I had several points of direct comparison: The console's own display, as well as the original Switch: In those, the picture looks just fine. However, on my TV the picture of the Switch 2 looks too washed-out, too low-contrast, too bright. And this even though I had adjusted the HDR brightness in the console's settings.
One day this was bothering me so much that I started browsing the TV's own menus, until I found buried deep within layers of settings one that turned HDR support completely off for that particular HDMI input.
And what do you know, the picture from the Switch 2 immediately looked perfect! Rich colors, rich saturation, good contrast, just like on the console's own screen.
The most annoying part of all of this is that, as mentioned, it's literally not possible to reach this state using the console's own system settings. If your TV tells it that it supports HDR, it will use HDR, and there's nothing you can do in the console itself to avoid that. You have to literally turn HDR support off in the settings of your TV to make the console stop using it.
The PlayStation 5 does have a system setting that turns HDR completely off from the console itself. The Switch 2 does not have such a setting. It's really annoying.
The entire HDR thing is a real pain in the ass. So far it has only caused problems without any benefits, at least to me.
No comments:
Post a Comment