Sunday, August 31, 2025

HDR is a pain in the ass

Almost thirty years ago the PNG image format was developed as a significantly better and more modern alternative for lossless image storage than any of the existing ones (particularly GIF). It did many things right, and it compiled into one all the various features of images that almost none of the other existing formats supported simultaneously (eg. GIF only supports 256 colors, which is horrendous, and only a single-color transparency. TIFF and several other formats support almost all image format features but have extremely poor compression. And so on.)

However, there was one thing that the PNG format tried to do "right" but which ended up causing a ton of problems and become a huge pain in the ass for years and years to come, particularly when support for the format became widespread, including by web browsers. And that feature was support for a gamma correction setting.

Without going into the details of what gamma correction is (as this can be easily found online), it's an absolute swamp of complications, and with the standardization and widespread support for PNG it became a nightmare for at least a decade.

In the vast, vast majority of cases, particularly when using image files in web pages, people just want unmodified pixels: Black is black, white is white, 50% gray is 50% gray, and everything in between. Period. If a pixel has RGB values (10, 100, 200), then they want those values being used as-is (eg. in a web page), not modified in any way. Particularly, if you eg. specify a background or text color of RGB (10, 100, 200) in your web page, you definitely want that same value to visually match exactly when used in an image.

When PNG became widely supported and popular in web pages, its gamma specification caused a lot of problems. That's because when a gamma value is specified in the file format, a conforming viewer software (such as a web browser) will change those pixel values accordingly, thus making them look different. And the problem is that not only do different systems use different gamma values (most famously Windows and macOS used, maybe even still use, different values for gamma), but support for gamma correction varied among browsers, some of them supporting it others not.

"What's the problem? PNG supports a 'no gamma' setting. Just leave the gamma setting out. Problem solved." Except that the PNG standard, at least back then, specifically said that if gamma wasn't specified in the file, for the viewing software to assume a particular gamma (I think it was 2.2). This, too, caused a lot of problems because some web browser were standard-conformant in this regard, while others didn't apply any default gamma value at all. This meant that even if you left the gamma setting out of the PNG file, the image would still look different in different browsers.

This was a nightmare because many web pages assumed that images will be shown unmodified and thus colors in the image will match those used elsewhere in the page. 

I think that in later decades the situation has stabilized somewhat, but it can still raise its ugly head.

I feel that HDR currently is similar to the gamma issue in PNG files in that it, too, causes a lot of problems and is a real pain in the ass to deal with.

If you buy a "4k HDR" BluRay, most (if not all) of them assume that you have a HDR-capable BluRay player and television display. In other words, the BluRay will only contain an HDR version of the video. Most of them will not have a regular non-HDR version.

What happens if your TV does not support HDR (or the support is extremely poor), and your BluRay player does not support "un-HDR'ing" the video content? What happens is that the video will be much darker and with very wrong color tones, and look completely wrong.

This is the exact situation, at least currently, with the PlayStation 5: It can act as a BluRay player, and supports 4k HDR BluRay discs, but (at least as of writing this) it does not have any support for converting HDR video material to non-HDR video (eg. by clamping the ultra-bright pixels to the maximum non-HDR brightness) and will just send the HDR material to the TV as-is. If your TV does not support HDR (or it has been turned off because it makes the picture look like ass), the video will look horrendous, with completely wrong color tones, and much darker than it should be.

(It's a complete pain in the ass. If I want to watch such a BluRay disc, I need to go and turn HDR support on, after which the video will look acceptable, even though my TV has very poor-quality HDR support. But at least the color tones will be ok. Afterwards I need to go back and turn HDR off to make games look ok once again. This is such a nuisance that I have stopped buying 4k HDR BluRays completely, after the first three.)

More recently, I got another instance of the pain-in-the-ass that's HDR, and it happened with the recently released Nintendo Switch 2.

Said console supports HDR. As it turns out, if your TV supports HDR, the console will turn HDR on, and there's no option to turn it off, anywhere. (There's one setting that claims to to turn it off for games, but it does nothing.)

I didn't realize this and wondered for some weeks why the picture looks like ass. It's a bit too bright, everything is a bit too washed.out, with the brightest pixels just looking... I don't know... glitched somehow. The thing is that I had several points of direct comparison: The console's own display, as well as the original Switch: In those, the picture looks just fine. However, on my TV the picture of the Switch 2 looks too washed-out, too low-contrast, too bright. And this even though I had adjusted the HDR brightness in the console's settings.

One day this was bothering me so much that I started browsing the TV's own menus, until I found buried deep within layers of settings one that turned HDR support completely off for that particular HDMI input.

And what do you know, the picture from the Switch 2 immediately looked perfect! Rich colors, rich saturation, good contrast, just like on the console's own screen.

The most annoying part of all of this is that, as mentioned, it's literally not possible to reach this state using the console's own system settings. If your TV tells it that it supports HDR, it will use HDR, and there's nothing you can do in the console itself to avoid that. You have to literally turn HDR support off in the settings of your TV to make the console stop using it.

The PlayStation 5 does have a system setting that turns HDR completely off from the console itself. The Switch 2 does not have such a setting. It's really annoying.

The entire HDR thing is a real pain in the ass. So far it has only caused problems without any benefits, at least to me. 

Thursday, August 14, 2025

The complex nature of video game ports/rereleases/remasters

Sometimes video game developers/publishers will take a very popular game of theirs that was released many years prior, and re-release it, often for a next-gen platform (especially if talking about consoles).

Sometimes the game will be completely identical to the original, simply ported to the newer hardware. This can be particularly relevant if a newer version of a console isn't compatible with the previous versions, allowing people who own the newer console but have never owned the older one to experience the game. (On the PC side this can be the case with very old games that are difficult if not impossible to run in modern Windows, at least not without emulation and thus probably not using a copy that has been legally purchased.

Othertimes the developers will also take the opportunity to enhance the game in some manner, improving the graphics and framerate, perhaps remaking the menus, and perhaps polishing some details (such as the controls).

Sometimes these re-releases can be absolutely awesome. Othertimes not so much, and feel more like cheap cash grabs.

Ironically, there's at least one game that's actually an example of both: The 2013 game The Last of Us.

The game was originally released for the PlayStation 3 in June of 2013, and was an exclusive for that console. Only owners of that particular console could play it.

This was, perhaps, a bit poorly timed because it was just a few months before the release of the PlayStation 4 (which happened on November of that same year).

However, the developers announced that an enhanced PlayStation 4 version would be made as well, and it was published on July of 2014, with the name "The Last of Us Remastered".

Rather than just going the lazy way of releasing the exact same game for both platforms, the PlayStation 4 version was indeed remastered with a higher resolution, better graphics, and higher framerate, and it arguably looked really good on that console.

From the players' point of view this was fantastic: Even people who never owned a PS3 but did buy the PS4 could experience the highly acclaimed game, rather than it being relegated as an exclusive to an old console (which is way too common). This is, arguably, one of the best re-releases/remasters ever made, not just in terms of the improvements but more importantly in terms of allowing gamers to experience the game who wouldn't have otherwise.

Well, quite ironically, the developers later decided to make the same game also one of the worst examples of useless or even predatory "re-releases". From one of the most fantastic examples, to one of the worst.

How? By re-releasing a somewhat "enhanced" version exclusively for the PlayStation 5 and Windows in 2022, with the name "The Last of Us Part I". The exact same game, again, with somewhat enhanced graphics for the next generation of consoles and PC.

Ok, but what makes that "one of the worst" examples of bad re-releases? The fact that it was sold at the full price of US$70, even for those who already own the PS4 version.

Mind you: "The Last of Us Remastered" for the PS4 is still perfectly playable on the PS5. It's not like PS5 owners who don't own the PS4 cannot play and thus experience it.

It was not published as some kind of "upgrade pack" for $10, like is somewhat common. It was released as its own separate game for full price, on a platform that's still completely capable of running the PS4 version. And this was, in fact, a common criticism among reviewers (both journalists and players.)

Of course this is not even the worst example, just one of the worst. There are other games that could be argued to be even worse, such as the game "Until Dawn", originally for the PS4, later re-released for the PS5 with somewhat enhanced graphics, at full price. While, once again, the original is still completely playable on the PS5.

Wednesday, August 6, 2025

"Dollars" vs "cents" notation confusion in America

There's a rather infamous recorded phone call, from maybe 20 years or so ago, where a Verizon customer calls to customer support to complain that their material advertised a certain cellphone internet connectivity plan to cost ".002 cents per kilobyte", but he was charged 0.002 dollars (ie 0.2 cents) per kilobyte.

It's quite clear that the ad meant to say "$0.002 per kilobyte", but whoever had written the ad had instead written ".002c per kilobyte" (or ".002 cents per kilobyte", I'm not sure as I have not seen the ad). (It's also evident from the context that the caller knew this but wanted to deliberately challenge Verizon for their mistake in the ad, as false advertisement is potentially illegal.)

I got reminded of this when I recently watched a video by someone who, among other things, explained how much money one can get in ad revenue from YouTube videos. He explains that his best-earning long form video has earned him "6.33 dollars per thousand views", while his best-earning shorts video has earned him "about 20 cents per thousand views". Crucially, while saying this he is writing these numbers, and what does he write? This:


In other words, he says "twenty cents", but rather than write "$0.20" or, alternatively, "20 c", he writes "0.20 c".

Obviously anybody who understands the basics of arithmetic knows that "0.20 c" is not "20 cents". After all, you can literally read what it says: "zero point two zero cents", which rather obviously is not the same thing as "twenty cents". It should be obvious to anybody that "0.20 c" is a fraction of a cent, not twenty entire cents (in particular, it's one fifth of a cent). The correct notation would be "$0.20", ie. a fraction of a dollar (one fifth).

This confusion seems surprisingly common in the United States in particular, even among people who are otherwise quite smart and should know better. But what causes this?

Lack of education, sure, but what exactly makes them believe this? Why do they believe this rather peculiar thing?

I think that we can get a hint from that phone call to Verizon. During that phone call the customer support person, when explicitly asked, very explicitly and clearly stated that ".002 cents" and ".002 dollars" mean the same thing. When later in the call the manager took over the call, he said the exact same thing.

Part of this confusion seems to indeed be the belief that, for example, "20 cents", "0.20 cents" and "0.20 dollars" all mean the same thing. What I believe is happening is that these people, for some reason, think that these are some kind of alternative notations to express the same thing. They might not be able to explain why there are so many notations to express the same thing, but I would imagine that if asked they would guess that it's just a custom, a tradition, or something like that. After all, there are many other quantities that can be expressed in different ways, yet mean the same thing.

It gives credibility to this hypothesis that, also, in that phone call to Verizon, the customer support person repeatedly says that the plan costs "point zero zero two per kilobyte", without mentioning the unit. Every time she says that, the customer explicitly asks "point zero zero two what?" and she clearly hesitates, and then says "cents". Which, of course, is the wrong answer, as it should be "dollars". But she doesn't seem to understand the difference.

What I believe happened there (and is happening with most Americans who have this same confusion) is that they indeed believe that something like "0.002", or ".002", in the context of money, is just a notation for "cents", all by itself. That if you want to write an amount of "cents", you use a dot and then the cents amount. Like, for example, if you wanted to write "20 cents", you would write a dot (perhaps preceded by a zero) and then the "20", thus "0.20" all in itself meaning "20 cents". And if you wanted to clarify that it indeed is cents, you just add the "¢" at the end.

They seem to have a fundamental misunderstanding of what the decimal point notation means and signifies, and appear to believe that it's just a special notation to indicate cents (and, thus, that "20 cents" and "0.20 cents" are just two alternative ways to write the same thing.)

Of course the critics are right that this ultimately stems from a lack of education: The education system has not taught people well enough the decimal system and how to use it. Most Americans have learned it properly, but then there are those who have fallen between the cracks and haven't got the proper education on the decimal system and arithmetic in general.

Sunday, August 3, 2025

How the Voldemort vs. Harry final fight should have actually been depicted in the movie

The movie adaptation of the final book in the Harry Potter series, Deathly Hallows: Part 2, makes the final fight between Harry and Voldemort flashy but confusing, leaving the viewers completely unclear about what exactly is happening and why, and does not convey at all the lore in the source material.

How the end to the final fight is depicted in the movie is as follows:

1) Voldemort and Harry cast some unspecified spells at each other, being pretty much a stalemate.


2) Meanwhile elsewhere Neville kills Nagini, which is the last of Voldemort's horcruxes.


3) Voldemort appears to be greatly weakened by this, so much so that his spell just fizzles out, at the same time as Harry's.

 

4) Voldemort is shown as greatly weakened, but he still casts another unspecified spell, and Harry responds with also an unspecified spell.


5) However, Voldemort's spell quickly fades out, and he looks completely powerless, looking at his Elder Wand with a puzzled or perhaps defeated look, maybe not understanding why it's not working, maybe realizing that it has abandoned him, or maybe just horrified at having just lost all of his powers. Harry's spell also fizzles out; it doesn't touch Voldemort.

6) Harry takes the opportunity to cast a new spell. He doesn't say anything but from its effect it's clear it's an expelliarmus, the disarming spell. 

 7) Voldemort gets disarmed and he looks completely powerless. The Elder Wand flies to Harry.

8) Voldemort starts disintegrating.


So what is depicted in the movie it looks like Neville destroying Nagini, Voldemort's last horcrux, completely zapped him of all power, and regardless of making a last but very powerless effort, he gets easily disarmed by Harry, and then just disintegrates, all of his power and life force having been destroyed.

In other words, it was, in fact, Neville who killed Voldemort (even if a bit indirectly) by destroying his last source of power, and Harry did nothing but just disarm him right before he disintegrated.

However, that's not at all what happened in the books.

What actually happened in the books is that, while Neville did kill Nagini, making Voldemort completely mortal, that's not what destroyed him. What destroyed him was that he cast the killing curse at Harry, who in turn immediately cast the disarming spell, and because the Elder Wand refused to destroy its own master (who via a contrived set of circumstances happened to be Harry Potter), Voldemort's killing curse rebounded back from Harry's spell and hit Voldemort himself, who died of it.

In other words, Voldemort destroyed himself with his own killing curse spell, by having it reflected back, because the Elder Wand refused to kill Harry (its master at that point).

This isn't conveyed at all in the movie.

One way this could have been depicted better and more clearly in the movie would be, for example:

When Neville destroys Nagini, Voldemort (who isn't at that very moment casting anything) looks shocked and distraught for a few seconds, then his shock turns into anger and extreme rage, and he casts the killing curse at Harry, saying it out loud (for dramatic effect the movie could show this in slow motion or in another similar manner), and Harry immediately responds with the disarming spell (also spelling it out explicitly, to make it clear which spell he is casting.)

Maybe after a second or two of the two spell effects colliding with each other, the movie clearly depicts Voldemort's spell rebounding and reflecting from Harry's spell, going back to Voldemort and very visibly hitting him. Voldemort looks at the Elder Wand in dismay, then at Harry, then his expression changes to shock when he realizes and understands, at least at some level, what just happened. He looks again at his wand and shows an expression of despair and rage, but now Harry's new disarming spell knocks it off his hand, and he starts disintegrating.

Later, in the movie's epilogue, perhaps Harry himself could give a brief explanation of what happened: That the Elder Wand refused to kill its own master, he himself, and that Voldemort's killing curse rebounded and killing its caster.