Monday, November 20, 2023

The one thing that Unreal Engine did right that Unity did wrong

When it comes to game engines used in triple-A games, the competition was quite fierce in the 1990's and early 2000's, but ultimately two engines became so immensely popular, both among big triple-A studios as well as many indie studios, that they have become almost ubiquitous and have the vast, vast majority of the market share among themselves, with other engines being left almost completely in the dust.

These game engines are, of course, Unreal Engine and Unity.

In terms of features and visual quality these engines are extremely similar, and are in a constant competition between themselves on which one can out-compete the other in terms of visuals and fancy "next gen" features.

However, for the longest time (more so in the past than today, but still to a large extent even to this day) these engines have had a rather different reputation among gamers.

Unity has always been considered a kind of "cheaper", "smaller" and, in a way, "worse" engine, while Unreal Engine has got this image of being a serious heavy-hitter for the truly massive and impressive multi-million-dollar budget triple-A games.

In fact, Unity has for quite a long time had this negative association with scammy asset-flip trash that gets dumped onto Steam and other digital stores just for a quick buck. It seems like it's almost too easy to make games (especially mediocre ones) with Unity, allowing scammers and opportunists to quickly make something that superficially looks like a game (and may even have fancy-looking screenshots) to try to fool people into buying them so they can essentially defraud them of money with a complete trash asset flip unstable game-in-name-only.

Likewise Unity has for the longest time had this image of being associated with free-to-play and very cheap small indie games, made by either individual people or extremely small indie studios.

In contrast, as mentioned earlier, Unreal Engine has always had this strong association with really big-budget massive high-quality triple-A titles. Almost all of the really big and famous game franchises seem to use it. Unreal Engine isn't really associated with small indie games nor scammy asset flips.

Is this because Unity is free to use and Unreal Engine is very expensive (and thus only affordable by big game studios)? No. Both have extremely generous usage licenses that allow using them for completely free up to surprisingly large amounts of revenue (and even after that you pay from your revenue, not upfront). Anybody can use either engine completely for free, no strings attached.

Is it because Unity is a very small and simple engine suitable only for small and simple games, while Unreal Engine is a massive engine supporting all the bells and whistles? Again, no. As also mentioned earlier, both are very modern and very complete in their support for modern gaming features, and are both capable of very similar graphics and other video game technologies. It is perfectly possible to do a full-on hundred-million-dollar-budget huge triple-A game with Unity, and a very small and cheap indie game with Unreal Engine.

So given how similar both engines are in terms of features, size, scope and usage licenses, how come Unity has got this reputation of being for small indie games and scammy asset flips, while Unreal Engine has got this image of being a "big boys" engine for the massive triple-A games?

This is because of one perhaps a bit surprising policy that the creators of the game engines have had for the longest time. And, more precisely, because the engines are polar opposites in terms of this particular policy.

You see, since pretty much the beginning and up to this day (although this was recently changed, if I remember correctly), the free version of Unity had the policy that any game made by it had to display the Unity splash screen when launching the game. Only a paid license of the engine would allow disabling the splash screen.

The developers of Unity probably thought that this would work as advertisement for their engine, in return for it being free to use. A bit like a small form of paid-by-ads (or, more precisely, by one ad in this case).

In contrast, Unreal Engine has the exact opposite policy: In other words, it's not allowed to show the Unreal Engine splash screen in your game unless you get a particular paid license for the engine. In other words, you need to get permission to show the Unreal Engine splash screen, else you can't show it.

Well, turns out that Unreal Engine, perhaps serendipitously or because of amazing foresight, had the better idea.

The reason why Unity is often associated with small crappy games and scammy asset flips is precisely because they all show the Unity splash screen when launched (it can't be disabled in the free version). Bigger triple-A games, however, usually disable the splash screen because it may not go well with their aesthetics of the game.

Thus there's a strong association between the Unity splash screen and the small crappy games.

In contrast, you almost exclusively see the Unreal Engine splash screen in huge triple-A games, and never in small indie games (where it's outright forbidden from being used, even if the game uses the engine), which is why the name is often associated with the former.

Thursday, August 17, 2023

How to browse the internet as safely and anonymously as possible

It doesn't really matter why one would want to browse the internet as anonymously and safely as possible, it is within everybody's rights to do so, if they so wish. The motivations behind it don't really matter, it's not anybody's business. There can be completely legit reasons why you want to do so, browsing the internet with complete anonymity, leaving no trace behind, and keeping your computer completely safe from any malicious software that you might encounter online.

Important note: No method can ever be 100% sureproof, with 0% of malicious actors, hackers, governments or other people getting hold of your PC and/or seeing what you are doing. If you connect your PC to the internet (and sometimes even if you don't, if it has any sort of wireless capabilities) you always take some risk.

That being said, following all these steps will significantly reduce such risks and make it extremely hard for any malicious actors or software from seeing what you are doing or getting some kind of access to your computer, and will make it much harder for any malware to invade your computer.

Another important note: Employing only one or two of these steps, while it already may add some safety, will not be sufficient. The more of these steps you use, the safer and more secure it will be.

1: Use a VPN

By this point it almost sounds like a cliche, but it does help: Using a VPN makes it significantly harder (although not 100% impossible) for anybody to connect what you are browsing as coming from your computer. It will (at least ostensibly) stop your internet service provider from seeing what you are browsing (because your ISP will only see an encrypted connection to some VPN server somewhere, not what you are actually connecting to at the end of the chain.)

Note that using a VPN will introduce significant lag to your internet connection (which is something VPN service providers will often lie about), so you might not want to have it constantly on, but only when you want to go private.

Also note that, as far as I know, there exists no good free-of-charge VPN services out there, so if you want to use one you'll have to buy a subscription. There's probably no (legal) way around this, but depending on your needs it may be worth it.

2: Use a virtual machine software

Way too few people know and understand how incredibly handy and versatile virtual machines are.

A virtual machine (such as VirtualBox or VMware) allows installing and running a second operating system in such a manner that it's completely encapsulated in its own hardware sandbox (and all of its files in its own directory in the host operating system). Modern processor architectures allow running a guest OS at pretty much effectively the same efficiency as a natively-installed OS.

There are many advantages in a virtual machine: Whatever you do inside the virtual machine stays within the virtual machine, and has no effect on your natively-installed host operating system. (There may exist "jailbreak" exploits for some virtual machines, but these are unlikely. And, as said earlier, no system can ever be 100% safe, you can only try to increase safety to the maximum you can.)

Additionally, a virtual machine allows effectively taking "snapshots" of the entire guest system, and later restore the entire thing to what it was at the time of this "snapshot". In other words, it's effectively an absolutely perfect 100% backup that will move time back and restore the system to exactly what it was before, bit-by-bit. If you ever want to undo something you have done inside the virtual machine, you can just restore this backup snapshot, and everything done after that will be gone. (The easiest way to take such a "snapshot" is to simply copy the directory where the virtual machine files are located somewhere else. You can then later copy it back, which will restore the guest system to what it was.)

Also, a virtual machine allows running Linux inside it, even if your natively-installed host OS is Windows. Linux in itself adds a layer of protection as it's less targeted and less vulnerable to attacks (eg. by trojans, viruses, etc.)

3: Use an encrypted partition in the virtual machine

When installing the guest operating system into a virtual machine, choose in the installer to use an encrypted partition. Most Linux distros offer this possibility in their installers (and if one doesn't, either choose a distro that does, or look up tutorials on how to make the partition encrypted.)

When the guest operating system has been installed in an encrypted partition inside the virtual machine, whatever you do inside the virtual machine will leave no recoverable trace anywhere in your hard drives / SSDs. Anything that saves anything to disk inside the virtual machine will be encrypted, leaving no recoverable trace behind. (Remember that simply deleting a file does not necessarily remove its bits from the storage device. Not even if you use some kind of "file shredder" application that tries to completely eliminate the original data by overwriting the file: In modern SSDs these overwrites may be written to a different location in the physical storage device. When the partition is encrypted to begin with, nothing will be written to the storage device unencrypted, and thus there will be no unencrypted trace of it anywhere.)

For the extra paranoid you might want to use an encrypted partition for your natively-installed host OS as well (be it Linux or Windows), and this too will add an extra layer of security, but it's up to you whether you want to go through this. Doing it inside the virtual machine is hassle-free.

4: (Optionally) use a Tor browser inside the virtual machine

While the Tor network is often associated with the "dark web" and all kind of illicit and illegal activities, it's not in principle designed for that, and it's a legitimate way to browse the internet anonymously, and can be used to browse the regular normal internet.

It shouldn't really be relied on by its own, without anything else, but in addition to all the above, it will add yet another layer of protection.

Note that Tor may be a form of communication that's alternative to VPN, so using both at the same time might not add one form of protection on top of the other. However, it may still be useful to use both at the same time, especially if you are going to use a normal web browser in addition to a Tor browser.

If you are going to use a regular web browser inside the virtual machine, it's recommended to use the "incognito mode" provided by the browser. This is not because it would add any security or anonymity (because it doesn't), but because it's a convenient way of erasing whatever your browsing left behind on your disk, like tracking cookies, scripts, etc. If any dubious website attempts to do something to your system (even if it's just the guest system running inside the virtual machine), this adds a layer of safety in that the browser will remove all of what that website did when the browser window is closed. This is a very mild form of security, but it still doesn't hurt to use it. This is much more convenient than doing a full virtual machine snapshot restore.

Even with regular web browsers, not all browsers are equal. Some browsers have been specifically fine-tuned to make things like fingerprinting and tracking by websites as difficult as possible. An example of such web browser (and widely preferred by privacy-conscious people) is LibreWolf, which is a fork of Firefox.

Monday, June 26, 2023

Was the QWERTY keyboard layout designed to slow typists down?

The QWERTY mechanical keyboard layout was devised in the 1870's. From all possible keyboard layouts it happened to become the universal standard (with a few very minor local variants in some countries, such as the AZERTY layout in France and some French speaking countries.)

One of the most persistent urban legends about the layout is that it was devised to slow down typists because the hammers of the mechanical keyboards were hitting each other and thus hindering typing, when typists were becoming proficient and typing too fast. This factoid is usually cited as evidence to why the QWERTY layout is (deliberately) inefficient and slow. (Unsurprisingly, it's a factoid often repeated by advocates of the Dvorak keyboard layout, who allege it to be significantly more efficient and faster to type with.)

The myth is based on truth, but only partially.

It is true that when testing different keyboard layouts, with some layouts the hammers, especially those physically close to each other, were hitting each other and getting stuck if typing too fast. After all, each hammer, controlled by its own key, has to hit the same spot, so if one hammer doesn't get out of the way before the next one comes in, it will be in the way and block it, or they could get stuck together, hindering or stopping further typing.

The QWERTY layout was indeed primarily designed to address this problem, by spreading out the most commonly used letters in English as far apart from each other as possible.

However, this was not done to slow down typing. It was done so that the equivalent hammers would likewise be as far apart from each other as possible. When two hammers in a mechanical keyboard are more apart from each other, they get out of the way quicker, giving way to the other hammer (since the hammers are physically located far apart in the mechanism, their trajectories diverge more quickly).

In other words, the QWERTY layout was designed to physically spread out the hammers for the most commonly used English letters, minimizing jams. It was not designed to slow down typing.

This is once again a factoid that's partially based on reality but gets the details wrong.

Saturday, April 1, 2023

The Final Fantasy series, a personal retrospective

I have been playing video games since the early 1980's. From the literally thousands of games that I have played during my life, the Final Fantasy series by Square (later Square Enix) holds a special place in my nostalgia.

I have played every single game in the mainline series, with the exception of the two MMORPGs, as well as some of the spinoffs and side games (such as Crisis Core, Final Fantasy Tactics, FF7 Remake, and a few others). This blog post will focus only on the mainline series.

The series could, perhaps, be divided into three distinctive "eras", based on style, game mechanics, and content. There are no hard lines dividing the games between these three eras, as there are a bit of overlap between them, but nevertheless, I would classify these three eras as (note that I'm using the Japanese numbering for the earlier games, which is now the official numbering as well):

  1. The "classic" or "golden" era, consisting of the Final Fantasy games from I to VI, plus IX.
  2. The "interim" era, consisting of the Final Fantasy games VII and VIII.
  3. The "modern" era, consisting of all Final Fantasy games from X forwards.

The "classic" games consisted of games using a 2D top view and 2D sprites, and were largely designed around that graphical style. (Rather obviously this style was largely dictated by the limitations of the hardware of the time, namely the NES and the SNES.) While Final Fantasy IX does use full 3D graphics, I still count it as belonging to the "classic" era because it was explicitly designed by Square to be a homage to the 2D classics, and follows many of the same game mechanics to a T.

While all the six (plus one) games are masterpieces on their own right, I consider Final Fantasy VI to be pretty much the culmination of the classic era. The magnum opus. The game I would choose if I had to choose one to represent what "Final Fantasy" is about. This game just transcends everything else, and in fact I consider it not only the best Final Fantasy game, but also one of the best video games ever made, period, regardless of genre or time period. I have extreme nostalgia for this game in particular.

In the "interim" era, consisting of Final Fantasy VII and VIII, Square radically shifted the style of the games, moving from 2D graphics to 3D graphics. But the shift went farther than just the visuals, as the contents, the gameplay, the storytelling, experienced a significant change as well, compared to all the previous six games.

One of the most significant changes in tone, started by FF VII, is the move from high fantasy to "magitech fantasy" (where "medieval Europe setting" is replaced essentially with a sort of steampunk or dieselpunk fantasy setting). Final Fantasy VI, the last game from the "classic era" (not counting IX), already delved a bit into this setting, so it kind of already showed the early signs of this shift in tone and setting. However, it was VII that embraced full-on magitech-steampunk

Final Fantasy VII is considered by many to be the best game in the series, and its culmination and magnum opus. Personally, I don't see it. I played the original PlayStation version, and I wasn't really all that enthralled by it. I have also played Final Fantasy VII Remake, and I still don't really get all that excited about it. (The Remake isn't just the original game with updated graphics and added content. It's actually an independent game of its own, but I will not spoil the idea any further here. However, for a significant part it does replicate the original game, just with improved graphics, so it kind of works as a modern version of it.)

It may be just me, but I just cannot understand what's so special about VII. For me the previous game, VI, is the one. The absolute best, the absolute culmination, everything that Final Fantasy is, the magnum opus. (And we are talking about the original SNES version. It has absolutely nothing to be ashamed of compared to modern games. It still stands as one of the best games ever made, even using the SNES hardware.)

Then there's the "modern" era. Oh boy, is a doozy...

This is the era, starting with Final Fantasy X, where Square Enix took the game series to a completely different direction. They threw almost everything from the previous games away, and only kept a very general theme, with vague allusions to some previous games (such as chokobos appearing in every game). In general gameplay, the game mechanics, are completely different and unrelated to all the previous games, and there's very little traces of classic 8-bit and 16-bit JRPG mechanics and tropes.

Personally I do not consider Final Fantasy X nor any of the subsequent games in the series to be actual Final Fantasy games. They are a completely different and unrelated game series that just carries the same game series name (soiling the originals).

Most people consider Final Fantasy XIII to be the worst game in the entire series, by a long margin. I disagree with that sentiment. I consider it the second-worst. Far, far below it is Final Fantasy X, which I not only consider the worst Final Fantasy game, but in fact one of the worst video games ever made. Everything that FF XIII did wrong, FF X did it years earlier, and much worse at that.

If FF XIII is extremely and utterly linear, with extremely short and linear levels which use a very abstract non-descript graphical design, and the game has pretty much nothing that makes a JRPG a JRPG (such as towns, npcs, shops, a wide open overworld and so on and so forth), FF X already did all those same things, but even worse. Its levels were even shorter, even more linear, were even more abstract, and the game lacked even more of JRPG game mechanics and tropes.

On top of that FF X is just infuriating to play due to its sheer amount of cutscenes. It has been a joke for quite a long time that modern games are nothing but cutscenes that are occasionally interrupted by short segments of gameplay. Well, with FFX that's not a joke, it's reality. That's exactly what the game is. And the worst part is that the cutscenes are too long, boring, uninteresting and badly acted. I'm not even kidding, there are parts of the game where you watch a long cutscene, gameplay continues, and less than 10 seconds later another long cutscene starts, interrupting your gameplay. There are large segments of the game where gameplay is literally a small fraction of the time.

FF XIII is slightly better than this, but not by a whole lot. XII is boring and uninspired (and I couldn't even bother playing it to is half-point, not even close), and XV, while better than any of those, is bland and uninspired, and also lacks most of what makes a JRPG a JRPG. (The two remaining games in this list are MMORPGs which I have not tried.)

Thursday, March 2, 2023

Jules Verne was a bad writer

While the title might sound provocative and controversial, I do seriously think that the writing skills of Jules Verne have always been greatly exaggerated. (Or, perhaps more precisely, not his skills in expressing himself in textual form, but his skills as a storyteller, as a writer.)

For about a century Jules Verne has always been hailed as one of the greatest authors of fiction in human history, even going so far as being sometimes called "the father of science fiction", among other fancy accolades. At least in past decades his books were a staple reading of school children and adults alike, and he has always been praised for his imagination, inventiveness and great science fiction writing.

Imaginative and inventive he might have been, but I would posit that he wasn't really all that good of a writer. Most of his stories are full of silly details that are illogical or nonsensical. Not only are his books full of scientific inaccuracies (even taking into account the time they were written, as any person of the time even slightly educated in science and physics could have attested), but there are also many weak plot points and illogical details.

The list of scientific inaccuracies (even by the knowledge of the time) and illogical or weak plot points from all of his books would be quite extensive, but let me present one particular example that struck me as odd even back when I first read the book as a child.

The protagonist and narrator of the book Twenty Thousand Leagues Under the Seas (one of the most famous of Verne's books) is one Professor Pierre Aronnax, a marine biologist and journalist, who accidentally ends up captured into the Nautilus, the submarine of Captain Nemo, among a couple of other crew members of the ship that he was traveling on.

At one point, well into the story, while these outsiders have been drugged to sleep and later wake up, Captain Nemo is very distressed and outright desperate, and he asks Aronnax if he might perhaps be a doctor. It is implied later that the Nautilus had sunk yet another ship by ramming into it, and a crew member had suffered a bad head injury because of the collision.

Completely out of the blue Aronnax confirms that yes, he's a Doctor of Medicine.

At no point prior to this point was it established that this marine biologist and journalist was also a Doctor of Medicine, or any kind of medical practitioner. Nor is it ever mentioned again after this small segment of the book. It comes up completely out of the blue and is never mentioned again.

It makes little sense. While of course it's not impossible, it's unlikely that someone would go through the 6+ years of medical school, plus all the extra years of training needed to get a doctorate in medicine, and then also become a marine biologist. It especially makes very little sense because this quite important aspect of his career was never mentioned nor alluded to in any way prior to this (and is completely forgotten after this).

It also makes very little sense that Nemo would have a submarine with a crew of dozens of people, and no doctor aboard. (It is never established in the book that there was a doctor but he eg. died or something along those lines. No mention of any kind is made about why there is no doctor among the crew of the Nautilus. It's left completely unexplained and unmentioned.)

The impetus for writing this in the book, completely out of the blue, is so that Aronnax would be taken to parts of the ship that he was normally completely banned from entering, so that he could get a look at more of the ship and get an estimate of the crew size (this is specifically alluded to in the subsequent narrative).

And this is precisely why I consider this a clear example of bad writing: This "I'm a Doctor of Medicine" is pretty much a Deus ex Machina, a completely illogical out-of-the-blue plot point that was not established and is never used again, just for such a small thing as the protagonist being taken to parts of the ship that were off bounds for him before.

There would be a million ways to achieve the same thing in a more logical manner, without resorting to such a stretch and illogical detail.

And this is not an isolated case of such bad writing. Verne's books are full of such examples. This is just one of them.