Sunday, February 11, 2024

How do some (software) projects become vaporware?

"Vaporware" is a term generally used for a project that's announced by a company, which then takes an inordinate amount of time to complete, and might in fact never be completed even though years and years, sometimes decades, are spent on developing it. Sometimes the project does eventually end up being released; however, most often than not it turns out to be less than stellar, sub-par to various degrees, and gets critically panned and disliked. Very often this is because, probably due to its extremely long development time, its technology is antiquated by the time it gets published and may still contain a lot of missing or buggy features (no doubt having not been completely finished for the exact same reasons it was so immensely delayed in the first place.)

Hidden behind the layer of publicly announced "vaporware" is, however, an immensely bigger layer of what could be called "eternal projects". These are projects that have gone over any reasonable timelines by an order of magnitude, but which have never been announced publicly by the company (most likely because the company only wants to announce finished products, not future projects.) These are, essentially, "company-internal vaporware", normal vaporware, just not announced in advance to the public.

Projects (software and otherwise) that go nowhere are extremely commonplace in all kinds of technology companies. Someone presents an idea, development is started, but it turns out that for one reason or another the idea is actually infeasible, or would take way too much time and resources, or in practice isn't as good as it sounded on paper, so it's abandoned. Usually in the timespan of a few months, a year at most.

However, these "eternal projects", "company-internal vaporware" projects, may be developed for years and years and years, sometimes even for over a decade, with no end in sight, even though it never seems to be even close to be finished, nor usable in practice, or something that actually feels like a good full video game (if the project is one). Rather than being cancelled after a few months, they are just stubbornly continued for many years, even though it never seems to get even close to finished.

But why? And how does a project become this kind of "eternal project" (either publicly announced or unannounced)? Why don't companies just cancel such projects when it has become clear that they aren't really going anywhere?

Having had some personal experience with these things, I can speculate from my perspective about some of the reasons.

One of the major, perhaps most obvious, reasons is the sunk cost fallacy: People managing and running the project (and, in my experience, it most usually tends to be them, rather than the developers and engineers themselves) get into the strong mentality that the project actually is going somewhere, that a finished product is just around the corner, it just needs a little bit more of work, and it will finally be finished, the final breakthrough in the project will be achieved, and the finishing touches will be routine and smooth sailing from that point forward. And, of course, where the "sunk cost fallacy" kicks in is that they have this strong feeling that because so many years and so much money have already been spent on the project, it would be a huge loss and an enormous waste to just stop and cancel the project. The team has spent all this time, all this effort, all this work, and there is years and years worth of material that has been produced by the team, and they are so close to finally get a working result, that it would be unthinkable to throw all that work, effort and money to the trashcan.

These managers, however, often get blindsided and can't see that the project actually isn't going anywhere, and the existing work is just outright bad, and that the project should have been cancelled literally years prior and all the developers and engineers moved to some other more productive projects. The developers themselves can often see this, but they either don't really want to say anything, or of they do, their objections are dismissed.

But how does a project, like a software project, end up in this state? A state of being, effectively, eternally "stuck" in a state where it's seemingly being constantly developed but it's getting nowhere, it's not actually advancing towards a final product?

There may be many reasons for this, but one common one is a lack of a plan, a very exact vision of what the final product must be like, planned before development even started. In other words, the project started as a more generic idea and took the approach of "let's plan the details as we go". For example if the project is a video game, the plan might have been to create a game of a particular genre, with certain features... but that's it. All the actual concrete details are made up during development, "as we go." There is no precise and exact final picture of the end goal.

This usually ends up in a form of development where the development itself is used as a testing bed for ideas and features. The person or people running the project may come up with new features to try, new ideas, new mechanics, to be tested on the project currently in development, to see if they work. These new ideas might literally come on an almost daily basis. "Let's try this. Let's try that. Let's change this into that. Let's add this to that. Let's remove this. Let's re-add this previous removed feature." An endless stream of new features and changes to the existing project, which just accumulate and accumulate over the years, but without any concrete vision nor plan of the final product.

One reason why a project manager may engage in this is that he completely misunderstood what the "agile development" paradigm actually means and entails. (Yes, actual case from personal experience.)

Such a project manager might get completely blindsided and, ironically, completely unable to see the bigger picture. The fact that the project isn't actually going anywhere, and is nowhere even near to be finished, and that the current project is in complete shambles, a complete mess, when it comes to an actual good design, and for example if it's a video game, it's not anything that anybody would play or enjoy.

Bosses, CEO's and other higher-ups might also be blindsided by the project. Perhaps they are presented previews of the product, which present a picture of it that looks way better than what it actually is. Company-internal deceptive marketing of sorts. The higher-ups might seriously get the wrong impression that the product is better and more finished than it really is, and thus not stop it.

Or, in many cases, it's precisely the bosses and CEO's who are the ones engaging in the endless cycle of trying new features without a clear plan nor goal of the end product, unable to see that the project is a complete disaster and should have been cancelled years prior.

Sunday, January 14, 2024

Why too much exposition ruins movies and games

Many years ago I went a couple of times to an event organized by some university student group where you could be introduced to and play all kinds of tabletop board games. I thought it would be a good way to have fun and socialize, and perhaps even find interesting tabletop games.

One of the organizers there, however, pretty much ruined the entire thing for me. The reason for it was that it seemed like he just loved the sound of his own voice, and when he started introducing some new board game to a small group of interested players, he would just explain... and explain... and explain... and explain... and explain... endlessly. He would literally take like 15 minutes explaining and explaining some board game (that wasn't actually even all that hugely complicated; it's not like it was Warhammer or some other enormously complex game.) Rather than, you know, actually allowing people to learn by playing.

The problem was, of course, that such a huge info dump is impossible to follow and remember. It's completely useless to explain a complex board game for fifteen minutes because no person in existence can remember all of that at once, especially when they have absolutely no experience with the game itself, no context, and all they hear are words and more words disconnected from any actual hands-on playing experience. Thus, I would just doze off after a minute or two, and listen to the huge stream of meaningless word salad for 10+ minutes, completely bored out of my skull. Those 15 minutes could have been used to actually play the game and learn the concepts in that manner, one by one as they come up during the game. After what felt like an absolute eternity the game would finally be started, and almost nothing of what he explained helped at all to play the game, because nobody can remember all of that. It was literally completely wasted 15 minutes for absolutely no benefit. We would learn from the first 5 minutes of actually playing the game way more than during that 15-minute verbal diarrhea info dump.

Way too many video games, especially nowadays, commit this exact same mistake: Quite often during the very beginning parts of the game, before the player has had any chance of getting any hands-on experience about the game, it will throw textbox tutorial after textbox tutorial at the player, usually interrupting gameplay, and way too often either explaining complete trivialities, or showing an info dump that the player has zero chance of learning because there's too much information at once, completely disconnected from any actual hands-on playing experience (and thus the player has no way of connecting what the tutorial is saying to the actual gameplay, making it harder to remember.)

(When it comes to explaining completely trivial things, which is way too common especially in a certain type of Japanese RPG games, it almost feels like the developers have the mentality of "we went through all this trouble to implement a tutorial system, let's use it to the fullest, dammit!" and start throwing the most trivial things at the player, like how to click on a button or exit from a menu, which would be completely obvious to anybody without it having to be explained.)

Sometimes, however, this kind of needless exposition and explanations can also extend to storytelling itself, which thus can affect not only video games but also movies, TV series and even books.

Movies, especially those that are based on stories originally told in another medium (usually a book or a game), tend to be especially egregious in this regard.

One particularly notorious and aggravating example is the 2021 film adaptation of Dune.

The opening scene of the book doesn't happen until about 20 minutes into the movie. The first 20 minutes are nothing but boring exposition.

This is not how you tell a story! Frank Herbert, when he wrote the book, understood how to tell a story in an interesting manner, in a way that immediately engages the audience. You start with something that grabs your attention, picks your curiosity, excites your imagination.

You don't start with 20 minutes of exposition!

Clearly the scriptwriters of the movie did not understand this at all, and felt the average audience is so stupid that they need 20 minutes of exposition before they can "understand" what's going on. They apparently felt that if they just did what the book did, then the audience wouldn't understand and would be confused. They clearly didn't understand about good writing at all.

And this is, by far, not the only example, just one of the most egregious recent ones.

Monday, November 20, 2023

The one thing that Unreal Engine did right that Unity did wrong

When it comes to game engines used in triple-A games, the competition was quite fierce in the 1990's and early 2000's, but ultimately two engines became so immensely popular, both among big triple-A studios as well as many indie studios, that they have become almost ubiquitous and have the vast, vast majority of the market share among themselves, with other engines being left almost completely in the dust.

These game engines are, of course, Unreal Engine and Unity.

In terms of features and visual quality these engines are extremely similar, and are in a constant competition between themselves on which one can out-compete the other in terms of visuals and fancy "next gen" features.

However, for the longest time (more so in the past than today, but still to a large extent even to this day) these engines have had a rather different reputation among gamers.

Unity has always been considered a kind of "cheaper", "smaller" and, in a way, "worse" engine, while Unreal Engine has got this image of being a serious heavy-hitter for the truly massive and impressive multi-million-dollar budget triple-A games.

In fact, Unity has for quite a long time had this negative association with scammy asset-flip trash that gets dumped onto Steam and other digital stores just for a quick buck. It seems like it's almost too easy to make games (especially mediocre ones) with Unity, allowing scammers and opportunists to quickly make something that superficially looks like a game (and may even have fancy-looking screenshots) to try to fool people into buying them so they can essentially defraud them of money with a complete trash asset flip unstable game-in-name-only.

Likewise Unity has for the longest time had this image of being associated with free-to-play and very cheap small indie games, made by either individual people or extremely small indie studios.

In contrast, as mentioned earlier, Unreal Engine has always had this strong association with really big-budget massive high-quality triple-A titles. Almost all of the really big and famous game franchises seem to use it. Unreal Engine isn't really associated with small indie games nor scammy asset flips.

Is this because Unity is free to use and Unreal Engine is very expensive (and thus only affordable by big game studios)? No. Both have extremely generous usage licenses that allow using them for completely free up to surprisingly large amounts of revenue (and even after that you pay from your revenue, not upfront). Anybody can use either engine completely for free, no strings attached.

Is it because Unity is a very small and simple engine suitable only for small and simple games, while Unreal Engine is a massive engine supporting all the bells and whistles? Again, no. As also mentioned earlier, both are very modern and very complete in their support for modern gaming features, and are both capable of very similar graphics and other video game technologies. It is perfectly possible to do a full-on hundred-million-dollar-budget huge triple-A game with Unity, and a very small and cheap indie game with Unreal Engine.

So given how similar both engines are in terms of features, size, scope and usage licenses, how come Unity has got this reputation of being for small indie games and scammy asset flips, while Unreal Engine has got this image of being a "big boys" engine for the massive triple-A games?

This is because of one perhaps a bit surprising policy that the creators of the game engines have had for the longest time. And, more precisely, because the engines are polar opposites in terms of this particular policy.

You see, since pretty much the beginning and up to this day (although this was recently changed, if I remember correctly), the free version of Unity had the policy that any game made by it had to display the Unity splash screen when launching the game. Only a paid license of the engine would allow disabling the splash screen.

The developers of Unity probably thought that this would work as advertisement for their engine, in return for it being free to use. A bit like a small form of paid-by-ads (or, more precisely, by one ad in this case).

In contrast, Unreal Engine has the exact opposite policy: In other words, it's not allowed to show the Unreal Engine splash screen in your game unless you get a particular paid license for the engine. In other words, you need to get permission to show the Unreal Engine splash screen, else you can't show it.

Well, turns out that Unreal Engine, perhaps serendipitously or because of amazing foresight, had the better idea.

The reason why Unity is often associated with small crappy games and scammy asset flips is precisely because they all show the Unity splash screen when launched (it can't be disabled in the free version). Bigger triple-A games, however, usually disable the splash screen because it may not go well with their aesthetics of the game.

Thus there's a strong association between the Unity splash screen and the small crappy games.

In contrast, you almost exclusively see the Unreal Engine splash screen in huge triple-A games, and never in small indie games (where it's outright forbidden from being used, even if the game uses the engine), which is why the name is often associated with the former.

Thursday, August 17, 2023

How to browse the internet as safely and anonymously as possible

It doesn't really matter why one would want to browse the internet as anonymously and safely as possible, it is within everybody's rights to do so, if they so wish. The motivations behind it don't really matter, it's not anybody's business. There can be completely legit reasons why you want to do so, browsing the internet with complete anonymity, leaving no trace behind, and keeping your computer completely safe from any malicious software that you might encounter online.

Important note: No method can ever be 100% sureproof, with 0% of malicious actors, hackers, governments or other people getting hold of your PC and/or seeing what you are doing. If you connect your PC to the internet (and sometimes even if you don't, if it has any sort of wireless capabilities) you always take some risk.

That being said, following all these steps will significantly reduce such risks and make it extremely hard for any malicious actors or software from seeing what you are doing or getting some kind of access to your computer, and will make it much harder for any malware to invade your computer.

Another important note: Employing only one or two of these steps, while it already may add some safety, will not be sufficient. The more of these steps you use, the safer and more secure it will be.

1: Use a VPN

By this point it almost sounds like a cliche, but it does help: Using a VPN makes it significantly harder (although not 100% impossible) for anybody to connect what you are browsing as coming from your computer. It will (at least ostensibly) stop your internet service provider from seeing what you are browsing (because your ISP will only see an encrypted connection to some VPN server somewhere, not what you are actually connecting to at the end of the chain.)

Note that using a VPN will introduce significant lag to your internet connection (which is something VPN service providers will often lie about), so you might not want to have it constantly on, but only when you want to go private.

Also note that, as far as I know, there exists no good free-of-charge VPN services out there, so if you want to use one you'll have to buy a subscription. There's probably no (legal) way around this, but depending on your needs it may be worth it.

2: Use a virtual machine software

Way too few people know and understand how incredibly handy and versatile virtual machines are.

A virtual machine (such as VirtualBox or VMware) allows installing and running a second operating system in such a manner that it's completely encapsulated in its own hardware sandbox (and all of its files in its own directory in the host operating system). Modern processor architectures allow running a guest OS at pretty much effectively the same efficiency as a natively-installed OS.

There are many advantages in a virtual machine: Whatever you do inside the virtual machine stays within the virtual machine, and has no effect on your natively-installed host operating system. (There may exist "jailbreak" exploits for some virtual machines, but these are unlikely. And, as said earlier, no system can ever be 100% safe, you can only try to increase safety to the maximum you can.)

Additionally, a virtual machine allows effectively taking "snapshots" of the entire guest system, and later restore the entire thing to what it was at the time of this "snapshot". In other words, it's effectively an absolutely perfect 100% backup that will move time back and restore the system to exactly what it was before, bit-by-bit. If you ever want to undo something you have done inside the virtual machine, you can just restore this backup snapshot, and everything done after that will be gone. (The easiest way to take such a "snapshot" is to simply copy the directory where the virtual machine files are located somewhere else. You can then later copy it back, which will restore the guest system to what it was.)

Also, a virtual machine allows running Linux inside it, even if your natively-installed host OS is Windows. Linux in itself adds a layer of protection as it's less targeted and less vulnerable to attacks (eg. by trojans, viruses, etc.)

3: Use an encrypted partition in the virtual machine

When installing the guest operating system into a virtual machine, choose in the installer to use an encrypted partition. Most Linux distros offer this possibility in their installers (and if one doesn't, either choose a distro that does, or look up tutorials on how to make the partition encrypted.)

When the guest operating system has been installed in an encrypted partition inside the virtual machine, whatever you do inside the virtual machine will leave no recoverable trace anywhere in your hard drives / SSDs. Anything that saves anything to disk inside the virtual machine will be encrypted, leaving no recoverable trace behind. (Remember that simply deleting a file does not necessarily remove its bits from the storage device. Not even if you use some kind of "file shredder" application that tries to completely eliminate the original data by overwriting the file: In modern SSDs these overwrites may be written to a different location in the physical storage device. When the partition is encrypted to begin with, nothing will be written to the storage device unencrypted, and thus there will be no unencrypted trace of it anywhere.)

For the extra paranoid you might want to use an encrypted partition for your natively-installed host OS as well (be it Linux or Windows), and this too will add an extra layer of security, but it's up to you whether you want to go through this. Doing it inside the virtual machine is hassle-free.

4: (Optionally) use a Tor browser inside the virtual machine

While the Tor network is often associated with the "dark web" and all kind of illicit and illegal activities, it's not in principle designed for that, and it's a legitimate way to browse the internet anonymously, and can be used to browse the regular normal internet.

It shouldn't really be relied on by its own, without anything else, but in addition to all the above, it will add yet another layer of protection.

Note that Tor may be a form of communication that's alternative to VPN, so using both at the same time might not add one form of protection on top of the other. However, it may still be useful to use both at the same time, especially if you are going to use a normal web browser in addition to a Tor browser.

If you are going to use a regular web browser inside the virtual machine, it's recommended to use the "incognito mode" provided by the browser. This is not because it would add any security or anonymity (because it doesn't), but because it's a convenient way of erasing whatever your browsing left behind on your disk, like tracking cookies, scripts, etc. If any dubious website attempts to do something to your system (even if it's just the guest system running inside the virtual machine), this adds a layer of safety in that the browser will remove all of what that website did when the browser window is closed. This is a very mild form of security, but it still doesn't hurt to use it. This is much more convenient than doing a full virtual machine snapshot restore.

Even with regular web browsers, not all browsers are equal. Some browsers have been specifically fine-tuned to make things like fingerprinting and tracking by websites as difficult as possible. An example of such web browser (and widely preferred by privacy-conscious people) is LibreWolf, which is a fork of Firefox.

Monday, June 26, 2023

Was the QWERTY keyboard layout designed to slow typists down?

The QWERTY mechanical keyboard layout was devised in the 1870's. From all possible keyboard layouts it happened to become the universal standard (with a few very minor local variants in some countries, such as the AZERTY layout in France and some French speaking countries.)

One of the most persistent urban legends about the layout is that it was devised to slow down typists because the hammers of the mechanical keyboards were hitting each other and thus hindering typing, when typists were becoming proficient and typing too fast. This factoid is usually cited as evidence to why the QWERTY layout is (deliberately) inefficient and slow. (Unsurprisingly, it's a factoid often repeated by advocates of the Dvorak keyboard layout, who allege it to be significantly more efficient and faster to type with.)

The myth is based on truth, but only partially.

It is true that when testing different keyboard layouts, with some layouts the hammers, especially those physically close to each other, were hitting each other and getting stuck if typing too fast. After all, each hammer, controlled by its own key, has to hit the same spot, so if one hammer doesn't get out of the way before the next one comes in, it will be in the way and block it, or they could get stuck together, hindering or stopping further typing.

The QWERTY layout was indeed primarily designed to address this problem, by spreading out the most commonly used letters in English as far apart from each other as possible.

However, this was not done to slow down typing. It was done so that the equivalent hammers would likewise be as far apart from each other as possible. When two hammers in a mechanical keyboard are more apart from each other, they get out of the way quicker, giving way to the other hammer (since the hammers are physically located far apart in the mechanism, their trajectories diverge more quickly).

In other words, the QWERTY layout was designed to physically spread out the hammers for the most commonly used English letters, minimizing jams. It was not designed to slow down typing.

This is once again a factoid that's partially based on reality but gets the details wrong.

Saturday, April 1, 2023

The Final Fantasy series, a personal retrospective

I have been playing video games since the early 1980's. From the literally thousands of games that I have played during my life, the Final Fantasy series by Square (later Square Enix) holds a special place in my nostalgia.

I have played every single game in the mainline series, with the exception of the two MMORPGs, as well as some of the spinoffs and side games (such as Crisis Core, Final Fantasy Tactics, FF7 Remake, and a few others). This blog post will focus only on the mainline series.

The series could, perhaps, be divided into three distinctive "eras", based on style, game mechanics, and content. There are no hard lines dividing the games between these three eras, as there are a bit of overlap between them, but nevertheless, I would classify these three eras as (note that I'm using the Japanese numbering for the earlier games, which is now the official numbering as well):

  1. The "classic" or "golden" era, consisting of the Final Fantasy games from I to VI, plus IX.
  2. The "interim" era, consisting of the Final Fantasy games VII and VIII.
  3. The "modern" era, consisting of all Final Fantasy games from X forwards.

The "classic" games consisted of games using a 2D top view and 2D sprites, and were largely designed around that graphical style. (Rather obviously this style was largely dictated by the limitations of the hardware of the time, namely the NES and the SNES.) While Final Fantasy IX does use full 3D graphics, I still count it as belonging to the "classic" era because it was explicitly designed by Square to be a homage to the 2D classics, and follows many of the same game mechanics to a T.

While all the six (plus one) games are masterpieces on their own right, I consider Final Fantasy VI to be pretty much the culmination of the classic era. The magnum opus. The game I would choose if I had to choose one to represent what "Final Fantasy" is about. This game just transcends everything else, and in fact I consider it not only the best Final Fantasy game, but also one of the best video games ever made, period, regardless of genre or time period. I have extreme nostalgia for this game in particular.

In the "interim" era, consisting of Final Fantasy VII and VIII, Square radically shifted the style of the games, moving from 2D graphics to 3D graphics. But the shift went farther than just the visuals, as the contents, the gameplay, the storytelling, experienced a significant change as well, compared to all the previous six games.

One of the most significant changes in tone, started by FF VII, is the move from high fantasy to "magitech fantasy" (where "medieval Europe setting" is replaced essentially with a sort of steampunk or dieselpunk fantasy setting). Final Fantasy VI, the last game from the "classic era" (not counting IX), already delved a bit into this setting, so it kind of already showed the early signs of this shift in tone and setting. However, it was VII that embraced full-on magitech-steampunk

Final Fantasy VII is considered by many to be the best game in the series, and its culmination and magnum opus. Personally, I don't see it. I played the original PlayStation version, and I wasn't really all that enthralled by it. I have also played Final Fantasy VII Remake, and I still don't really get all that excited about it. (The Remake isn't just the original game with updated graphics and added content. It's actually an independent game of its own, but I will not spoil the idea any further here. However, for a significant part it does replicate the original game, just with improved graphics, so it kind of works as a modern version of it.)

It may be just me, but I just cannot understand what's so special about VII. For me the previous game, VI, is the one. The absolute best, the absolute culmination, everything that Final Fantasy is, the magnum opus. (And we are talking about the original SNES version. It has absolutely nothing to be ashamed of compared to modern games. It still stands as one of the best games ever made, even using the SNES hardware.)

Then there's the "modern" era. Oh boy, is a doozy...

This is the era, starting with Final Fantasy X, where Square Enix took the game series to a completely different direction. They threw almost everything from the previous games away, and only kept a very general theme, with vague allusions to some previous games (such as chokobos appearing in every game). In general gameplay, the game mechanics, are completely different and unrelated to all the previous games, and there's very little traces of classic 8-bit and 16-bit JRPG mechanics and tropes.

Personally I do not consider Final Fantasy X nor any of the subsequent games in the series to be actual Final Fantasy games. They are a completely different and unrelated game series that just carries the same game series name (soiling the originals).

Most people consider Final Fantasy XIII to be the worst game in the entire series, by a long margin. I disagree with that sentiment. I consider it the second-worst. Far, far below it is Final Fantasy X, which I not only consider the worst Final Fantasy game, but in fact one of the worst video games ever made. Everything that FF XIII did wrong, FF X did it years earlier, and much worse at that.

If FF XIII is extremely and utterly linear, with extremely short and linear levels which use a very abstract non-descript graphical design, and the game has pretty much nothing that makes a JRPG a JRPG (such as towns, npcs, shops, a wide open overworld and so on and so forth), FF X already did all those same things, but even worse. Its levels were even shorter, even more linear, were even more abstract, and the game lacked even more of JRPG game mechanics and tropes.

On top of that FF X is just infuriating to play due to its sheer amount of cutscenes. It has been a joke for quite a long time that modern games are nothing but cutscenes that are occasionally interrupted by short segments of gameplay. Well, with FFX that's not a joke, it's reality. That's exactly what the game is. And the worst part is that the cutscenes are too long, boring, uninteresting and badly acted. I'm not even kidding, there are parts of the game where you watch a long cutscene, gameplay continues, and less than 10 seconds later another long cutscene starts, interrupting your gameplay. There are large segments of the game where gameplay is literally a small fraction of the time.

FF XIII is slightly better than this, but not by a whole lot. XII is boring and uninspired (and I couldn't even bother playing it to is half-point, not even close), and XV, while better than any of those, is bland and uninspired, and also lacks most of what makes a JRPG a JRPG. (The two remaining games in this list are MMORPGs which I have not tried.)

Thursday, March 2, 2023

Jules Verne was a bad writer

While the title might sound provocative and controversial, I do seriously think that the writing skills of Jules Verne have always been greatly exaggerated. (Or, perhaps more precisely, not his skills in expressing himself in textual form, but his skills as a storyteller, as a writer.)

For about a century Jules Verne has always been hailed as one of the greatest authors of fiction in human history, even going so far as being sometimes called "the father of science fiction", among other fancy accolades. At least in past decades his books were a staple reading of school children and adults alike, and he has always been praised for his imagination, inventiveness and great science fiction writing.

Imaginative and inventive he might have been, but I would posit that he wasn't really all that good of a writer. Most of his stories are full of silly details that are illogical or nonsensical. Not only are his books full of scientific inaccuracies (even taking into account the time they were written, as any person of the time even slightly educated in science and physics could have attested), but there are also many weak plot points and illogical details.

The list of scientific inaccuracies (even by the knowledge of the time) and illogical or weak plot points from all of his books would be quite extensive, but let me present one particular example that struck me as odd even back when I first read the book as a child.

The protagonist and narrator of the book Twenty Thousand Leagues Under the Seas (one of the most famous of Verne's books) is one Professor Pierre Aronnax, a marine biologist and journalist, who accidentally ends up captured into the Nautilus, the submarine of Captain Nemo, among a couple of other crew members of the ship that he was traveling on.

At one point, well into the story, while these outsiders have been drugged to sleep and later wake up, Captain Nemo is very distressed and outright desperate, and he asks Aronnax if he might perhaps be a doctor. It is implied later that the Nautilus had sunk yet another ship by ramming into it, and a crew member had suffered a bad head injury because of the collision.

Completely out of the blue Aronnax confirms that yes, he's a Doctor of Medicine.

At no point prior to this point was it established that this marine biologist and journalist was also a Doctor of Medicine, or any kind of medical practitioner. Nor is it ever mentioned again after this small segment of the book. It comes up completely out of the blue and is never mentioned again.

It makes little sense. While of course it's not impossible, it's unlikely that someone would go through the 6+ years of medical school, plus all the extra years of training needed to get a doctorate in medicine, and then also become a marine biologist. It especially makes very little sense because this quite important aspect of his career was never mentioned nor alluded to in any way prior to this (and is completely forgotten after this).

It also makes very little sense that Nemo would have a submarine with a crew of dozens of people, and no doctor aboard. (It is never established in the book that there was a doctor but he eg. died or something along those lines. No mention of any kind is made about why there is no doctor among the crew of the Nautilus. It's left completely unexplained and unmentioned.)

The impetus for writing this in the book, completely out of the blue, is so that Aronnax would be taken to parts of the ship that he was normally completely banned from entering, so that he could get a look at more of the ship and get an estimate of the crew size (this is specifically alluded to in the subsequent narrative).

And this is precisely why I consider this a clear example of bad writing: This "I'm a Doctor of Medicine" is pretty much a Deus ex Machina, a completely illogical out-of-the-blue plot point that was not established and is never used again, just for such a small thing as the protagonist being taken to parts of the ship that were off bounds for him before.

There would be a million ways to achieve the same thing in a more logical manner, without resorting to such a stretch and illogical detail.

And this is not an isolated case of such bad writing. Verne's books are full of such examples. This is just one of them.