Friday, April 17, 2026

Is the "gaming" label in PC peripherals just a marketing gimmick?

For quite a while now, probably 15 to 20 years, a lot of PC peripherals have been marketed with the label "gaming". Heck, even things like chairs have been marked with that label.

But does that label actually mean anything, does it make any actual difference, or is it just a meaningless marketing gimmick?

With some peripherals it may well be completely meaningless, and the device is just completely normal, no different from any non-"gaming" versions from that same manufacturers. 

With some peripherals, such as mice, SSDs, GPUs and RAM, the "gaming" might just be slapped onto higher-end products, such as high-DPI mice and faster SSDs, GPUs and RAM. So it's essentially a marketing gimmick in that it's replacing some technical term (like "high-DPI") with a term that sells better (ie. "gaming"). So, whether that label, "gaming", is actually meaningful is a bit of a matter of definition. In general, not really.

However, there is one type of peripheral where "gaming" might actually be meaningful and it actually affects how the device has been designed and manufactured, rather than it merely being either meaningless marketing drivel, or just a synonym for "higher end product".

And that's "gaming" keyboards. At least in some cases.

How so?

The vast majority of keyboards do not support every single possible combination of simultaneous key press. For example even a simple 104-key keyboard has 2104 possible keypress combinations, which is an absolutely humongous amount. Even a 64-bit value wouldn't be able to represent all of them.

Instead, most if not all keyboards have an internal circuitry design that supports only some combinations of simultaneous keys, but not nearly all of them. Typically the keys are, essentially, internally wired in a sort of grid pattern where keys on different rows and columns of the grid can be recognized simultaneously, but ones on the same rows or columns cannot. (In reality it's a bit more complicated than this, but that's the essential idea.) This saves an enormous amount of circuitry and electronic components, and thus it's much more cost-effective.

This "grid" doesn't need to follow the physical layout of the keyboard, though. The designers can route the connections however they want, thus shuffling the grid elements around to cover pretty much whatever keys they want (so, for example, one "row" of keys might consists of completely and seemingly randomly placed keys on the physical keyboard.)

Thus, the designers of the keyboard circuitry have a choice to make when it comes to which key presses are supported simultaneously and which aren't.

And here's where the "gaming" aspect of the design of a keyboard kicks in, quite literally: Usually the upper left of the alphabetical and numerical keys on a "gaming" keyboard will support significantly more simultaneous key presses than the rest of the keyboard, and this is precisely for better support in video games.

For example, I myself own a "gaming" keyboard, and it supports pressing all the ten keys QWERT plus ASDFG simultaneously without problems. However, if I press merely three keys, V, B and N, simultaneously, only two of them will register (the two that I press first).

An "office" keyboard would not need this peculiar choice of where the "densest" concentration of multiple supported key presses are located, but a "gaming" keyboard most definitely benefits from it.

This might be one of the best examples of where the "gaming" label is not a mere marketing gimmick, but actually indicates a hardware design choice for the explicit support of video games.

Friday, April 10, 2026

Explanation for the astonishingly large "minimum livable" wage in the United States

For quite a while now I have been astonished by what is considered a "minimum livable" wage in the United States. In other words, what is generally considered an absolutely minimum yearly income that allows you to survive, barely, on your own without having to rely on charity or governmental welfare.

One of the most common numbers cited for this is 50000 USD a year, ie. about 42000 €. That would be 4170 USD or 3500 € per month.

That number always makes my jaw drop, and that's because 42000 €/year in most European countries, even in the richest and most expensive-to-live ones, is a very good salary. It's a decent salary of an engineer in the tech industry, and way, way higher than most low-level jobs.

Even in the richest and most expensive European countries (eg. the Nordic Countries), a "minimum livable" wage would be about 1500 €/month, ie. 18000 €/year (about 21000 USD/year), although many people are able to live independently with salaries as low as 1000 €/month (about 1200 USD/month). It's not great, but it's livable if you don't need huge expenses.

And that's before taxes. On top of it, taxes are much lower in the US than in Europe (particularly the expensive countries), which means that in the US your net income is even larger in comparison to Europe.

So if we put all of that in USD for our American friends, that would be:

  • Generally considered "minimum livable" income:
    • US: 50000 USD/year
    • Europe: 21000 USD/year
  • "Barely survivable" extremely low income:
    • US: 30000 USD/year
    • Europe: 14000 USD/year
  • Decent income for a senior tech engineer:
    • US: 150000+ USD/year
    • Europe: 70000+ USD/year

And that is, as mentioned, before taxes. After taxes the difference is even bigger (because taxes are so much lower in the United States.)

How is this even possible? Well, I did a bit of research about this, and here are a few reasons for the disparity:

* Firstly, typical rent is significantly higher in the United States. Where the monthly rent of a small apartment in a small-to-medium size city in Europe is typically somewhere around 450 USD to 600 USD, in the United States an equivalent small apartment in an equivalent city has typically a monthly rent of 1200 USD to 1500 USD, and even higher. That's like triple. With larger apartments the difference can be even bigger.

There are many economic reasons for this disparity. Also, from the perspective of a "senior tech engineer", apartment rents skyrocket in the cities that are most populated by tech companies where those engineers work. Even a very small apartment could have a monthly rent well in excess of 2500 USD. That's like five times more than the typical small apartment in Europe (even in such cities). It's all about supply and demand.

* Secondly, health insurance is almost mandatory, unless you plan to never get sick or injured. The costs of health insurance vary a lot, but on average the absolute minimum cost is somewhere in the ballpark of 8000 to 10000 USD per year (unless the employer participates in this expense as a job benefit, which some do, but many of the smallest/cheapest companies don't.)

That's the "barely survivable" European income almost on its own (especially after taxes).

In Europe, of course, there are pretty much no expenses related to health services (or even if there are, they tend to be extremely small in comparison.) 

* Thirdly, unlike in Europe, public transport services in the United States are absolutely abysmal. In the biggest cities it can be decent and in some places you can actually survive without owning a car, but in most places owning a car is pretty much mandatory in practice, else you'll have a really hard time getting anywhere (including work.)

Cars and fuel are significantly cheaper in the United States than in Europe (especially the Nordic Countries), but they nevertheless eat a good chunk of your yearly income, easily as much as the health insurance, if not even more. In most of Europe, however, if you can't afford a car in most places it's perfectly possible to survive on public transport only (public transport services tend to be extraordinary, even in small cities and towns.) 

Wednesday, April 1, 2026

False myths: Subliminal ads inserted into movie frames

All the way since the early 1980's and even much earlier there was, at least in many parts of the world, this widespread notion that some movie producers had at least considered inserting a form of subliminal advertising into the movie reels that they were sending to movie theaters, in the form of showing an advertisement picture (eg. for a brand of soda, or whatever) during one frame of the movie, eg. every 24 frames, ie once per second.

The widely believed claim was that since the picture was only shown for one single frame, it would go too fast for anybody to consciously notice, but the subconscious would notice it, especially since it was shown repeatedly once per second during the entire movie, and thus it would create a subconscious craving for that particular product in the viewers.

This notion was so widely believed that, in fact, many countries outright passed laws banning this from being done.

The funny thing is that many people believed that claim, ie, that you wouldn't notice the advertisement picture if it was shown for one single frame, without ever having tested it. This factoid was just repeated over and over and over. I, in fact, heard the factoid from my primary school teacher, who just repeated it seriously, without criticism or doubt. In fact, many people to this day, in 2026, still believe it.

This is particularly funny because of how obviously false it is. Just try it: Create a video at a framerate of 24 frames per second (which was, and still is, the standard framerate for movie theater films), and put a static picture that has nothing to do with the rest of the video each 24th frame, and then play it at that speed: The picture flashing once per second will be extremely obvious. Completely impossible to miss. Even if you show the video to someone who has no idea what's going on will clearly see it.

Even if you don't actually replace the entire frame of the original video with the ad picture, but instead embed the ad (for example, say, the Coca Cola logo) into the original frame, like putting it in a corner with a transparent background, you will still very clearly see it flashing every second (or whatever interval you used). You'll likely even be able to read what it says.

1/24th of a second is not even nearly fast enough for you not to notice it. Neither is 1/30th of a second used in NTSC (and most online videos eg. on YouTube). Not even 1/60th of a second, if you were to create a 60-fps video, would be enough. It might be less obvious and you might be less able to read what it says, but the flashing would still be quite noticeable.

That original myth was just repeated blindly, and people just believed it, without ever raising any doubts or ever actually testing it.