When Nvidia introduced for the first time their new smart upscaling technology, DLSS (which uses neural networks to upscale a lower-resolution image to a higher resolution with a much better end result than a naive upscaling), the idea was simple: To allow even older hardware to run newer games at a decent framerate.
After all, this has been a bane of PC gaming since the beginning of time: Newer games require faster hardware, and thus PCs that are 10 years old usually have a hard time running those games at an acceptable resolution and framerate.
Smart upscaling gives a good solution to that problem: The 10-year-old PC can now render the game at a much lower resolution (thus increasing framerate to acceptable levels) and DLSS upscales it to a higher resolution, with the end result looking almost as good as if the game had been running at that larger resolution to begin with.
(Of course only the RTX 20 series of cards was the next generation that supports DLSS well, but the idea is, of course, that 10+ years after its publication even newer and more demanding games could still be run on that 10-year-old hardware at an acceptable resolution and framerate.)
Some years later Nvidia developed the next step in this idea: Frame generation. In other words, not only could the PC render at a lower resolution (with the end result still looking high-resolution), but for example it could just render every second frame, with the in-between frames being generated by a neural network, this effectively almost doubling the framerate (at a very small expense in latency).
So, for example, an older PC could render the game at a resolution of, say, 960x540 pixels at 30 frames per second, and DLSS would convert it to 1920x1080 pixels at 60 frames per second. Thus, even older hardware could get a full 1080p experience with newer games.
In other words, you wouldn't necessarily need to upgrade your RTX 2070 system to an RTX 6070 (or whatever) in order to run newer games acceptably.
In other words, Nvidia quite clearly originally envisioned game developers just keeping developing and optimizing their games as normal, and Nvidia's DLSS technology helping those games run even in older hardware.
However, that's not what happened.
What instead happened is that now many game developers are using DLSS as a crutch, as an excuse to not have to optimize their games so well. What's happening is that more and more games are now requiring DLSS to run even on the latest hardware at acceptable resolutions and framerates, while not looking any better, for the simple reason that the game developers are saving time and cost by not optimizing their games.
In other words, more and more developers are taking the lazy route of "why spend months optimizing our game to run at 60 FPS when we can just use DLSS?"
So what's happening is that many new games are requiring DLSS even on the newest hardware and, thus, they will not run well in older hardware (including the RTX 20 series), even though that was supposed to be the entire core idea of DLSS! Even with DLSS the games will run like crap on older hardware.
It doesn't exact help that more and more developer studios are, for some reason, jumping onto the Unreal Engine 5 bandwagon, and said engine is, for some reason, astonishingly inefficient (much more so than Unreal Engine 4, even for content that looks the same.)
More and more people are noticing how utterly inefficient Unreal Engine 5 games are, especially compared to Unreal Engine 4 games. And what's worse, the former do not look particularly better than the latter. (In fact, sometimes it's even the opposite: Many current Unreal Engine 5 games actually look visually worse than many Unreal Engine 4 games from 10 years ago.)
Why are so many game studios jumping to Unreal Engine 5, rather than Unreal Engine 4? I have no idea.
The situation has become so bad that several recently published games are actually listing DLSS and frame generation in their recommended specs. Astonishingly, some are even listing them in their minimum required specs!
And some of those games don't even visually justify that requirement. The most infamous recent example being the latest Lego Batman, which lists DLSS as a minimum requirement even though the game has the visual quality that many games over 15 years ago had.
Can you guess which game engine that Lego Batman uses? (If you guessed "Unreal Engine 5", you would be absolutely correct.)
This entire thing is getting completely out of control. We are already getting games in the Lego series that require DLSS to run properly.