It's a trope as old as humanity: Everything was better in the past, nowadays everything sucks. Music 25 years ago was awesome, modern music is trash. Movies 25 years ago were great, nowadays they are nothing but CGI slop. And, of course, video games in the past were better than today: They might look prettier (well, at least sometimes), but they are worse in most other ways.
However, at least when it comes to video games, particularly certain long-running franchises, this is not just the nostalgia filter speaking. There are many objectively measurable ways in which many newer triple-A games are objectively worse than equivalent triple-A games of 15 or even just 10 years ago. There are, for example, myriads of compilation and commentary videos on YouTube making direct comparisons between such games.
Some examples include:
- In a war game from the early 2010's you could shoot a building with a tank, and its walls would crumble, and if you kept shooting at it, the entire building would crumble. In a modern game in the same game franchise if you shoot a building with a tank, nothing happens to it.
- Likewise shooting at a wooden fence with a firearm would destroy it much more realistically in many war games 15 or so years ago.
- Water effects tended to be much more realistic in many triple-A games 15 or so years ago than in equivalent games (even within the same game franchise) today, such as when wading through the water, shooting at the water, how transparent the water is and how it distorts the ground, etc.
- Many other effects, such as explosions, smoke effects, the effect that projectiles had on walls and so on and so forth, often (and perhaps a bit surprisingly) looked significantly better and more realistic in the older games than today, even within the same franchise.
- Many games had put significantly more effort in making grand-scale physics look more realistic, such as how it looks when an entire high raise building collapses, or a big element (such as a huge antenna) falls off the top of the building.
- Overall, and somewhat ironically, game physics tended to be more polished and look more realistic 15 years ago than they look in many similar games today.
- Many visual effects, such as lighting and reflections, looked better 10 years ago than they look today in some games that have RTX support, when RTX is turned off (in other words, they have to rely on the same rendering techniques as the games from 10 years ago.)
- Many modern games are much heavier to run than games 10-15 years ago even when the graphical and visual quality are set to be very similar (ie. they are on pretty even and comparable ground for comparison.) In fact, many modern triple-A games look objectively worse than games 10-15 years ago when their graphical settings are tuned so that they will run at about the same framerate at the same native resolution (ie. no upscaling) in the same PC.
And all this even though the budgets of these triple-A games are much larger today than they were 15 years ago, even within the same game franchise.
This is not to say that every single video game published in 2025 looks worse and has worse visual and physics effects than the best games published in 2010-2015. However, there is a clear trend that can be seen with many triple-A games, especially when it comes to long-running franchises.
What has caused this?
There are probably myriads of reasons for this, but here are some of the possible reasons:
1) There is less talent and passion today in big game studios
Many of the game developers in big game studios in the era between about 2000-2015 were "old-timey" demo coders and hackers of the 1990's. Computer nerds who learned and coded graphically impressive demos and games out of sheer passion, and who had a great talent, knowledge and coding skills. Many of these people could code in one day something a thousand times more impressive than a modern university graduate could code in a month, and that's no exaggeration.
These "demo coders" and hackers grew up and many of them went to work in the gaming industry, for these big game studios such as EA, Ubisoft and so on. And they brought their talent and passion with them. They would, for example, spend a week implementing very detailed and accurate building crumbling mechanics and physics so that buildings could be destroyed with tank fire, just out of sheer passion and accomplishment.
This is, in fact, one of the reasons why game mechanics jumped in leaps during that era of about 2000-2015, and why many games, particularly towards the end of that era, are so impressive even by today's standards.
However, starting from about that 2010-2015 time period and forward, many of these big game studios started changing. They grew bigger and bigger, budgets grew bigger and bigger, and they became more and more what could be called "industrialized". Many of these big game studios stopped making games out of sheer passion, and instead it became just a means for making money. Deadlines became tighter, overtime and crunching became the norm, and management became less and less tolerant of time being "wasted" by these "demo coders" spending a week or two polishing some irrelevant detail in the physics engine of the game. On top of that the politics of not just western society at large but also within the video game industry was changing, and these game studios started prioritizing things other than talent and expertise.
Many of these 90's "demo coders", who were in the industry out of passion and love for their craft, got fed up and left these studios. Indeed, there has been a quite massive exodus of "old-timey" coders from many of these big studios, such as EA, Ubisoft and several others. Some of them have created their own smaller studios, and others have just moved to something else entirely, being fed up with an industry that just doesn't suit them and their passions anymore.
Thus, these big game studios have been replacing the old "demo coders" with new recruits who have less talent, less knowledge, less skills, and significantly less passion for low-level game development. People who will not spend a week polishing some particular mechanic or effect because they love doing it and have the knowledge and passion to do it. And even the few old-timer hold-outs who still cling to their jobs in these game studios are often held back and hampered by company-internal politics and the new self-entitled recruits who are not there to make great games but to boss others around.
2) Scrum may be hindering polish and innovation
Software development companies, big and small, just love Scrum, and have been adopting it over the last 15 or so years. Scrum is an "agile development" framework that is supposed to make software development more efficient and effective by having the process go through a clear set of steps and plans, where the project is divided into tasks and sub-tasks, which are clearly planned and defined, and which are put into a timeline and sort of priority list, where every programmer takes or is assigned tasks, weekly and daily meetings are held in order to figure out where everybody is at in their current tasks, and to see what to do next.
Among the hundreds and hundreds of similar software development frameworks, Scrum has become a clear favorite and is almost universally used. It has become the de facto standard, and often contrasted with the exact opposite, in other words a complete "wild west" form of "cowboy programming" where everybody does whatever they feel like with little to no supervision, planning, testing, or anything.
There are many good things about Scrum, and when well implemented (which isn't actually easy) it can improve software development. There are also bad things about Scrum which can hinder polish and innovation, particularly in large video games.
One of the major problems with it is that, when tightly implemented and used, it ties the hands of the developers and puts an extremely bright spotlight on everything they are doing: Developers are not free to do whatever they want, and have essentially no leeway on what to do (other than at some level choosing which tasks to do for the next Scrum sprint.) There are no "side projects", no "hobby projects", no "experimentation", no "let's try this to see if it works", no "let's polish this feature a bit, even though nobody asked for it." Every single task, to the most minute level, is clearly defined and assigned to every developer. "Person X does task Y, person A does task B. Period."
Sure, developers are free to suggest and even create new tasks that they come up with, like "research and implement a way for buildings to be destructible by tank fire." However, in a tight Scrum framework they usually are not free to just start doing those tasks: The tasks need to be approved in a planning meeting, and added to the next Scrum "sprint" before anybody can start doing them.
And what happens when the higher-ups see such a task? They start asking "do we really need this? Is this really necessary? There are more important and urgent things to finish first." Such "unnecessary" side tasks are never selected for the next sprint, and thus are shoved aside and forgotten. The developer who had the inspiration and passion to do that kind of task never gets to do it because his hands are tied and he is, essentially, not allowed to do it because he has to go through the process, reveal what he was planning to do, and have it approved. And, thus, "unnecessary" development often does not get approved.
And, thus, Scrum often kills polish and innovation in video games. It removes freedom from developers to engage in new ideas, in what they are passionate and talented about. Suddenly higher-ups start scrutinizing what they are doing, and denying them these "unnecessary" side projects because there are "more important" tasks to do first.
3) Technological innovation is making games worse
Many people have noticed and commented on the fact that technological innovation when it comes particularly to graphics hardware is actually, and very ironically, making games worse.
One of the most prominent examples of this is smart upscaling: This is a technique that allows a game to render at a lower resolution and then for the smart upscaler to scale up the result to the display's native resolution in a way that looks better than a naive upscaler (in other words, the picture doesn't become blurry or pixelated, but retains small details as much as possible.)
The original intent for this feature was, of course, to allow a bit weaker hardware to play games at higher resolutions with decent framerates. After all, weaker gaming hardware on the PC has always been a bane of every gamer who can't afford a top-of-the-line gaming PC: They always need to either lower the graphical quality or the resolution, or both, of new games in order to be able to play at a reasonable framerate. Well, no longer! Now they can play at their native display resolution with pretty much the highest graphical quality, even on weaker hardware! This allows even weaker gaming PCs to play games that are visually almost indistinguishable from top-of-the-line PCs. The trick is that behind the scenes the game is actually rendering at a significantly lower resolution, which is much faster, and then the smart upscaler makes it look almost like it had been rendered at native resolution in the first place.
However, this technological innovation had a huge negative side effect: Many game developers started taking it as an excuse to not to have to optimize their games like they had to do in the past. Why optimize the game to be able to hit that golden 60 frames per second at native 4k, even on high-end PCs, when you can just use the smart upscaler and make it look so? Why spend time optimizing the game when you have this wonderful tool that allows you to bypass all that?
The end result has been, of course, that new games still run like crap on weaker gaming PCs. The smart upscaling technology didn't help those one bit. With only few exceptions, not much changed. Well, except for the fact that games are now looking worse than before because the smart upscaler isn't perfect: It does a decent job at adding missing detail, but it can't beat the game being actually rendered at the display's native resolution in the first place. (Ok, to be fair, there a few situations where the smart upscaler actually produces a better-looking result than when rendering at native resolution. But this is a very hit-or-miss thing: Most things look ok, a few things look actually better, but many things look worse.)
RTX is, of course, the other technological innovation that's causing games to look worse than they did 10 years go, when RTX is turned off (ie. they have to rely on the same rendering techniques as in the past). The developers just can't be bothered with making non-RTX graphics to look as good as they did in the past.
In other words, technological innovation has made game developers lazy, and the end result is often worse than what it was 15 years ago.