It’s not that I don’t like realistic graphics. But I’m not gonna pay 100 bucks per game + micro transactions and / or live service shenanigans to get it. Nowadays it’s not even that hard to have good looking games, thanks to all the work that went into modern engines. Obviously cutting edge graphics still need talented artists who create all the textures and high poly models but at some point the graphical fidelity gained becomes minuscule, compared to the effort put into it (and the performance it eats, since this bleeds into the absurd GPU topic too).
There’s also plenty of creative stylization options that can be explored that aren’t your typical WoW cartoon look that everyone goes for nowadays. Hell, I still love pixel art games too and they’re often considered to be on the bottom end of the graphical quality (which I’d heavily disagree with, but that’s also another topic).
What gamers want are good games that don’t feel like they get constantly milked or prioritize graphics over gameplay or story.
I agree 100%. I’d love a AAA game that uses the studio’s clout not for cutting edge graphics but a stellar, polished story and gameplay. The story doesn’t even need to be DEEP, just solid.
I always wonder how some big ass studio announcing a title that uses (high quality) 2D or 2.5D graphics would go. Like, pump it full of many hours of great gameplay and gut and / or heart wrenching story, with lovely & beautiful art, in 2025+. No online account requirements, no Denuvo, no micro or macro transactions, just a solid buy to play title that’s a blast to get immersed in. The problem is that suits would not dare to even try this, just like they don’t dare to try anything else that’s not your standard formula customer milking. And that’s how you get the 20iest iteration of generic graphic bliss with hundreds or even thousands of bucks to spend on macro transactions and other pain the ass bullshit. Innovation for the big companies is dead, which is why I focus so much on Indie studios and smaller developers now. At least there’s still some honest passion behind those games.
This article’s reasoning is faith based. The cornerstone assumption is that industry profits and layoffs obey the preferences of the market.
To those who follow the industry, this is demonstrably false. What follows is the lack of awareness on full display:
and even though Spider-Man 2 sold more than 11 million copies, several members of Insomniac lost their jobs when Sony announced 900 layoffs in February.
Overall good article with some inaccuracies but the answer to the articles question is to me an easy no. The whole industry won’t recover because its an industry. It follows the rules of capitalism and its a constant race to the worse and while good games by good people happen on the side, they happen in spite of the system. Everything else is working as expected and will continue until you pay per minute to stream games you rent with intermittent forced ads and paid level unlocks.
It’s nice to see gaming covered in NYT at all. The article generally rings hollow to me. I’m not an industry expert, but:
- It’s easy to be profitable when you’re just making a sandbox and your players make the games, but at that point are you a game developer? (Roblox)
- High end graphics cards have become so expensive that people can’t afford gaming with good graphics
- AAA developers aren’t optimizing games as well as they used to, so only high end hardware would even run them
- AAA is more focused on loot boxes, microtransactions, season passes, and cinematics all wrapped up in great visuals. That’s at the expense of innovative gameplay and interesting stories. Making the graphics worse won’t get execs to greenlight better games, just uglier ones. And they’ll still be $70.
- Even when games are huge successes and profitable, studios are getting bought and shut down (EA, Microsoft, Sony?), so it’s hard to say the corps are hurting.
High end graphics cards have become so expensive that people can’t afford gaming with good graphics
Not only that, but mid range cards just haven’t really moved that much in terms of performance. The ultra high end used to be a terrible value only for people who want the best and didn’t care about money. Now it almost makes sense from a performance per dollar standpoint to go ultra high end. At launch the 4090 was almost twice the performance of the 4080, but only cost about 1.5x. And somehow the value gets worse the lower end you go.
Meanwhile mid-high end cards like the 4060 and 7600 (which used to be some of the best values) are barely outperforming their predecessors.
Well, everyone has their priorities. The problem is that even the people, who do value realistic graphics the most, are not captured by new AAA games.
Story goes somewhere below Replay value, and controls go to number one. Gameplay and controls are pretty much interchangable unless you want a cinema simulator.
Add sound design as number two above music as number three and then the list is done.
Disagree, story is definitely more important than replayability for me.
All the time i spent playing Dota, Starcraft, battlefield and smash melee says nope. But to even have replayability as a category is pretty pointless.
All other factors lead to replayability if you go by the order I suggested. Having a set story limits replayability, so that tracks for you.
All the time i spent playing Dota, Starcraft, battlefield and smash melee says nope.
Sure, if your metric is hours of gameplay per dollar spent. But that’s no way to rate a video game if you ask me.
For instance I would rate The Talos Principle or Disco Elysium as much better games than, say, World of Warcraft, despite the fact that I played wow much more than the former two. But the story of those two games are just far more interesting and the games have left a much more impactful, lasting impression on me even though I don’t play them any more.
It is hard for me to take seriously a hand-wringing industry that makes more money than most entertainment industries. Capitalism is the primary cause of articles like this. Investors simply demand moar each year, otherwise it is somehow a sign of stagnation or poor performance.
AAA studios could be different, but they choose to play the same game as every other sector. Small studios and independents suffer much more because of the downstream effects of the greedy AAAs establishing market norms.
We need unionization, folks. Broad unionization across sectors to fight against ownership/investor greed. It won’t solve everything but it will certainly stem the worst of it.
There are a number of theories why gamers have turned their backs on realism. One hypothesis is that players got tired of seeing the same artistic style in major releases. Others speculate that cinematic graphics require so much time and money to develop that gameplay suffers, leaving customers with a hollow experience.
Whoosh.
We learned all the way back in the Team Fortress 2 and Psychonauts days that hyper-realistic graphics will always age poorly, whereas stylized art always ages well. (Psychonauts aged so well that its 16-year-later sequel kept and refined the style, which went from limitations of hardware to straight up muppets)
There’s a reason Overwatch followed the stylized art path that TF2 had already tread, because the art style will age well as technology progresses.
Anyway, I thought this phenomena was well known. Working within the limitations of the technology you have available can be pushed towards brilliant design. It’s like when Twitter first appeared, I had comedy-writing friends who used the limitation of 140 characters as a tool for writing tighter comedy, forcing them to work within a 140 character limitation for a joke.
Working within your limitations can actually make your art better, which just complements the fact that stylized art lasts longer before it looks ugly.
Borderlands 1 and 2 still look great in comparison to a lot of games that came out around the same time. The stylized cel-shaded textures help hide the lower-poly environments and really make the world stand out. Most games at the time were trying to go for a “realistic” look that just resulted in bland brown and gray environments that look terrible.
Shout out to Borderlands 1, one of the last game to have some of the best comedy delivered by text, instead of audio.
I actually am in the minority of preferring 1 over 2 because 2 is just so fucking loud. Handsome Jack in my fucking ear for hours on end, refusing to shut the fuck up and let me play the game.
I much much much preferred the quiet reading of Borderlands 1.
Just wanna throw Windwaker into the examples of highly stylized art style games that aged great.
Unfortunately, Cyberpunk is exactly the kind of product that is going to keep driving the realistic approach. It’s four years later now and the game’s visuals are still state-of-the-art in many areas. Even after earning as much backlash on release as any game in recent memory, it was a massively profitable project in the end.
This is why Sony, Microsoft, and the big third parties like Ubisoft keep taking shots in this realm.
I honestly feel like this with Genshin Impact. It looks absolutely breathtaking and in 20 years it will still be beautiful. It runs on a damn potato. I personally like the lighting in a lot of scenes way better than the lighting in some titles that have path tracing.
I have always liked art styles in games better than realism.
Sure, but I’m still going to say “fuck mihoyo”
In what world does Genshin runs well on a potato? Unless you have a different definition of potato than me. My Galaxy S10e can barely play the game, and it’s not even slow enough to be called a potato
Might be talking within the context of PC gaming, where even a relative potato will beat the performance of a flagship phone.
Probably is, but I get why the other fella was confused by this.
Until right this moment I was under the impression that Genshin was literally just a phone game. Looks like I was wrong.
Haha, opposite experience for me! I don’t play it but know some people that do, and I only ever heard about them playing it on their PCs, so it was their comment that made me realize it was also available on phones :P
Yeah i was talking about pc haha. I don’t keep very up to date with phones and don’t know much about their performance.
The game of the year was a cutesy cartoon game about a robot. I don’t think there’s a problem here.
Read the Article pleasw
Yeah I did read the article. That’s why I know what the article is about, and the fact that he’s complaining about graphical fidelity in games and not getting the profit benefit. clearly AAA studios aren’t actually having this issue because, like I said, the winner of the game awards this year was a cartoony game, so clearly they are well aware that graphics aren’t everything.
Didn’t he tell you to read the article??
They did say pleasw.
“Hyperrealistic” weirdly means “more almost realistic”.
Yeah, that frustrates me a lot, too. They almost had it right, that they need to go beyond realism to make truly good-looking games. But in practice, they say that only to show you the most boring-ass graphics known to humanity. I don’t need your pebbles to cast shadows. I can walk outside and find a pebble that casts shadows in a minute tops. Make the pebbles cast light instead, that could look cool. Or make them cast a basketball game. That’s at least something, I haven’t seen yet.
I like the way you think. The logic of video games and what they display don’t have to be limited by anything in the real world. They can invent entirely new forms of perception even (like that Devil Daggers sequel that lets you see behind yourself using colour overlays).
Is there a way to actually read the article without having to be exposed to whatever the drug fueled hellscape that website is?
Maybe it’s just me, but I like the style it’s presented in, and I have major adblockers in service so I’m not sure how it’s a drug fueled hellscape. It basically becomes a normal NYT article after a half-page of scrolling. Not all their readers are familiar with these games, so the NYT is doing its diligence by trying to show what they’re talking about, so their readers have a frame of reference. (Remember the NYT is actually aimed at an investor class who owns a second house in the Hamptons and may not be gamers at all. Go look at their Lifestyle section sometime.)
I think it’s fine but I guess I’m in the minority, but also maybe it’s less worse for me because of uBlock/Pihole/Bypass Paywalls Clean.
I use Firefox’s “reader mode”
Edit: nyt managed to enshittify even that. will wonders never cease
I can’t be bothered to visit any mainstream news site anymore. They’ve made the process of accessing the content so adversarial that there’s no point.
I’d recommend trying RSS and if they don’t support it just quit reading them.
Unfortunately, RSS doesn’t do anything for the links to the NYT in my Lemmy feed.
In a lot of cases, I find I’ve already read the underlying content or skipped it with my reader and therefore can go right to the comments. But ymmv of course.
One way to understand the video game industry’s current crisis is by looking closely at Spider-Man’s spandex.
For decades, companies like Sony and Microsoft have bet that realistic graphics were the key to attracting bigger audiences. By investing in technology, they have elevated flat pixelated worlds into experiences that often feel like stepping into a movie.
Designers of last year’s Marvel’s Spider-Man 2 used the processing power of the PlayStation 5 so Peter Parker’s outfits would be rendered with realistic textures and skyscraper windows could reflect rays of sunlight.
That level of detail did not come cheap.
Insomniac Games, which is owned by Sony, spent about $300 million to develop Spider-Man 2, according to leaked documents, more than triple the budget of the first game in the series, which was released five years earlier. Chasing Hollywood realism requires Hollywood budgets, and even though Spider-Man 2 sold more than 11 million copies, several members of Insomniac lost their jobs when Sony announced 900 layoffs in February.
Cinematic games are getting so expensive and time-consuming to make that the video game industry has started to acknowledge that investing in graphics is providing diminished financial returns.
“It’s very clear that high-fidelity visuals are only moving the needle for a vocal class of gamers in their 40s and 50s,” said Jacob Navok, a former executive at Square Enix who left that studio, known for the Final Fantasy series, in 2016 to start his own media company. “But what does my 7-year-old son play? Minecraft. Roblox. Fortnite.”
Joost van Dreunen, a market analyst and professor at New York University, said it was clear what younger generations value in their video games: “Playing is an excuse for hanging out with other people.”
When millions are happy to play old games with outdated graphics — including Roblox (2006), Minecraft (2009) and Fortnite (2017) — it creates challenges for studios that make blockbuster single-player titles. The industry’s audience has slightly shrunk for the first time in decades. Studios are rapidly closing and sweeping layoffs have affected more than 20,000 employees in the past two years, including more than 2,500 Microsoft workers.
Many video game developers built their careers during an era that glorified graphical fidelity. They marveled at a scene from The Last of Us: Part II in which Ellie, the protagonist, removes a shirt over her head to reveal bruises and scrapes on her back without any technical glitches.
But a few years later, costly graphical upgrades are often barely noticeable.
When the studio Naughty Dog released a remastered version of The Last of Us: Part II this year, light bounced off lakes and puddles with a more realistic shimmer. In a November ad for the PlayStation 5 Pro, an enhanced version of the Sony console that retails for almost $700, the billboards in Spider-Man 2’s Manhattan featured crisper letters.
Optimizing cinematic games for a narrow group of consumers who have spent hundreds of dollars on a console or computer may no longer make financial sense. Studios are increasingly prioritizing games with basic graphics that can be played on the smartphones already in everyone’s pocket.
“They essentially run on toasters,” said Matthew Ball, an entrepreneur and video game analyst, talking about games like Roblox and League of Legends. “The developers aren’t chasing graphics but the social connections that players have built over time.” Going Hollywood
Developers had long taught players to equate realism with excellence, but this new toaster generation of gamers is upsetting industry orthodoxies. The developer behind Animal Well, which received extensive praise this year, said the game’s file size was smaller than many of the screenshots used to promote it.
A company like Nintendo was once the exception that proved the rule, telling its audiences over the past 40 years that graphics were not a priority.
That strategy had shown weaknesses through the 1990s and 2000s, when the Nintendo 64 and GameCube had weaker visuals and sold fewer copies than Sony consoles. But now the tables have turned. Industry figures joke about how a cartoony game like Luigi’s Mansion 3 on the Nintendo Switch considerably outsells gorgeous cinematic narratives on the PlayStation 5 like Final Fantasy VII Rebirth.
There are a number of theories why gamers have turned their backs on realism. One hypothesis is that players got tired of seeing the same artistic style in major releases. Others speculate that cinematic graphics require so much time and money to develop that gameplay suffers, leaving customers with a hollow experience.
Another theory is that major studios have spent recent years reshaping themselves in Hollywood’s image, pursuing crossover deals that have given audiences “The Super Mario Bros. Movie” and “The Last of Us” on HBO. Not only have companies like Ubisoft opened divisions to produce films, but their games include an astonishing amount of scenes where players watch the story unfold.
In 2007, the first Assassin’s Creed provided more than 2.5 hours of footage for a fan edit of the game’s narrative. As the series progressed, so did Ubisoft’s taste for cinema. Like many studios, it increasingly leaned on motion-capture animators who could create scenes using human actors on soundstages. A fan edit of Assassin’s Creed: Valhalla, which was released in 2020, lasted about 23 hours — longer than two seasons of “Game of Thrones.”
Gamers and journalists began talking about how the franchise’s entries had gotten too bloated and expensive. Ubisoft developers advertised last year’s Assassin’s Creed Mirage, which had about five hours of cut scenes, as “more intimate.”
The immersive graphics of virtual reality can also be prohibitive for gamers; the Meta Quest Pro sells for $1,000 and the Apple Vision Pro for $3,500. This year, the chief executive of Ubisoft, Yves Guillemot, told the company’s investors that because the virtual reality version of Assassin’s Creed did not meet sales expectations, the company was not increasing its investment in the technology. ImageA person plays a video game on a tablet. Live service games that are playable on mobile devices, like Genshin Impact, can generate large amounts of revenue. Credit…Ina Fassbender/Agence France-Presse — Getty Images
Many studios have instead turned to the live service model, where graphics are less important than a regular drip of new content that keeps players engaged. Genshin Impact, by the studio Hoyoverse, makes roughly $2 billion every year on mobile platforms alone, according to the data tracker Sensor Tower. Going Broke?
It was clear this year, however, that the live service strategy carries its own risks. Warner Bros. Discovery took a $200 million loss on Suicide Squad: Kill the Justice League, according to Bloomberg. Sony closed the studio behind Concord, its attempt to compete with team-based shooters like Overwatch and Apex Legends, one month after the game released to a minuscule player base.
“We have a market that has been in growth mode for decades,” Ball said. “Now we are in a mature market where instead of making bets on growth, companies need to try and steal shares from each other.”
Some industry professionals believe there is a path for superb-looking games to survive the cost crunch.
“I used to be a high-fidelity guy; I would log into games and if it didn’t look hyperrealistic, then it was not so interesting,” said David Reitman, a managing director at PricewaterhouseCoopers, where he leads the consulting firm’s games division. “There was a race to hyperrealism, and it’s tough to pivot away. You have set expectations.”
Reitman sees a future where most of the heavy costs associated with cutting-edge graphics are handled by artificial intelligence. He said that manufacturers were working on creating A.I. chips for consoles that would facilitate those changes, and that some game studios were already using smart algorithms to improve graphics further than anything previously seen.
He expects that sports games will be the first genre to see considerable improvements because developers have access to hundreds of hours of game footage. “They can take feeds from leagues and transpose them into graphical renderings,” Reitman said, “leveraging language models to generate the incremental movements and facial expressions of players.”
Some independent developers are less convinced. “The idea that there will be content from A.I. before we figure out how it works and where it will source data from is really hard,” said Rami Ismail, a game developer in the Netherlands.
Ismail is worried that major studios are in a tight spot where traditional games have become too expensive but live service games have become too risky. He pointed to recent games that had both jaw-dropping realism — Avatar: Frontiers of Pandora (individual pebbles of gravel cast shadows) and Senua’s Saga: Hellblade II (rays of sunlight flicker through the trees) — and lackluster sales.
He recalled a question that emerged early in the coronavirus pandemic and has become something of an unofficial motto in the video game industry.
“How can we as an industry make sho
I linked the gift article. This link shouldn’t be necessary, right?
The archive link:
- Doesn’t have a tracker.
- Works for people who restrict scripts in their browsers (good security practice).
- Will still be useful when nytimes.com eventually disables your gift ID or takes the article down.
Fair enough.
Oh man… I still can’t read it because of the atrocious background. I was hoping this link would have just been normal text.
You can select the text that’s over that background to make reading easier. Most of the article is below it, so you should be fine after a couple taps of Page Down.
Or use Firefox reader view, which cleans it right up. :)
Firefox reader mode fixes that background.
GSC in my opinion ruined stalker 2 in the chase for “next gen” graphics. And modern graphics are now so dependent on upscaling and frame gen, sad to see but trailers sell.
My favourite games don’t look nearly as good as in my memory. Graphics don’t matter, they might even hurt, because there is less left to imagination.
I’d say it’s less about imagination than gameplay. I’m reminded of old action figures. Some of them were articulated at the knees, elbows, feet, wrists, and head. Very posable, but you could see all the joints. Then you had the bigger and more detailed figures, but they were barely more than statues. Looked great but you couldn’t really do anything with them.
And then you had themed Lego sets. Only a vague passing resemblance to the IP, but your imagination is the limit on what you do with them.
I may be outsider but lower graphic level horror games actually work more for me, because imagination fills the gaps better than engine rendering plastic looking tentacles can
Unpopular opinion but I preferer the graphics of a game were absolute trash but the ost be awesome. I can forget easyly how much individual hairs are in a 3d model, but good OST will live in my mind and heart forever.
And of course gameplay go first.
This is why so many indie games are awesome. The graphics don’t need to be great when the soundtracks and gameplay more than make up for it. Those are what actually matter. I have most of Undertale’s OST committed to memory at this point lol
The Wii was a fantastic example of this. Less capable hardware used in very imaginative ways, and had the capacity to bring older people into the games
Neir Automata had pretty good graphics, but nothing groundbreaking.
The soundtrack is fucking phenomenal.
How hard is it for them to realize this? Graphics are a nice to have, they’re great, but they do not hold up an entire game. Star wars outlaws looked great, but the story was boring. If they took just a fraction of the money they spent on realism to give to writers and then let the writers do their job freely without getting in their way they could make some truly great games.
It’s hard for them to realize because good graphics used to effectively sell lots of copies of games. If they spent their graphics budget on writers, they’d have spent way too much on writing.
Yep, it’s a byproduct of the “bit wars” in the gaming culture of the '80s and '90s where each successive console generation had much more of a visual grqphical upgrade without sacrificing too much in other technical aspects like framerate/performance. Nowadays if you want that kind of upgrade you’re better off making a big investment in a beefy gaming rig because consoles have a realistic price point to consider, and even then we’re getting to a point of diminishing returns when it comes to the real noticeable graphical differences. Even back in the '80s/'90s the most powerful consoles of the time (such as the Neo Geo) were prohibitively expensive for most people. Either way, the most lauded games of the past few years have been the ones that put the biggest focus on aspects like engaging gameplay and/or immersive story and setting. One of the strongest candidates for this year’s Game of the Year could probably run on a potato and was basically poker with some interesting twists: essentially the opposite of a big studio AAA game. Baldur’s Gate 3 showed studios that gamers are looking for an actual complete game for their $60, and indie hits such as the aforementioned Balatro are showing then that you can make games look and play great without all the super realistic graphics or immense budget if you have that solid gameplay, story/setting and art style. Call of Duty Black Ops 48393 with the only real “innovation” being more realistic sun glare on your rifle is just asking for failure.
Baldur’s Gate 3 showed studios that gamers are looking for an actual complete game for their $60
This language always misses me. Every game I buy is complete. Adding an expansion to it later doesn’t make it less complete, and it’s not like BG3 wasn’t without major bugs.
I think we landed in a situation where some people don’t understand the different between graphical style and graphical quality. You can have high quality graphics that are still very simplistic. The important part is that they serve their purpose for the title you’re making. Obviously some games benefit from more realistic graphics, like TLoU Part 2 depicted in the thumbnail & briefly mentioned. The graphics help convey a lot of what the game tries to tell you. You can see the brutality of the world they are forced to live in through the realistic depiction of gore. But you can also see the raw emotion, the trauma on the character’s faces, which tells you how the reality of this world truly looks like. But there’s plenty of games with VERY simplistic graphic styles that are still high quality. CrossCode was one of the surprise hits for me a couple years ago and became one of my favorite RPGs, probably only topped by the old SNES title Terranigma. They both have simple yet beautiful graphics that serve them just as well as the realistic graphics of TLoU. Especially the suits / publishers will make this mistake since they are very detached from the actual gaming community and just look at numbers instead, getting trapped in various fallacies and then wonder why things don’t go as well as they calculated.
Look, I’m gonna be real with you, the pool of writers who are exceptionally good at specifically writing for games is really damn small.
Everyone is trained on novels and movies, and so many games try to hamfist in a three-act arc because they haven’t figured out that this is an entirely different medium and needs its own set of rules for how art plays out.
Traditional filmmaking ideas includes stuff like the direction a character is moving on the screen impacting what the scene “means.” Stuff like that is basically impossible to cultivate in, say, a first or third-person game where you can’t be sure what direction characters will be seen moving. Thus, games need their own narrative rules.
I think the first person to really crack those rules was Yoko Taro, that guy knows how to write for a game specifically.
Yoko Taro is also a pretty cool dude. I dig his dedication to living life as he wants in a society/species that strongly pushes conformity.
For anyone not in the know, he doesn’t like being photographed or videoed, so he wears a big freaky smile mask for interviews. I also learned he has a leprechaun mask.
Yeah, but you can’t make a TV ad about good writing.
Sure you can, just do like “reviewers/players gush about ‘riveting plot’ and ‘characters that feel real’ and ‘a truly compelling story’” or whatever it is.