> If you are so attached to the idea that DLSS is bad for your game experience because of something you saw or read on the internet, I am sorry" he wrote. The CEO also said he believed humans cannot detect any input lag in a blind test, though he seems to be talking about DLSS rather than frame generation.
I'm torn on how I feel about frame gen.
On one hand, it's mostly "free" frames. In MS Flight Simulator 2024, with the details cranked to the near maximum, it turns my somewhat choppy 45 fps (When flying over dense cities like NYC or London) into a buttery smooth ~150 fps.
On the other hand, it's allowing game developers to get lazy with optimization. They're being allowed to target lower frame rates under the assumption that frame gen will pick up the slack, which leaves players with a 3000-series GPU, which isn't that old, in the dust.
I was somewhat shocked when I saw that BL3, a 2019 game, barely even booted on my 2015 gaming desktop. BL1 and 2 were playable on my dirt cheap Dell Vostro laptop with integrated graphics, 4GB RAM and a low power Core i3 processor. I thought it was possible because the BL series' graphics are quite cartoony compared to the ultra-realism of something like Battlefield. But I'm looking at BL4's graphics and they mostly look....the same?
When running a brand new game most savvy gamers don't expect 120fps 4K HDR native on an 3+ year-old hardware. But they might reasonably expect 60fps at 2K native. So, apparently, the CEO - or the PMs that report to him - decided to ship with a performance level which surprised their customers negatively. The most obvious mistake here is if they weren't very clear and upfront with the fact it would require an unusually high spec system and/or synthetic pixel/frame gen to reach resolutions and frame rates customers might expect based on prior experience.
Performance optimization in AAA games is hard and time-consuming. Some game developers believe chasing ever higher visual fidelity will increase sales due to the 'curb-appeal' of the game play trailer. Maybe they're right but it's a double-edged sword because management loves the killer visuals but then will reduce the minimum required frame rate over slipping the schedule to permit performance optimization. Yet marketing will still insist on a game play trailer showing Ultra settings, so it ends up being made with lots of synthetic pixels and frames being inserted because they know YouTube streaming compression will hide much of the degradation. As someone who cares about visual fidelity and frame rate, I've learned to not trust YouTube streams of game play anymore and wait for a technical analysis by someone like Digital Foundry.
It's pretty clear how this keeps happening. The dumb thing is a CEO going on social media trying to argue their customers shouldn't want what they want. The right way to respond is pointing to where they made the higher system specs and requirement to use synthetic pixel & frame modes super clear in the specs, demo videos and other marketing. Synthetic generation can be useful in the right context but companies need to stop acting like it's something they don't need to fully disclose. Being either "artfully vague" or misleading in their marketing is unethical. Much like how in streaming video 'resolution' is meaningless if you don't know the bit rate, in AAA games on modern GPUs resolution and frame rate are now meaningless if synthetic generation tricks are being used.
Borderlands 4 has a cartoonish art style that doesn't require photorealistic graphics, so it's unforgivable that the game runs so poorly. It has stutters and low FPS on the most expensive hardware available to buy.
The “realism” of graphics has nothing to do with performance. It has to do with shaders which, contrary to your point can be expected in a greater degree in a game that uses cartoonish graphics BUT I don’t even think that the shaders are the culprit. It really just is that Unreal Engine 5 requires insanely high specs because of all of the myriad different cutting edge technologies it employs, all of which BL4 seems to take advantage of including Lumen, Nanite, and level streaming (idk the name for it)
> The “realism” of graphics has nothing to do with performance.
Obviously I understand your point that computational complexity is different than the extent to which something is realistic. But it's totally wrong that it "has nothing it do with" it.
Photorealistic scenes require high res textures, higher detail levels in geometry, better shadows, better global illumination, etc...
Cartoonish art styles don't necessarily require any of those. They still benefit from them, but with diminishing returns.
UE 5 requires so high specs even 5090 can't meet. It's a turd of an engine.
Nanite is a way to make cheaper but lower performance assets (yes, it performs worse than manual Level Of Detail optimizations and quite significantly).
Lumen requirements are so high that for it to perform good you basically have to render at 720p. So they do and then upscale, and it's all blurry and still runs like shit.
That is a weird thing to highlight anyway. I assume there are 10-100x more people who face a problem vs file a ticket. Especially for something as nebulous as, “program runs slowly”.
I have a long list of bugs in programs I use everyday, but I am not going to waste my time filing a ticket that will be ignored. GTA5 had famously bad loading times which nobody on the inside cared to fix until a nobody posted a solution to their JSON parser.
As gamer a poor performance is something I have never felt I need to fill ticket for. And I think I am with majority. The developers should bloody well know if their game runs poorly for most users.
There is little expectations for such tickets to do anything. Unless you have some very specific problem which no one else has. And even then... I would be doubtful of them doing anything.
The whole situation feels to me like buying an Audi R8 and after not being able to get over 150kmph,complaining about it and CEO of Audi would tell me to build my own engine when I don't like R8's
The game really doesn't perform great, but it's not impacting the fun I'm having with it, so I decided to stick with it.
What I don't get is why Randy Pitchford seems intent on alienating the player base further by doubling down again and again on there not being a problem. Emotionally, I understand being defensive of one's work, but at a certain point it might be financially advantageous to show some humility or simply ... not say anything. Then again, he's free to do as he pleases.
I think Satisfactory I think is one that works reasonably well even with lot of extra objects on screen. And at point when it start having issues might also be just having lot calculations going on for factory mechanics.
It has relatively static graphics and has some graphical glitches, most notably hair, but others too. First game I immediately turned of motion blur too, it was awful from default.
Really the biggest downside of that game is the engine.
> The exec also said "less than one percent of one percent" of players are filing customer service tickets about performance issues, and asked people to "code your own engine and show us how it's done, please."
???
Why wouldn't we just use your competitors engines instead.
They aren’t even using an engine they built themselves. It’s built on top of Unreal Engine 5 which performs great for a ton of other games. They had to put effort into making it as badly performing as it is. It’s crazy.
> If you are so attached to the idea that DLSS is bad for your game experience because of something you saw or read on the internet, I am sorry" he wrote. The CEO also said he believed humans cannot detect any input lag in a blind test, though he seems to be talking about DLSS rather than frame generation.
I'm torn on how I feel about frame gen.
On one hand, it's mostly "free" frames. In MS Flight Simulator 2024, with the details cranked to the near maximum, it turns my somewhat choppy 45 fps (When flying over dense cities like NYC or London) into a buttery smooth ~150 fps.
On the other hand, it's allowing game developers to get lazy with optimization. They're being allowed to target lower frame rates under the assumption that frame gen will pick up the slack, which leaves players with a 3000-series GPU, which isn't that old, in the dust.
I was somewhat shocked when I saw that BL3, a 2019 game, barely even booted on my 2015 gaming desktop. BL1 and 2 were playable on my dirt cheap Dell Vostro laptop with integrated graphics, 4GB RAM and a low power Core i3 processor. I thought it was possible because the BL series' graphics are quite cartoony compared to the ultra-realism of something like Battlefield. But I'm looking at BL4's graphics and they mostly look....the same?
I can play Borderlands 3 pretty decently on a i7 7820 based laptop with Geforce 1070 graphics.
Borderlands 4 on the other hand...
By the way, blame Unreal 5. Every Unreal 5 game released recently runs like utter crap unless you have luck, godlike hardware or both.
When running a brand new game most savvy gamers don't expect 120fps 4K HDR native on an 3+ year-old hardware. But they might reasonably expect 60fps at 2K native. So, apparently, the CEO - or the PMs that report to him - decided to ship with a performance level which surprised their customers negatively. The most obvious mistake here is if they weren't very clear and upfront with the fact it would require an unusually high spec system and/or synthetic pixel/frame gen to reach resolutions and frame rates customers might expect based on prior experience.
Performance optimization in AAA games is hard and time-consuming. Some game developers believe chasing ever higher visual fidelity will increase sales due to the 'curb-appeal' of the game play trailer. Maybe they're right but it's a double-edged sword because management loves the killer visuals but then will reduce the minimum required frame rate over slipping the schedule to permit performance optimization. Yet marketing will still insist on a game play trailer showing Ultra settings, so it ends up being made with lots of synthetic pixels and frames being inserted because they know YouTube streaming compression will hide much of the degradation. As someone who cares about visual fidelity and frame rate, I've learned to not trust YouTube streams of game play anymore and wait for a technical analysis by someone like Digital Foundry.
It's pretty clear how this keeps happening. The dumb thing is a CEO going on social media trying to argue their customers shouldn't want what they want. The right way to respond is pointing to where they made the higher system specs and requirement to use synthetic pixel & frame modes super clear in the specs, demo videos and other marketing. Synthetic generation can be useful in the right context but companies need to stop acting like it's something they don't need to fully disclose. Being either "artfully vague" or misleading in their marketing is unethical. Much like how in streaming video 'resolution' is meaningless if you don't know the bit rate, in AAA games on modern GPUs resolution and frame rate are now meaningless if synthetic generation tricks are being used.
Borderlands 4 has a cartoonish art style that doesn't require photorealistic graphics, so it's unforgivable that the game runs so poorly. It has stutters and low FPS on the most expensive hardware available to buy.
The “realism” of graphics has nothing to do with performance. It has to do with shaders which, contrary to your point can be expected in a greater degree in a game that uses cartoonish graphics BUT I don’t even think that the shaders are the culprit. It really just is that Unreal Engine 5 requires insanely high specs because of all of the myriad different cutting edge technologies it employs, all of which BL4 seems to take advantage of including Lumen, Nanite, and level streaming (idk the name for it)
> The “realism” of graphics has nothing to do with performance.
Obviously I understand your point that computational complexity is different than the extent to which something is realistic. But it's totally wrong that it "has nothing it do with" it.
Photorealistic scenes require high res textures, higher detail levels in geometry, better shadows, better global illumination, etc...
Cartoonish art styles don't necessarily require any of those. They still benefit from them, but with diminishing returns.
It's cool if they want to take advantage of some fancy UE5 features, but the burden to optimize is on them, especially considering that the game's quality settings look like this: https://www.thegamer.com/borderlands-4-optimal-pc-settings/#...
UE 5 requires so high specs even 5090 can't meet. It's a turd of an engine.
Nanite is a way to make cheaper but lower performance assets (yes, it performs worse than manual Level Of Detail optimizations and quite significantly).
Lumen requirements are so high that for it to perform good you basically have to render at 720p. So they do and then upscale, and it's all blurry and still runs like shit.
> The exec also said "less than one percent of one percent" of players are filing customer service tickets about performance issues
Okay, and how many refunded the game on Steam?
That is a weird thing to highlight anyway. I assume there are 10-100x more people who face a problem vs file a ticket. Especially for something as nebulous as, “program runs slowly”.
I have a long list of bugs in programs I use everyday, but I am not going to waste my time filing a ticket that will be ignored. GTA5 had famously bad loading times which nobody on the inside cared to fix until a nobody posted a solution to their JSON parser.
As gamer a poor performance is something I have never felt I need to fill ticket for. And I think I am with majority. The developers should bloody well know if their game runs poorly for most users.
There is little expectations for such tickets to do anything. Unless you have some very specific problem which no one else has. And even then... I would be doubtful of them doing anything.
Mutahar with RTX5090 made a video where he can't get the performance from the game either.
https://www.youtube.com/watch?v=R0TKmVXcypc
The whole situation feels to me like buying an Audi R8 and after not being able to get over 150kmph,complaining about it and CEO of Audi would tell me to build my own engine when I don't like R8's
150k mph is only .0002 the speed of light. Audi has a perf problem.
.0000002 of the speed of light
Thx for helping make AI dumber!
Did you miss the k?
The game really doesn't perform great, but it's not impacting the fun I'm having with it, so I decided to stick with it.
What I don't get is why Randy Pitchford seems intent on alienating the player base further by doubling down again and again on there not being a problem. Emotionally, I understand being defensive of one's work, but at a certain point it might be financially advantageous to show some humility or simply ... not say anything. Then again, he's free to do as he pleases.
Show me a good Unreal Engine game? Supposedly it can be optimised to be decent but so many of these games don't even run properly on top end hardware.
I think Satisfactory I think is one that works reasonably well even with lot of extra objects on screen. And at point when it start having issues might also be just having lot calculations going on for factory mechanics.
Satisfactory runs pretty well on UE 5. They don't use Lumen though. There is an option to enable it but only if you accept it's tradeoffs.
Expedition 33 is mostly pretty good
It has relatively static graphics and has some graphical glitches, most notably hair, but others too. First game I immediately turned of motion blur too, it was awful from default.
Really the biggest downside of that game is the engine.
The Alters
I’ve never had any issues with The Finals.
"code your own engine" , is that license?
not very well thought out indeed, risking someone will actually roll thier own and everyone will get one.
> The exec also said "less than one percent of one percent" of players are filing customer service tickets about performance issues, and asked people to "code your own engine and show us how it's done, please."
???
Why wouldn't we just use your competitors engines instead.
They aren’t even using an engine they built themselves. It’s built on top of Unreal Engine 5 which performs great for a ton of other games. They had to put effort into making it as badly performing as it is. It’s crazy.