Can you spot the difference? I mean, Low to Medium, or Medium to High, sure, that's easy. But the higher we go, the change is less obvious, outside of your frame rate biting the dust. So, if super-fancy-extreme-ultra graphics options are hardly noticeable, why are developers still adding them to new games, especially when your banging' new PC can't even run them at playable frame rates? Are the GPU manufacturers to blame? I need an answer. Even with the best modern hardware, brand-new triple-A games often struggle to run at maxed-out graphics settings. With the advent of ray tracing and HDR, this has become even more apparent as systems with high-end 30-series cards can barely keep up once you output to a 4K monitor and flip a few in-game switches. DLSS and FSR are helping, but is it worth it to spend your life savings on a Bugatti so you can feel like a badass on your five-minute commute? Look, we're not saying maxed-out settings do nothing. It's just that the visual upgrade you'll notice is minimal, comparatively speaking. After all, at MSRP a 3070 is $499 while a 3090 is basically triple that at $1,499. So, to see whether skipping a rent payment to buy a graphics card is worth it, we built two systems to compare different settings at similar frame rates and asked a few folks from around the office if they could tell the difference. We didn't tell them what was going on. We just asked them to play a section from "Metro Exodus: Enhanced Edition" and to tell us which one looked nicer. Uh... Uh... Uh, I mean... They're the same. I would say this one's better. I want to say I like this one better. It's hard to tell though. I don't know. They look the same. And guess what? Only one person picked correctly. Yeah. Is this the 3090? Yeah, it's the 3090. - Yeah. It's a little bit clearer, but like... I don't know, a tiny bit. Our resident gamer god David was able to tell quickly which system was our RTX 3090 at extreme settings, while everyone else defaulted to the 3070. The results were so skewed, we started doubting the setup by the end. But, nope, it's just hard to tell for most people, particularly those who aren't already gaming on some of the most expensive hardware available. But why? The answer has to do with diminishing returns.
I mean, you're going to enjoy a $70 steak more than a buck-fifty hot dog, but will you enjoy it $68.50 more? I don't know. But fortunately, our friends at Crytek are meat experts, and by meat, I mean video game graphics. They're working on the "Crisis Remastered Trilogy," so we reached out for some answers. They told us that they do have performance targets and are aware of modern hardware like the latest generation of consoles or graphics cards, but there are always hurdles in the way of smooth performance, large environments, 4K textures, particle effects, and AI being some of the bigger ones to get over. The first step is getting all of this into the game and running in the first place, while optimization typically comes later. Usually, the more time for optimization, the better the game will run on most hardware. When there's a lot of hardware to optimize for, things can get messy. And if your setup's cable management is messy, make sure to pick up some cable ties from lttstore.com. Like these. They're so good. Targets have scaled up over the years, and with modern consoles, we're finally looking at 1440p or even 4K resolutions running at 60 frames per second, at least on medium or low settings. If your hardware is getting older, which it is for a lot of us according to the Steam hardware survey, and my back! Ooh! Some settings have a much bigger impact than others, particularly shadows and particle effects. You've probably found yourself tinkering with settings one by one to try to get the best mix of frame rate to fidelity, only to have trouble even noticing the difference sometimes, especially if there are no example images or videos to show you in real time. Why does Ultra and Very High in many cases look basically identical, but suddenly you go from 70 FPS to 40? The real problem often isn't developers failing to optimize. It's that as things look better and better, it's hard to notice due to the 80/20 rule. Getting 80% of the way there is easy and it's not until we try to squeeze out that extra 20% of detail and depth that we run into problems. Take "Anno," for example. It looks great, but for the best play experience, you need to really zoom the camera out, and adding more objects and textures to render in-frame causes performance to chug significantly. We see similar results when improving draw distance in other games, but at a certain point it's not worth it since we can barely make out distant objects anyway. But we're all dumb apes, and when there's a slider we can move all the way to the right, we just have to stick our grubby fingers in there even though the Low, Medium, and High settings will probably look just fine. Ooh-ooh, ah-ah. Me like sliding. So, here's the example Crytek gave us. At the lowest option you've got low-res textures, a 1,000-meter draw distance, low amounts of particles for effects, and extra details like waves on water are turned off. Very High sounds drastically different, offering 4K textures, a 2,500-meter draw distance, lots of particles, and all the extras turned on. But when you compare High to Very High, the gap is much smaller, with the main difference being texture resolution going from 2K to 4K and a draw distance improvement. If you don't have a 4K monitor to begin with and the foliage is thick, good luck spotting that with the naked eye. Another improvement that can be hard to see past a certain point is tessellation or splitting up polygons into finer pieces. Adding more triangles to smooth out a curved surface is a huge improvement to 3D environments, but unless you start zooming in a lot, you likely won't be able to tell the difference.
However, some things are 100% worth turning on, even if
your frame rate drops significantly. We mentioned draw distance already, but
more realistic shadows and lighting can just make a dull environment mmm,
spring to life. Antialiasing solutions are also huge for smoothing ugly, jagged
surfaces, and higher-resolution textures do look a lot better if you have the
right screen for them. But these options are going to have different effects in
every game, so whatever you're playing, try to test the waters and find the
right balance of frame rate to visual quality for yourself. And speaking of
visual quality, make sure you're subscribed so you don't miss out on our
upcoming Looking Glass Portrait video. Ooh. Now, if we know deep down that
going from Very High to Ultra is either indistinguishable or hardly better, why
do developers keep making these ridiculous settings that only a tiny fraction
of players can run if anyone at all? Well, there's two big reasons One: options
are always good to have. And two: it helps market the game. We all want to buy
the new hotness, especially when it's pretty, but many of us have been duped
into reordering based on what's shown at E3 versus the final product. And while
most people can't run Ultra or Very High settings now, they might be able to in
the future once they upgrade and want to revisit some older games. For now,
though, if you don't have the latest and greatest GPU, which is probably the
case given the hard, hard year we've had, don't worry. These days, Low settings
often look surprisingly good in modern games and in some cases can give you a
competitive advantage, with Medium and High offering a nicer experience if you
can run it. There's no shame in keeping setting on the left, silly ape.