Nvidia RTX 4080 review: performance, for a price
Nvidia is back with another new 40-series card. The RTX 4080 can deliver impressive 4K gaming compared to the RTX 30-series. Continue reading…
The RTX 4090 stunned me with its 4K performance, delivering frame rates I’ve only ever seen at 1440p before. If the RTX 4090 was a beast at 4K, the RTX 4080 is far tamer and more of a direct replacement for the RTX 3080 Ti or RTX 3080 than something that can come close to the RTX 4090.
While the performance gap between the RTX 3080 Ti and RTX 3090 was small, there’s a much bigger disparity between the RTX 4080 and RTX 4090 this time around. Priced at $1,199, the RTX 4080 is $400 less than the RTX 4090 and yet can easily beat the RTX 3090. It delivers impressive performance gains for some of the most demanding games right now.
Despite that, I’m not sold on the pricing. It was an eye-watering price for the RTX 2080 Ti in 2018, didn’t really make sense for the RTX 3080 Ti last year, and I think it’s still a stretch for most people’s wallets in 2022. Nvidia was planning to launch a 12GB model of the RTX 4080 with hugely different specs at a more reasonable $899, but after widespread criticism over confusing naming and specs, it has “unlaunched” that card and left just the 16GB RTX 4080 model.
With AMD’s next-gen RX 7900 XTX and XT GPUs starting at $899 and promising big performance gains, it’s an awkward time for a $1,199 RTX 4080 to arrive.
At first look, it’s surprising how big the RTX 4080 is. While much attention has been paid to the size of some RTX 4090 cards, it turns out that the RTX 4090 and RTX 4080 Founders Edition cards are exactly the same size. So that means it ships in the same comically huge box that the RTX 4090 comes in.
If you put the RTX 4080 and RTX 4090 side by side, you wouldn’t be able to tell the difference, aside from the names printed on them. They’re identical, including the redesigned fan that now includes seven blades instead of nine. The RTX 3080 Founders Edition was a 2-slot design, and even the RTX 3080 Ti managed to maintain that, but the RTX 4080 jumps up to a 3-slot design.
The saving grace of this size is that it’s still 10mm shorter than the RTX 3090, but it has an extra inch in height, making it super chunky. I didn’t have any problems fitting it into my case, but the power adapter could certainly cause issues getting side panels on in certain cases — particularly if you opt for a third-party RTX 4080 where the sizes will go beyond what’s available on this Founders Edition model.
Photo by Tom Warren / The Verge
Nvidia has switched up its power connector on the RTX 40-series cards, and it’s now using the single 12-pin PCIe 5 standard that can deliver up to 600 watts in total. In the box, there’s a 12VHPWR adapter cable in the box so you can connect three eight-pin PCIe power cables.
I recommend against using Nvidia’s included power adapter. The company is still investigating a number of power adapters that have melted on the RTX 4090. While we’re in this transition to power supplies that natively support 12VHPWR, I think it’s worth investing in a single cable solution from third parties like cable mod if you’re spending more than $1,000 on a graphics card alone.
The good news on the power side is the minimum power requirements have stayed the same as the RTX 3080. You’ll need at least a 750-watt power supply, and the RTX 4080 uses up to 320 watts of power (the same as the 10GB RTX 3080). You’ll want a beefier power supply if you’re interested in overclocking the RTX 4080 just for that extra headroom.
The Founders Edition RTX 4080 is a great-looking design, especially when it’s slotted into a case with a see-through side panel. There aren’t many third-party cards that can match this simple design, and I’m glad to see Nvidia hasn’t raised the power requirements here, even if the cable is more complicated.
1440p benchmarks
For 1440p testing, I paired the RTX 4080 with Intel’s new Core i9-13900K processor pushing a 32-inch Samsung Odyssey G7 monitor. This monitor supports refresh rates up to 240Hz as well as Nvidia’s G-Sync technology.
Test machine:
I’ve put the RTX 4080 head-to-head with the RTX 3080, RTX 3080 Ti, RTX 3090, and even the RTX 4090 to see exactly how this latest Ada Lovelace GPU compares to the previous Ampere generation. My testing routine included a variety of AAA games, such as Forza Horizon 5, Assassin’s Creed Valhalla, and Cyberpunk 2077. I’ve also been testing DLSS 3 on Cyberpunk 2077 and Microsoft Flight Simulator, both notoriously demanding games.
All games were tested at max or ultra settings on all of the GPUs tested, and every game apart from Microsoft Flight Simulator managed to deliver frame rates above 100fps. Shadow of the Tomb Raider managed a massive 259fps without any help from DLSS, and even Cyberpunk 2077 pushed out 127fps at 1440p without ray tracing or DLSS on the RTX 4080.
The RTX 4080 is around 50 percent faster than the previous RTX 3080 in most games at 1440p and around 30–40 percent faster than the RTX 4080 Ti. That’s impressive performance, but the RTX 4090 was still around 10–20 percent faster at 1440p.
I’ve also been testing new versions of Cyberpunk 2077 and Microsoft Flight Simulator, which both include DLSS 3. Nvidia’s latest upscaling technique uses the same AI from DLSS 2 alongside a new AI frame generation technology using the new Ada Lovelace architecture. This essentially generates two frames using existing rendering techniques; then, a third frame is inserted between them using the new frame generation tech.
While I’ve noticed the occasional image glitches in Cyberpunk 2077, overall, the result of DLSS 3 is a massive leap in performance that’s exclusive to RTX 40-series cards right now. I’m looking forward to more games getting DLSS 3 support, even if it does slightly bump up the latency in some cases. We haven’t seen DLSS 3 appear in a competitive esports title yet to really get a better understanding of the latency tradeoffs, but overall, the noticeable performance boost outweighs that latency in games like Cyberpunk 2077 and Microsoft Flight Simulator.
With maxed settings and psycho ray tracing and DLSS 2 quality enabled, Cyberpunk 2077 managed 99fps on average with the RTX 4080. Simply flicking the switch to enable frame generation with DLSS 3 made the performance jump around 60 percent to 149fps. You can even go further if you switch to performance DLSS mode, which pushes Cyberpunk 2077 to 190fps at 1440p. These are some big jumps in performance compared to the 60fps found on the RTX 3080 or the 68fps on the RTX 3080 Ti.
Microsoft Flight Simulator also benefits a lot from DLSS 3. In the special test build of Flight Simulator, DLSS 3 boosted performance to an average of 127fps, up from the 59fps I was seeing with DLSS 2. I did notice some strange results with Microsoft Flight Simulator with DLSS 2 across all the RTX cards I tested at 1440p, though, where performance was actually decreasing with DLSS 2 enabled. Nvidia is investigating, but Microsoft Flight Simulator can be particularly sensitive to certain system combinations of CPU and RAM, so this could be a bug that Microsoft and Asobo, Flight Simulator’s developer, need to address.
Photo by Tom Warren / The Verge
4K benchmarks
Over on the 4K side, the RTX 4080 continues to impress. For 4K testing, I paired this GPU with a 31.5-inch Acer Nitro XV2 monitor. This monitor supports refresh rates up to 144Hz, but the RTX 4080 couldn’t always deliver the types of frame rates to really take advantage of it like the RTX 4090 does without dropping settings down.
On average, the RTX 4080 is around 50 percent faster than the RTX 3080 at 4K. Every game I tested apart from Cyberpunk 2077 managed to deliver 60fps or above, and Shadow of the Tomb Raider even managed 143fps without any DLSS help.
Cyberpunk 2077 is really a great test of 4K performance in modern graphics cards. It’s still a massively demanding game, and it provides a good idea of the type of 4K performance you can expect in modern games. While the RTX 4080 wasn’t able to hit close to the impressive 74fps result I saw on the RTX 4090 at 4K, DLSS 3 helps it comfortably deliver above 60fps with everything maxed out and ray tracing enabled.
With DLSS 3, I managed to hit 112fps at 4K on average with everything maxed out, psycho ray tracing, and DLSS 3 performance mode enabled. I once again got some bizarre results for Microsoft Flight Simulator, where the RTX 4080 and RTX 4090 both dropped frames with DLSS 2 enabled, but the RTX 30-series cards all benefited from DLSS 2, as you’d expect.
I also noticed lower-than-expected frame rates in Counter-Strike: Global Offensive across both 1440p and 4K. Nvidia was able to reproduce the results I saw in CS:GO and is currently investigating the performance dips on the RTX 4080 here.
Speaking of bugs, there was some odd behavior with my Samsung Odyssey G7 monitor with the RTX 4080. I couldn’t get any output from the card unless I used another one of my monitors to install the relevant drivers. The Samsung Odyssey G7 worked fine after the drivers were installed, but the display still refuses to wake up during the initial BIOS phase. Nvidia says it has identified the issue and has a fix coming.
Photo by Tom Warren / The Verge
The RTX 4080 is a more than capable card at 4K and easily outperforms the previous RTX 30-series cards it’s here to replace, but if you’re willing to spend the extra $400, then the RTX 4090 really outshines it at 4K. The gap is much less at 1440p, which is where the 4080 really shines for high refresh rate gaming.
DLSS 3 is genuinely transformative for frame rates across 1440p and 4K, and I’m hoping to see a lot more than the 35 games and apps that have been confirmed so far. But the $1,199 pricing of the RTX 4080 means we still don’t have a reasonable entry point into this latest generation of cards.
The controversial 12GB RTX 4080 would have been $899 and is rumored to return RTX 4070 Ti, but if AMD can actually deliver performance that comes close to or beats the RTX 4080 for $899 or $999, then that’s going to put some huge pressure on Nvidia’s RTX 4080 pricing and beyond.
The RTX 4080 is a really tempting upgrade for performance, but I’m waiting to see if AMD can deliver a performance blow that will shake up Nvidia’s pricing.