It’s finally time to stop ignoring Intel GPUs

Intel is releasing its second generation of discrete graphics cards next week. Here's why you should keep an eye on them.

It’s finally time to stop ignoring Intel GPUs

Intel is taking another swing at making it among the best graphics cards with the Arc B580, which is set to launch in a matter of days. It’s the first time we’re seeing discrete graphics on desktop packing Intel’s Battlemage architecture, and it’s arriving just weeks before AMD and Nvidia are set to launch new generations.

I’m sure you’ve heard about Intel’s first attempt with discrete GPUs, and all of the problems that ensued. Things have changed quite a bit over the past few years, though. I need to wait until the Arc B580 is here to fully put it through its paces, but based on what Intel has shared so far, it’s a card you should definitely keep an eye on.

Fulfilling AMD’s role

AMD RX 7600 on a pink background.Jacob Roach / Digital Trends

Intel is fulfilling AMD’s traditional role in the GPU market, and that’s only clear if you’ve been following PC hardware for a while. Although AMD and Nvidia were on equal footing more than a decade ago, Nvidia has undoubtedly taken over the lead in flagship performance over the last several generations. During that time, AMD focused more on undercutting Nvidia’s lower-end products, releasing GPUs that never neared flagship performance but delivered solid value for the money.

Get your weekly teardown of the tech behind PC gaming

That’s been changing in recent generations. With the first RDNA generation, AMD pushed into the midrange with the RX 5700 XT, and in the RDNA 2 generation, it went after the flagship crown with the RX 6900 XT — a trend that AMD maintained with the RX 7900 XTX in the current generation. That’s great for PC gamers looking for alternative GPU options, but it’s led AMD to more closely align with Nvidia’s pricing structure.

You see that at the high end with GPUs like the RTX 4080 Super and RX 7900 XTX launching at $1,000, but also lower down the stack. In the past three generations, both AMD and Nvidia have pushed budget workhorse GPUs like Nvidia’s RTX XX60 series and AMD’s RX X600 series toward $300. Previously, budget options like the GTX 1660 and RX 580 sat closer to $200.

That context is important to understand where Intel currently fits in the market. Regardless of what you think about Intel’s GPUs or Intel as a company, there’s no doubt that Arc GPUs serve as a counterweight to the rising prices on budget GPUs from AMD and Nvidia. Intel is also offering GPUs that AMD and Nvidia have ignored. In the previous generation, Nvidia offered the RTX 3050 and AMD the RX 6500 XT at around the $250 mark. This generation, that class of GPU is completely absent.

On pricing alone, it’s worth looking at Intel’s options. Even if the upcoming B580 falls short of Intel’s performance claims, it’d still be a very impressive graphics card. At $250, it’s undercutting AMD and Nvidia, so even if it performs on-par with the RTX 4060 and RX 7600 — Intel claims it’s faster than those two GPUs — it’s still worth considering based purely on price.

More VRAM, fewer problems

The board of the RTX 4090 Super graphics card.Teclab

Intel took another note from AMD’s playbook — higher VRAM capacity. For close to two years now, I and many other reviewers have been screaming from the rooftops about how graphics cards with 8GB of VRAM aren’t suitable for a modern gaming experience, and we’re seeing that play out now. In games like Indiana Jones and the Great Circle, even the RTX 3080 struggles to maintain a playable frame rate at 1080p due to its limited VRAM capacity. And modern 8GB GPUs like the RTX 4060 and RX 7600 don’t have any hope of running the game without some serious limitations.

Intel has addressed the VRAM problem. In the previous generation, even the $300 Arc A770 packed 16GB of VRAM, and with its new B580 and B570, you’re getting 12GB and 10GB, respectively. VRAM capacity is like system memory capacity — more isn’t better unless you’re using it. The problem now is that several games are using more than 8GB, even at 1080p, and that’s a problem that only Intel is tackling right now.

Buying a GPU with more VRAM simply because it has more VRAM isn’t a good idea. That’s a trap we’ve seen in the past, particularly with AMD’s budget options. Today, the situation is different. Intel is meeting the need for more VRAM with budget GPUs that, well, pack more VRAM. That extra capacity won’t boost performance across games, but you’ll be thankful when a game like Indiana Jones and the Great Circle comes along.

Drivers are better, but work is ongoing

 Ryan Shrout plays Shadow of the Tomb Raider on a gaming PC.Intel

I wouldn’t blame you for writing off Intel GPUs when the Arc A770 and A750 first launched. They had a lot of problems. It didn’t matter how much cheaper they were than the competition, nor the performance in benchmark staples like Cyberpunk 2077. The drivers and software had a lot of issues.

DirectX 9 performance wasn’t there at all. You’d see around half of the frame rate as something like the RTX 3060 in a game like Counter-Strike Global Offensive or Payday 2. Outside of DirectX 9, the GPUs struggled with a wide range of games, and Intel had to play whack-a-mole to get everything up to snuff. Even earlier this year, we saw driver updates from Intel that claimed a 268% boost in games like Just Cause 4You don’t get that kind of boost unless there are problems. Perhaps the most infamous example is Starfield, where Intel users had to wait several days before playing the game due to the lack of a driver — the game otherwise wouldn’t run on Intel GPUs.

Those issues are (mostly) behind Intel. For DirectX 9, Intel redesigned its driver package to boost performance. I interviewed Intel’s Tom Petersen around the time, who reiterated that, “it is well understood within our organization that, you know, driver updates are what’s going to make the difference between our success and lack of success.” Today, DirectX 9 performance is on-par with DirectX 11, which is great.

As for individual games, things are better. Although new drivers occasionally deliver performance gains in games, I haven’t seen a driver delivering triple-digit improvements since the beginning of the year. That suggests Intel’s whack-a-mole phase is close to over, if it isn’t over already. Since the release of the Arc A770 and A750, Intel has released 78 new drivers — I counted — and sometimes those drivers are updated within a matter of days of each other.

I’m not saying Intel’s GPUs are free of issues. Even Nvidia and AMD occasionally run into problems, and despite a ton of diligent driver work, discrete GPUs are still a relatively new venture for Intel. Those issues just aren’t as prevalent or severe as they once were. It’s hard to say Intel’s drivers are as iron-clad as AMD and Nvidia’s at this point — I can’t test every single game — but they’re a heck of a lot better than they were two years ago when the Arc A770 and A750 launched.

Something to keep in mind

An exploded view of Intel's Arc A580 GPU.Intel

The Arc B580 isn’t here yet, so I don’t want anyone to leave this article thinking I’m recommending you buy the GPU. That’s a question that I’ll answer in my review of the GPU next week. But a lot has changed with Intel’s GPUs over the past two years, and given what leaks have suggested about Nvidia’s RTX 50-series GPUs and AMD’s RX 8000 options, the B580 might be a very compelling option.

Of course, make sure to read several reviews before making a decision when the card launches on December 13. The main thing I want to encourage you to do is not count the GPU out. It’s possible that we’ll see cards reaching down to $250 from AMD and Nvidia, but I suspect they won’t launch for several months, if not a year from now. That leaves the B580 in a spot that Nvidia and AMD have largely ignored.