The dedicated graphics memory on graphics cards has been the topic of rigorous discussion. The release of NVIDIA’s midrange Ada Lovelace GPU has particularly fueled this issue. The GeForce RTX 4070 has a 12GB VRAM buffer, while the RTX 4060 Ti is limited to just 8GB (16GB is yet to roll out). Many games require 10-12GB of graphics memory, even at 1080 Ultra. A good example is Hogwarts Legacy which crashes at 1080p Ultra on an RTX 4060 Ti minutes into the scene.
Diablo 4 is a better-optimized game, but its ultra-quality texture preset requires 16GB of dedicated graphics memory. At this preset, it looks similar to the PlayStation 5 port of the game, with consistent frame pacing and a lag-free experience.
Digital Foundry ran the game through its paces and found that the Ultra quality textures on the PC look identical to the PS5 port of the game. While you can still run the game on Ultra on an 8GB graphics card, you’ll often face stutters and frame rate drops into the 20s:
On a Ryzen 5 3600 system with an RTX 2080 and 16GB of DDR4 memory, Digital Foundry experienced frequent drops into the sub-30 FPS territory, making the whole experience less than playable. We suspect a 12GB card will do better with fewer drops, but a 16GB VRAM is required for smooth gameplay.
The GeForce RTX 4070 12GB performs worse than a Radeon RX 6800 XT in Diablo 4 because of this particular reason. It delivers lower averages and lows due to constant VRAM bottlenecks, making it worse than the two-year-old (and cheaper) Radeon RX 6800 XT.