One question that’s been popping up with increasing frequency when we talk about high-end graphics cards is whether 4GB of RAM is enough to power current and next-generation gaming. When we initially covered AMD’s Fury X launch, I promised to return to this topic and cover it in more detail. Before we can hit the data, however, we need to talk about how VRAM management works and what tools are available to evaluate it in DirectX 11.
While it might seem straightforward to test whether or not any given title uses more than 4GB of RAM, the tools for doing this are rather inexact. The GPU itself does not control which data is loaded into memory. Instead, memory management is handled by the operating system and the GPU driver. The GPU tells the OS how much memory it has, but it doesn’t make any decisions about how data is loaded or which data is loaded first.
One way that game developers handle memory management in software is by creating game presets that assume a particular amount of VRAM is present on the card. Low detail might be tuned to run on 512MB cards, while ultra detail assumes you have at least 4GB of VRAM. If you choose a detail level that calls for more VRAM than is present on your card, you’ll likely see a heavy performance hit as the system is forced to load data from main memory.
Some games won’t use much VRAM, no matter how much you offer them, while others are more opportunistic. This is critically important for our purposes, because there’s not an automatic link between the amount of VRAM a game is using and the amount of VRAM it actually requires to run. Our first article on the Fury X showed how Shadow of Mordor actually used dramatically more VRAM on the GTX Titan X as compared with the GTX 980 Ti, without offering a higher frame rate. Until we hit 8K, there was no performance advantage to the huge memory buffer in the GTX Titan X — and the game ran so slowly at that resolution, it was impossible to play on any card.
GPU-Z: An imperfect tool
GPU-Z claims to report how much VRAM the GPU actually uses, but there’s a significant caveat to this metric. GPU-Z doesn’t actually report how much VRAM the GPU is actually using —instead, it reports the amount of VRAM that a game has requested. We spoke to Nvidia’s Brandon Bell on this topic, who told us the following: “None of the GPU tools on the market report memory usage correctly, whether it’s GPU-Z, Afterburner, Precision, etc. They all report the amount of memory requested by the GPU, not the actual memory usage. Cards will larger memory will request more memory, but that doesn’t mean that they actually use it. They simply request it because the memory is available.”
Our own testing backed up this claim; VRAM monitoring is subject to a number of constraints. Resolution switching or visiting more than one area before beginning testing can significantly increase total memory “in use” without actually impacting performance at all. There’s also a moderate amount of variance between test runs. We can say that the GPU requested around 4.5GB of RAM, for example, but one test run might show a GPU topping at 4.3GB while the next showed a maximum RAM consumption of 4.5GB. Reported VRAM consumption can also vary during the game; logging and playthroughs must be carefully managed.
Finding >4GB games
When we started this process, I assumed that a number of high-end titles could readily be provoked into using more than 4GB of VRAM. In reality, this proved a tough nut to crack. Plenty of titles top out around 4GB, but most don’t exceed it. Given the lack of precision in VRAM testing, we needed games that could unambiguously break the 4GB limit.
We tested Assassin’s Creed Unity, Battlefield 4, BioShock Infinite, Civilization: Beyond Earth, Company of Heroes 2, Crysis 3, Dragon Age: Inquisition, The Evil Within, Far Cry 4, Grand Theft Auto V, Metro Last Light (original), Rome: Total War 2, Shadow of Mordor, Tomb Raider, andThe Witcher 3: Wild Hunt. Out of those 15 titles, just four of them could be coaxed into significantly exceeding the 4GB limit: Shadow of Mordor, Assassin’s Creed: Unity, Far Cry 4,and Grand Theft Auto V. Even in these games, we had to use extremely high detail settings to ensure that the GPUs would regularly report well over 4GB of RAM in use.
Our testbed for this project was an Intel Core i7-5960X with 16GB of DDR4-2667 running Windows 8.1 with all patches and updates installed. While Windows 10 has just recently launched, we began this project on Windows 8.1 and wanted to finish it there. Transitioning operating systems would have necessitated a complete retest of our titles. We tested AMD’s Radeon R9 Fury X, Nvidia’s GTX 980 Ti, and the top-end GTX Titan X. With Titan X, we were curious to see if we’d see any benefits to running with a 12GB RAM buffer over and above the 6GB buffer on the GTX 980 Ti.
In every case, the games in question were pushed to maximum detail levels. MSAA was not used, since that incurs its own performance penalty and could warp results, but the highest non-GameWorks settings were used in all standard menus. GW-specific implementations available only on Nvidia hardware were left disabled to create a level testing field. The one exception to this was Grand Theft Auto V, where we used Nvidia’s PCSS shadows for its cards, and AMD’s preferred CHS shadows for the Fury X.
GameWorks, performance, and 4GB VRAM
There’s one common factor that ties three of our four >4GB titles together — GameWorks. Three of the four games that we’ve tested (Far Cry 4, Assassin’s Creed Unity, and Grand Theft Auto V) are Nvidia GameWorks titles, meaning they make use of Nvidia-provided libraries to provide key DirectX 11 functions like ambient occlusion, soft shadows, and tessellation. AMD cannot optimize for these games to the same degree and AMD GPUs tend to perform significantly worse in GameWorks titles than in other games. We’ve discussed GameWorks, its implications, and its impact on various titles at multiple times in the past few years.
One thing I want to stress is that while we’ll be looking at performance data in this article, its primary purpose isn’t to compare how the Fury X stacks up, performance-wise, against Nvidia’s highest-end GPUs. Such comparisons are inevitable, to some extent, but this isn’t a standard review. We’ve created specialized test cases designed to test a theory and used settings that significantly depart from what we consider playable or appropriate for 4K testing. As such, the 4K performance results in this story should not be treated as typical results for Nvidia or AMD. The goal of these tests is to create a worst-case scenario for a GPU with 4GB of VRAM and see what happens as a result.
We covered Shadow of Mordor in our initial Fury X coverage, so this article will concern itself with the three new games we tested. Let’s kick things off with Far Cry 4.