Battlefield 4

Kicking off our benchmark suite is Battlefield 4, DICE’s 2013 multiplayer military shooter. After a rocky start, Battlefield 4 has since become a challenging game in its own right and a showcase title for low-level graphics APIs. As these benchmarks are from single player mode, based on our experiences our rule of thumb here is that multiplayer framerates will dip to half our single player framerates, which means a card needs to be able to average at least 60fps if it’s to be able to hold up in multiplayer.

Battlefield 4 - 3840x2160 - Ultra Quality - 0x MSAA

Battlefield 4 - 3840x2160 - Medium Quality

Battlefield 4 - 2560x1440 - Ultra Quality

Battlefield 4 is going to set the pace for the rest of this review. In our introduction we talked about how the GTX 980 Ti may as well be the GTX Titan X, and this is one such example why. With a framerate deficit of no more than 3% in this benchmark, the difference between the two cards is just outside the range of standard run-to-run experimental variation that we see in our benchmarking process. So yes, it really is that fast.

In any case, after stripping away the Frostbite engine’s expensive (and not wholly effective) MSAA, what we’re left with for BF4 at 4K with Ultra quality puts the 980 Ti in a pretty good light. At 56.5fps it’s not quite up to the 60fps mark, but it comes very close, close enough that the GTX 980 Ti should be able to stay above 30fps virtually the entire time, and never drop too far below 30fps in even the worst case scenario. Alternatively, dropping to Medium quality should give the card plenty of headroom, with an average framerate of 91.8fps meaning even the lowest framerate never drops below 45fps.

Meanwhile our other significant comparison here is the GTX 980, which just saw its price cut by $50 to $499 to make room for the GTX 980 Ti. At $649 the GTX 980 Ti ideally should be 30% faster to justify its 30% higher price tag; here it’s almost exactly on that mark, fluctuating between a 28% and 32% lead depending on the resolution and settings.

Finally, shifting gears for a moment, gamers looking for the ultimate 1440p card will not be disappointed. GTX 980 Ti will not get to 120fps here (it won’t even come close), but at 77.7fps it’s well suited for driving 1440p144 displays. In fact and GTX Titan X are the single-GPU cards to do better than 60fps at this resolution.

NVIDIA's Computex Announcements & The Test Crysis 3
Comments Locked

290 Comments

View All Comments

  • Laststop311 - Monday, June 1, 2015 - link

    how is 6GB the minimum ram needed till finfet gpus? Even at 1440p with max settings no game requires 6GB of ram. Even if a game can use 6GB of ram the way some games are programmed they just use up extra ram if it is available but that used ram isn't crucial to the operation of the game. So it will show a high ram usage when in reality it can use way less and be fine.

    You are overly paranoid. 4GB of ram should be just fine to hold u off a year or 2 till finfet gpus comes out for 1440p res. If you are smart you will skip these and just wait for 2h 2016 where 14/16nm finfet gpu's are going to make a large leap in performance. That generation of gpu's should be able to be kept long term with good results. This is when you would want an 8GB card to keep it running smooth for a good 3-4 years, since you should get good lifespan with the first finfet gpu's.
  • chizow - Monday, June 1, 2015 - link

    Again, spoken from the perspective of someone who doesn't have the requisite hardware to test or know the difference. I've had both a 980 and a Titan X, and there are without a doubt, games that run sluggishly as if you are moving through molasses as soon as you turn up bandwidth intensive settings, like MSAA, texture quality and stereo 3D and hit your VRAM limits even with the FRAPs meter saying you should be getting smooth frame rates.

    With Titan X, none of these problems and of course, VRAM shoots over the 4GB celing I was hitting before.

    And why would I bother to keep running old cards that aren't good enough now and wait for FinFET cards that MIGHT be able to run for 3-4 years after that? I'll just upgrade to 14/16nm next year if the difference is big enough, it'll be a similar 18-24 month timeframe when I usually make my upgrades anyways. What am I supposed to do in this year while I wait for good enough GPUs? Not play any games? Deal with 2-3GB slow cards at 1440p? No thanks.
  • Refuge - Monday, June 1, 2015 - link

    So you are saying I shouldn't be asking questions about something I'm spending my hard earned money on? Not a small sum of which at that?

    You sir should buy my car, it is a great deal, just don't ask me about it. Because that would be stupid!
  • Yojimbo - Monday, June 1, 2015 - link

    He's not questioning your concern, he's questioning your criteria.
  • Peichen - Sunday, May 31, 2015 - link

    Why is the most popular mid-high card: GTX 970, not on the comparison list? It is exactly half the price as 980 Ti and it would be great to see if it is exactly 50% the speed and uses half the power as well.
  • dragonsqrrl - Sunday, May 31, 2015 - link

    It's definitely more than 50% the performance and power consumption, but yes it would've been nice to include in the charts.
  • PEJUman - Monday, June 1, 2015 - link

    Ryan's selection is not random. it seems he selects the likely upgrade candidates & nearest competitors. it's the same reasoning why there is no R9 290 here. most 970 and R9 290 owners probably know how to infer their card performance from the un-harversted versions (980 and 290x).

    Granted, it's odd to see 580 here and 970 will be more valuable technically.
  • mapesdhs - Wednesday, June 3, 2015 - link

    Plus, most requests I've seen on forums have been for 970 SLI results rather than a 970 on its own, as 970 SLI is the more likely config to come anywhere a 980 Ti, assuming VRAM isn't an issue. Data for 970 SLI would thus show where in the various resolution/detail space one sees performance tail off because it needs more than 4GB.
  • bloodypulp - Monday, June 1, 2015 - link

    The 295X2 still crushes it. But blind Nvidia fanboys will claim it doesn't matter because it is either a) not a single GPU or b)AMD (and therefore sucks).
  • PEJUman - Monday, June 1, 2015 - link

    I owns 290 crossfire currently, previously a single 780 TI. Witcher 3 still sucks for my 290 CF, as well as the 295X2. so... depends on your game selections. I also have to spend more time customizing most of my games to get the optimal settings on my 290 CF than my 780TI.

Log in

Don't have an account? Sign up now