Battlefield 4

Kicking off our benchmark suite is Battlefield 4, DICE’s 2013 multiplayer military shooter. After a rocky start, Battlefield 4 has since become a challenging game in its own right and a showcase title for low-level graphics APIs. As these benchmarks are from single player mode, based on our experiences our rule of thumb here is that multiplayer framerates will dip to half our single player framerates, which means a card needs to be able to average at least 60fps if it’s to be able to hold up in multiplayer.

Battlefield 4 - 3840x2160 - Ultra Quality - 0x MSAA

Battlefield 4 - 3840x2160 - Medium Quality

Battlefield 4 - 2560x1440 - Ultra Quality

Battlefield 4 is going to set the pace for the rest of this review. In our introduction we talked about how the GTX 980 Ti may as well be the GTX Titan X, and this is one such example why. With a framerate deficit of no more than 3% in this benchmark, the difference between the two cards is just outside the range of standard run-to-run experimental variation that we see in our benchmarking process. So yes, it really is that fast.

In any case, after stripping away the Frostbite engine’s expensive (and not wholly effective) MSAA, what we’re left with for BF4 at 4K with Ultra quality puts the 980 Ti in a pretty good light. At 56.5fps it’s not quite up to the 60fps mark, but it comes very close, close enough that the GTX 980 Ti should be able to stay above 30fps virtually the entire time, and never drop too far below 30fps in even the worst case scenario. Alternatively, dropping to Medium quality should give the card plenty of headroom, with an average framerate of 91.8fps meaning even the lowest framerate never drops below 45fps.

Meanwhile our other significant comparison here is the GTX 980, which just saw its price cut by $50 to $499 to make room for the GTX 980 Ti. At $649 the GTX 980 Ti ideally should be 30% faster to justify its 30% higher price tag; here it’s almost exactly on that mark, fluctuating between a 28% and 32% lead depending on the resolution and settings.

Finally, shifting gears for a moment, gamers looking for the ultimate 1440p card will not be disappointed. GTX 980 Ti will not get to 120fps here (it won’t even come close), but at 77.7fps it’s well suited for driving 1440p144 displays. In fact and GTX Titan X are the single-GPU cards to do better than 60fps at this resolution.

NVIDIA's Computex Announcements & The Test Crysis 3
Comments Locked

290 Comments

View All Comments

  • kyuu - Monday, June 1, 2015 - link

    Witcher 3 runs just fine on my single 290. Is it just the xfire profile? Do you have the new driver and latest patches? Also, have you turned down tesselation or turned off hairworks?
  • PEJUman - Monday, June 1, 2015 - link

    4K... was hoping my U28D590D will have freesync, but alas... no such luck. I am very sensitive to stutter, it gives me motion sickness, to the point I have to stop playing :(

    limiting hairworks to 8x does help, but I really dislike the hair without it. I rather wait for 15.5.1 or 15.6. I have other games to keep me busy for a while.

    I can get 45 avg if I drop to 21:9 ratio using 2840 x 1646, but even then I still get motion sickness from the occasional drops.
  • chizow - Monday, June 1, 2015 - link

    Yes CrossFire support of TW3 is broken from Day1, its a well-known issue. AMD hastily released a driver last week with a CF profile, but its virtually unusable as it introduces a number of other issues with AA and flickering icons.
  • PEJUman - Monday, June 1, 2015 - link

    15.5 no longer flickers with or without AA. still slow though.
  • chizow - Monday, June 1, 2015 - link

    Are you sure? Did they release a follow-up to the 15.5 Beta? Because the notes and independent user feedback stated there was still flickering:

    *The Witcher 3: Wild Hunt - To enable the best performance and experience in Crossfire, users must disable Anti-Aliasing from the games video-post processing options. Some random flickering may occur when using Crossfire. If the issue is affecting the game experience, as a work around we suggest disabling Crossfire while we continue to work with CD Projekt Red to resolve this issue
  • Peichen - Monday, June 1, 2015 - link

    295X2 is indeed faster but it also uses twice as much power. You have to take the 1000W PSU into account as well as one or two additional 120mm fans that's needed to get the heat out the case. When you add up all the extra cost for PSU, fans, electricity, noise and stutter against an overclocked 980Ti (last few pages of review), the slight speed advantage aren't going to be worth it.

    Also, Maxwell 2 supports DirectX 12, I am not so sure about any of the current AMD/ATI cards since they were designed in 2013.
  • xthetenth - Monday, June 1, 2015 - link

    You don't have to buy a new PSU every time you buy a high TDP card, but otherwise a valid point. Going multi-GPU for the same performance requires a much bigger price difference to be worth it vs. a single card.
  • Kutark - Monday, June 1, 2015 - link

    Basically you're gonna spend an extra $5/mo on electricity with that card, or $60/yr vs a 980ti. thats actually pretty huge. Thats at 4hrs/day of gaming, at an average of 12c/kwh. If you game 6 or 7 hours a day, its even worse.

    These high power cards are a little ridiculous. 600w just for one video card?!!
  • Daroller - Monday, June 1, 2015 - link

    I had a GTX690, and I run SLI TITAN X. Dual GPU IS a hindrance. You'd have to be blind, stupid, or a rabid fanboy to claim otherwise. The 295x2 isn't exempt from that just because you dislike NV and harbor a not so secret love for AMD.
  • Daroller - Monday, June 1, 2015 - link

    I had a GTX690, and I run SLI TITAN X. Dual GPU IS a hindrance. You'd have to be blind, stupid, or a rabid fanboy to claim otherwise. The 295x2 isn't exempt from that just because you dislike NV and harbor a not so secret love for AMD.

Log in

Don't have an account? Sign up now