NVIDIA's Computex Announcements & The Test

Alongside the launch of the GTX 980 Ti, NVIDIA is also taking advantage of Computex to make a couple of other major technology announcements. Given the scope of these announcements we’re covering these in separate articles, but we’ll quickly go over the high points here as they pertain to the GTX 980 Ti.

G-Sync Variable Overdrive & Windowed Mode G-Sync

NVIDIA is announcing a slew of G-Sync products/technologies today, the most important of which is Mobile G-Sync for laptops. However as part of that launch, NVIDIA is also finally confirming that all G-Sync products, including existing desktop G-Sync products, feature support for G-Sync variable overdrive. As the name implies, this is the ability to vary the amount of overdrive applied to a pixel based on a best-effort guess of when the next frame will arrive. This allows NVIDIA to continue to use pixel overdrive on G-Sync monitors to improve pixel response times and reduce ghosting, at a slight cost to color accuracy while in motion from errors in the frame time predictions.

Variable overdrive has been in G-Sync since the start, however until now NVIDIA has never confirmed its existence, with NVIDIA presumably keeping quiet about it for trade secret purposes. However now that displays supporting AMD’s Freesync implementation of DisplayPort Adaptive-Sync are out, NVIDIA is further clarifying how G-Sync works.

Meanwhile being freshly rolled out in NVIDIA’s latest drivers is support for Windowed Mode G-Sync. Before now, running a game in Windowed mode could cause stutters and tearing because once you are in Windowed mode, the image being output is composited by the Desktop Window Manager (DWM) in Windows. Even though a game might be outputting 200 frames per second, DWM will only refresh the image with its own timings. The off-screen buffer for applications can be updated many times before DWM updates the actual image on the display.

NVIDIA will now change this using their display driver, and when Windowed G-Sync is enabled, whichever window is the current active window will be the one that determines the refresh rate. That means if you have a game open, G-Sync can be leveraged to reduce screen tearing and stuttering, but if you then click on your email application, the refresh rate will switch back to whatever rate that application is using. Since this is not always going to be a perfect solution - without a fixed refresh rate, it's impossible to make every application perfectly line up with every other application - Windowed G-Sync can be enabled or disabled on a per-application basis, or just globally turned on or off.

GameWorks VR & Multi-Res Shading

Also being announced at Computex is a combination of new functionality and an overall rebranding for NVIDIA’s suite of VR technologies. First introduced alongside the GeForce GTX 980 in September as VR Direct, NVIDIA will be bringing their VR technologies in under the GameWorks umbrella of developer tools. The collection of technologies will now be called GameWorks VR, adding to the already significant collection of GameWorks tools and libraries.

On the feature front, the newly minted GameWorks VR will be getting a new feature dubbed Multi-Resolution Shading, or Multi-Res Shading for short. With multi-res shading, NVIDIA is looking to leverage the Maxwell 2 architecture’s Multi-Projection Acceleration in order to increase rendering efficiency and ultimately the overall performance of their GPUs in VR situations.

By reducing the resolution of video frames at the edges where there is already the most optical distortion/compression and the human eye is less sensitive, NVIDIA says that using multi-res shading can result in a 1.3x to 2x increase in pixel shader performance without noticeably compromising the image quality. Like many of the other technologies in the GameWorks VR toolkit this is an implementation of a suggested VR practice, however in NVIDIA’s case the company believes they have a significant technological advantage in implementing it thanks to multi-projection acceleration. With MPA to bring down the rendering cost of this feature, NVIDIA’s hardware can better take advantage of the performance advantages of this rendering approach, essentially making it an even more efficient method of VR rendering.

Getting Behind DirectX Feature Level 12_1

Finally, though not an outright announcement per-se, from a marketing perspective we should expect to see NVIDIA further promote their current technological lead in rendering features. The Maxwell 2 architecture is currently the only architecture to support DirectX feature level 12_1, and with DirectX 12 games due a bit later this year, NVIDIA sees that as an advantage to press.

For promotional purposes NVIDIA has put together a chart listing the different tiers of feature levels for DirectX 12, and to their credit this is a simple but elegant layout of the current feature level situation. The bulk of the advanced DirectX 12 features we saw Microsoft present at the GTX 980 launch are part of feature level 12_1, while the rest, and other functionality not fully exploited under DirectX 11 are part of the 12_0 feature level. The one exception to this is volume tiled resources, which is not part of either feature level and instead is part of a separate feature list for tiled resources that can be implemented at either feature level.

The Test

The press drivers for the launch of the GTX 980 Ti are release 352.90, which other than formally adding support for the new card is otherwise identical to the standing 352.86 drivers.

CPU: Intel Core i7-4960X @ 4.2GHz
Motherboard: ASRock Fatal1ty X79 Professional
Power Supply: Corsair AX1200i
Hard Disk: Samsung SSD 840 EVO (750GB)
Memory: G.Skill RipjawZ DDR3-1866 4 x 8GB (9-10-9-26)
Case: NZXT Phantom 630 Windowed Edition
Monitor: Asus PQ321
Video Cards: AMD Radeon R9 295X2
AMD Radeon R9 290X
AMD Radeon HD 7970
NVIDIA GeForce GTX Titan X
NVIDIA GeForce GTX 980 Ti
NVIDIA GeForce GTX 980
NVIDIA GeForce GTX 780 Ti
NVIDIA GeForce GTX 780
NVIDIA GeForce GTX 680
NVIDIA GeForce GTX 580
Video Drivers: NVIDIA Release 352.90 Beta
AMD Catalyst Cat 15.5 Beta
OS: Windows 8.1 Pro
Meet The GeForce GTX 980 Ti Battlefield 4
Comments Locked

290 Comments

View All Comments

  • FlushedBubblyJock - Wednesday, June 10, 2015 - link

    Thumbs up daroller, 1920x1200 and only 980TI is capable of driving it properly without eye candy loss and fps failures.

    People claim I'm crazy but then I never have to worry about my settings and I can seamlessly choose and change and view and analyze and I'm never frustrated "having to turn down the settings" to make things playable.

    The rest of the world stretches livability to the limit and loves stressing everything to the max an grinding it all down to pathetic perf, all the while claiming "it's awesome !"

    In other words, stupidity is absolutely rampant.
  • Yojimbo - Monday, June 1, 2015 - link

    Value is not performance per price. Value is what benefit is achieved by the purchase of the product. I'll repeat my previous post by asking how can you assume that purchasing a card has "unreasonable value"? If I, as someone who is in the market for a video card, have a range of options to choose from for myself, how can you off-the-cuff judge how much I should be willing to spend to get a better experience (higher resolution, more detailed graphics, smoother game play, etc) from a higher-priced offering compared with a lower-priced offering? You have no idea what my value of those experiences are, so how can you judge whether the cards offer good value or not?

    The people buying those high priced cards are buying them because in their minds they are getting more value from them at the higher price than they would be getting from the lower-priced alternatives. Now people don't always make the most accurate decisions. They can be fooled, or they can have misconceived notions of what they are going to be getting, but the point is that they THINK they are getting more value at the time of purchase.
  • mapesdhs - Wednesday, June 3, 2015 - link

    Or they can simply afford it and want the *best* for other reasons such as being able to max out visuals without ever worrying about VRAM issues. It's wrong to assume such purchases are down to incorrect value judgements. You're imposing your own value perception on someone else.
  • chizow - Sunday, May 31, 2015 - link

    Yeah, unfortunately anyone who actually buys high-end GPUs understands price and performance goes out the window the higher you go up the product chain. Nvidia made their value play to the masses with the 970 at an amazing $330 price point, memory snafu notwithstanding, and the card has sold incredibly well.

    There was no reason for them to drop prices further, and I think most observers will recognize the $650 price point of the 980Ti is actually very aggressive, given there was absolutely no pressure from AMD to price it that low.
  • Kjella - Sunday, May 31, 2015 - link

    If AMD gets to launch first and nVidia must respond, it seems like they're trading blows. If nVidia makes a preemptive strike now, they make AMD's launch seem late and weak. They know AMD is betting on Win10 driving sales, so they could read their launch plan like an open book and torpedo it out of the gate. I think this will be a miserable month for AMD, it's hard to see how GCN-based cards are going to compete with Maxwell, HBM or not.
  • chizow - Sunday, May 31, 2015 - link

    Yep, Nvidia just pre-emptively torpedoed AMD's product launch and set pricing again. All very impressive how Nvidia has gone about 28nm product launches despite the uncertainty we'd see anything new until 14/16nm after word 20nm was cancelled.
  • PEJUman - Monday, June 1, 2015 - link

    I don't think Nvidia is dumb enough to launch 980TI without knowing where FIJI would lay on their stack. I think this is more of a powerplay from them saying, 'here's your price point AMD, good luck'

    like you said, the fact is they have no competitive pressure on titan X, why ruin it's pricing now if you don't know where FIJI would land.

    here's my guess:
    Nvidia just torpedoed their titan X, mainly because FIJI probably around 97% of titan X, and AMD was about to ask 850~1000 USD for it. now Nvidia will launch this 980TI at 650 to control the price. (which I bet they have been readying for quite some time, simply waiting for the right time/price point)
  • Peichen - Monday, June 1, 2015 - link

    I think you are right. Fuji was estimated to be close to Titan but cheaper by $200. Now Nvidia delivered a Fuji-like card for $650, Fuji cannot go above $650. In fact, consider Fuji to be limited to 4GB and hot enough to have a watercooled version, Fuji might have to go below $600 with bundle game or $550 without bundle to make any sense. With the big chip and expensive memory Fuji is using, AMD/ATI's margin on those card are going to be slim compares with Nvidia.
  • PEJUman - Monday, June 1, 2015 - link

    yeah... time to buy AMD stock options :)
    In all honestly though, I really would like to have them around, if only for the 2 horses race...
  • bloodypulp - Monday, June 1, 2015 - link

    For christsake... it's Fiji. NOT Fuji.

Log in

Don't have an account? Sign up now