Meet The GeForce GTX Titan X

Now that we’ve had a chance to look at the GM200 GPU at the heart of GTX Titan X, let’s take a look at the card itself.

From a design standpoint NVIDIA put together a very strong card with the original GTX Titan, combining a revised, magnesium-less version of their all-metal shroud with a high performance blower and vapor chamber assembly. The end result was a high performance 250W card that was quieter than some open-air cards, much quieter than a bunch of other blowers, and shiny to look at to boot. This design was further carried forward for the reference GTX 780 series, its stylings copied for the GTX Titan Z, and used with a cheaper cooling apparatus for the reference GTX 980.

For GTX Titan X, NVIDIA has opted to leave well enough alone, having made virtually no changes to the shroud or cooling apparatus. And truth be told it’s hard to fault NVIDIA right now, as this design remains the gold (well, aluminum) standard for a blower. Looks aside, after years of blowers that rattled, or were too loud, or didn’t cool discrete components very well, NVIDIA is sitting on a very solid design that I’m not really sure how anyone would top (but I’d love to see them try).

In any case, our favorite metal shroud is back once again. Composed of a cast aluminum housing and held together using a combination of rivets and screws, it’s as physically solid a shroud as we’ve ever seen. Meanwhile having already done a partial black dye job for GTX Titan Black and GTX 780 Ti – using black lettering a black-tinted polycarbonate window – NVIDIA has more or less completed the dye job by making the metal shroud itself almost completely black. What remains are aluminum accents and the Titan lettering (Titan, not Titan X, curiously enough) being unpainted aluminum as well. The card measures 10.5” long overall, which at this point is NVIDIA’s standard size for high-end GTX cards.

Drilling down we have the card’s primary cooling apparatus, composed of a nickel-tipped wedge-shaped heatsink and ringed radial fan. The heatsink itself is attached to the GPU via a copper vapor chamber, something that has been exclusive to GTX 780/Titan cards and provides the best possible heat transfer between the GPU and heatsink. Meanwhile the rest of the card is covered with a black aluminum baseplate, providing basic heatsink functionality for the VRMs and other components while also protecting them.

Finally at the bottom of the stack we have the card itself, complete with the GM200 GPU, VRAM chips, and various discrete components. Unlike the shroud and cooler, GM200’s PCB isn’t a complete carry-over from GK110, but it is none the less very similar with only a handful of changes made. This means we’re looking at the GPU and VRAM chips towards the front of the card, while the VRMs and other discrete components occupy the back. New specifically to GTX Titan X, NVIDIA has done some minor reworking to improve airflow to the discrete components and reduce temperatures, along with employing molded inductors.

As with GK110, NVIDIA still employs a 6+2 phase VRM design, with 6 phases for the GPU and another 2 for the VRAM. This means that GTX Titan X has a bit of power delivery headroom – NVIDIA allows the power limit to be increased by 10% to 275W – but hardcore overclockers will find that there isn’t an extreme amount of additional headroom to play with. Based on our sample the actual shipping voltage at the max boost clock is fairly low at 1.162v, so in non-TDP constrained scenarios there is some additional headroom through overvolting, up to 1.237v in the case of our sample.

In terms of overall design, the need to house 24 VRAM chips to get 12GB of VRAM means that the GTX Titan X has chips on the front as well as the back. Unlike the GTX 980 then, for this reason NVIDIA is once again back to skipping the backplate, leaving the back side of the card bare just as with the previous GTX Titan cards.

Moving on, in accordance with GTX Titan X’s 250W TDP and the reuse of the GTX Titan cooler, power delivery for the GTX Titan X is identical to its predecessors. This means a 6-pin and an 8-pin power connector at the top of the card, to provide up to 225W, with the final 75W coming from the PCIe slot. Interestingly the board does have another 8-pin PCIe connector position facing the rear of the card, but that goes unused for this specific GM200 card.

Meanwhile display I/O follows the same configuration we saw on GTX 980. This is 1x DL-DVI-I, 3x DisplayPort 1.2, and 1x HDMI 2.0, with a total limit of 4 displays. In the case of GTX Titan the DVI port is somewhat antiquated at this point – the card is generally overpowered for the relatively low maximum resolutions of DL-DVI – but on the other hand the HDMI 2.0 port is actually going to be of some value here since it means GTX Titan X can drive a 4K TV. Meanwhile if you have money to spare and need to drive more than a single 4K display, GTX Titan X also features a pair of SLI connectors for even more power.

In fact 4K will be a repeating theme for GTX Titan X, as this is one of the primary markets/use cases NVIDIA will be going after with the card. With GTX 980 generally good up to 2560x1440, the even more powerful GTX Titan X is best suited for 4K and VR, the two areas where GTX 980 came up short. In the case of 4K even a single GTX Titan X is going to struggle at times – we’re not at 60fps at 4K with a single GPU quite yet – but GTX Titan should be good enough for framerates between 30fps and 60fps at high quality settings. To fill the rest of the gap NVIDIA is also going to be promoting 4Kp60 G-Sync monitors alongside the GTX Titan X, as the 30-60fps range is where G-sync excels. And while G-sync can’t make up for lost frames it can take some of the bite out of sub-60fps framerates, making it a smoother/cleaner experience than it would otherwise be.

Longer term NVIDIA also sees the GTX Titan X as their most potent card for VR headsets., and they made sure that GTX Titan X was on the showfloor for GDC to drive a few of the major VR demos. Certainly VR will take just about whatever rendering power you can throw at it, if only in the name of reducing rendering latency. But overall we’re still very early in the game, especially with commercial VR headsets still being in development.

Finally, speaking of the long term, I wanted to hit upon the subject of the GTX Titan X’s 12GB of VRAM. With most other Maxwell cards already using 4Gb VRAM chips, the inclusion of 12GB of VRAM in NVIDIA’s flagship card was practically a given, especially since it doubles the 6GB of VRAM the original GTX Titan came with. At the same time however I’m curious to see just how long it takes for games to grow into this space. The original GTX Titan was fortunate enough to come out with 6GB right before the current-generation consoles launched, and with them their 8GB memory configurations, leading to a rather sudden jump in VRAM requirements that the GTX Titan was well positioned to handle. Much like 6GB in 2013, 12GB is overkill in 2015, but unlike the original GTX Titan I suspect 12GB will remain overkill for a much longer period of time, especially without a significant technology bump like the consoles to drive up VRAM requirements.

GM200 - All Graphics, Hold The Double Precision Our 2015 GPU Benchmark Suite & The Test
Comments Locked

276 Comments

View All Comments

  • Refuge - Thursday, March 19, 2015 - link

    Honestly this looks more like a Ti than a Titan.
  • D. Lister - Tuesday, March 17, 2015 - link

    Nice performance/watt, but at $1000, I find the performance/dollar to be unacceptable. Without a double-precision edge, this GPU is essentially a 980Ti, and Nvidia seems to want to get away with slapping on a Titan decal (and the consequential 1K price tag) by just adding a useless amount of graphics memory.

    Take out about 4 gigs of VRAM, hold the "Titan" brand, add maybe 5-10% core clock, with an MSRP of at least $300 less, and I'll be interested. But I guess, for Nvidia to feel the need to do something like that, we'll have to wait for the next Radeon launch.
  • chizow - Tuesday, March 17, 2015 - link

    It's 980Ti with double the VRAM, a year earlier, if you are going off previous timelines. Don't undervalue the fact this is the first big Maxwell only 6 months after #2 Maxwell.

    I agree the pricing has gotten ridiculous on these graphics cards, but this is the market we live and play in now. I typically spent $800-$1000 every 2 years on graphics cards, but I would get 2 flagship cards. After the whole 7970/680 debacle where mid-range became flagship, I can now get 2 high-end midrange for that much, or 1 super premium flagship. Going with the flagship, and I'm happy! :D
  • D. Lister - Tuesday, March 17, 2015 - link

    @chizow
    It's 980Ti with double the VRAM
    Yes, pretty much - a Ti GPU, with more VRAM than necessary, with the price tag of a Titan.
    I agree the pricing has gotten ridiculous on these graphics cards, but this is the market we live and play in now.
    The market is the way it is because we, consumers, let it be that way, through our choices. For us to obediently accept, at any time, overpricing as an acceptable trend of the market, is basically like agreeing with the fox who wants to be a guard for our henhouse.
  • chizow - Wednesday, March 18, 2015 - link

    Except the 780Ti came much later, it was the 3rd GK210 chip to be released, so there is a premium on that time and money. While this is the 1st GM200 based chip, no need to look any further beyond it. Also, how many 780Ti owners complained about not enough VRAM? Looks like Nvidia addressed that. There's just no compromises with this card, its Nvidia's best foot forward for this chip and only 6 months after GTX 980. No complaints here and I had plenty when Titan launched.

    Sure the market is this way partially because we allow it, but the reality is, the demand is overwhelmingly there. I was thoroughly against paying $1000 for what I used to get for $500-$650 for Nvidia's big chip flagship card with the original Titan, but the reality is, Nvidia has raised the bar on all fronts (and AMD has done well also) and they are looking to be rewarded for doing so. I used to buy 2x cards before because 1 just wasn't good enough. Now, 1 is good enough, so I don't mind paying the same amount for that relative level of performance and enjoyment.
  • D. Lister - Wednesday, March 18, 2015 - link

    @chizow
    Except the 780Ti came much later, ...... plenty when Titan launched.
    Both the 780Ti and the Titan X were released exactly when Nvidia needed them in the market. For the 780Ti, the reason was to challenge the 290X for the top spot. The Titan X was made available sooner because a) Nvidia needed the positive press after the 970 VRAM fiasco and b) because Nvidia wanted to take some attention away from the recent 3xx announcements by AMD.

    Hence I really can't find any logical reason to agree with your spin that the Nvidia staff was doing overtime as some sort of a public service, and so deserve some reward for their noble sacrifices.

    Sure the market is this way partially because we allow it, but the reality is, the demand is overwhelmingly there. I was thoroughly against paying $1000 for what I used to get for $500-$650 for Nvidia's big chip flagship card with the original Titan, but the reality is, Nvidia has raised the bar on all fronts (and AMD has done well also) and they are looking to be rewarded for doing so. I used to buy 2x cards before because 1 just wasn't good enough. Now, 1 is good enough, so I don't mind paying the same amount for that relative level of performance and enjoyment.
    http://media2.giphy.com/media/13ayyyRnHJKrug/giphy...
  • chizow - Monday, March 23, 2015 - link

    Uh, you make a lot of assumptions while trying to dismiss the fact there is a huge difference in time to market and relative geography on Nvidia's release timeline for Titan X, and that difference carries a premium to anyone who observed or felt burned by how Titan and Kepler launches played out over 2012, 2013, 2014.

    Fact remains, Titan X is the full chip very close to the front of Maxwell's line-up release, while the 780Ti came near the end of Kepler's life cycle. The correct comparison is if Nvidia launched Titan Black in 2013 instead of the original Titan, because that's what Titan X is.

    The bolded portion should be pretty easy to digest, not sure why you are having trouble with it. Nvidia's advancement on the 28nm node has been so good (someone showed a 4x increase from the 40nm GTX 480 to the Titan X, which is damn amazing on the same node) and the relatively slow advancement in game requirements mean I no longer need 2 GPUs to push the game resolutions and settings I need. A single, super flagship card is all I need, and Nvidia has provided just that with the Titan X.

    For those who don't think it is worth it, you can always wait for something cheaper and faster to come along, but for me, I'm good until Pascal in 2016 (maybe? Oh wait, don't need to worry about that).
  • chizow - Tuesday, March 17, 2015 - link

    Bit of a sidenote, but wow looks like 980 SLI scaling has REALLY improved in the last few months. I don't recall it being that good at launch, but that's not a huge surprise given Maxwell was a new architecture and has gone through a number of big (on paper) driver improvements. Looks really good though, made it harder to go with the Titan X over a 2nd 980 for SLI, but I think I'll be happier this way for now.
  • mdriftmeyer - Tuesday, March 17, 2015 - link

    Buy these like hotcakes. And when the R9 390/390X arrives in June I pick either up and laugh at all that used hardware being dumped on EBay.
  • TheJian - Tuesday, March 17, 2015 - link

    You're assuming they'll beat this card, and I doubt you'll see them in June as the channel is stuffed with AMD's current stuff. I say Q3 and won't be as good as you think. HBM will cause pricing issues, won't net any perf (isn't needed, bandwidth isn't a problem, so wasted extra cost here) so the gpu will have to win on it's own vs. NV. You'd better hope AMD's is good enough to sell like hotcakes, as they really need the profits finally. This Q already wasted and will result in a loss most likely, and NV is good for the next 3 months at least until something competitive arrives, at which point NV just drops pricing eating any chance of AMD profits anyway. AMD has a very tough road ahead and console sales drop due to mobile closing the gap at 16/14nm for xmas (good enough that is, to have some say screw a console this gen, and screw $60 game pricing - go android instead).

Log in

Don't have an account? Sign up now