Introducing the NVIDIA GeForce 700M Family

With spring now well under way and the pending launch of Intel’s Haswell chips, OEMs always like to have “new” parts across the board, and so once more we’re getting a new series of chips from NVIDIA, the 700M parts. We’ve already seen a few laptops shipping with the 710M and GT 730M; today NVIDIA is filling out the rest of 700M family. Last year saw NVIDIA’s very successful launch of mobile Kepler; since that time, the number of laptops shipping with NVIDIA dGPUs compared to AMD dGPUs appears to have shifted even more in NVIDIA’s favor.

Not surprisingly, with TSMC still on 28nm NVIDIA isn’t launching a new architecture, but they’ll be tweaking Kepler to keep it going through 2013. Today's launch of the various 700M GPUs is thus very similar to what we saw with the 500M launch: everything in general gets a bit faster than the previous generation. To improve Kepler NVIDIA is taking the existing architecture and making a few moderate tweaks, improving their drivers (which will also apply to existing GPUs), and as usual they’re continuing to proclaim the advantages of Optimus Technology.

Starting on the software side of things, we don’t really have anything new to add on the Optimus front, other than to say that in our experience it continues to work well on Windows platforms—Linux users may feel otherwise, naturally. On the bright side, things like the Bumblebee Project appear to be helping the situation, so now it's at least possible to utilize the dGPU and iGPU under Linux. As far as OEMs go, Optimus has now matured to the point where I can't immediately come up with any new laptop that has an NVIDIA GPU and doesn't support Optimus; we're now at the point where an NVIDIA equipped laptop inherently implies Optimus support.

The second software aspect is NVIDIA’s GeForce Experience software, which allows for automatic game configuration based on your hardware. You can see the full slide deck in the gallery at the end with a few additional details, but GeForce Experience is a new software tool that’s designed to automatically adjust supported games to the “best quality for your hardware” setting. This may not seem like a big deal for enthusiasts, but for your average Joe that doesn’t know what all the technical names mean (e.g. antialiasing, anisotropic filtering, specular highlighting, etc.) it’s a step towards making PCs more gamer friendly—more like a console experience, only with faster hardware. ;-) GeForce Experience is already in open beta, with over 1.5 million downloads and counting, so it’s definitely something people are using.

Finally, NVIDIA has added GPU Boost 2.0 to the 700M family. This is basically the same as what’s found in GeForce Titan, though with some tuning specific to mobile platforms as opposed to desktops. We’re told GPU Boost 2.0 is the same core hardware as GPU Boost 1.0, with software refinements allowing for more fine-grained control of the clocks. Ryan has already covered GPU Boost 2.0 extensively, so we won’t spend much more time on it other than to say that over a range of titles, NVIDIA is getting a 10-15% performance improvement relative to GPU Boost 1.0.

Moving to the hardware elements, hardware change only applies to one of the chips. GK104 will continue as the highest performing option in the GTX 675MX and GTX 680M (as well as the GTX 680MX in the iMac 27), and GK106 will likewise continue in the GTX 670MX (though it appears some 670MX chips also use GK104). In fact, for now NVIDIA isn’t announcing any new high-end mobile GPUs, so the GTX 600M parts will continue to fill that niche. The changes come for everything in the GT family, with some of the chips apparently continuing to use GK107 while a couple options will utilize a new GK208 part.

While NVIDIA won’t confirm which parts use GK208, the latest drivers do refer to that part number so we know it exists. GK208 looks to be largely the same as GK107, and we’re not sure if there are any real differences other than the fact that GK208 will be available as a 64-bit part. Given the similarity in appearance, it may serve as a 128-bit part as well. Basically, GK107 was never available in a 64-bit configuration, and GK208 remedies that (which actually makes it a lower end chip relative to GK107).

GeForce 700M Models and Specifications
Comments Locked

91 Comments

View All Comments

  • Kevin G - Monday, April 1, 2013 - link

    What I'd like to see is an ExpressCard version of the low end parts. I've been working with numerous business class laptops with this expansion slot and I've run into the situation where I could use an additional display. I've used USB adapters but they've been less than ideal. I fathom a low clock GK208 chip and a 64 bit wide memory bus could be squeezed into an ExpressCard form factor. I'd expect it to perform around the level of Intel HD4000 but that'd still be far superior to USB solutions.
  • arthur449 - Monday, April 1, 2013 - link

    While some ExpressCard slots give access to the PCI-E bus, the problem is that the laptop's BIOS/UEFI has to support the device in its whitelist. In almost every situation where people have modded their laptops and attached them to external GPUs, they had to flash a custom ROM to remove compatibility restrictions put in place to limit the amount of compatibility testing the vendor had to conduct.
  • rhx123 - Monday, April 1, 2013 - link

    A surprisingly low amount of laptops needed modification to remove the whitelist on the express card slot, and it is possible to do it with software pre-windows if there is whitelisiting.
    I did not have to whitelist on my Lenovo X220T.
  • JarredWalton - Monday, April 1, 2013 - link

    Cooling would require the majority of the GPU to exist outside of the slot if you go this route. I don't think you could properly route heat-pipes through the relatively thin slot opening with a radiator/fan on the outside. Once you go external, the number of people really interested in the product drops quite a bit, and you'd still need to power the device so on most laptops without a dGPU I expect the external ExpressCard option would also require external power. At that point, the only real value is that you could have an external GPU hooked up to a display and connect your laptop to it for a semi-portable workstation.
  • Kevin G - Monday, April 1, 2013 - link

    It would be crazy to put any of these chips into an ExpressCard form factor without reducing power consumption. I was thinking of dropping the clock down to 400 Mhz and cutting power consumption further with a corresponding drop in voltages. It wouldn't have to break any performance records, just provide full acceleration and drive an external display.

    In hindsight, the GK208 may be too power hungry. The 28 nm Fermi parts (GF117?) should be able to hit the power and thermal allocations for ExpressCard without resorting to an external chassis.
  • Wolfpup - Tuesday, April 2, 2013 - link

    I like the IDEA of a connection to an external dock that allows ANY video card to be used (heck, why not go for SLI?) but notebooks would have to be able to support it-sounds like lots don't, plus tons of notebooks don't have ExpressCard slots anymore (plus not sure if the bandwidth would start being a bottleneck or not). (Or obviously Thunderbolt could theoretically pull this off too...IF you could just boot with any GPU installed and have the external GPU active by the time Windows boots at least).
  • rhx123 - Monday, April 1, 2013 - link

    You can make an external graphics card if you want, I have a 650Ti desktop card attached through ExpressCard.

    It's powered by an XBox PSU.

    http://imgur.com/239skMP
  • rhx123 - Monday, April 1, 2013 - link

    It can drive the internal laptop display through Optimus.
  • Flunk - Monday, April 1, 2013 - link

    Disappointing, this is a really small bump. Mostly a re-labelling of existing parts. Although I suppose it is to be expected seeing as almost all Geforce GT 640m LE-650ms can be clocked up to 1100Ghz with a little bit of bios hacking.
  • JarredWalton - Monday, April 1, 2013 - link

    Besides the fact that nothing runs at 1100GHz (or Ghz, whatever those are), I dare say you've exaggerated quite a bit. Many laptops with even moderate dGPUs run quite warm, and that's with the dGPUs hitting a max clock of around 900MHz (GT 650M with DDR3 and a higher clocked core as opposed to GDDR5 with a lower clocked core). If you manage to hack the VBIOS for a laptop to run what is supposed to be a 500MHz part at 1GHz or more, you're going to overload the cooling system on virtually every laptop I've encountered.

    In fact, I'll go a step further and say that with very few exceptions, overclocking of laptops in general is just asking for trouble, even when the CPU supports it. I tested old Dell XPS laptops with Core 2 Extreme CPUs that could be overclocked, and the fans would almost always be at 100% under any sort of load as soon as you started overclocking. Long-term, that sort of thing is going to cause component failures far more quickly, and on laptops that cost well over $2000 I think most would be quite angry if it failed after a couple years.

    If you understand the risks and don't really care about ruining a laptop, by all means have at it. But the number of laptops I've seen running stock that have heat dissipation issues urges extreme caution.

Log in

Don't have an account? Sign up now