Introducing the NVIDIA GeForce 700M Family

With spring now well under way and the pending launch of Intel’s Haswell chips, OEMs always like to have “new” parts across the board, and so once more we’re getting a new series of chips from NVIDIA, the 700M parts. We’ve already seen a few laptops shipping with the 710M and GT 730M; today NVIDIA is filling out the rest of 700M family. Last year saw NVIDIA’s very successful launch of mobile Kepler; since that time, the number of laptops shipping with NVIDIA dGPUs compared to AMD dGPUs appears to have shifted even more in NVIDIA’s favor.

Not surprisingly, with TSMC still on 28nm NVIDIA isn’t launching a new architecture, but they’ll be tweaking Kepler to keep it going through 2013. Today's launch of the various 700M GPUs is thus very similar to what we saw with the 500M launch: everything in general gets a bit faster than the previous generation. To improve Kepler NVIDIA is taking the existing architecture and making a few moderate tweaks, improving their drivers (which will also apply to existing GPUs), and as usual they’re continuing to proclaim the advantages of Optimus Technology.

Starting on the software side of things, we don’t really have anything new to add on the Optimus front, other than to say that in our experience it continues to work well on Windows platforms—Linux users may feel otherwise, naturally. On the bright side, things like the Bumblebee Project appear to be helping the situation, so now it's at least possible to utilize the dGPU and iGPU under Linux. As far as OEMs go, Optimus has now matured to the point where I can't immediately come up with any new laptop that has an NVIDIA GPU and doesn't support Optimus; we're now at the point where an NVIDIA equipped laptop inherently implies Optimus support.

The second software aspect is NVIDIA’s GeForce Experience software, which allows for automatic game configuration based on your hardware. You can see the full slide deck in the gallery at the end with a few additional details, but GeForce Experience is a new software tool that’s designed to automatically adjust supported games to the “best quality for your hardware” setting. This may not seem like a big deal for enthusiasts, but for your average Joe that doesn’t know what all the technical names mean (e.g. antialiasing, anisotropic filtering, specular highlighting, etc.) it’s a step towards making PCs more gamer friendly—more like a console experience, only with faster hardware. ;-) GeForce Experience is already in open beta, with over 1.5 million downloads and counting, so it’s definitely something people are using.

Finally, NVIDIA has added GPU Boost 2.0 to the 700M family. This is basically the same as what’s found in GeForce Titan, though with some tuning specific to mobile platforms as opposed to desktops. We’re told GPU Boost 2.0 is the same core hardware as GPU Boost 1.0, with software refinements allowing for more fine-grained control of the clocks. Ryan has already covered GPU Boost 2.0 extensively, so we won’t spend much more time on it other than to say that over a range of titles, NVIDIA is getting a 10-15% performance improvement relative to GPU Boost 1.0.

Moving to the hardware elements, hardware change only applies to one of the chips. GK104 will continue as the highest performing option in the GTX 675MX and GTX 680M (as well as the GTX 680MX in the iMac 27), and GK106 will likewise continue in the GTX 670MX (though it appears some 670MX chips also use GK104). In fact, for now NVIDIA isn’t announcing any new high-end mobile GPUs, so the GTX 600M parts will continue to fill that niche. The changes come for everything in the GT family, with some of the chips apparently continuing to use GK107 while a couple options will utilize a new GK208 part.

While NVIDIA won’t confirm which parts use GK208, the latest drivers do refer to that part number so we know it exists. GK208 looks to be largely the same as GK107, and we’re not sure if there are any real differences other than the fact that GK208 will be available as a 64-bit part. Given the similarity in appearance, it may serve as a 128-bit part as well. Basically, GK107 was never available in a 64-bit configuration, and GK208 remedies that (which actually makes it a lower end chip relative to GK107).

GeForce 700M Models and Specifications
Comments Locked

91 Comments

View All Comments

  • JarredWalton - Monday, April 1, 2013 - link

    *Crickets* (again)

    And this is why I don't buy into such claims; I had some issues when Optimus first launched, but I haven't had any I can specifically pinpoint in the last year or more. If anyone has something under Windows that doesn't play right with Optimus, please post it here so I can verify the issue. Otherwise, with no concrete evidence, it's just FUD.
  • Wolfpup - Tuesday, April 2, 2013 - link

    VLC for starters. I seriously have no idea how you can be unaware of issues with it-any big notebook forum will have threads about it all the time, people trying to disable it.
  • mrhumble1 - Monday, April 1, 2013 - link

    Speaking of dishonest, you are not sticking to the facts. You obviously don't own a nice laptop with a 680M inside. I do, and it performs amazingly well. I output games from my laptop to my TV via HDMI and they look spectacular.

    You also obviously don't understand that desktop PCs (and their components) cannot be directly compared laptops. I also highly doubt that a "dirt cheap" PC can run The Witcher 2 at almost Ultra settings at a playable framerate.
  • tviceman - Monday, April 1, 2013 - link

    You're missing out then if you like to game. I've got a 560m that still performs admirably, running many of today's games with max settings (no AA) at 1600x900 60fps. I'm spoiled by fast frame rates and decent graphics settings, I can't imagine using even the upcoming haswell to play games like Bioshock Infinite on.
  • xTRICKYxx - Monday, April 1, 2013 - link

    I will never ever buy a laptop without a discrete card. Video cards 7770M/650M or above can play any game on 1920x1080 on high if there is a good enough CPU as well. Mobile graphics are starting to become powerful enough.

    Look at Intel CPU's. My i7-2600K at home is slightly slower than my i7-3720QM clock for clock.
  • jsilverstreak - Monday, April 1, 2013 - link

    680m sli is about 10% faster than a 680
    and the 780m's leaked benchmark was on par with a 660ti
  • jsilverstreak - Monday, April 1, 2013 - link

    oops it's actually the 680 the 780m is on par with
    although its leaked
  • nerd1 - Saturday, April 13, 2013 - link

    You obviously dont game, do you?
  • Mr. Bub - Wednesday, April 17, 2013 - link

    Sure, maybe most mainstream users who facebook and email all day on laptops don't need anything more than integrated graphics, but guys like me (engineering students) who actually do stuff on computers will still rely on discrete GPUs. I can't properly run any of my CAD software without a discrete GPU.
  • nerdstaz - Monday, April 22, 2013 - link

    http://www.notebookcheck.net/NVIDIA-GeForce-GTX-68...

    Your comparison is a bit off base <3

Log in

Don't have an account? Sign up now