Late last week we pulled back the covers on Intel's next-generation Core architecture update: Sandy Bridge. Due out in Q1 2011, we learned a lot about Sandy Bridge's performance in our preview. Sandy Bridge will be the first high performance monolithic CPU/GPU from Intel. Its performance was generally noticeably better than the present generation of processors, both on the CPU and GPU side. If you haven't read the preview by now, I'd encourage you to do so.

One of the questions we got in response to the article was: what about Sandy Bridge for notebooks? While Sandy Bridge is pretty significant for mainstream quad-core desktops, it's even more tailored to the notebook space. I've put together some spec and roadmap information for those of you who might be looking for a new notebook early next year.

Mobile Sandy Bridge

Like the desktop offering, mobile Sandy Bridge will arrive sometime in Q1 of next year. If 2010 was any indication of what's to come, we'll see both mobile and desktop parts launch at the same time around CES.

The mobile Sandy Bridge parts are a little more straightforward in some areas but more confusing in others. The biggest problem is that both dual and quad-core parts share the same brand; in fact, the letter Q is the only indication that the Core i7 2720QM is a quad-core and the Core i7 2620M isn't. Given AMD's Bulldozer strategy, I'm sure Intel doesn't want folks worrying about how many cores they have - just that higher numbers mean better things.

Mobile Sandy Bridge CPU Comparison
  Base Frequency L3 Cache Cores / Threads Max Single Core Turbo Memory Support Intel Graphics EUs Intel HD Graphics Frequency / Max Turbo TDP
Core i7 2920XM 2.5GHz 8MB 4 / 8 3.5GHz DDR3-1600 12 650 / 1300MHz 55W
Core i7 2820QM 2.3GHz 8MB 4 / 8 3.4GHz DDR3-1600 12 650 / 1300MHz 45W
Core i7 2720QM 2.2GHz 6MB 4 / 8 3.3GHz DDR3-1600 12 650 / 1300MHz 45W
Core i7 2620M 2.7GHz 4MB 2 / 4 3.4GHz DDR3-1600 12 650 / 1300MHz 35W
Core i5 2540M 2.6GHz 3MB 2 / 4 3.3GHz DDR3-1333 12 650 / 1150MHz 35W
Core i5 2520M 2.5GHz 3MB 2 / 4 3.2GHz DDR3-1333 12 650 / 1150MHz 35W

You'll notice a few changes compared to the desktop lineup. Clock speeds are understandably lower, and all launch parts have Hyper Threading enabled. Mobile Sandy Bridge also officially supports up to DDR3-1600 while the desktop CPUs top out at DDR3-1333 (though running them at 1600 shouldn't be a problem assuming you have a P67 board).

The major difference between mobile Sandy Bridge and its desktop countpart is all mobile SB launch SKUs have two graphics cores (12 EUs), while only some desktop parts have 12 EUs (it looks like the high-end K SKUs will have it). The base GPU clock is lower but it can turbo up to 1.3GHz, higher than most desktop Sandy Bridge CPUs. Note that the GPU we tested in Friday's preview had 6 EUs, so mobile Sandy Bridge should be noticeably quicker as long as we don't run into memory bandwidth issues. Update: Our preview article may have actually used a 12 EU part, we're still trying to confirm!

Even if we only get 50% more performance out of the 12 EU GPU, that'd be enough for me to say that there's no need for discrete graphics in a notebook - as long as you don't use it for high-end gaming.

While Arrandale boosted multithreaded performance significantly, Sandy Bridge is going to offer an across the board increase in CPU performance and a dramatic increase in GPU performance. And from what I've heard, NVIDIA's Optimus technology will work with the platform in case you want to do some serious gaming on your notebook.

The Roadmap
Comments Locked

52 Comments

View All Comments

  • IntelUser2000 - Monday, August 30, 2010 - link

    Holy crap. It doesn't make sense, but 650/1300MHz with 12 EUs seems amazing on mobile. I'd WAG the 650/1300MHz IGP is clock speed equivalent to a 1.1GHz "stable" frequency GPU, while the 850/1350MHz on desktop is equal to 1.25GHz.

    Still, the 12EU is a big thing. So the question is, 1 vs 2 core. Does it really mean EU-wise or more than just EU? If its just EU the difference might be much smaller than 50%.
  • CharonPDX - Monday, August 30, 2010 - link

    The difference is that the mobile space has a lower power threshold to hit; and Intel would like to muscle discrete GPUs out of the equation completely in both the low end and midrange.

    So by offering such aggressive turbo (plus all SKUs having more EUs,) they make it more likely that midrange systems will forego discrete GPUs. For example, we may very well see Intel integrated graphics as the standard, even on the highest-end Apple products; with a discrete GPU solely as an additional-cost option.

    In the mobile space, you can trade off CPU power usage for GPU power usage more readily. On the desktop, you don't need to do that nearly as much, with the greater power envelope, combined with the reasonably common acceptance of a low-end discrete GPU on mid-range systems.

    For the majority of games, even two CPU cores aren't running at full load, so the CPU half could run at stock speeds, possibly even idling one or two cores on the quad-core parts, while the GPU could take up all the power headroom to boost itself up all the way. Once you're back at your desktop doing some video compression, though, the GPU can drop to idle, and all (potentially four) cores of the CPU can take all the power they can muster to get that over with.

    Now, if you play a game that will fully stress all your CPU cores *AND* your GPU, though... And you end up with a mess.

    I can't wait to see benchmarks of CPU-heavy games, tweaked with different settings. You'll probably find that settings that should be purely CPU-intensive will drop your framerate noticeably, as the GPU doesn't have the headroom to clock up when the CPU is drawing the power.
  • IntelUser2000 - Monday, August 30, 2010 - link

    You don't see a full-fledged Geforce GTX480 on a laptop because the thermals don't allow it. If you are giving a faster product on a mobile variant it just means your desktop part is crippled and not showing its full potential.

    They can't take such a chance when Llano's GPU is rumored to be substantial. They need all the advantages they can get.

    About Turbo: I think its too early to judge how it'll do on CPU-intensive games. If implemented well it can work pretty good. The Turbo driver for graphics on Arrandale has a frame rate counter. If they can see that boosting the GPU works better than CPU it can be programmed so GPU Turbo kicks in more often.

    It's not "oh we'll let the software decide what to do", but rather "hmm most games require GPU power so we'll make the software Turbo more on GPU".
  • IntelUser2000 - Monday, August 30, 2010 - link

    You don't see a full-fledged Geforce GTX480 on a laptop because the thermals don't allow it. If you are giving a faster product on a mobile variant it just means your desktop part is crippled and not showing its full potential.

    They can't take such a chance when Llano's GPU is rumored to be substantial. They need all the advantages they can get.

    About Turbo: I think its too early to judge how it'll do on CPU-intensive games. If implemented well it can work pretty good. The Turbo driver for graphics on Arrandale has a frame rate counter. If they can see that boosting the GPU works better than CPU it can be programmed so GPU Turbo kicks in more often.

    It's not "oh we'll let the software decide what to do", but rather "hmm most games require GPU power so we'll make the software Turbo more on GPU".
  • IntelUser2000 - Monday, August 30, 2010 - link

    Sorry for the double post.

    Everything else is good for Anandtech except for the retarded posting system.
  • iwod - Tuesday, August 31, 2010 - link

    Unless Apple and Intel will develop a OpenCL Drivers for this new Intel IGP, otherwise Apple using it solely for their Laptop will be out of the question.
  • Calin - Tuesday, August 31, 2010 - link

    A game that is fully stressing both the CPU and the GPU belongs to a gaming rig (or a gaming laptop), not to a laptop.
    This "powerful integrated graphics" will also greatly simplify the cooling (one heat sink, one fan) and the power delivery (probably cut in half the number of components). Along with real "fast enough" graphics, looks like NVidia is out of the market of midrange laptop graphics (and AMD/ATI is also out of the midrange laptop graphic for everything outside its own processors/chipsets).
  • bennyg - Thursday, September 2, 2010 - link

    lol. My G51J has only 1 fan for 100W+ worth of i7-720 CPU and GTX260M GPU and manages to keep it below boiling point. Not by much mind you...

    Midrange graphics won't be touched. We're only talking about 2-3x the power of current integrated gfx, which only puts the low-end dedicated GPUs under threat, where arguably the feature set is more of a selling point than han grunt anyway. If these iGPUs can match on feature set then we'll see the "6400"/"6500"/"410M" et al have a tough time justifying their existence.
  • SteelCity1981 - Monday, August 30, 2010 - link

    I'm suprised intel didn't name the 2720QM or 2820QM to something on the lines of 2750QM or 2850QM considering the clock speeds are a bit faster then the original 720QM's and 820QM's which sounds like a more evolutionary step to the next level beyond the 740QM and 840QM's. Also, I take it that Q4 of 2011 will prob see a refresh to thev 2nd gen mobile core I-series lineup much like the current mobile core I-series saw this year.
  • bennyg - Thursday, September 2, 2010 - link

    740/840/940 wasn't a refresh. It was a rebadging of the kind we hate Nvidia for.

    It was just a redrawing of the lines between speed bins. They are EXACTLY the same chips as were used in 720/820/920 just with a 1x higher multiplier across the board.

    I also see the confusion everywhere... go onto any forum where people ask "what notebook should I buy" and you'll see a "i5 vs i7" or "dual vs quad" thread on a daily basis with a totally confused OP and sometimes even the replies are even more wrong.

Log in

Don't have an account? Sign up now