Intel's Ivy Bridge Architecture Exposed
by Anand Lal Shimpi on September 17, 2011 2:00 AM EST- Posted in
- CPUs
- Intel
- Ivy Bridge
- IDF 2011
- Trade Shows
Five years ago Intel announced its ambitious tick-tock release cadence. We were doubtful that Intel could pull off such an aggressive schedule but with the exception of missing a few months here or there tick-tock has been a success. On years marked by a tick Intel introduces a new manufacturing process, while tock years keep manufacturing process the same and introduce a new microprocessor architecture. To date we've had three tocks (Conroe, Nehalem, Sandy Bridge) and two ticks (Penryn, Westmere). Sampling by the end of this year and shipping in the first half of next year will be Intel's third tick: Ivy Bridge.
Ivy Bridge (IVB) is the first chip to use Intel's 22nm tri-gate transistors, which will help scale frequency and reduce power consumption. As we already mentioned, mobile Ivy Bridge will be the first Intel CPU to bring four cores into a 35W TDP.
At a high level Ivy Bridge looks a lot like Sandy Bridge. It's still a monolithic die that features an integrated GPU. The entire die is built at 22nm, continuing Intel's march towards truly addressing integrated graphics performance. Ivy Bridge won't get rid of the need for a discrete GPU but, like Sandy Bridge, it is a step in the right direction.
Intel hasn't announced die size but transistor count has increased to approximately 1.4 billion (layout). This is up from 1.16 billion in Sandy Bridge, a 20.7% increase. With perfect scaling a 22nm Sandy Bridge die would be 47.3% the size of a 32nm die. Even with the increase in transistor count, it's a good bet that Ivy Bridge will be noticeably smaller than Sandy Bridge.
Motherboard & Chipset Support
Ivy Bridge is backwards compatible with existing LGA-1155 motherboards, although there will be a new chipset for Ivy Bridge and new motherboards to enable some features (e.g. PCI Express 3.0, native USB 3.0). The new chipset family falls under the 7-series banner. We'll see Z77, Z75, H77, Q77, Q75 and B75 available at or around launch.
Chipset Comparison | ||||||||
Z77 | Z75 | H77 | Z68 | P67 | H67 | |||
CPU Support |
IVB LGA-1155 |
IVB LGA-1155 |
IVB LGA-1155 |
SNB/IVB LGA-1155 |
SNB/IVB LGA-1155 |
SNB/IVB LGA-1155 |
||
CPU Overclocking | Yes | Yes | No | Yes | Yes | No | ||
CPU PCIe Config |
1 x16 or 2 x8 or 1 x8 + 2 x4 PCIe 3.0 |
1 x16 or 2 x8 PCIe 3.0 |
1 x16 PCIe 3.0 |
1 x16 or 2 x8 or 1 x8 + 2 x4 PCIe 3.0 |
1 x16 or 2 x8 PCIe 3.0 |
1 x16 PCIe 3.0 | ||
Processor Graphics Support | Yes | Yes | Yes | Yes | No | Yes | ||
Intel SRT (SSD caching) | Yes | No | Yes | Yes | No | No | ||
RAID Support | Yes | Yes | Yes | Yes | Yes | Yes | ||
USB 2.0 Ports (3.0) | 14 (4) | 14 (4) | 14 (4) | 14 | 14 | 14 | ||
SATA Total (Max Number of 6Gbps Ports) | 6 (2) | 6 (2) | 6 (2) | 6 (2) | 6 (2) | 6 (2) | ||
PCIe Lanes | 8 (5GT/s) | 8 (5GT/s) | 8 (5GT/s) | 8 (5GT/s) | 8 (5GT/s) | 8 (5GT/s) |
As I mentioned above, Ivy Bridge finally supports USB 3.0 natively. The consumer 7-series chipsets feature 14 total USB ports, 4 of which are USB 3.0 capable. The CPU itself features 16 PCIe (1x16, 2x8 or 1x8 + 2x4) gen 3 lanes to be used for graphics and/or high performance IO. You will only see Gen 3 speeds on qualified motherboards. It's technically possible on 6-series motherboards but guaranteed on 7-series motherboards. The Z77 and H77 chipsets will support Intel's Smart Response Technology (SRT, aka SSD caching) which is a Z68 exclusive today.
SATA and chipset-attached PCIe slots haven't changed. Overclocking is supported on all Z-chipsets, while the H-chipset doesn't. All chipsets support Intel's HD Graphics, which is a departure from the Sandy Bridge mess where P67 didn't.
97 Comments
View All Comments
Arnulf - Sunday, September 18, 2011 - link
"Voltage changes have a cubic affect on power, so even a small reduction here can have a tangible impact."P = V^2/R
Quadratic relationship, rather than cubic ?
damianrobertjones - Sunday, September 18, 2011 - link
" As we've already seen, introducing a 35W quad-core part could enable Apple (and other OEMs) to ship a quad-core IVB in a 13-inch system."Is Apple the only company that can release a 13" system?
medi01 - Monday, September 19, 2011 - link
No. But it's the only one that absolutely needs to be commented on in orgasmic tone in US press (and big chunk of EU press too)JonnyDough - Monday, September 19, 2011 - link
They're the only ones who will market it with a flashy Apple logo light on a pretty aluminum case. Everyone knows that lightweight pretty aluminum cases are a great investment on a system that is outdated after just a few years. I wish Apple would make cars instead of PCs so we could bring the DeLorean back. Something about that stainless steel body just gets me so hot. Sure, it would get horrible gas mileage and be less safe in an accident. But it's just so pretty! Plus, although it would use a standard engine made by Ford or GM under the hood, its drivers would SWEAR that Apple builds its own superior hardware!cldudley - Sunday, September 18, 2011 - link
Am I the only one who thinks Intel is really wasting a lot of time and money on improvements to their on-die GPU? They keep adding features and improvements to the onboard video, right up to including DirectX 11 support, but isn't this really all an excersise in futility?Ultimately a GPU integrated with the CPU is going to be bottlenecked by the simple fact that it does not have access to any local memory of it's own. Every time it rasterizes a triangle or performs a texture operation, it is doing it through the same memory bus the CPU is using to fetch instructions, read and write data, etc.
I read that the GPU is taking a larger proportion of the die space in Ivy Bridge, and all I see is a tragic waste of space that would have been better put into another (pair of?) core or more L1/L2 cache.
I can see the purpose of integrated graphics in the lowest-end SKUs for budget builds, and there are certainly power and TDP advantages, and things like Quick-Sync are a great idea, but why stuff a GPU in a high-end processor that will be blown away by a comparatively middle-of-the-road discrete GPU?
Death666Angel - Sunday, September 18, 2011 - link
I disagree. AMD has shown that on-die GPUs can already compete with middle-of-the-road discrete graphics in notebooks. Trinity will probably take on middle-of-the-road in the current desktop space.Your memory bandwidth argument also doesn't seem to be correct, either. Except for some AMD mainboard graphics with dedicated sideport memory, all IGPs use the RAM, but a lot of them are doing fine. It is also nice to finally see higher clocked RAM be taken advantage of (see Llano 1666MHz vs 1800MHz). DDR4 will add bandwidth as well.
Once the bandwidth becomes a bottleneck, you can address that, but at the moment Intel doesn't seem to be there, yet, so they keep addressing their other GPU issues. What is wrong with that?
Also, how many people who buy high-end CPUs end up gaming 90% of the time on them? A lot of people need high-end CPUs for work related stuff, coding, CAD etc. Why should they have to buy a discrete graphics card?
Overall, you are doing a lot of generalization and you don't take into account quite a few things. :-)
cldudley - Sunday, September 18, 2011 - link
Ironically I spend lots of time in AutoCAD, and a discrete graphics board makes a tremendous difference. Gamer-grade stuff is usually not the best thing in that arena though, it needs to be the special "workstation" cards, which have very different drivers. Quadro or FireGL.I agree with you on the work usage, and gaming workloads not being 90% of the time, but on the other hand,workstations tend to have Xeons in them, with discrete graphics cards.
platedslicer - Sunday, September 18, 2011 - link
As a fraction of the computer market, buyers who want power over everything else have plunged. Mobility is so important for OEMs now that fitting already-existent performance levels into smaller, cheaper devices becomes more important than pushing the envelope. I still remember a time when hardly anybody gave a rat's ass about how much power a CPU consumed as long as it didn't melt down. Today, power consumption is a crucial factor due to battery life and heat.Personally these developments make me rather sad, partly because I like ever-shinier games, and (more importantly) because seeing the unwashed masses talk about computers as if they were clothing brands makes me want to rip out their throats. That's how the world works, though. Hopefully the chip makers will realize that there's still a market for power over fluff.
Looking at it on the bright side, CPU power stagnation might make game designers pay more attention to content. Hey, you have to look on the bright side of life.
KPOM - Monday, September 19, 2011 - link
I think that's largely because for the average consumer, PCs have reached the point where CPU capabilities are no longer the bottleneck. Look at the success of the 2010 MacBook Air, which had a slow C2D but a speedy SSD, and sold well enough to last into mid-2011. Games are the next major hurdle, but that's the GPU rather than the CPU, and hence the reason it receives a bigger focus in Ivy Bridge (as it also did in Sandy Bridge compared to Westmere).The emphasis now is having the power we have last longer and be available in smaller, more portable devices.
JonnyDough - Monday, September 19, 2011 - link
You're missing the point. They aren't trying to beef the power of the CPU. CPUs are already quite powerful for most tasks. They are trying to lower energy usage and sell en-mass to businesses that use thousands of computers.