It is interesting that the desktop cpu will be released at the same time as zen. Maybe they think zen might be competitive and are waiting to set the final clocks once zen performance is known?
More likely Intel has been afraid to release anything that would put AMD out of business entirely... but at the same time they are not about to let them have a foothold on the market ever again.
As you can see, Intel didn't meet its forecast of the AnandTech graph. They were quite close in H2'14, but then the yield learning slowed down considerably. In the image of AT, they forecasted parity in H1'15, but in their latest graph (mine is from november '15), they forecasted mid-2016 for yield parity compared to 22nm. My guess from what I've heard from them is that this is fairly accurate. Yields should be pretty healthy by now.
As always, but I would consider the lack of built-in USB 3.1 a stunning failure. KL was under design when the 3.1 standard was finalized. Mark my words, but this time next year every single phone tablet is going to be using Type-C. Though you can use the port without the 3.1 spec it's a stunning failure on Intel's part to not integrate it when they are a key member of the USB forum.
Personally I think they are doing it because they try to push people to thunderbird but it makes their default product a failure that needs another chip to provide what will be default functionality in 12 months.
They could have supported USB3.1, but it is a bandwidth hungry monster that takes up die space. Instead they support pcie3x4 and oems can get/pay for the extra USB3.1 chip out of their own moneys. USB-c on the other hand can be hung of pretty much anything because it is just a connector.
The only way you can expect lower prices/better chips on desktop (i3 becoming 4c/8t, i5 8c/16t, i7 12c/24t and the no gpu chip with extra cores/cache desired by pc enthusiasts (lets name it i8 and give it 18c/36t and as much extra l3 cache you can fit on it with the space gained by the removal of the gpu... you could even have i6 (14c/28t) and i4 (6c/12t) chips using the same chips with defects on some of the cores/l3 and an i2 with 3c/6t or 4c/8t but with an extremely high clock rate (5GHz+ base)), no more random removal of some instructions (K class missing some of the virtualization for example), a turbo only limited by actual heat and power, mandatory ecc, 4 channel memory (which means that the high end now gets 8 channels), extra pcie lanes, unlocked multipliers and so on) is if Zen is actually faster than AMD marketing claims.
So if you are a religious person start praying....
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
129 Comments
View All Comments
Meteor2 - Wednesday, August 31, 2016 - link
D'oh, Apollo Lake, not Broxton.ianmills - Tuesday, August 30, 2016 - link
It is interesting that the desktop cpu will be released at the same time as zen. Maybe they think zen might be competitive and are waiting to set the final clocks once zen performance is known?CaedenV - Tuesday, August 30, 2016 - link
More likely Intel has been afraid to release anything that would put AMD out of business entirely... but at the same time they are not about to let them have a foothold on the market ever again.witeken - Tuesday, August 30, 2016 - link
Here is a more up to date graph of Intel's 14nm yield than the one shown on page 1: http://www.fudzilla.com/images/stories/2016/Januar...As you can see, Intel didn't meet its forecast of the AnandTech graph. They were quite close in H2'14, but then the yield learning slowed down considerably. In the image of AT, they forecasted parity in H1'15, but in their latest graph (mine is from november '15), they forecasted mid-2016 for yield parity compared to 22nm. My guess from what I've heard from them is that this is fairly accurate. Yields should be pretty healthy by now.
lilmoe - Tuesday, August 30, 2016 - link
Ridiculous MSRPs as usual.rahvin - Tuesday, August 30, 2016 - link
As always, but I would consider the lack of built-in USB 3.1 a stunning failure. KL was under design when the 3.1 standard was finalized. Mark my words, but this time next year every single phone tablet is going to be using Type-C. Though you can use the port without the 3.1 spec it's a stunning failure on Intel's part to not integrate it when they are a key member of the USB forum.Personally I think they are doing it because they try to push people to thunderbird but it makes their default product a failure that needs another chip to provide what will be default functionality in 12 months.
beginner99 - Wednesday, August 31, 2016 - link
USB-C and USB 3.1 are not the same thing. You can use C-connector with USB 3.0 or old connector with USB 3.1doggface - Wednesday, August 31, 2016 - link
They could have supported USB3.1, but it is a bandwidth hungry monster that takes up die space. Instead they support pcie3x4 and oems can get/pay for the extra USB3.1 chip out of their own moneys. USB-c on the other hand can be hung of pretty much anything because it is just a connector.someonesomewherelse - Thursday, September 1, 2016 - link
The only way you can expect lower prices/better chips on desktop (i3 becoming 4c/8t, i5 8c/16t, i7 12c/24t and the no gpu chip with extra cores/cache desired by pc enthusiasts (lets name it i8 and give it 18c/36t and as much extra l3 cache you can fit on it with the space gained by the removal of the gpu... you could even have i6 (14c/28t) and i4 (6c/12t) chips using the same chips with defects on some of the cores/l3 and an i2 with 3c/6t or 4c/8t but with an extremely high clock rate (5GHz+ base)), no more random removal of some instructions (K class missing some of the virtualization for example), a turbo only limited by actual heat and power, mandatory ecc, 4 channel memory (which means that the high end now gets 8 channels), extra pcie lanes, unlocked multipliers and so on) is if Zen is actually faster than AMD marketing claims.So if you are a religious person start praying....
someonesomewherelse - Thursday, September 1, 2016 - link
Obviously at the same/lower price as current i3/i5/i7 chips.