Fiscal quarters can be anywhere in the year. They don't have to follow any calender schedule. It depends on when the company incorporated. If they incorporated January 1, then that convenient. But if they incorporated in June 6, then that's the beginning of their fiscal year.
besides the date discrepancy, lets hear some good discussion on how this data compares to AMD's earnings, and what products you think will help nvidia's sales grow.
I'm surprised tegra 4 is doing so well, considering its not in many devices. Surface 2, Shield, Tegra Note 7, and a few other AIO's is about it. Surface and shield must be pushing those numbers up. Remember, K1 is not in these earnings.
Nvidia is showing perf per watt is important, not just price vs perf.
NVidia probably selling loads of Tegra chips due to a deal with a Chinese manufacturer + the MS/HP deals.
Tegra SOLD a lot, but, they don't talk about how much money it made... for good reasons. 139 million in sales does not mean anything if you spent 150-200 million+ on the division. Nvidia would never be willing to release profits of its Tegra division... Mostly because it has none. Tegra has always been a par chip that Nvidia overinflated to OEMs/consumers. Doing it to OEMs cost them every large mobile OEM (except that chinese one, but, those chips were probably sold with very low margins. And, that OEM had not used Tegra before).
Now that Tegra K1 has a nice GPU, they have a part they really can sell as beating the competition, but, sadly, they have driven away most of their customers... I hope they make a SHIELD 2 (with K1) I would love to pick up one of those. The SHIELD screen rez is to low for me to justify (I pick out pixels really easily) but, the device is really cool.
It is not just the GPU that is impressive in Tegra K1. The CPU is also impressive, with reportedly superior performance and superior per watt compared to Tegra 4 and Snapdragon 800.
FYI, NVIDIA does release data on Tegra profitability on an annual basis. NVIDIA does not release data on profitability by line of business on a quarterly basis. Anyway, that is now largely a moot point because R&D costs will be shared and leveraged between Tegra and other lines of business moving forward.
LOL, you have no idea what you are talking about. See measured application processor + DRAM perf. per watt of Tegra K1 vs. Snapdragon 800: http://cdn.pcper.com/files/imagecache/article_max_... . Tegra K1's CPU is > 1.3x more power efficient in this task!
Your <11w number is very misleading too. That is CPU+GPU+mem+I/O for a dev platform that is not mobile optimized. With very compute heavy tasks, the Jetson TK1 dev kit consumes slightly north of 6w for application processor + mem. For a mobile optimized TK1 variant, this would be closer to ~ 5w. Remember that this is > 300 GFLOPS GPU throughput!!!
At the end of the day, based on what we have seen so far, it is pretty clear that TK1 has superior CPU and GPU perf. per watt compared to S800, on the same fab. process node to boot.
how many phones is Nvidia in with Tegra 4? How many is Qualcomm on with SD800?
NVidia has proven itself inept at making a low power chip... Note, that 11W is not the max power draw, that is "working hard" the max power draw for the whole thing is probably closer to 15W... with at least 10W going to the SoC.
Why you seem to think Nvidia managed to magically change from losing perf/watt with Tegra, Tegra 2, Tegra 3 and Tegra 4 into slaughtering competition with Tegra K1 (when compared to products that have been out and IN PRODUCTS for over a year, mind you).
Remember, Tegra 4 was faster than Apple's iPad 8 month old iPad chip... Than Apple released a new chip... Tegra 4 was slower. It is not enough to beat competitors chips.
Also, you do realize Tegra K1 will be competing with 20nm chips... right?
The problem with the Tegra's, and Nvidia in general, is that their chips never meet the overinflated specs Nvidia releases for them before they hit the market. By the time they arrive, they are no longer what Nvidia has gotten people to expect for them, and other manufacturers, such as Apple and Qualcomm, have passed them by, except for a few specs.
That's why they aren't in a lot of devices, the OEM's are very well aware of all this.
in the call they said "It's about to be Tegra k1 season, i'm pretty excited about Tegra K1 , and... we don't have anything to announce today , but hopefully as we go into the second half we'll see some pretty exciting products coming out from TK1." so i wouldn't expect much from TK1 in Q2. They really need to get a lot more aggressive with Tegra. Too low volume is just rendering them irrelevant and can't really support all the costs of a custom core, baseband and so on.If they can get to 300 million per quarter and keep growing ,it would most likely be okish, but without that it seems futile to keep going without tuning the strategy.
Tegra is being killed off except perhaps in cars. They managed to drive away all large mobile OEMs without resorting to selling their chips at low margins (one chinese OEM)
Hard to be profitable when you have no way to sell high volume, and, the pricing on mobile chips is so low.
Last time i heard, Nvidia is going to start selling the Graphics IP. Like PowerVR, ARM etc... There would be a licensing cost sure, but it would be up to device makers on how to implement into their products. We'll have to see if Nvidia is price competitive on their licensing agreements with other ARM/x86 GPU Makers
Who would buy it? Vivante? I'm not sure how much of their ip is applicable to these very post constrained environments. What I'd expect to happen is that new ip is going to be generated by img, qual, ARM, Nvidia, and possibly apple(if they do decide to go that route). What is trying to be done with mobile gpus hasn't been attempted before and if expect different approaches to reign.
their architecture is leading the market in Performance per watt. Especially when you look at Maxwell's numbers which is said to also be available for License. So yeah, Apple, Qualcomm, or even Samsung would have reason to buy it.
oh, really? How many licensees has Nvidia announced? Zero. There are many reasons for that... One likely reason is that Nvidia probably does not hold the pref/watt crown. Tegra K1 not maxed out on CPU/GPU clock already consumers uses "<11 W" which means over 10W. source: https://pbs.twimg.com/media/Bl8_-GCCcAABW8r.jpg
Another one, is that, the terms of the license probably requires NVidia to have taped a SoC with the GPU before others can start integrating the GPU the moment NVidia had a chip with it... Meaning, if NVidia's chip first taped out in 2013 early-middle, your own SoC with that GPU would be coming late 2014-mid 2015. By that time, if Nvidia had a perf/watt lead with said GPU, it would have lost it.
It also would require any company licensing it to share their roadmap and chip plans with Nvidia... Which is a big competitive no-no.
There is a reason few companies that make hardware manage to license non-essential things to competitors. It is a bad deal for the competitor in most cases.
Oh, and, Maxwell comes late 2014 tape out on an ARM chip (at best) which means any launch for it in anyone's products but Nvidia's would be in 2016...
I would also like to point out that Apple has a GPU design team. They are likely to try (and likely accomplish) to design a custom GPU at some point soon. So, they have no reason to license from NVidia. Neither does Samsung (same reason) or Qualcomm (who has WAY better low power tech than NVidia, btw.) Given the terms of license are cheap, and NVidia is willing to optimize SMX for area, Rockchip or some low-end cheapo (Rockchip sells 28nm quad cores for under $5 a pop) manufacturer will license NVidia mobile graphics IP.
Actually Apple has every reason to license NVIDIA's GPU technology if they desire to expand the reach of their in-house GPU's. NVIDIA probably has more fundamental graphics patents than any other company in the world. And with Kepler (including the Tegra K1 GPU, which is ~ 1.5x more power efficient than the best ultra mobile GPU's available today) and especially Maxwell, NVIDIA arguably makes the most power efficient and highest performance GPU's in the world too.
They have sold zero, mostly because they are not seeking yet. The k1 is at least a year ahead of the competition in GPU performancs and performance per watt. They will start licensing with next years maxwell architecture, which will only have better performance and crazy good perf per watt. Its the only mobile GPU that supports full opengl 4.4 and all other nvidia gaming optimizations.
Because: 1. You have to give Nvidia your roadmaps and a look into your tech 2. Apple clearly wants to make a fully custom chip, Nvidia likely does not want to allow licenses to change how the architecture works. This is based off of how NVidia's management acts. __THIS COULD BE WRONG__ 3. No proof that Nvidia's GPU is best in perf/watt (Except what Nvidia says...) and, while it might have beaten everything when it was announced, so did Tegra 2 (which ended up not being the fastest or best perf/watt) as did Tegra 3 (See tegra 2) and also Tegra 4 did ( see tegra 2/3). Nvidia has a history of hyping its products in mobile space only for them to be average at best (Although, the K1 should be above average) 4. Kepler would only work on 28nm node. Why licenses a 28nm tech when you are working on 20nm? Gains from shrinking it could easily be obtained by licensing a PowerVR design much like they did with ARM... AKA, licensing the architecture, and designing a fully custom GPU.
A)OK, if youre not going to listen, then please don't reply. Kepler is the most efficient architecture in mobile, but nvidia WILL NOT BE LICENSING until they unvail their maxwell mobile architecture next year.
B)Apple uses powervr by imagination right now, and they have no plans of changing that, thats not custom.
C) Kepler is not based on one lithography. There are two versions of the tegra k1, one which is 28nm, the other(Denver) has not yet been determined. But it doesn't matter, the maxwell architecture will be independent from the process used, it would work on 3nm chips if a third party company wanted to.
A. You have proof? NVidia has not managed to make a system that wins in perf/watt in mobile in REAL DEVICES compared to Qualcomm, Apple, and a few others. When NVidia's K1 gets into a phone, and we can test the power/pref talk to me. Until than, I must go by the fact that Tegra, Tegra 2, Tegra 3 and Tegra 4 were all uncompetitive on perf/watt and say that K1 is not competitive either.
2. They can do custom PowerVR the same way they do Custom ARM... It is not that complicated.
3. Oh? Those me any roadmap suggesting a 20nm Kepler is coming? Don't you think JHH would have mentioned "Nvidia has the first 20nm SoC taped out for mobile chips" when he presented Tegra K1 if he could have? JHH is GREAT at seizing PR moments (And always wants to have the FASTEST chip)
If Apple wants to create their own fully custom GPU's that reach beyond simply phones and small tablets, they will likely need to turn to NVIDIA in order to license NVIDIA's world-leading GPU patent and IP portfolio.
Kepler and Maxwell GPU power efficiency has been proven time and time again. The top 10 most green supercomputers in the world are powered by Kepler GPU's. The most power efficient high end gaming computers in the world are powered by Kepler GPU's. The most power efficient gaming laptops in the world are powered by Kepler (and now Maxwell) GPU's. And Tegra K1's GPU on 28nm HPM is ~ 1.5x more power efficient than the very best ultra mobile GPU's available today. Previously generation Tegra GPU's are so much different in architecture and design that there is really no comparison.
No, not really. Most of Nvidia's patents don't have anything to do with mobile GPUs.
End of story. Besides, until we can see an independent review test K1, we cannot say it is better.
Tegra 1, 2, 3 and 4 all lost in perf/watt. Nvidia all said they were amazing, the best chips, they beat all the competitors as date they were announced. Come launch, they did not beat competing chips. I hope Tegra K1 is different, but, putting hope in Nvidia's Tegra division is like Hoping VIA becomes the worlds largest x86 manufacturer.
28nm and 20nm TSMC are pretty much same node, except shrunk down...so any issues seen in 28nm will be worse on 20nm...so shrinking Kepler to 20nm, is not a huge issue since it's mostly a shrink in node...same materials used. Why would Kepler NOT work with 20nm? Maxwell is on 28nm, but i'm sure it'll go 20nm by sometime next year...
Tegra K1 is <5W, Its even in Anand's own articles. He explains exactly how they got there. Your speculation about license stipulation is pure guess work, you shouldn't have even used that in your argument.
They dont have to share their roadmap or chip plans. The only time a company would want to do that, is if they wanted guidance on how to make their SoC. This is the same thing ARM does. You can license the IP, and make a chip yourself, or you can license and work with our engineers to optimize your SoC
Gee, NVidia's claims sure are pretty. Now, here is what Nvidia has talked about "GPU is over 900Mhz, over 350GFLops (or whatever it is) and the CPU is 2.3Ghz"
What clockspeed did they give for 5W? Considering Kepler was designed to be an approx 25-250W architecture from the start, the fact they scaled it so low IS AMAZING, however, I would guess it runs at around 500Mhz MAX at "5W"
Also, 5W is still to much for most phones.
My information about WHEN you can use the IP is fine (Although, it is not good)... About how you can change it, I am not sure. HOWEVER 28nm will be cheaper per transistor than 20nm for quite some time... Nvidia is not shrinking Kepler to 20nm, you would have to yourself... that costs lots of money, if Nvidia would even allow you to (I do not see why they would, they are the ones who said that 20nm costs will not be lower than 28nm...)
At the desktop, sure, but this is a different environment. That's my point. If performance has to keep improving, NEW solutions are going to have to be created, and I'm not sure that Maxwell is going to fix that at the 1-3W level For one thing, Nvidia is running their ultramobile gpus at relatively high speeds (compared to most of the others---the highest clocked adreno part runs at 578 compared to the tk1 850).
That is not true at all. Tegra is a growing business and will continue to grow in various areas. The Tegra consumer revenue and Tegra gaming revenue will keep going up this year and next year and will be larger than Tegra automotive revenue, even as Tegra automotive revenue goes way up too.
Tegra 4/4i family was used not just by Xiaomi but also by very reputable OEM's such as Asus, Toshiba, LG, ZTE, Microsoft, HP, EVGA, Wiko (among others). Considering how good Tegra K1 appears to be, I'd say that this list will keep expanding to include other significant OEM's who are looking to differentiate their devices.
Nvidia continues to make good profits in a challenging environment and they have established very promising new revenue streams in supercomputing, autos, and cloud. K1 has tested very well and could be a big winner going forward. The GRID products for data centers are looking good. The next 20nm generation should be awesome.
find me an LG with Tegra 4 or higher. Find me a Samsung with Tegra 4 or higher. Find me a HTC with Tegra 4 or higher Find me almost any large mobile OEM that uses Tegra 4.
Xiaomi is the only large mobile phone company to use Tegra 4. That is because NVidia blatantly lied about Tegra 2 and Tegra 3 to mobile OEMs so no one buys it anymore. I also doubt that Xiaomi is paying full price for Tegra 4... I could be wrong however."
Also, once more, Tegra K1 uses over 10W (nvidia says <11, see: https://pbs.twimg.com/media/Bl8_-GCCcAABW8r.jpg) without reaching full load. How far do you think it has to downclock to be in a phone without killing the battery life?
Um, the LG G2 mini LTE is being released with the Tegra 4i. The problem with your assertion is that there are very few large mobile OEMs right now; LG, Samsung, Huawei, and Apple, with Samsung and Apple making over 45% of the phones out there with their own SoC. The #3 OEM is Huawei at just shy of 5%.
Note: Tegra 4i uses Cotex-A9 CPUs, not the Cortex-A15s in the full-fledged Tegra4. The GPU is also fairly different. The Tegra 4i is closer to a speed-boosted Tegra3 than a full-fledged Tegra4.
HP has many different products using Tegra 4. LG also has T4i in the G2 Mini for certain markets.
Are those not large enough mobile OEM's for you?
You seem to continue to want to push that slide that was shown at GTC, You do understand that the quoted power draw in that slide isn't indicative of power draw in all form factors right?
HP is a large mobile OEM? Compared to who? Microsoft?
Anyhow, I know it is not for all form factors... however, if your chip not running full speed draws over 10W (even with IO included) than it will not run anywhere NEAR full speed in mobile devices... Well, it could, just, it would last about 1 hour.
hell Apple's A7 pulls 11W on the iPad air and 8W on the phone. The numbers are crazy off. The phone K1 devices will run at a lower clockspeed but the performance per watt is still there, it's the constant.
plus testbug, your first post on this article says "they finally have a gpu that's beating the competition" you contradict yourself in your following post.
I believe that the GPU should be able to eek a lead out (at its max clockspeed) given you want to go by that. Only because their competitors aim at the market as a whole, not as some "max clockspeed we can get out of the part" because the money is in perf/watt, not absolute perf.
I do not believe it will be beating competition in phones and small tablets (large tablets, it __MIGHT__ be able to)
The tegra k1 pulled 10 watts powering the tk1 bored, which had desktop components. I don't think you have much room to make that claim. You are right the iPad draws 11 watts at max, so does the k1 on a development board with desktop components and a fan, we cannot judge what it would be like in a tablet. Quit blowing smoke, we get that you don't like tegra, but don't make crap up.
You are cherry-picking phone-centric OEM's now. LG uses Tegra 4i in one of their products, and Xiaomi uses Tegra 4 in one of their products, but Tegra's emphasis and focus is not on phones. Tegra's emphasis is on tablets, micro-gaming consoles, automotive, embedded. Tegra may get some high end smartphone design wins in the future with Tegra K1 and beyond for those OEM's looking to differentiate their products, but the lions share of smartphone SoC's outside of Apple and Samsung will use Qualcomm and Mediatek.
How many tablet wins do they have? MS, HP, their own tablet. How many Phones wins do they have? One Chinese manufacturer How many "micro-gaming consoles" wins do they have? Two(?) SHIELD and Ouya. I don't think you can even truly call the SHIELD a "win" it was made by Nvidia. Automotive.. Well, NVidia has a decent chunk of the market, sadly, that market is quite small... and, those systems are approaching using very low-end SoC for stuff car needs, and running the multimedia/etc using phones power. embedded? Perhaps.
Nvidia was talking about the k1 only MONTHS after they announced the tegra 4. They were showing battlefield 3 demos running the next gen (come to be tegra K1) before the tegra 4 was even release. Nvidia is hyped about the k1 and they should be, its revolutionary, truly it is.
Also I just want to clarify that there was no tegra 1 soc, so stop referencing that (there was an original tegra, but it was a seperate chip, as it was not an soc). also the tegra 2 was the most advanced chip at the time, very competitive.
AMD did post good resultst too, with the GPU division doing extremely well also.
#1 My take on this is that if you bought a core i7 2600k years ago, it's still good enough. GPU performance is now the important factor in computing performance, and if your CPU can old longer than the usual 18 months it used to, you have spare money to spend on a beefy GPU upgrade.
Intel is the looser here.
#2 With game streaming from your powerful desktop coming (Shield / Steam OS / i'm sure AMD will follow), notebook GPU will begin to be less important. And the rise of 500+W GPU cards shows that mobile GPU can't give you all the power you need due to power contraint, contrary to CPU. All you need is a potent SoC (tegra K1 / Mullins) and a powerful desktop.
And by a constant reduction in research investments.
And they showed the best quarter of the future incoming ones, as the request for APUs for consoles was at its maximum. Still they didn't get a profit. nvidia gets 66x4 millions by licensing their old IP to Intel in a year. And that's without investing anything in silicon or future shrinking and so. You can see who made the deal when nvidia said that doing SoC for console market was not profitable at all (or not as it would be investing the same money and brains in other things).
Why nobody pointed out that Tegra4 was based on over a decade old GeForce ULP? And that TK1 is NVidia's first product with a modern GPU architecture? Why nobody pointed out that OEMs were also drawn to Snapdragons because of on chip LTE? Really, previous tegras were incremental changes, thats why im excited to see where K1 can take us. Stop the hate, sit back end enjoy... Not like you lost large sums of money on Nvidia shares.
As long as they can sell them at >$200 they will keep that price. People do not understand the basic business law of offer and demand: you have to sell your product at a low price only if it cannot be sold at a higher one. nvidia sells tons of their GPUs at high prices. AMD fanboys cannot understand this and keep on speaking of perf/dollar nonsense. They just cannot understand that criteria is not the only one, and for those that really understand something, it is also not the most important. If perf/cost was the only criteria used when buying something we would all drive Skoda and KIA cars. Or have Samsung electronics devices.
nvidia can sell a lot at a higher price making money (well, not plenty but enough), while AMD not to have a one digit market share must keep its prices so low that they cannot have a gain even at what it probably was the best quarter of the year.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
68 Comments
Back to Article
lazarpandar - Thursday, May 8, 2014 - link
Am I missing something here?nandnandnand - Thursday, May 8, 2014 - link
Welcome to the woooorld of tomoooorrrrooowww!!!ddriver - Friday, May 9, 2014 - link
You are 40% welcome.tipoo - Friday, May 9, 2014 - link
Fiscal quarter, not normal people quarter. Nothing to miss.hamiltenor - Thursday, May 8, 2014 - link
Did I miss 2014, or did someone break an NDA?Brett Howse - Thursday, May 8, 2014 - link
Funny you say that. Apparently NVIDIA did accidently release their results early for this quarter.But FY 2014 is over for them :)
tipoo - Thursday, May 8, 2014 - link
Fiscal quarter does run ahead of real life, but I'm not sure this is right.Brett Howse - Friday, May 9, 2014 - link
It's correct. There's a link at the bottom of the article to the news release if you would like to confirm.melgross - Monday, May 12, 2014 - link
Fiscal quarters can be anywhere in the year. They don't have to follow any calender schedule. It depends on when the company incorporated. If they incorporated January 1, then that convenient. But if they incorporated in June 6, then that's the beginning of their fiscal year.Morawka - Thursday, May 8, 2014 - link
besides the date discrepancy, lets hear some good discussion on how this data compares to AMD's earnings, and what products you think will help nvidia's sales grow.I'm surprised tegra 4 is doing so well, considering its not in many devices. Surface 2, Shield, Tegra Note 7, and a few other AIO's is about it. Surface and shield must be pushing those numbers up. Remember, K1 is not in these earnings.
Nvidia is showing perf per watt is important, not just price vs perf.
testbug00 - Friday, May 9, 2014 - link
NVidia probably selling loads of Tegra chips due to a deal with a Chinese manufacturer + the MS/HP deals.Tegra SOLD a lot, but, they don't talk about how much money it made... for good reasons. 139 million in sales does not mean anything if you spent 150-200 million+ on the division. Nvidia would never be willing to release profits of its Tegra division... Mostly because it has none. Tegra has always been a par chip that Nvidia overinflated to OEMs/consumers. Doing it to OEMs cost them every large mobile OEM (except that chinese one, but, those chips were probably sold with very low margins. And, that OEM had not used Tegra before).
Now that Tegra K1 has a nice GPU, they have a part they really can sell as beating the competition, but, sadly, they have driven away most of their customers... I hope they make a SHIELD 2 (with K1) I would love to pick up one of those. The SHIELD screen rez is to low for me to justify (I pick out pixels really easily) but, the device is really cool.
ams23 - Friday, May 9, 2014 - link
It is not just the GPU that is impressive in Tegra K1. The CPU is also impressive, with reportedly superior performance and superior per watt compared to Tegra 4 and Snapdragon 800.FYI, NVIDIA does release data on Tegra profitability on an annual basis. NVIDIA does not release data on profitability by line of business on a quarterly basis. Anyway, that is now largely a moot point because R&D costs will be shared and leveraged between Tegra and other lines of business moving forward.
testbug00 - Friday, May 9, 2014 - link
no, Tegra 4 was aweful in perf/watt compared to SD 800.... this part is no different.According to Nvidia themselves, it draws over 10W (well, they say <11) "working hard"
it is probably 20% faster (CPU at that speed) and uses WAY more power (not sure how much is GPU, which is certainly faster than the 800)
ams23 - Friday, May 9, 2014 - link
LOL, you have no idea what you are talking about. See measured application processor + DRAM perf. per watt of Tegra K1 vs. Snapdragon 800: http://cdn.pcper.com/files/imagecache/article_max_... . Tegra K1's CPU is > 1.3x more power efficient in this task!Your <11w number is very misleading too. That is CPU+GPU+mem+I/O for a dev platform that is not mobile optimized. With very compute heavy tasks, the Jetson TK1 dev kit consumes slightly north of 6w for application processor + mem. For a mobile optimized TK1 variant, this would be closer to ~ 5w. Remember that this is > 300 GFLOPS GPU throughput!!!
At the end of the day, based on what we have seen so far, it is pretty clear that TK1 has superior CPU and GPU perf. per watt compared to S800, on the same fab. process node to boot.
tuxRoller - Friday, May 9, 2014 - link
Well, if Nvidia's slides tell us that...testbug00 - Friday, May 9, 2014 - link
how many phones is Nvidia in with Tegra 4? How many is Qualcomm on with SD800?NVidia has proven itself inept at making a low power chip... Note, that 11W is not the max power draw, that is "working hard" the max power draw for the whole thing is probably closer to 15W... with at least 10W going to the SoC.
Why you seem to think Nvidia managed to magically change from losing perf/watt with Tegra, Tegra 2, Tegra 3 and Tegra 4 into slaughtering competition with Tegra K1 (when compared to products that have been out and IN PRODUCTS for over a year, mind you).
Remember, Tegra 4 was faster than Apple's iPad 8 month old iPad chip... Than Apple released a new chip... Tegra 4 was slower. It is not enough to beat competitors chips.
Also, you do realize Tegra K1 will be competing with 20nm chips... right?
melgross - Monday, May 12, 2014 - link
Don't believe anything Nvidia says, or their controlled laboratory testing. None of their stuff ever meets their inflated numbers.tuxRoller - Friday, May 9, 2014 - link
http://semiaccurate.com/2014/03/05/nvidias-tegra-k...Also, the dev board has a frickin fan attached.
For comparison, the dragonboards (the dev boards for snapdragon, don't seem to have fans attached---http://mydragonboard.org/wp-content/uploads/2014/0...
So, yeah, it's easy to have great performance if you don't mind blowing your power budget.
melgross - Monday, May 12, 2014 - link
The problem with the Tegra's, and Nvidia in general, is that their chips never meet the overinflated specs Nvidia releases for them before they hit the market. By the time they arrive, they are no longer what Nvidia has gotten people to expect for them, and other manufacturers, such as Apple and Qualcomm, have passed them by, except for a few specs.That's why they aren't in a lot of devices, the OEM's are very well aware of all this.
phoenix_rizzen - Friday, May 9, 2014 - link
According to the article, the Tegra sales are up due to use in car infotainment packages, not phones/tablets.jjj - Thursday, May 8, 2014 - link
in the call they said"It's about to be Tegra k1 season, i'm pretty excited about Tegra K1 , and... we don't have anything to announce today , but hopefully as we go into the second half we'll see some pretty exciting products coming out from TK1."
so i wouldn't expect much from TK1 in Q2.
They really need to get a lot more aggressive with Tegra. Too low volume is just rendering them irrelevant and can't really support all the costs of a custom core, baseband and so on.If they can get to 300 million per quarter and keep growing ,it would most likely be okish, but without that it seems futile to keep going without tuning the strategy.
grahaman27 - Thursday, May 8, 2014 - link
Where did you see that?jjj - Thursday, May 8, 2014 - link
http://investor.nvidia.com/events.cfmtestbug00 - Friday, May 9, 2014 - link
Tegra is being killed off except perhaps in cars. They managed to drive away all large mobile OEMs without resorting to selling their chips at low margins (one chinese OEM)Hard to be profitable when you have no way to sell high volume, and, the pricing on mobile chips is so low.
Morawka - Friday, May 9, 2014 - link
Last time i heard, Nvidia is going to start selling the Graphics IP. Like PowerVR, ARM etc... There would be a licensing cost sure, but it would be up to device makers on how to implement into their products. We'll have to see if Nvidia is price competitive on their licensing agreements with other ARM/x86 GPU MakerstuxRoller - Friday, May 9, 2014 - link
Who would buy it? Vivante? I'm not sure how much of their ip is applicable to these very post constrained environments. What I'd expect to happen is that new ip is going to be generated by img, qual, ARM, Nvidia, and possibly apple(if they do decide to go that route). What is trying to be done with mobile gpus hasn't been attempted before and if expect different approaches to reign.Morawka - Friday, May 9, 2014 - link
their architecture is leading the market in Performance per watt. Especially when you look at Maxwell's numbers which is said to also be available for License. So yeah, Apple, Qualcomm, or even Samsung would have reason to buy it.testbug00 - Friday, May 9, 2014 - link
oh, really? How many licensees has Nvidia announced? Zero.There are many reasons for that... One likely reason is that Nvidia probably does not hold the pref/watt crown. Tegra K1 not maxed out on CPU/GPU clock already consumers uses "<11 W" which means over 10W. source: https://pbs.twimg.com/media/Bl8_-GCCcAABW8r.jpg
Another one, is that, the terms of the license probably requires NVidia to have taped a SoC with the GPU before others can start integrating the GPU the moment NVidia had a chip with it... Meaning, if NVidia's chip first taped out in 2013 early-middle, your own SoC with that GPU would be coming late 2014-mid 2015. By that time, if Nvidia had a perf/watt lead with said GPU, it would have lost it.
It also would require any company licensing it to share their roadmap and chip plans with Nvidia... Which is a big competitive no-no.
There is a reason few companies that make hardware manage to license non-essential things to competitors. It is a bad deal for the competitor in most cases.
Oh, and, Maxwell comes late 2014 tape out on an ARM chip (at best) which means any launch for it in anyone's products but Nvidia's would be in 2016...
testbug00 - Friday, May 9, 2014 - link
I would also like to point out that Apple has a GPU design team. They are likely to try (and likely accomplish) to design a custom GPU at some point soon. So, they have no reason to license from NVidia.Neither does Samsung (same reason) or Qualcomm (who has WAY better low power tech than NVidia, btw.) Given the terms of license are cheap, and NVidia is willing to optimize SMX for area, Rockchip or some low-end cheapo (Rockchip sells 28nm quad cores for under $5 a pop) manufacturer will license NVidia mobile graphics IP.
ams23 - Friday, May 9, 2014 - link
Actually Apple has every reason to license NVIDIA's GPU technology if they desire to expand the reach of their in-house GPU's. NVIDIA probably has more fundamental graphics patents than any other company in the world. And with Kepler (including the Tegra K1 GPU, which is ~ 1.5x more power efficient than the best ultra mobile GPU's available today) and especially Maxwell, NVIDIA arguably makes the most power efficient and highest performance GPU's in the world too.iamlilysdad - Friday, May 9, 2014 - link
Curious about the source of your information.grahaman27 - Friday, May 9, 2014 - link
Testbug00,They have sold zero, mostly because they are not seeking yet. The k1 is at least a year ahead of the competition in GPU performancs and performance per watt. They will start licensing with next years maxwell architecture, which will only have better performance and crazy good perf per watt. Its the only mobile GPU that supports full opengl 4.4 and all other nvidia gaming optimizations.
Why would anyone not want that?
testbug00 - Friday, May 9, 2014 - link
Because:1. You have to give Nvidia your roadmaps and a look into your tech
2. Apple clearly wants to make a fully custom chip, Nvidia likely does not want to allow licenses to change how the architecture works. This is based off of how NVidia's management acts. __THIS COULD BE WRONG__
3. No proof that Nvidia's GPU is best in perf/watt (Except what Nvidia says...) and, while it might have beaten everything when it was announced, so did Tegra 2 (which ended up not being the fastest or best perf/watt) as did Tegra 3 (See tegra 2) and also Tegra 4 did ( see tegra 2/3). Nvidia has a history of hyping its products in mobile space only for them to be average at best (Although, the K1 should be above average)
4. Kepler would only work on 28nm node. Why licenses a 28nm tech when you are working on 20nm? Gains from shrinking it could easily be obtained by licensing a PowerVR design much like they did with ARM... AKA, licensing the architecture, and designing a fully custom GPU.
grahaman27 - Friday, May 9, 2014 - link
A)OK, if youre not going to listen, then please don't reply. Kepler is the most efficient architecture in mobile, but nvidia WILL NOT BE LICENSING until they unvail their maxwell mobile architecture next year.B)Apple uses powervr by imagination right now, and they have no plans of changing that, thats not custom.
C) Kepler is not based on one lithography. There are two versions of the tegra k1, one which is 28nm, the other(Denver) has not yet been determined. But it doesn't matter, the maxwell architecture will be independent from the process used, it would work on 3nm chips if a third party company wanted to.
4) go home you're drunk.
testbug00 - Friday, May 9, 2014 - link
A. You have proof? NVidia has not managed to make a system that wins in perf/watt in mobile in REAL DEVICES compared to Qualcomm, Apple, and a few others. When NVidia's K1 gets into a phone, and we can test the power/pref talk to me. Until than, I must go by the fact that Tegra, Tegra 2, Tegra 3 and Tegra 4 were all uncompetitive on perf/watt and say that K1 is not competitive either.2. They can do custom PowerVR the same way they do Custom ARM... It is not that complicated.
3. Oh? Those me any roadmap suggesting a 20nm Kepler is coming? Don't you think JHH would have mentioned "Nvidia has the first 20nm SoC taped out for mobile chips" when he presented Tegra K1 if he could have? JHH is GREAT at seizing PR moments (And always wants to have the FASTEST chip)
ams23 - Friday, May 9, 2014 - link
If Apple wants to create their own fully custom GPU's that reach beyond simply phones and small tablets, they will likely need to turn to NVIDIA in order to license NVIDIA's world-leading GPU patent and IP portfolio.Kepler and Maxwell GPU power efficiency has been proven time and time again. The top 10 most green supercomputers in the world are powered by Kepler GPU's. The most power efficient high end gaming computers in the world are powered by Kepler GPU's. The most power efficient gaming laptops in the world are powered by Kepler (and now Maxwell) GPU's. And Tegra K1's GPU on 28nm HPM is ~ 1.5x more power efficient than the very best ultra mobile GPU's available today. Previously generation Tegra GPU's are so much different in architecture and design that there is really no comparison.
testbug00 - Friday, May 9, 2014 - link
No, not really. Most of Nvidia's patents don't have anything to do with mobile GPUs.End of story. Besides, until we can see an independent review test K1, we cannot say it is better.
Tegra 1, 2, 3 and 4 all lost in perf/watt. Nvidia all said they were amazing, the best chips, they beat all the competitors as date they were announced. Come launch, they did not beat competing chips. I hope Tegra K1 is different, but, putting hope in Nvidia's Tegra division is like Hoping VIA becomes the worlds largest x86 manufacturer.
Ghost0420 - Wednesday, May 28, 2014 - link
28nm and 20nm TSMC are pretty much same node, except shrunk down...so any issues seen in 28nm will be worse on 20nm...so shrinking Kepler to 20nm, is not a huge issue since it's mostly a shrink in node...same materials used. Why would Kepler NOT work with 20nm? Maxwell is on 28nm, but i'm sure it'll go 20nm by sometime next year...Morawka - Friday, May 9, 2014 - link
Tegra K1 is <5W, Its even in Anand's own articles. He explains exactly how they got there. Your speculation about license stipulation is pure guess work, you shouldn't have even used that in your argument.They dont have to share their roadmap or chip plans. The only time a company would want to do that, is if they wanted guidance on how to make their SoC. This is the same thing ARM does. You can license the IP, and make a chip yourself, or you can license and work with our engineers to optimize your SoC
Morawka - Friday, May 9, 2014 - link
Brian Klug Talking Tegra K1 Power: http://www.anandtech.com/show/7622/nvidia-tegra-k1...testbug00 - Saturday, May 10, 2014 - link
Gee, NVidia's claims sure are pretty. Now, here is what Nvidia has talked about "GPU is over 900Mhz, over 350GFLops (or whatever it is) and the CPU is 2.3Ghz"What clockspeed did they give for 5W? Considering Kepler was designed to be an approx 25-250W architecture from the start, the fact they scaled it so low IS AMAZING, however, I would guess it runs at around 500Mhz MAX at "5W"
Also, 5W is still to much for most phones.
My information about WHEN you can use the IP is fine (Although, it is not good)... About how you can change it, I am not sure. HOWEVER 28nm will be cheaper per transistor than 20nm for quite some time... Nvidia is not shrinking Kepler to 20nm, you would have to yourself... that costs lots of money, if Nvidia would even allow you to (I do not see why they would, they are the ones who said that 20nm costs will not be lower than 28nm...)
tuxRoller - Friday, May 9, 2014 - link
At the desktop, sure, but this is a different environment. That's my point. If performance has to keep improving, NEW solutions are going to have to be created, and I'm not sure that Maxwell is going to fix that at the 1-3W level For one thing, Nvidia is running their ultramobile gpus at relatively high speeds (compared to most of the others---the highest clocked adreno part runs at 578 compared to the tk1 850).ams23 - Friday, May 9, 2014 - link
That is not true at all. Tegra is a growing business and will continue to grow in various areas. The Tegra consumer revenue and Tegra gaming revenue will keep going up this year and next year and will be larger than Tegra automotive revenue, even as Tegra automotive revenue goes way up too.Tegra 4/4i family was used not just by Xiaomi but also by very reputable OEM's such as Asus, Toshiba, LG, ZTE, Microsoft, HP, EVGA, Wiko (among others). Considering how good Tegra K1 appears to be, I'd say that this list will keep expanding to include other significant OEM's who are looking to differentiate their devices.
beck2050 - Friday, May 9, 2014 - link
Nvidia continues to make good profits in a challenging environment and they have established very promising new revenue streams in supercomputing, autos, and cloud. K1 has tested very well and could be a big winner going forward. The GRID products for data centers are looking good. The next 20nm generation should be awesome.testbug00 - Friday, May 9, 2014 - link
find me an LG with Tegra 4 or higher.Find me a Samsung with Tegra 4 or higher.
Find me a HTC with Tegra 4 or higher
Find me almost any large mobile OEM that uses Tegra 4.
Xiaomi is the only large mobile phone company to use Tegra 4. That is because NVidia blatantly lied about Tegra 2 and Tegra 3 to mobile OEMs so no one buys it anymore. I also doubt that Xiaomi is paying full price for Tegra 4... I could be wrong however."
Also, once more, Tegra K1 uses over 10W (nvidia says <11, see: https://pbs.twimg.com/media/Bl8_-GCCcAABW8r.jpg) without reaching full load. How far do you think it has to downclock to be in a phone without killing the battery life?
testbug00 - Friday, May 9, 2014 - link
note: the imagine is full of typos, but, it is legit: https://twitter.com/ProfMatsuoka/status/4591574225...michael2k - Friday, May 9, 2014 - link
Um, the LG G2 mini LTE is being released with the Tegra 4i.The problem with your assertion is that there are very few large mobile OEMs right now; LG, Samsung, Huawei, and Apple, with Samsung and Apple making over 45% of the phones out there with their own SoC. The #3 OEM is Huawei at just shy of 5%.
phoenix_rizzen - Friday, May 9, 2014 - link
Note: Tegra 4i uses Cotex-A9 CPUs, not the Cortex-A15s in the full-fledged Tegra4. The GPU is also fairly different. The Tegra 4i is closer to a speed-boosted Tegra3 than a full-fledged Tegra4.michael2k - Friday, May 9, 2014 - link
The A15 is power constrained, so I'm not sure it's at much of an advantage to the A9r4 seen in the Tegra 4i, especially in a smartphone.testbug00 - Friday, May 9, 2014 - link
Tegra 4i is a great chip. Best Tegra chip yet... to bad so few phones have it :(Still, Tegra 4i IS NOT Tegra 4.
iamlilysdad - Friday, May 9, 2014 - link
HP has many different products using Tegra 4. LG also has T4i in the G2 Mini for certain markets.Are those not large enough mobile OEM's for you?
You seem to continue to want to push that slide that was shown at GTC, You do understand that the quoted power draw in that slide isn't indicative of power draw in all form factors right?
testbug00 - Friday, May 9, 2014 - link
HP is a large mobile OEM? Compared to who? Microsoft?Anyhow, I know it is not for all form factors... however, if your chip not running full speed draws over 10W (even with IO included) than it will not run anywhere NEAR full speed in mobile devices... Well, it could, just, it would last about 1 hour.
Morawka - Saturday, May 10, 2014 - link
hell Apple's A7 pulls 11W on the iPad air and 8W on the phone. The numbers are crazy off. The phone K1 devices will run at a lower clockspeed but the performance per watt is still there, it's the constant.Morawka - Saturday, May 10, 2014 - link
plus testbug, your first post on this article says "they finally have a gpu that's beating the competition" you contradict yourself in your following post.testbug00 - Saturday, May 10, 2014 - link
I believe that the GPU should be able to eek a lead out (at its max clockspeed) given you want to go by that. Only because their competitors aim at the market as a whole, not as some "max clockspeed we can get out of the part" because the money is in perf/watt, not absolute perf.I do not believe it will be beating competition in phones and small tablets (large tablets, it __MIGHT__ be able to)
testbug00 - Saturday, May 10, 2014 - link
Tegra K1 can pull over 10W just on the SoC, I would guess between 12 and 15W max. iPhone and iPad consume those with all parts running at max.grahaman27 - Sunday, May 11, 2014 - link
The tegra k1 pulled 10 watts powering the tk1 bored, which had desktop components. I don't think you have much room to make that claim. You are right the iPad draws 11 watts at max, so does the k1 on a development board with desktop components and a fan, we cannot judge what it would be like in a tablet. Quit blowing smoke, we get that you don't like tegra, but don't make crap up.ams23 - Friday, May 9, 2014 - link
You are cherry-picking phone-centric OEM's now. LG uses Tegra 4i in one of their products, and Xiaomi uses Tegra 4 in one of their products, but Tegra's emphasis and focus is not on phones. Tegra's emphasis is on tablets, micro-gaming consoles, automotive, embedded. Tegra may get some high end smartphone design wins in the future with Tegra K1 and beyond for those OEM's looking to differentiate their products, but the lions share of smartphone SoC's outside of Apple and Samsung will use Qualcomm and Mediatek.testbug00 - Friday, May 9, 2014 - link
Tegra 4i is a great product.Now, we were talking about Tegra 4.
How many tablet wins do they have? MS, HP, their own tablet.
How many Phones wins do they have? One Chinese manufacturer
How many "micro-gaming consoles" wins do they have? Two(?) SHIELD and Ouya. I don't think you can even truly call the SHIELD a "win" it was made by Nvidia.
Automotive.. Well, NVidia has a decent chunk of the market, sadly, that market is quite small... and, those systems are approaching using very low-end SoC for stuff car needs, and running the multimedia/etc using phones power.
embedded? Perhaps.
grahaman27 - Sunday, May 11, 2014 - link
Nvidia was talking about the k1 only MONTHS after they announced the tegra 4. They were showing battlefield 3 demos running the next gen (come to be tegra K1) before the tegra 4 was even release. Nvidia is hyped about the k1 and they should be, its revolutionary, truly it is.Also I just want to clarify that there was no tegra 1 soc, so stop referencing that (there was an original tegra, but it was a seperate chip, as it was not an soc). also the tegra 2 was the most advanced chip at the time, very competitive.
melgross - Monday, May 12, 2014 - link
I do t expect much from this—period. We've heard this before from them. They sound more like AMD every day.Da W - Friday, May 9, 2014 - link
AMD did post good resultst too, with the GPU division doing extremely well also.#1 My take on this is that if you bought a core i7 2600k years ago, it's still good enough. GPU performance is now the important factor in computing performance, and if your CPU can old longer than the usual 18 months it used to, you have spare money to spend on a beefy GPU upgrade.
Intel is the looser here.
#2 With game streaming from your powerful desktop coming (Shield / Steam OS / i'm sure AMD will follow), notebook GPU will begin to be less important. And the rise of 500+W GPU cards shows that mobile GPU can't give you all the power you need due to power contraint, contrary to CPU. All you need is a potent SoC (tegra K1 / Mullins) and a powerful desktop.
CiccioB - Friday, May 9, 2014 - link
AMD results were quite not that good at all.All the profits they advertise are all compensated by an almost equal long term debt increment.
CiccioB - Friday, May 9, 2014 - link
And by a constant reduction in research investments.And they showed the best quarter of the future incoming ones, as the request for APUs for consoles was at its maximum.
Still they didn't get a profit. nvidia gets 66x4 millions by licensing their old IP to Intel in a year. And that's without investing anything in silicon or future shrinking and so. You can see who made the deal when nvidia said that doing SoC for console market was not profitable at all (or not as it would be investing the same money and brains in other things).
theqnology - Saturday, May 10, 2014 - link
Why nobody pointed out that Tegra4 was based on over a decade old GeForce ULP? And that TK1 is NVidia's first product with a modern GPU architecture? Why nobody pointed out that OEMs were also drawn to Snapdragons because of on chip LTE? Really, previous tegras were incremental changes, thats why im excited to see where K1 can take us. Stop the hate, sit back end enjoy... Not like you lost large sums of money on Nvidia shares.theqnology - Saturday, May 10, 2014 - link
Oops, meant for everyone, not on your comment.. :P Sorry hard to distinguish, using a phone.Hrel - Monday, May 12, 2014 - link
ok Nvidia, you made plenty of money, now lower the prices of your desktop GPU's significantly. GTX760 should be <200.CiccioB - Monday, May 12, 2014 - link
As long as they can sell them at >$200 they will keep that price.People do not understand the basic business law of offer and demand: you have to sell your product at a low price only if it cannot be sold at a higher one.
nvidia sells tons of their GPUs at high prices.
AMD fanboys cannot understand this and keep on speaking of perf/dollar nonsense. They just cannot understand that criteria is not the only one, and for those that really understand something, it is also not the most important.
If perf/cost was the only criteria used when buying something we would all drive Skoda and KIA cars. Or have Samsung electronics devices.
nvidia can sell a lot at a higher price making money (well, not plenty but enough), while AMD not to have a one digit market share must keep its prices so low that they cannot have a gain even at what it probably was the best quarter of the year.