Micron Confirms Mass Production of GDDR5X Memory
by Anton Shilov on May 12, 2016 3:00 PM ESTMicron Technology this week confirmed that it had begun mass production of GDDR5X memory. As revealed last week, the first graphics card to use the new type of graphics DRAM will be NVIDIA’s upcoming GeForce GTX 1080 graphics adapter powered by the company’s new high-performance GPU based on its Pascal architecture.
Micron’s first production GDDR5X chips (or, how NVIDIA calls them, G5X) will operate at 10 Gbps and will enable memory bandwidth of up to 320 GB/s for the GeForce GTX 1080, which is only a little less than the memory bandwidth of NVIDIA’s much wider memory bus equipped (and current-gen flagship) GeForce GTX Titan X/980 Ti. NVIDIA’s GeForce GTX 1080 video cards are expected to hit the market on May 27, 2016, and presumably Micron has been helping NVIDIA stockpile memory chips for a launch for some time now.
NVIDIA GPU Specification Comparison | |||||||
GTX 1080 | GTX 1070 | GTX 980 Ti | GTX 980 | GTX 780 | |||
TFLOPs (FMA) | 9 TFLOPs | 6.5 TFLOPs | 5.6 TFLOPs | 5 TFLOPs | 4.1 TFLOPs | ||
Memory Clock | 10Gbps GDDR5X | GDDR5 | 7Gbps GDDR5 |
6Gbps GDDR5 |
|||
Memory Bus Width | 256-bit | ? | 384-bit | 256-bit | 384-bit | ||
VRAM | 8 GB | 8 GB | 6 GB | 4 GB | 3 GB | ||
VRAM Bandwidth | 320 GB/s | ? | 336 GB/s | 224 GB/s | 288 GB/s | ||
Est. VRAM Power Consumption | ~20 W | ? | ~31.5 W | ~20 W | ? | ||
TDP | 180 W | ? | 250 W | 165 W | 250 W | ||
GPU | "GP104" | "GP104" | GM200 | GM204 | GK110 | ||
Manufacturing Process | TSMC 16nm | TSMC 16nm | TSMC 28nm | ||||
Launch Date | 05/27/2016 | 06/10/2016 | 05/31/2015 | 09/18/2014 | 05/23/2013 |
Earlier this year Micron began to sample GDDR5X chips rated to operate at 10 Gb/s, 11 Gb/s and 12 Gb/s in quad data rate (QDR) mode with 16n prefetch. However, it looks like NVIDIA decided to be conservative and only run the chips at the minimum frequency.
As reported, Micron’s first GDDR5X memory ICs (integrated circuits) feature 8 Gb (1 GB) capacity, sport 32-bit interface, use 1.35 V supply and I/O voltage as well as 1.8 V pump voltage (Vpp). The chips come in 190-ball BGA packages with 14×10 mm dimensions, so, they will take a little less space on graphics cards than GDDR5 ICs.
The announcement by Micron indicates that the company will be the only supplier of GDDR5X memory for NVIDIA’s GeForce GTX 1080 graphics adapters, at least initially. Another important thing is that GDDR5X is real, it is mass produced now and it can indeed replace GDDR5 as a cost-efficient solution for gaming graphics cards. How affordable is GDDR5X? It should not be too expensive - particularly as it's designed as an alternative to more complex technologies such as HBM - but this early in the game it's definitely a premium product over tried and true (and widely available) GDDR5.
Source: Micron
59 Comments
View All Comments
FlorisR - Monday, May 16, 2016 - link
Yes, that's was true in the past, but not with Polaris. They implemented memory compression, to save on required memory bandwidth.BurntMyBacon - Friday, May 13, 2016 - link
@Kalelovil: "Polaris 10 is only 2/3rds the size of GP104. The fastest Polaris 10 card will probably be slightly slower than the GTX 1070, which makes do with GDDR5."I agree that Polaris probably won't need the kind of bandwidth that a 256bit 10Gbps GDDR5X array can provide. Hoever, if GDDR5X ends up being as cost effective as Micron seems to suggest, then AMD could save significant die area and silicon cost by reducing the bus width even on mid-high end parts. Some of that savings would be eaten by the initial premium for GDDR5X, but it could make sense. Reducing the bus width generally means a reduction in the overall memory capacity of the cards which wouldn't bode well for high end cards like the GTX1080/GTX1070, but in the mainstream that Polaris is targeting, this wouldn't be much of a problem.
fanofanand - Friday, May 13, 2016 - link
AMD is desperately working on reducing their TDP, shaving off 10+ watts simply by using a different memory type must be appealing to them.SunLord - Thursday, May 12, 2016 - link
We really don't know at what stage Samsung and Hynix are at when it comes to mass production for gddr5x. When GDDR5X was announced it was assumed all the manufactures were 6 months out though Samsung might of focus on HBM 2 firstextide - Friday, May 13, 2016 - link
Lol, I love nvidiots. I mean wow, just the world some of you guys live in is absolutely ridiculous.Michael Bay - Monday, May 16, 2016 - link
Not worrying about what the next driver will break is certainly nice.Then again, you guys save on heating big time.
xthetenth - Tuesday, May 17, 2016 - link
Agreed on not worrying about driver issues. I'm so glad I got rid of my 970. I'd already had two different driver issues, getting the joys of the 364 drivers would've driven me nuts. My 290's been vastly better in terms of drivers. It's also been nice not having to run NV inspector to keep my card from heating the room on the desktop because apparently clocking down from full 3D clocks when someone's running two screens is hard, you guys. The only reason NV has mindshare is because of a reputation that people like you don't care to check.mfearer - Thursday, May 12, 2016 - link
Is this happening at Micron's Manassas, Virginia, USA location?casperes1996 - Thursday, May 12, 2016 - link
Why isn't it called GQDR? If it runs at quad-data rate, why do we name it double-data rate?LukaP - Thursday, May 12, 2016 - link
Because it slips off the tongue nicer? plus DDR is already known, so GDDR makes the logical connection of it being memory. GQDR is just some weird acronym noone would connect to memory at first.