I was in Brussels a couple of weeks ago to attend imec's annual technology forum. One of the keynotes on the first morning was from Wally Rhines, Mentor's CEO, entitled Extending Semiconductor Cost Reduction Another 20 Years . As Gordon Moore himself said, "no exponential is forever" and, whatever your view on whether Moore's Law is slowing down at the moment, the future is certainly murky when you look out on a 20 year timeframe. I talked to Wally at the social event that evening, in the Magritte Museum, just a couple of minutes from the conference center. Probably two of Magritte's most famous paintings are the man with a bowler hat and an apple in front of his face, and the picture of a pipe with "ceci n'est pas une pipe" underneath ("this is not a pipe" if your French isn't up to that). So the staff were all wearing bowler hats and one of the deserts consisted of candy apples and chocolate pipes. Wally told me that whenever he does one of these keynotes, he always learns something he didn't know before, and this time it was the amazing statistic I used for the title of the blog. It really is true that 99.7% of transistors manufactured are memory. Moore's Law is amazing in some ways, since at the time he only had 4 data points and the maximum transistors on current chips was about 64. Wally pointed out that Moore's Law is actually a special case of a more general law, the Learning Curve. As the cumulative unit volume increases the cost decreases, but in a very predictable way. Each doubling of volume produces the same percentage decline in unit price. Wally didin't use it in this presentation, but I've seen a slide he has used before showing the cost of Japanese beer production falling in just such a linear way (when plotted on logarithmic scales). You have to adjust for inflation, of course, and assume reasonably free markets. This can happen across different technologies. There is a sense in which Moore's Law is a fifty year period in the middle of a much longer decline in the cost of computing going back to electromechanical relays, through vacuum tubes and, probably, something other than just scaling silicon transistors in the future. It would be interesting to know where Babbage's difference engine would fall on the graph. When something falls off the curve, it serves as an incentive for innovation. For example, test costs were getting out of control fifteen years ago, with the cost of testing a chip threatening to exceed the cost of manufacturing it. Then test compression was invented and everything got back on track. But there are challenges to staying on the learning curve. Everyone knows that 20nm, when we first needed double patterning, and 14/16nm with FinFET didn't see the traditional 15-25% cost increase per wafer, it was much larger, making moving to the new process less compelling. The delay of EUV and the need for SADP and penta patterning are making it harder to keep costs in line. But the learning curve doesn't care whether a transistor is logic or memory. So Wally went on a quest to get some data on how many transistors are memory versus logic. It turns out that the answer is almost all of them. There is so much memory that transistor cost reduction is carrying on at around 33% per year. One of the big drivers is NAND flash which is growing (in bit volumes) at about 40% per year. The other big memory technology is DRAM, of course. Both DRAM and flash remain on the transistor learning curve, declining in cost by 39% per year. The rest of semiconductor is not on that curve, declining by only 21% per year. The conclusion of all of this is that it is memory, not logic, that is keeping semiconductors on the learning curve. If you add logic and memory together, memory is so overwhelming that the numbers all come out right. DRAM and flash are declining in price per transistor at 39% and total semiconductor at 37%. That 21% number for logic applies to so little of the market. What is all this memory being used for? Video for one. YouTube users upload 400 hours of video every minute, which is a daily increase of about a petabyte. Solid state disks will get cheaper than hard drives in a year or two. There are new memory technologies on the way too, like 3D NAND and Xpoint. Ten years ago there was very little memory in a car, there is already a lot in any vehicle with any form of ADAS, and that is going to explode with truly autonomous vehicles. Automotive memory is growing at 26% per year.* So Wally's conclusions: Moore's Law is a special case of the learning curve, and while Moore's Law itself will end the learning curve will not continued growth in transistor unit volume reduces the cost per transistor growth of demand for memory, especially for visual processing and video, has caused memory to dominate unit volume vertical memory structures will keep us on the learning curve for 10-20 years His final point was one that Lou Scheffer would discuss in his keynote at DAC a week or two later. We need better computing architectures not just more horsepower in the silicon, what are sometimes called neuromorphic computing platforms, ones that are architected similar to the brain. Wally didn't mention it but here's a statistic I came across somewhere else. One megabyte of memory cost $2.6 billion in 1957. It's worth less than a cent these days. Those transistors don't cost much. * You know the rule of 70 right? Everyone should know this. If an investment with compound interest grows at, let's say 10% to keep the math simple, then it doubles in 70/10 = 7 years. So if automotive memory is growing at 26% per year it doubles in 70/26 = 2.7 years. Economic growth rates of, say, 3.5% sound like nothing changes much from year to year, but that means the economy doubles every 20 years. OK, it's not Moore's Law, but it compounds in the same way. Previous: RISC-V—Instruction Sets Want to Be Free
↧