Every year Breakfast Bytes makes some predictions for the year. At the end of the year, I review how I did in what I try and make as objective a manner as possible. If you want to see my assessment of how I did in 2017, see my post 2017: A Year in Breakfasts. Security I picked security as one of the big trends for last year, and I certainly nailed that. Arm TechCon, in particular, seemed to be all about security. Equifax, Yahoo, WannaCry, Mirai, Kraik, ShadowBrokers. Do you remember those names? Here we are just a couple of weeks into 2018 and we have two new names already: Meltdown and Spectre, exploits that are so advanced they have their own logos. I wrote about them in Spectre and Metldown: an Update just last week. I won't go over the whole story, but the amazing thing about these two exploits is that they have been hiding in plain sight in almost every advanced microprocessor in the world for twenty years. For 2018, security will be in the news. For sure, Spectre and Meltdown and the implications of how they work will affect future microprocessor designs, and perhaps cache-memory design. As the security teams in places like academia, Google, RSA, etc. direct their attention to hardware, I am sure that these will not be the last weaknesses discovered. After all, within a space of a few months, two independent groups discovered these two despite that twenty-year window of opportunity (and went through a professional notification process—the problems were identified about June last year). IoT security is a disaster waiting to happen. But even companies like Apple, regarded as a paragon in the area, managed to put out an update to the iPhone that allowed anyone to login with a default password. If Apple can get it so wrong, what hope have little companies with no security expertise? In psychology, there is something called the Dunning-Kruger effect, in which people of limited ability think that they are really good (the people who really are good understand their limitations). In computer security, there is a special version. People with limited ability think that they must be good, because they can create security systems that are so good, they can't break them themselves. The general public is even more clueless. The experiment of dropping USB flash drives in parking lots and seeing how many get plugged in inside the firewall is depressing (it is over 50%, and at I think it was GOMAC last year, a military security researcher pointed out that the number goes up to more like 85% if you put an NSA or DoD logo on the drive). Every so often, a dump of passwords comes to light so that security researchers get to see what passwords people really pick. The first amazing thing about this is that even when I was studying computer science back in stone age, before PCs and smartphones and the internet, Roger Needham, my lecturer on operating systems told us that passwords should always always always be encrypted with a "one-way function" and not stored in the clear. It shouldn't even be possible to generate a list of passwords for security researchers to study. But the passwords are often "12345" or "password". To be fair, often these are annoyance passwords and I don't worry myself. If you want to break into my account on The Economist and read it for free, then be my guest. My bank account, not so much. Autonomous Cars I don't think I'm going out on a limb by saying that autonomous cars will be big again this year, with some real advances rolled out into at least limited production. One battle to watch is going to be whether Tesla can master volume automobile manufacturing (they are still struggling with the model 3) before the car companies, who have that part down, can master the software and electronics (along with their partners). The old OEM/tier-1/tier-2 structure of the industry is breaking down, and it is going to be interesting to see what replaces it. Reliability, in various forms, will be a big part of this. For semiconductors, things like aging and temperature sensitivity for advanced processes needs to be mastered. For SoCs, the whole functional safety approach to analysis needs to be comprehended. There is obviously a big potential market opportunity. But there is a big human opportunity, The latest numbers I saw were 1.2M people (nearly 40,000 just in the US) were killed in automobiles, and it is in the high 90%s caused by some sort of human error. Unfortunately, I don't think that society (especially one as litigious as the US) will allow an incremental approach to this ("only" killing 100,000 people would be more than a 90% reduction) and we have to go from 1.2M a year to almost perfection on the vehicles. Of course, the number won't drop to zero—drunk pedestrians will continue to step out in front of cars and get killed—but, perhaps surprisingly, over half of fatalities are in single-vehicle crashes (73% in Montana, Vermont, and New Hampshire, which is the highest; so you might think it is snow/ice, until you find out that the lowest percentage is Minnesota at 44%). Artificial Intelligence I wondered whether to leave this under automotive, since a lot of AI is being driven by vision and other aspects of autonomous vehicle design. Alexa, Siri, Google Voice, and Cortana were all over CES. I think that this will be the year that voice starts to get a lot more natural. I forget where I saw this (or even precisely how it went), but it amused me: Q: What do we want? A: Voice recognition Q: When do we want it? A: I'm sorry, what were you talking about? 5nm This will be the year of 5nm. Not in volume production, of course. We are only just going to be getting started with 7nm this year. But decisions are going to need to be made on the type of transistor architecture to be used, and the type of metal to be used. The answer is almost certain to be silicon nanosheet transistors, and a metallization system that continues to be based on damascene copper, but with some incremental use of cobalt and ruthenium to address the extreme resistance challenges, especially in vias and contacts. More of the increase in transistor density with come from design technology co-optimization (DTCO) whereby incremental features at the process level that don't contribute directly to density allow denser cell-libraries and memories to be created. I'm thinking of things like contact over active gate (COAG), and buried power rails. The days where the process people produced design rules and SPICE decks and then the library guys ran with whatever they got are long gone. Of course, Cadence tools will need to support whatever comes out, and Cadence IP will be required early for those early adopters. EUV This is the year that some limited use of EUV will start in manufacturing. All the 7nm (Intel 10nm) processes seem to have been designed with the capability for later EUV introduction. This will be used first for contacts and cut masks, since these can get away with a lot more than layers that are more sensitive, such as lack of a pellicle, lack of defect-free mask blanks, limited metrology. Even so, this will reduce the number of masks, improve the quality of lithography, and perhaps reduce costs (but I don't expect cost-saving to be the major driver for the introduction of EUV, at least for a couple of years). 5G This is the year of 5G. There will be trial installations during the year, and 2019 and 2020 will be the real era of 5G. But applications and approaches to services that depend on always-on, low-latency, high-bandwidth will be jockeying for position this year. For a good summary of the current state of play, see my CES post Mobile World Congress in One Keynote from last week. China At SEMI last year, the main sessions were all about China. All over China, money is being sunk into fabs and businesses, with an aim to increase China's self-sufficiency in semiconductors. I continue to be amazed by the statistic that China spends more on importing semiconductors than oil. Of course, a lot are re-exported inside things like iPhones (I guess that a lot less oil is indirectly re-exported inside products, although it is a major feedstock for plastic). The big question is whether China can build a successful semiconductor market that can compete globally, in the same way Japan, Korea, and Taiwan (and to a lesser extent, Singapore) did in previous decades. I am going to SEMICON China in March (nearly three times the size of SEMICON West), so I promise that late March will have a lot of China semiconductor content. Sign up for Sunday Brunch, the weekly Breakfast Bytes email.
↧