Monday is always a hectic start to DAC. If you have been in the industry for decades, as I have, it is hard to walk more than twenty yards without meeting old friends and colleagues who have been in the industry for years, too. NXP Keynote: Revolution Ahead: Enabling Securely Connected, Self-Driving Cars The opening keynote was by Lars Reger from NXP Hamburg, where he is CTO of the Automotive Division. With the merger at the end of last year between NXP and Freescale, one thing I hadn't realized is that NXP is now the #1 semiconductor supplier into automotive with about 15% market share. (They are also the biggest supplier of crypto chips such as those in credit cards and machine-readable passports.) This, of course, positions them well for the future as we go through the transition to increasingly autonomous vehicles. As he pointed out, worldwide there are 1.3M vehicle fatalities per year and the transition from horseless carriages to robots on wheels has the potential to make a huge dent in that. Most accidents are caused by some form of driver error (as opposed to a blowout of a tire). For years, the rates have been gradually declining but now they are inching up and this is generally thought to be due to distracted driving (texting, looking at our smartphones, even (do people do this any more?) making calls). Lars said that his goals were: Functional safety (0 accidents by system failures: ISO 26262) Functional security (0 accidents by system hacks) Device reliability (0 component failures such as power steering) Road safety (0 accidents by human error) He talked about five particular areas: Radar sensing, and creating smaller cheaper radar "patches" V2X, vehicle-to-vehicle, and vehicle-to-infrastructure communication, "virtual towbar" Sensor fusion, supplying a lot of sensor data to a high-performance processor to make decisions Ethernet in cars Data security On the last one, security, he pointed out depressingly that, “All the hacking cases you have seen in last two years would have been impossible with the right application of the silicon that is already in place.” I won't cover the whole presentation in detail. Christine will post something later this week, so watch out for that over on Design Chronicles . Verification Lunch For several years, Cadence has held a panel session about verification, so I suppose we can now call it the annual verification lunch. Despite Brian Fuller having moved from Cadence to ARM, he has kept his emeritus moderator position. The panelists were: Jim Hogan of Vista Ventures (and recently married) Alex Starr, a fellow at AMD Narendra Konda of NVIDIA and manager of the world's largest emulation lab Mike Stellfox, Cadence Fellow If there was a theme of the discussion it was that one engine is never enough and the engines are never fast enough. Simulation, emulation, formal verification, virtual platforms, FPGA-prototyping: these all have a part to play but there are holes. Narendra said it took one and a half years to get virtual platforms and emulation working well together, for example. Another theme was that "shift left" is very disruptive, especially to the software engineers who are not used to working on unstable platforms and really don't want to get involved in finding hardware bugs. (Let me point out my blog post about a recent video in which Narendra Konda appeared Ten Months of Emulation on Palladium, Hours to Bring-Up .) I will write about this panel session in more detail, probably some time next week. I think it is worth giving a whole post to. Functional verification in the context of heavyweight software loads is one of the biggest challenges in designing these huge systems. Lucio Lanza Panel on Open Source The participants on the panel were (from left to right): Warren Savage (IPextreme, now Silvaco as of 72 hours ago) Mark Templeton (Scientific Ventures, was CEO of Artisan) Michael Wishart (CEO efabless corporation, was at Goldman Sachs) Lucio set the stage that Moore’s Law is slowing down, so we should just accept that and think about what is going to happen. What will the new requirements be with a different evolution of the nodes. It used to be impossible to have sophisticated tools for the new nodes, things were moving so fast that designers were lucky to get something that almost worked. Now scaling is slowing, is there opportunity in different directions. Warren, who started in the system world before moving to the IP world, said that it is not a technology problem, it is a content problem. It is like a record company. The differentiation in records is not about optimizing the vinyl or the CD plastic, but what hit music gets put on it. He thinks there is such an explosion coming, some of these older nodes are coming alive allowing new applications. There is going to be a diversity of IP like we’ve never seen before. Expertise is going to come from system specialists, not semiconductor design specialists, because that's where the knowledge is. Mark pointed out that people need a way to differentiate their products, and they used to grab the next node and hope to get something out ahead of the competition. But if nodes hang around longer, they need different differentiation. Each node used to need a few sets of IP, but in the future there will be demand for wider variation of IP. The big problem is fragmentation, since it won’t be possible to sell the same piece of IP 1000 times, maybe only three, so these will become more like service businesses. Michael is running a crowd-sourcing company, connecting chip designers with companies that need (and lack) semiconductor expertise. They are trying to take the cost of design as close to zero as possible and believe that creating a true community is part of what is needed. The discussion to me was a bit schizophrenic, since on the one hand everyone was talking about this explosion of innovation that was imminent. The foundries would be love to manufacture all this. Just go to the Maker Faire to see the future today. But, as I pointed out in my question at the end, semiconductor manufacturing is a mass-production process. The Maker Faire would look very different if everyone had to manufacture 100,000 gizmos or none, which is how semiconductor is now. So then everyone backtracked and said systems might contain an FPGA and a couple of other supporting devices, it was designing the systems that was important. Another problem that I regularly point out about open source is that it is great for software engineers to scratch a really important itch. They can all get together and solve that problem, including pulling in a lot of corporations for whom that is important (think of how gcc or Linux is maintained). But you can get a large group of designers together and conclude that, say, on-chip variation is a really big problem. But none of those people have a clue how to solve that, how to write an EDA tool, how to get the data that they need from the foundries, and a myriad of other things that don't happen with pure software. HOT Party Jim Hogan is back as the driving force behind HOT, the Heart of Technology, his charity incubator that ropes in event experts from all over the EDA and IP industries to host events to benefit local charities. This year's went to benefit CASA of Travis County (where Austin is). By the time you read this, you missed this year's party. You can read more about it on my post It's HOT in Austin in June . Since DAC is in Austin again in 2017, I assume there will be another HOT party next year. On the subject of parties, if you read this the morning it is published (you do read Breakfast Bytes every morning, don't you?) then don't forget to pick up your Denali Party wristband by noon. I can tell you from personal experience that it is not an idle threat that if you don't then they will give it to someone else (and you won't get in without a wristband). Previous: DAC News, Sunday
↧