Quantcast
Channel: Cadence Blogs
Viewing all articles
Browse latest Browse all 6662

The Book That Changed Everything

$
0
0
Academics have had a big influence on semiconductor design and design automation. I wrote earlier this week about Rob Rutenbar, the recipient of this year's Kaufman award, an academic as well as the founder of a couple of startups. Just a look at the list of past winners of the Kaufman award to see how influential academia has been: Richard Newton, Alberto (does he need a last name?), Chenming Hu, Randy Bryant, Andrzej Strojwas, Bob Brayton... and Carver Mead. Probably the most influential of them all is that last name, Carver Mead of Caltech. Along with Lynn Conway (then at Xerox Palo-Alto Research Center), he co-authored what I call "The book that changed everything." It was one of the first things that Ron mentioned when I talked to him about receiving the Kaufman award himself. This book ushered in what came to be known as the "Mead and Conway revolution." Mead & Conway Until 1979, IC design was done by specialists who understood every aspect of the design from semiconductor fabrication, transistor characteristics, all the way up to small blocks of a maybe a thousand gates which were the limit of chip fabrication in that era. In the late 1970s, this “tall thin man” approach started to break down. Design was getting too complex for people who understood the process to do it, and the process was getting sufficiently complex to become the realm of its own specialists. In time, things would stratify more into architects, front-end designers, verification engineers, physical layout experts, FEOL, BEOL, and more. Everything changed thirty years ago with the publication of Mead and Conway’s book, Introduction to VLSI systems . It is out of print now, but it was the most influential book in semiconductor design and design automation ever. Mead and Conway separated design from manufacturing by creating simplified design rules for layout, and a simplified timing model suitable for digital design. No longer was it necessary to understand every nuance of the fabrication process, no longer was it necessary to consider every transistor as an analog device. The most important aspect of this is that it meant that computer scientists could design digital chips because they no longer needed deep electrical engineering knowledge. I was at Edinburgh University at the time, finishing up my Ph.D. (in computer science, not electrical engineering). John Gray, who ran Carver Mead’s silicon structures project at Caltech while on sabbatical from Edinburgh, returned carrying galley proofs of the yet-to-be-published Mead and Conway book. He ran a course based on this, one of the first on IC design in the UK, I presume, heavily attended not just by the final-year students who were meant to be on the course, but by many of us faculty too. Design became the province of computer scientists who understood enough about layout, enough about timing, enough about architecture and enough about test to successfully create state-of-the-art chips. Indeed, they could do so more effectively than the electrical engineers because chips were getting to be too large to do entirely by hand, and computer scientists already knew how to deal with complexity. Abstraction, hierarchy, automation. They started to create the first EDA tools, simple layout editors, simple simulators, rudimentary design rule checkers—because their natural instincts were never to do anything by hand if you could create a program to automate it. Mead and Conway’s book created a cohort of IC-literate computer scientists who went on to populate the CAD groups of the semiconductor companies, and, eventually, the EDA industry when it got going in the first half of the 1980s. To see how big a difference it made, look at analog design versus digital design today. Analog design is largely done today the way digital design was done until Mead and Conway: deeply expert designers with the raw process models, raw design rules, and polygons. The next big change would be the invention of Verilog and RTL synthesis, that meant that computer scientists could design complex chips with almost no knowledge of how chips worked, what a transistor was, or how a chip was made. This new layer meant that front-end designers and back-end designers were different people with different skill-sets. A decade ago, it was clear that design would need to move up to a new level of abstraction to close the "design gap", the difference in productivity of the design process as compared to semiconductor manufacturing. I think we all expected that this would be some sort of high-level synthesis. But it turned out to be IP-based design, with groups and companies that created the IP blocks, and separate groups to assemble them into SoCs. High-level synthesis (such as Stratus) has its place, but it is largely restricted to very complex algorithms for vision and wireless protocol processing. Much the same happened in software, where structured programming has its place, but most programs are implemented with stacks of APIs and libraries that do most of the heavy lifting. Carver Mead An interview with Carver Mead done by Doug Fairbairn (another name from that era, who worked at PARC, and later hired me into VLSI Technology and brought me to the US) is at the Computer History Museum website . Here's a fun story from that interview, Carver's first chip: I actually didn’t believe deep down that the whole thing would work until I had made my first chip work and that was actually a fun story. I had made my first design of this PLA state machine and it was programmed up to be a digital clock, so it was something that would be fun. It would actually drive the little seven segment fluorescent tube display because at that time, the Intel process was a 20-some odd volt process, So I’d program this thing up and ran it through the Intel line and took it back. I had a bonder by then, so I went in for the weekend and bonded up my chips. I bonded up I think six of them and had a setup on my desk. I had a bench in my office so I had a little setup there, all set up to test it. So I plugged the thing in and all the rule lines were high and all the column lines were low. Ooh, that was a big disappointment. So I thought it must’ve been a bad chip. I went through all six of them and they all acted exactly the same way... I was feeling really low. This was a big thing for me because I’d sort of committed the next part of my career to doing this and here it didn’t work... So I thought I’ll go get something to eat and then I’ll decide what to do. So I went out the door of my office, flipped off the light and just as the door was closing, out of the corner of my eye, I caught a trace on a scope. So I went back in, turned the light on and the trace went away. Oops. Dynamic logic. Should’ve thought of that. So I took a penny out of my pocket, put it on top of the chip, and it worked fine. So I had a much better dinner that night. First chip worked fine. Lynn Conway But Lynn Conway also had a deep secret that in those days she wasn't ready to reveal: she started life as a guy. She had a hugely successful research career as a young researcher at IBM on supercomputing, stuff that even today is part of the architecture in the most modern microprocessors, dynamic instruction scheduling, or out-of-order execution. But she was a woman trapped in a man's body and eventually she decided she had to do the whole thing and do sex realignment surgery. This was too much for IBM of the day, so they fired her. So she was basically screwed, with no family, friends or job. She got some positions as a contract programmer. OK, I'm sure she was an incredibly good contract programmer. But how do you get from A to B? The guys at PARC, which was just starting up, noticed her at Memorex, where she was working, and recruited her. PARC in that era was the most innovative computer science location in the world, blowing away Bell Labs and places like that, as well as every academic department from Stanford to MIT. A huge proportion of the top computer scientists in the world worked there. Including Lynn Conway. Carver Mead and Lynn Conway were just in the right place at the right time to lead people like me into what became the VLSI world. I met her a few years ago at Dick Lyon's house in Los Altos. He is another name from those early days of VLSI, famous for the first optical mouse. Lynn's reminiscences about this period are in IEEE Solid State Circuits Magazine . A (long) article about her entire career was in the LA Times in 2000 . Sign up for Sunday Brunch, the weekly Breakfast Bytes email.

Viewing all articles
Browse latest Browse all 6662

Trending Articles