Quantcast
Channel: Cadence Blogs
Viewing all articles
Browse latest Browse all 6678

Clayton Christensen and the Innovator's Dilemma

$
0
0
One of the most readable and influential business books of the last 20 years was Clayton Christensen's The Innovator's Dilemma , which was originally published in 1997. I know that "readable" is not an adjective commonly applied to business books, but it is both beautifully written and fascinating. I don't seem to have a paper copy anymore, since I lent it to someone and they didn't give it back. It's that kind of book. When I was at Cadence in around 2000, we used to have a worldwide engineering conference where about 30% of the engineering staff of the company went to a sort of internal DAC. Papers were presented, typically ones that could not be presented externally for confidentiality reasons. The conference had some catchy name, but not catchy enough that I can remember it nearly two decades later. There were keynotes, too. At the one in Phoenix, Clayton Christensen was one of the keynote speakers. He was scheduled for an hour, but talked for nearly two, but was so spellbinding that nobody considered giving him the hook. The book covers a number of industries. Most relevant for the semiconductor ecosystem is probably the parts about disk drives. But the same basic message works for hydraulic excavators, mainframes and PCs, and for the steel industry. The basic premise of the book is that there are two types of innovation: sustaining innovation, and disruptive innovation. Big companies are really good at sustaining innovation, even when it is really hard to do. The problem with disruptive innovation is that it doesn't do anything for the existing customers of the big company. So even if they do it, it doesn't get the resources. Even if management tells everyone to focus on it, they don't, because the company's money stream comes from the sustaining innovation. Kodak and Digital Photography A good example is Kodak (which wasn't "doomed" in 1997, and so is not in the book). When digital photography first came along, it had useless resolution and wasn't competitive with film in any way. But it only got better. By the time it was so good that even professional photographers went digital, it was too late for Kodak: other people were the market leaders. Kodak went bankrupt. But the high-fives of Canon, Minolta, Pentax, Nikon, and others were short-lived. Smartphones came along. At first, this had no effect, since the resolution was too low—tell me if you've heard this before—to be a replacement even for a cheap point-and-shoot camera. But it only got better. Now, there isn't really a point-and-shoot market anymore and some manufacturers, such as Minolta, have exited the market completely. At the high end, there is still a market for professional digital SLRs and the like (and for the high-end hobbyist). But the most important thing about a camera turns out not to be the pixel resolution, but whether you have a camera with you when you want to take a picture. And we all have our smartphones with us all the time. Kodak didn't make it long enough to be killed by smartphones, it got killed in the first digital photography wave, but that wave got killed by smartphones. People often criticise Kodak for "not seeing digital coming". But a big company in imaging is not that stupid. They even built the first digital camera in 1975 (that's it to the right). It was less than compelling: The 8-pound camera that Sasson put together shot 0.01-megapixel black-and-white photos and recorded them to cassette tapes. Each photo took 23 seconds to create, and the only way to view the photos was to read the data from the tape and display it onto a standard television screen. But that's the point of the Innovator's Dilemma. The initial products that are eventually going to slay the giants are never that compelling, or the giants would be selling them already. But the rate of improvement of the digital camera was fast. People would put up with 35mm film that they had to take to Photo Drive-Up to get developed and printed just so long as digital was really bad. Once it got close, it was a no-brainer to switch. I can't find a copy of it, but I've read an internal Kodak white paper on digital photography that laid out, with prescient accuracy, just how the digital market would unfold. First, it would be used for satellite photography. Almost any cost could be justified to avoid having to recover films dropped from satellites using jet fighters with nets, so it didn't matter if a camera cost $10M. Next, it would be used for professional users like architects, who could justify the high cost of the camera by the convenience in their workflow. This was still the height of Polaroid's instant photography, even more clearly vulnerable to digital photography. Kodak forecast that it would only become a consumer product many years later, once the costs came down to be competitive with film cameras. And so it was. But even having an accurate roadmap was not enough. The skills required to be successful in digital photography are not chemicals and an insane distribution network that covered every supermarket, drugstore, gift shop, park concession—in the entire world. Those all became stranded assets, since chemistry was not involved, and there was nothing to be distributed. The same vector exists in all the industries in the book. A disruptive technology comes along. But it is no use for the people who are using the current products. The disruptive technology gets better and better, and eventually starts to take the low end of the mainstream market. The current market gets driven up to the high end where people still value the premium capability. In the end, either the high end vanishes completely, or it becomes a small niche like professional cameras. PCs Digital Equipment (DEC) eventually went bankrupt, and the main reason is that the PC stole their market and nobody wanted to buy Vaxes anymore. Unlike the Kodak case, the capabilities to build a PC and a Vax were broadly similar, it was more of a market issue. When PCs first came out, nobody who wanted a Vax could use a PC. A lot of IC design was done on Vaxes back in the day, for example. PCs found their own niches, especially running spreadsheets (Lotus 1-2-3) and doing basic word processing, but those were not the people that DEC sold to. DEC's stronghold was scientific and engineering computing, and they had started to expand into the mainframe market, disrupting the big mainframe suppliers like IBM but winning sales in finance and government. For example, the British Value Added Tax system apparently still runs on a (virtual) Vax, written in Cobol. DEC wasn't oblivious and made a couple of attempts to enter the PC market with their own lines. The trouble was, their PCs were not really differentiated from anyone else's. That was the point of a PC, the "clones" were all the same. Most of the innovation in PCs came in form factor (Compaq with portable) and channel (especially Dell). The basic PCs were all pretty much identical. However, the more fundamental problem was that not many of DEC's existing customers could make use of a PC. Scientific and engineering computing required significant compute power and memory. In that era, Gordon Bell's book on engineering startups said that all business plans consist of (1) raise money (2) buy a Vax (3) do something. When you are that invested in a particular market that you are a joke in business books, a product that you might be able to sell to a non-core customer base is not that interesting. Of course, the PC got better and better. Eventually, the scientific and engineering people didn't need Vaxes because they went to engineering workstations from Sun and Apollo. But that was an interlude, and the PC drove everyone from the market. Now only the highest end mainframe market exists outside of PCs (I'm counting a typical server farm server as a PC). A similar trajectory happened in operating systems, where Linux was initially regarded as a joke. Even Linus Torvalds described it as "just a hobby, won't be big and professional" when he started. This was the era of Solaris, HP-UX, whatever IBM called its OS in that era, and so on. Now, there is Windows for desktop PCs, and everything else is Linux (iOS/MacOS is not really Linux but has the same roots). I remarked when writing about supercomputers (see my post Supercomputers ) that there was a drop-down menu to allow you to look at the supercomputers by operating system—but the menu only contained a single item, Linux. Education One area that Clayton has predicted will be disrupted is education. In the Cadence presentation 15 years ago, he said that he terrified the management of Harvard, where he is a professor, with his predictions. If education followed these other industries, then an institution like Harvard would be disrupted by lower-cost education, especially online education. I was reminded of this since I just finished reading Bryan Caplan's book The Case Against Education . His thesis is that education is mostly about signaling. Apart from a few courses like engineering and computer science, almost nobody uses anything of what they were taught in high school, never mind university, beyond basic literacy and numeracy, which we were done with by middle-school. In that sense, a Harvard degree is more like a Rolex watch. Casio will sell you a more accurate watch. A Rolex signals that you are rich and sophisticated enough to chose one. In the same way, a Harvard degree mostly shows that you were smart enough to get into Harvard. Online education assumes that education is about what you learn, not the signaling. As Bryan points out, anyone can have a Princeton education. Just wander into the lecture halls and attend. They don't check you are enrolled at the door. If you ask the professor if it is fine to sit in, they say yes, since professors love motivated students. The one thing you won't get is the one thing that matters. A Princeton degree. So the disruption that Clayton predicted 15 years ago has not happened. A Harvard education is not about what you learn, any more than a Rolex watch is about telling the time, or a Gucci bag is about having somewhere to carry stuff. EDA Is EDA vulnerable to this sort of Innovator's Dilemma disruption? So far, the sheer investment required to keep on the treadmill of Moore's Law has meant that there hasn't really been an opportunity to undercut the main tool suites with something that is "good enough" for the mainstream. However, that could change as Moore's Law slows and more and more designs are done in non-leading edge processes. There is perhaps a niche for a tool suite that can't be used for 7nm but is good enough for 90nm. In the past, that would have been too small a market to be interesting. If I was foolish enough to start a full-line EDA company, then I would do that: produce a suite of tools using mature technology that is good enough for non-leading edge design, and is easy and cheap to use. Obviously it would use a SAAS model. If I was successful, I could use the money to push the technology closer and closer to the leading edge if it was profitable, and undercut Cadences and its premium competitors. I have no idea if that niche even exists, and it would be a huge challenge to produce a full flow from a standing start even for 90nm. But there is clearly no way that a head-on attack on the most advanced process nodes has any chance at all. Digital was not killed by Data General and Prime, it was killed by Dell and its competitors. Sign up for Sunday Brunch, the weekly Breakfast Bytes email.

Viewing all articles
Browse latest Browse all 6678

Trending Articles