The innovator’s dilemma

Clayton ChristensenThe Innovator’s Dilemma is a book by Harvard business school professor Clayton Christensen. I highly recommend the book both as one of the most stimulating and best-written business books (I know that is an oxymoron, but this is a book you will really enjoy reading). The basic thesis is that there are two kinds of innovation, sustaining (giving high-end customers what they want) and disruptive (giving a new set of less demanding customers something less that state-of-the-art). Sustaining innovation eventually gives people more than they want at a premium price point, but disruptive innovation often improves faster and eventually steals the main market from below when its basic capability addresses the mainstream at a lower price.

Here’s an example: Digital Equipment Corporation (DEC) built Vax computers in the 80s. Customers wanted more and more powerful Vaxen and had no use for the IBM PC when it came out, a low-powered machine that didn’t even have a floppy disk as standard, let alone a hard disk, when it came out. But eventually the PC destroyed DEC’s business (and it will almost certainly destroy Sun’s) as it got more powerful. The dilemma is that it is unclear what a company like DEC (or Sun today) should have done. They were not stupid, they could see the PC coming, and they even made several attempts to enter the PC market themselves. But it was of no use initially to their primary customers and they didn’t really have the capability to sell to the people who could make use of early PCs. By the time the PC was powerful enough to be of interest to the scientific and business computing segments, where DEC sold most of its kit, it was too late. Other companies (Compaq, Dell etc) were already established as the leaders and DEC was eventually dismembered with part going to Intel and part going to Compaq and so ending up inside HP. It is not that it was or is impossible to built a computer more powerful than a high-end PC, it is that the cost-differential is so large that very few applications justify paying such an enormous premium.

Clayton’s book has some other lovely examples: cable driven earth-moving equipment being driven out by hydraulic; steel mini-mills making rebar and gradually working up to high-grade sheet steel and so on.

InnovatorWhen I was at Cadence we had an annual engineering conference, a mixture of presentations of papers that could not be presented externally due to confidentiality, social getting together of engineers from dispersed sites and an opportunity to address a lot of engineering in one place (I think about a third of all Cadence engineers attended). Professor Christensen was one of the keynote speakers at one meeting and he was also a fascinating speaker.

One thing he discussed a bit was the end of Moore’s law. He predicted that Moore’s law would end because it would deliver more capability than the mainstream required at a price that was higher than the mainstream wanted to pay. This was already happening in the PC marketplace where for some time microprocessors have been “fast enough” for almost all applications (whereas through most of the 1980s and early 1990s people would upgrade their PC regularly simply because the old ones lacked oomp).

I think it is clear now that the mainstream PC market in its own turn is going to be disrupted from below by iPhone like devices. iPhones will get more powerful until most of what a PC is used for can be done on an iPhone (or a Google Android-based phone or a Nokia one; I’m just using iPhone as shorthand). Of course they don’t have big screens or keyboards but if my office and home had them, then my powerful future iPhone would simply work from my pocket when I was nearby. Or maybe it will project onto my retina or sense the muscles in my fingers or something. Who knows?

For many systems, FPGAs are disrupting ASIC from below in traditional innovator’s dilemma style. Nobody does an ASIC unless they absolutely have to, which either means an enormous amount of integration, enormous volumes, or low-power requirements (which is the Achille’s heel of FPGAs). If you can use an FPGA then you will.

Moore’s law has been driven for decades by semiconductor economics. It was always 30% or more cheaper to use the new process generation than stick with the old one. But it is not clear whether 28nm (and 22nm or whatever comes next) will have such a cost reduction. Maybe 22nm is going to be the mainframe of semiconductor processing, very expensive and delivering more capability than the mainstream market can take advantage of. The mainstream will hold back in older processes and use clever software to get what they want; after all, most chips these days are just substrates for running the software of the system.

This entry was posted in book review. Bookmark the permalink.

Comments are closed.