Guest blog: Sandeep Srinivasan

Sandeep is currently a consultant at Mskribe. Most recently he was Vice President of West Coast Operations, for CLK-Design Automation. Prior to CLK-DA, he was CEO and Founder of Synchronous-DA which merged with CLK-DA and a history going back through Cadence and HLD Systems. He began is career as a CAD engineer at AMD in the x86 group.

EDA and the 50 picosecond problem

There has been a lot of introspection and analysis recently, by EDA executives and analysts as to why we are where we are, as an industry.    There seem to be no easy answers as to why EDA is at the bottom of the economic food chain, in-spite of the stellar growth and demand for electronic devices. We can blame the recent economic crisis, declining ASIC design starts, rising mask costs etc. but the writing was on the wall much before the market meltdown.

EDA ecosystem revisited

One can point to many issues with the EDA industry, and attempt to root cause why we are not being able to get a bigger piece of the semiconductor pie. Few things that come to mind are the following:

The venture capital community seems to look at EDA as a broken business model, with little or no upside, due to the lack of recent exits. An investor recently told me and I quote “The smallest ROI per Phd. is in the EDA industry”.

Large EDA vendors are not feeding the ecosystem.There has been little or no funding for startups or academic research from the large EDA vendors. In addition there is less and less collaboration from the large EDA vendors.While this approach is conceivably good to protect ones’ franchise, it maybe a flawed strategy for the long term, in a technology driven industry such as ours.

Funding for research institutions has fallen out of favor with EDA companies.This is a disturbing trend, considering the fact that a majority of the foundation of EDA software comes from university research.

Large EDA vendors want to mimic the enterprise software (Oracle, SAP) monolithic model. This sounds very attractive to a CEO of an enterprise, as compared to a heterogeneous ‘best-in- class’ model, which would entail internal support and development.The one difference in EDA versus enterprise software, is perhaps the fact that FASB (Financial Accounting Standards Board) or GAAP (Generally Accepted Accounting Principles) rules don’t change at the rate at which the semiconductor process or design requirements change.

EDA startups have always relied on Angel investors who were as passionate about EDA technology as the founders.This is another very important source of capital that is drying up, and is perhaps the one that has serious implications for us as a technology industry. If successful EDA “Angels” don’t feel comfortable investing in our industry, this say’s that we have a serious problem on our hands.

Semiconductor industry (our customer base) has been slow to adopt new technologies, due to cost pressures. In addition they have got used to getting products for a fraction of what they used to spend 5 years ago.This trend compounded by a lack of pricing discipline from the EDA vendors has lead to significant price and value erosion.EDA industry’s disaggregated software ecosystem is clearly hurting the industry. If companies differentiate themselves based on a file format ( CPF versus UPF for instance) , we have some critical thinking to do, as an industry. Efforts such as Open-Access have not gained the traction they should have to propel our industry from competing on issues that add little or no value, such as proprietary file formats. Perhaps ‘Open-Access’ needs to be more ‘open’.

The 50 picosecond problem

The final problem that comes to mind is what can be termed as the ‘50-picosecond problem’. This is a problem where innovation seems to stall when an industry segment nears maturity.

In order to highlight the ’50 picosecond problem’ we can try to analyze the IC physical design tool segment of the EDA industry, as an example.

IC physical design tools from Company S, Company C , Company M and Company M can take the same design and produce results within 50 picoseconds (figure of speech rather than a literal) of each other.

What this highlights is a lack of differentiation amongst physical design tools. In addition, we see new startups in the physical design space, that develop a tool from ground up, only to be marginally better (50 picoseconds ?) than the incumbent tools.

What could be reason for such incremental differentiation in products that are developed from ground up, with the premise of displacing incumbent tools ? Perhaps all the engineers are reading the same books, and implementing the same algorithms again and again ? Could it be that the semiconductor process is scaling so well, that there are no new disruptive physical effects ?

No easy answers

We will attempt to address some of the issues highlighted above in a later blog entry.

This entry was posted in guest blog. Bookmark the permalink.

Comments are closed.