Guest blog: Steve Schulz

Today’s guest blog is by Steve Schulz. These days Steve runs the Silicon Integration Initiative, Si2. Prior to joining Si2 Steve was VP corporate marketing for BOPS. Prior to that he had a long tenure of nearly 20 years at Texas Instruments in a wide variety of positions. He’s been heavily involved in many EDA standards.

The law of unmet needs: embedded software and EDA

Over the years, I have developed many “Belief Bricks”. What are these, you ask? These are the building bricks that define my foundation for explaining the world around me. We all have them – they are a filter by which we accept input and shape our ideas. I’d like to pick out one of them here and connect it to a trend in our industry – it’s the “Law of Unmet Needs”.

I like this one – it is the root of business opportunity. Do this: find what is painful and getting worse, and no one (seems to be) addressing it. Then figure out a (market-feasible) approach for how to solve it, use good business skills to manage the solution, add water – and voila!

Of course it’s not quite that simple. Yet it remains the basic recipe for “business plans” supporting new startups, or new products within existing companies. It is the basis for how we approach our new standardization efforts at Si2 as well. Part of the trick is truly understanding what the market is telling you about the unmet need, while part is navigating hazards that must be deftly avoided to not fall into the abyss along the way.

One benefit of my role at Si2 is the opportunity to listen to a wealth of input from across our member companies. Recently, there has been a noticeable increase in the pain level associated with designing complex silicon that runs embedded software. This trend was already there, as more processors run more software on SoCs. What has changed, however, is the added risk this embedded software “variable” brings to achieving necessary parameters of the hardware design task. 

One working group in Si2′s Low-Power Coalition, while addressing power at the architecture / ESL level (where 80% of the savings are hidden), concluded that a lack of standards for higher-level modeling of power was a barrier to industry progress. Now, even without an embedded software component, the challenges of estimating and managing power consumption during the product’s operation are hard enough. Yet many products today have multiple processors, and this trend will continue. Your smart phone’s silicon burns power dictated by the software that owns the bulk of digital functionality. The energy dissipation resulting from switching transistors is a direct consequence of the software operation… but EDA flows lack a means to factor that into the design trade-off space. What operations must be concurrent? What impact will switching power / frequency modes have to critical response times as timing fluctuates? Which architecture is best suited to the combined (software + hardware) time-varying functionality? How do we cooperatively work with the software team better?

This problem does not lend itself to simple in-house solutions. Its no wonder that we are hearing so much about the rising cost and complexity of designing silicon – to the point that the venture capital community has “moved on” to other (more attractive) areas. There clearly seems to be a large unmet need, and this trend has nowhere to go but up. 

In the past, established EDA vendors have stated they have rejected this growing aspect of design because there is no money in the “software world” (think free compilers and $995 development kits). However, that logic is flawed. To compete in the software development world is to address a different problem – and one that already has plethora of solutions. The unmet need here is addressing the current problem scoped by EDA: effective design of silicon to requirements under a number of complex constraints. EDA adapted to adjacent manufacturing issues and integrated DFM concerns; perhaps software is the next adjacency. How much would companies pay for genuine improvement in this problem, where the new world order puts embedded software onto nearly every chip? How can we design to even more stringent requirements 5 years from now if this trend continues?

Perhaps this problem area is not being addressed because we lack a clear vision of any feasible approach for connecting our world of silicon design and the world “on the other side”. Perhaps no single company can deliver a useful solution without more enabling infrastructure to support it. Perhaps we haven’t really tried yet.

I see a continuation of the trend for more embedded processors – and more complex silicon design parameters dependent upon what the software does that drives its operation. What do you see? Is this really an unmet need? If so, how would you propose the industry tackle it? I would be interested to hear your comments.

This entry was posted in guest blog. Bookmark the permalink.

Comments are closed.