Presentations books

A few days ago I was asked by a friend what books I would recommend on putting together presentations. There are lots out there and I don’t claim to have looked at all of them. But here are five that I think are especially good.

The first book isn’t actually about presentations specifically but is one aspect of a few slides in some presentations. It is Edward Tufte’s book “The Visual Display of Quantitative Information.” He has two more books which are also worth reading but they go over much of the material in this first book in more detail. Anyone who has ever put a graph or a table in a presentation (or in an article or white-paper for that matter) should read this book. It is full of wonderful examples of appalling presentation of data as well as some exemplary ones. Too many books on presentations show you some good ones without being brave enough to call out presentations that don’t work.

The next book is very analytical and contains a lot of data about what works and what does not in presentations. It is Andrew Abela’s “Advanced Presentations by Design.” One key finding is that if you put the points you want to make in bullets in your presentation, then when you present it (so you are speaking as well as showing the slides) it is actually less effective than simply showing the presentation and shutting up, or giving the speech and not showing the slides.

Next, two books that really are about putting together presentations. Garr Reynolds’s has a book called “PresentationZen” and Nancy Duarte has one called “slide:ology.” These two books somewhat cover the same material with slightly different perspectives. In fact the blurb on the back of each book is written by the other author. You probably don’t need both of them but you’ll need to look at them both to decide which one you feel most comfortable with. Both books carry on from the analysis I mentioned above, emphasizing that a presentation should be designed to reinforce visually what you are saying, not repeat it textually. A presentation is not a crux for the presenter, not a sort of teleprompter for not having rehearsed enough.

Finally there is Jerry Weissman’s “Presenting to Win.” This is complementary to the other books in that it focuses much less on the visual aspect of a presentation and much more on how to make a presentation tell a story. His track record and focus is putting together presentations for IPO roadshows, which are probably a type of presentation that has more money riding on it than anything else. But most of what he says is appropriate for other types of presentations.

Between these books you get instruction on how to create a compelling narrative in a presentation, how to maximize the visual impact of your presentation, how to display quantitative information compellingly, and more analysis that you probably care to read about what works and what doesn’t in presentations.

Two other resources that I think are good for presentations: any Steve Jobs keynote speech (look at the iPhone announcement if you only look at one) and many of the speakers at TED, which has a 20 minute time-limit and so forces speakers to maximize their impact and focus on the most important messages.

Posted in book review | Comments Off

EDAC’s origins

Because I do some consulting for Oasys, I crossed paths with Rick Carlson who is an investor and advisor to them. I hadn’t realized that he was one of the people responsible for getting EDAC going.

In fact, when EDAC started it wasn’t for the whole industry, it was for the smaller EDA companies and actually was called IDAC, for Independent Design Automation Consortium (I’m not quite sure in what way the big companies were “dependent” but it wasn’t a bad name).

It started back in 1987 when Rick was VP sales at EDA Systems (remember them, they did frameworks back when frameworks weren’t called things like OpenAccess) and Dave Millman was at Epic. Almost jokingly they discussed creating an industry consortium before realizing that it was not just a way of passing time over a couple of beers but something that they should actually go out and do.

So they pulled together ten or so small EDA companies into a conference room in San Jose and in 3 or 4 meetings over 6 months started to create IDAC. But they rapidly realized that an industry consortium without the elephants in the living room wasn’t really an industry consortium.

So they pulled together a meeting at the San Jose Jet Center (that Rick’s wife managed, I bet they didn’t pay for the conference room) that was attended by Cadence, Dazix (already drowning), Zycad and others. The first hour didn’t go very well, with nobody really prepared to say anything. Luckily, another of Rick’s investments had been Pete’s Brewing Company and he’d bought along a case of Pete’s Wicked Ale. Time to drink the Kool-Aid, er…beer.

Soon, Joe Costello, as usual, is taking over, laughing and getting everybody rolling. A couple of months later Joe has got everyone there and they are starting to put some substance on what the issues were for EDA as an industry.

But that early promise never really went anywhere. EDAC got hijacked by the marketing managers of each company and took their eye off the big issues. EDAC declined into what it is today: it collects industry data, organizes a few events but doesn’t serve in any real way to raise the consciousness about the big issues affecting EDA as an industry nor doing anything about them.

If you’ve read much of this blog you know that I am very critical of the way that EDA failed to become a strategic partner to its main customers and allowed itself to be pushed down into an expense line in the CAD department budget. The ERP companies like Oracle and SAP manage to extract more dollars from semiconductor companies than does EDA, despite being far less strategically critical. EDAC unfortunately never managed to have any real effect on that.

Posted in eda industry | Comments Off

Tekton

I met with Bob Smith, VP of product marketing for Magma, earlier this week, over a glass of wine and a lunch in a nice French bistro, to talk about the new product that they are introducing today. They are introducing a completely new from-the-ground-up timing engine called Tekton. I happen to know that Tekton is also typeface (font) designed by David Siegel, one of the first designers to take the web seriously.

Magma have kept this development insulated from their general turmoil over the last couple of years. A year ago Magma seemed like it was coughing up blood and might not recover. Now they are clearly back. Rajeev apparently asked lots of CEOs of other companies how to cope with the downturn. “Slash R&D until the recovery comes and milk the existing products.” I’m not sure that ever works in EDA where the next process is coming down the track anyway, and Rajeev wisely ignored the advice, if anything doubling down on R&D since the stock market couldn’t punish him any more anyway.

It is always interesting watching established EDA companies bring completely new products to market. Far fewer of them are successful than you would expect. I think this is primarily due to the risk-averseness of their sales forces who make most of their money re-selling the same tools as last time and don’t want to risk delaying the renewal by introducing untried new products into the mix. This makes it impossible to mature the product and means it will die on the vine. In fact the only completely new products that I can think of that you would call successful are Calibre from Mentor, FineSim from Magma and, perhaps, PrimeTime from Synopsys (although they cheated there by buying Viewlogic and then shutting down Motive, the then market leader). All other products were either developed when the company was a startup (most notably Design Compiler and Magma’s original BlastFusion), came through acquisition, or were/are only marginally successful.

PrimeTime is now about 15 years old and so is perhaps vulnerable. Its basic architecture seems to make it hard to keep its performance in the top rank. Back then a full analysis was about 1 million instances and four process corners. Now a full analysis is about 50-100 million instances and 400 process ‘corners.’ The only way to do this with PrimeTime is to set up a huge farm and run lots of jobs in parallel on dozens of licenses. For example, one run of just 1.3M cells in 3 modes with 9 process corners (so 27 scenarios) required 27 machines and 65 minutes to do the analysis.

Tekton can apparently (and remember I was having lunch with the marketing guy not a hands-on user) do everything on a single 4 CPU machine in less than 30 minutes, and do a crosstalk analysis on the side. If these sorts of numbers hold up across the board, and if the correlation with PrimeTime (still the signoff standard) is nearly perfect then this will be an interesting battle to watch. Even though the product is only just being announced, Magma have already closed some customers and have several other high-profile evaluations in progress.

John Cooley managed to find some rumors about Tekton, including the product name, and correctly surmised that Magma would announce it at Music, their users conference, taking place this week.

Going forward, of course, Magma will be integrating the Tekton timing engine into their other timing-driven tools, such as Talus, to replace the older engine. This should speed up their other tools and also keep them on the capacity ramp necessary to handle sub 30nm designs.

The takeaway is that Tekton can time any design on a single machine in under an hour with just a single license. I think PrimeTime does over $100M in business so there is plenty of upside available for Magma to steal if the technology is as good as described. Let the battle begin.

Posted in eda industry, marketing | Comments Off

Altium: EDA Oz-style

One theme of this blog is that IC EDA is increasingly inspecting its own navel while the bulk of design is going on in the PCB and FPGA worlds (not to mention software). One company that is focused on this market is Altium. They used to be called Protel and were famous for a couple of things. Firstly, they were Australian. And secondly they sold their software at a very low price, partially because it wasn’t really state-of-the-art when compared with the more expensive packages. Today they are around $50M in revenue, with about 300 people world-wide, so that intermediate size where they are no longer small but still not the market share leader (in dollars). Worldwide they have about 40,000 individual licenses.

Over the last few years they revamped their product into a single integrated platform which was first released in 2004. I talked with Jeff Hardison and Bob Potock last week about their business strategy and their results.

The money quote is that last year over 500 US companies (companies, not individual engineers) switched to Altium Designer last calendar year. And they are strict about what counts as a switch: the company was not a customer with the older product, they were using a competitor’s tools (Mentor, Cadence or Zuken, who I thought was no longer around). In a few cases they counted separate divisions of large companies as separate companies but it is not material to the 500 number.

This is obviously a very nice result to have. Anyone in EDA would love to have 500 new customers switch from a competitor to their product. So they decided to survey them to find out why.

Since Altium Designer sells for $4,595 including the entire platform incorporating tools for PCB design, FPGA design, embedded software, IP delivery, verification, change-control and more, price would be one likely reason. A few people identified price but, by and large, people switched because Altium Designer is better in some dimension: easy to switch, easy to use, unified platform. Once these people had switched then a whopping 84% reckoned that their productivity had improved by at least a factor of 2, and nearly a quarter of them reckoned it had improved by a factor of 4 or more. One thing that seems to help a lot in ramping up on the new tool is the 3-400 “how-to” videos that Altium has available.

By any measure these are impressive results. If a “company” averages (I’m guessing) 5-10 engineers then 500 companies is 2,500 to 5,000 new individual users (out of 40,000). And remember, this is in the US alone. Most EDA companies are struggling to maintain their revenues an to open new accounts in the current downturn, making these numbers more remarkable.

Their competition is basically Mentor and Cadence with tool-chains that have been put together over the years by acquisitions. By contrast, Altium bit the bullet and built a completely new fully integrated architecture where everything is held consistent. For example, if you re-assign a pin on an FPGA you’d like that to automatically update things inside the FPGA (so that when the FPGA is routed it uses the new pin assignment) and outside on the board (so that the PCB traces correctly go to the right pins). In Altium this indeed happens. In other tool chains, not so much. You can even buy add-on tools to take care of this deficiency.

It reminds me of the tools we built from the ground up at VLSI which had many of the same attributes although with less automation (computers and databases were much more primitive 20 years ago). I’ve argued before that for most people, a Ferrari is better than a Formula-1 racecar; a fully integrated turnkey solution that just works and is an order of magnitude cheaper. Altium are delivering on this but with one big difference: the Ferrari goes faster than the F-1 racecar.

One thing that I immediately thought of is that if Altium sells their software for $5,000 and if almost everyone is listing better rather than cheaper as the reason they switched then aren’t they leaving money on the table? It reminds me of Model Technology in the early days when it sold a VHDL simulator for about the third of the cost that Cadence and Synopsys were selling Verilog simulators. They had the best VHDL simulator out there and I’m convinced they’d have sold almost as many at twice the price.

Well, I’m not going to do any more of Altium’s marketing for them but it is nice to see a company focusing on the 95% of designs that are not SoCs and disrupting an established market both from below, Innovator’s Dilemma style with a lower price point, and from above, delivering a premium product.

Posted in eda industry, sales | Comments Off

EDAC forecast panel

The annual EDAC CEO forecast meeting was last week. For a change, Synopsys had just reported their results earlier and so were not in their quiet period. Unfortunately, Wally had just entered his quiet period so, as he put it, he was only allowed to talk about the past.

First up was Lip-Bu of Cadence, who started out by wearing his venture-capitalist hat. He said that VCs were not finding any semiconductor companies to invest in right now. The economics are that it takes $100M to get a start of the art SoC into production, you need to sell 80M units to amortize all the costs and the chance of profit is low. Not exactly a compelling elevator pitch. It’s all about the profit was his message: the 90s was the decade of point-tools, the 2000s the decade of platforms and 2010s are the decade of profitability.

Wally was up next and instead of his usual slides predicting the future, we only had slides predicting the past. In particular, pointing out how accurate he had been in his predictions from last year.

John Kibarian of PDF solutions was up next. His slides said explicity “Confidential. Any reproduction prohibited,” which was funny in a public forum. I once made the same error presenting to IBM. They stopped the presentation and said they couldn’t look at the presentation if it was company confidential. John’s connection to fabs gave him a perspective on how bad the downturn had been. In early 2009 fab utilization averaged 30%, customers were cutting back orders by a factor of 10. However, utilization has popped back up quickly. TSMC is planning a capital investment of $4.8B in 2010, their largest ever.

To him the big interesting question in his space is which flavor of 28nm will win, especially in terms controlling variability. In the blue corner, gate-first; in the red-corner gate-last; and, mixing my metaphors, this is a 3 horse race and polysilicon is still in the running. Gate-first, which seems to be the IBM approach, is the most like current processes, but the gate has to be able to withstand the 1000°C processing needed after ion implantation. Gate-last, which is what Intel does, is actually gate first and last. A conventional self-aligned sacrificial polysilicon gate is built (although with totally different dielectric) and then finally is removed and replaced with metal. John reckons that by the time these processes actually get to volume manufacturing they will be much more similar than they currently seem.

Aart was next up and he thinks we’re at an inflexion point where we are moving from scale complexity, basically Moore’s law, to systemic complexity (IP, chip, board, software etc).

Then it was question time. Jay Vleeshhouwer, the moderator, asked a lot of questions. So many that by the time he threw them open to the floor about 10 minutes after everything was meant to have finished we’d were all ready to go home. The most interesting question was about how semiconductor consolidation would affect EDA. The cost of developing a process (TD in the lingo of semi companies) is $1-1.5B and then $3-5B for a fab to run it in. So there has been a lot of consolidation around a small number of processes, and a lot of companies moving to fables or fab-lite models (except for memory). Fewer processes is good for EDA since each one is expensive to support.

Wally pointed out that semiconductor is not consolidating despite things like Renasas and NEC. The market share of the #1 semiconductor company is the same today as 35 years ago (when it was TI, before passing the baton to NEC who passed it to Intel). Market share of the top 5 is less than 35 years ago, top 10 the same. And 2008 was the first year that a fables semiconductor company, Qualcomm, was in the top 10. EDA is not consolidating either, despite all the little mergers. The market share of the big 3 has been unchanged for 10 years.

Despite Lip-Bu’s early comment about not funding fabless companies, it seems that in 2008 it was down. But only to 80% of the long-run average, which seems surprisingly high. Wally also pointed out that churn is good for EDA. When a fabless company goes out of business the engineers pop up again soon at other companies. But the licenses die and the new companies need to re-invest.

Lip-Bu pointed out that this year 10-12 semiconductor companies should go public and that he thinks VC will swing back. I’m getting a mixed message here. Spinoff activity is also increasing.

Jay next asked where the next big $100M market would come from, in the same way as RET went from $0 in 1999 and is now close to $200M. Aart pointed out that if it was obvious everyone would be investing heavily in it and there would be 30 startups in the area. There is a major aspect of serendipity in these things.

Finally, acquisitions. Aart said that the best acquisitions are things that are not to close and not too far from what you do already. If they are too far then little synergy, too close and too much overlap. Wally agreed and pointed out that recessions are a time of opportunity since you can do deals that might not get done in sunnier times. But he warned that if you take the #3 company in a space and merge it with the #5 company, you are much more likely to end up with the #7 company than the #2 company.

And with that it was time to fill in our predictions for the stock prices for next year. I forgot to hand my card in but I guess 8 for Cadence, that Magma would no longer be independent, 11 for Mentor, MIPS would no longer be independent, and 27 for Synopsys. But if I could do this stuff well then I’d be much richer than I am.

Posted in eda industry | Comments Off

Guest blog: Sandeep Srinivasan

This is the second part of a piece by Sandeep Srinivasan. The first part is here. This follows on from my piece yesterday and sets up a different view from mine (you’ll have to wait until next Tuesday, or read lots of this blog, to find out my opinion). Be at the San Jose Doubletree at 6.30 on February 23rd in the Oak ballroom.

EDA and the 50 picosecond problem, part 2

There has been a lot of introspection and analysis recently, by EDA executives and analysts as to why we are where we are, as an industry.    There seem to be no easy answers as to why EDA is at the bottom of the economic food chain, in-spite of the stellar growth and demand for electronic devices. We can blame the recent economic crisis, declining ASIC design starts, rising mask costs etc. but the writing was on the wall much before the market meltdown.

There maybe a variety of reasons as to why there maybe a lack of significant differentiation amongst physical design tools, as discussed in the previous segment, some of which are highlighted below.

Cost of EDA software development is extremely high compared to any other segment of the software industry. For a startup to shine and differentiate itself, it has to build the mundane first (foundation, database, file format support), before attempting to show it’s value proposition. By the time the foundation software layer is done, there are significant cost pressures for the startup to scale-back it’s differentiation and compete head on with incumbent products on features.

The cost of sales in EDA is high, not in the traditional accounting sense, but when we look at a typical sales cycle. An EDA tool ‘benchmark’ can last for months, extending the sales cycle to a degree where it ceases to make financial sense for a startup company.

The large EDA vendors do innovate significantly but tend to be encumbered by their existing customer base, and choose to focus on incremental rather than disruptive innovation.

So what is the answer for us an industry to get out of this slump ?

Reduce the cost of EDA software development.This will require EDA companies to devote significant effort on an ‘open-source’ like paradigm.

Engage with universities to encourage the next generation of EDA developers with fresh ideas. This may lead to us finding the 100 picosecond or 1 nanosecond differentiator.

Short circuit the long benchmark cycles by enabling web deployable tools and focus on ease of use.This is perhaps easier said than done, due to the complexity of EDA software tools, but it is a necessary step for the industry to jettison out of an archaic business model.The burden lies on the creativity of EDA developers to make tools easier to use and deploy.

Large EDA companies need to step up and encourage disruptive innovation, either through funding or feeding the software ecosystem. Companies like Cisco Systems have mastered the art of spin-offs and spin-ins,. While on the other hand, companies like Google and Yahoo have contributedsignificantly to the ‘open source’ software ecosystem. EDA industry should follow these model to accelerate innovation.

There are significant technical challenges for EDA developers to solve and capitalize on. Some technical challenges that come to mind are the following: Power distribution, Fine-grain on-chip voltage control, semi-synchronous logic, system level compilers for hardware-software partitioning.

Merging EDA with circuit IP can expand the market segment, while adding significant additional value to our customer base.This notion has been tried in past with marginal success. Perhaps the timing is right for these activities to accelerate and morph into a new business model for EDA.

Next generation of EDA entrepreneurs need to build companies without an overwhelming focus on an ‘exit strategy’, and not being shy about building profitable ‘lifestyle’ companies. Some of the most successful companies in the electronics industry were built on the premise of adding value to the engineering community, and not with a focus on how to ‘exit’.

Some of the answers to our industry’s problems lie in our ability to efficiently innovate out of the economic slump and to pay attention to creating differentiated products.

Posted in guest blog | Comments Off

So you want to start an EDA company?

As I have said repeatedly, the old model for innovation in EDA has died. The old model was largely that venture capitalists would fund teams of engineers, they would produce products to solve some problem that was looming on the horizon, one or two of them would turn out to be the market leaders, the big EDA companies would buy them for significant money and everyone was happy.

This model is broken for all sorts of reasons. The big EDA companies just don’t have the stock valuations and the cash to make acquisitions in the $100Ms. As a result, almost no VC will fund a new EDA company or even put much more money into one that they already have on their hands. Plus, the slow adoption of the most advanced technology and falling design starts make it impossible to justify that type of valuation.

On the other hand, it remains really easy to start an EDA company. Find a problem, get together a few engineers who really understand it, and write some code. If you successfully solve a key problem for the leading edge semiconductor companies, they will buy your product.

Some things work in your favor. Part of the reason that the VC model isn’t working well for EDA is that it isn’t working well in a lot of segments: it just doesn’t cost enough to require VC-level investment. Computers are cheap. Cloud computing means that large capital investment is not required, you can have all the peak compute power and storage for almost nothing. If you are lucky, you may find enough friends, fools and family to invest in your company. Probably you won’t be able to pay anyone until you get your first product far enough along that you have a chance to raise some investment. The key is to keep that investment small, keep the burn rate low and run the company so that a $20M acquisition is attractive. Or that you have the old Metasoft (HSPICE) model of having a small highly profitable EDA company that throws off a lot of cash every year in bonuses, even if you don’t get acquired (although they did in the end when Gerry Hsu sent them an offer that they had 24 hours to accept). VCs disparagingly call these “lifestyle” companies, but if you have 15 people in the company and $10M in revenue that could be quite a lifestyle.

So why am I reiterating all this? Because Jim Hogan and I are leading a discussion on just this topic next Tuesday evening. “So you want to start an EDA company…” at the San Jose Doubletree on February 23 at 6:30-7:30 in the Oak Ballroom (bars are open in the area!). This is during DVcon but you don’t need to be registered to come along.

Posted in investment, management | Comments Off

CoWare

It must be something in the water in silicon valley right now. But no sooner have I written about VaST and Virtutech being acquired then Synopsys acquires CoWare as well. Since I never worked there, I don’t have any sort of inside track on what is going on. CoWare started life many years ago with a product called n2c, which stood for napkin to C, an attempt at turning high level system ideas into models quickly. They then did a deal with Cadence whereby they took over the old Comdisco SPW product line and the engineering team that supported it, in return for equity and royalties I think. Then more recently they decided to develop their own virtual platform technology which, I believe, is the heart of their business today. For a long time a lot of their revenue was service revenue in Japan (remember, they are the most advanced in system-level thinking so tended to be the first place where there was any revenue to be had) although I presume that is no longer the case.

To sum up, there have been a number of different instruction set simulation technologies developed: Axys (which ARM acquired and then spat out again to Carbon when they decided they didn’t want to do all that modeling any more), Virtio (which Synopsys acquired a couple of years ago), VaST (which Syopsys acquired 2 weeks ago), Virtutech (which Intel/WindRiver acquired 1 week ago) and CoWare (which Synopsys acquired today). Phew. 

Simon Davidmann and Brian Bailey both pointed out, correctly, that I’d forgotten about Imperas when I talked earlier about which technologies were left out there. There is also, of course, Qemu which is an open-source instruction set simulator. The base technology of Imperas is also open-source, the open virtual platform, www.OVPworld.org although that is not how they started out. So in an act of contrition for forgetting Imperas, let me tell you where they are: they have 2000 registered users, adding about 150 per month, spread around about 400 companies. They have over 40 processor models ranging from the bizarre ones used in automotive up to quadcore MIPS processors running at 200-1200 MIPS. They are still self-funded but are growing and adding staff.

So going forward there is Imperas/OVP still independent. Synopsys with Virtio, VaST and CoWare technologies which they will presumably try and pull together into a single environment, and WindRiver/Virtutech. One distributing basically over the web and the other two with much bigger distribution channels than they had before although less focused than before. Let the games begin.

Posted in eda industry, embedded software | Comments Off

Virtutech

I’d heard rumors that Intel was acquiring Virtutech and I presumed that the purpose was to put it together with Wind River that they acquired last year. I mentioned this in my post about VaST earlier this week but I wasn’t expecting to come back quite so soon to write about Virtutech. Anyway, it is now official, Intel is indeed acquiring Virtutech. The price hasn’t been disclosed (and I haven’t received any paperwork yet so I’m not even pretending to be clueless this time). Unlike with VaST, I do get my money back, and more.

I think I was probably the only person who worked for both VaST and Virtutech (and I even did a consulting contract for the two companies jointly a couple of years ago) so it is interesting to look at why one company was so much more successful than the other. I haven’t seen the finances of either company recently, so I don’t know the revenue levels, profitability etc.

The companies targeted different markets. VaST did most of its business in Japan, almost all of it when I first joined them. They had customers in automotive, wireless and consumer. Consumer is mostly ARM. Automotive is mostly processors you’ve never heard of (NEC V850 anyone?). Virtutech went for the big iron, modeling complete base-stations for Ericsson, whole routers for Cisco and Huawei, servers for Sun and IBM, and aerospace systems. Almost coincidentally, most Virtutech customers were PowerPC based which leveraged the models and led to a close relationship with Freescale. These types of projects are much bigger and so have much bigger budgets, meaning larger deals.

Another issue, as I mentioned in my post about VaST earlier in the week, is that VaST’s cycle accurate models were more complex to build. For about the same revenue, VaST had four times as many people doing modeling as Virtutech which, plainly, means that they need to sell four times as many copies of any given model to get the same return. The fact that the end markets that VaST was targeting had different processor requirements aggravated this.

Virtutech was run much leaner than VaST. A couple of years ago, at a time when both companies revenues were the same, VaST had a full-time CFO and four people in finance. Virtutech had one finance person. A company doesn’t live or die by the size of its finance department, but G&A being too large is always a symptom of lax expense control.

One thing that both VaST and Virtutech did is have their engineering groups offshore. But it wasn’t an expense issue, with groups in India or China, but more an accident of history. VaST started in Sydney Australia and the engineering remained based there. Virtutech started in Stockholm Sweden and their engineering remained there. And I’ll give VaST the edge here; Sydney is a nicer place to visit than Stockholm, especially in winter!

One story about Virtutech, from before I worked there, was that Microsoft used Simics to port Windows to the 64-bit AMD processor before AMD could deliver any silicon. This was in the days when Intel was all Itanium going forward and Microsoft was supposedly on-board. In fact at first AMD tried to screen the fact that Microsoft was the end-user, it was so sensitive. And on the first day AMD delivered silicon, Windows booted. It would have been the ultimate customer success story but we weren’t allowed to talk about it.

When I joined Virtutech, Peter Magnusson was the CEO. He’d started the company in Sweden, where the engineering remained, before moving himself and his family to the US. It was a joke inside Virtutech that Simics was Peter’s Ph.D. thesis gone out of control. He has a reminiscence about Virtutech on his own blog.

So that just leaves CoWare out there as the only remaining independent supplier of this type of technology, and Carbon with their RTL acceleration technology.

Posted in eda industry, embedded software | Comments Off

VaST

Finally it is public knowledge that Synopsys has acquired VaST Systems Technology. I was VP marketing there for a bit over a year back when Graham was still CEO. Since I exercised my options when I left, I’ve been inundated with paperwork on the merger, although the acquiring  company name was redacted everywhere it appeared (although everyone knew who it was). I don’t suppose I’m giving away too many secrets to reveal that I’m not going to get my money back. The common stock is lucky to get anything at all, but they did need bribe us a little to sign off on the merger.

By the time Alain Labatt came on board as CEO, I was at Virtutech which sorta competes with VaST in having similar technology and sorta doesn’t since they go after such different markets. We thought that this was great for us since Alain’s reputation was that he was good at two things: raising VC money, and then ramping up the expense run-rate to spend it all. He’d done that at Frequency/Sequence and then again at Tera Systems. And true to form he seems to have been successful again at both raising money and then spending it.

When I was at VaST we did most of our business in Japan. I opened up a couple of accounts in Europe and we had just one, Delphi, in the US. The dependence on Japan has apparently reduced, but still VaST is over-represented there. To be fair, the Japanese are ahead of Europe which is ahead of the US in terms of system level thinking, so Willie Sutton style, that’s where the money was. VaST was also over-represented in automotive. Japan and automotive have unfortunately been especially hard hit in the current downturn, so I’m guessing that VaST’s business declined dramatically. I assume the VCs didn’t want to put in more money since they couldn’t see a route to a fast growing profitable company and so they got Alain to shop the company around. Synopsys is its new home.

Synopsys already purchased Virtio a couple of years ago which had similar technology to VaST’s. VaST’s is cycle-accurate which makes for a number of issues: since VaST had no non-cycle-accurate models it couldn’t really get premium pricing for them since it didn’t have any non-premium models for people who didn’t value cycle-accuracy (when ARM was in the modeling business it sold cycle-accurate models for 6-8 times the price of non-cycle-accurate ones). Also, the verification issues of cycle-accurate models are much harder since not only do they have to be functionally correct, the cycle accounting has to be correct and there are many corner cases in a modern processor. So the combination of expensive models and an inability to get a premium price for them made for an unattractive combination. Potentially with VaST and Virtio now in the same stable, that problem goes away. Non-cycle-accurate models for people who don’t need them, and premium pricing for people who need the costly cycle-accurate models.

There are plenty of rumors that Virtutech is also in the throes of being acquired, but I’ve not seen any paperwork for that one yet. But if that is true it leaves only CoWare out there of the companies with virtual platform technology. Axys was acquired by ARM. Virtio and VaST by Synopsys. Virtutech and CoWare, for now, are still out there.

It is interesting to look at why these companies were not as successful as I thought they should have been. In the end, I think, you could get so much done with cross-compilation to your workstation that the market for people who valued the model accuracy was too small. Look at iPhone programming. Despite a litany of complaints about the simulator, which works by compiling your source-code to run directly on the Intel processor in your Mac, you don’t really need an platform able to run the ARM binary to get development done. An inaccurate simulator and the actual phone are enough. There isn’t a lot you can do with a more accurate platform squeezed into the gap between these two other solutions.

Posted in eda industry, embedded software | Comments Off