Friday puzzle: bananas

Last week’s puzzle was to estimate when the 20th record year of rainfall will occur in New York. The answer is in 272 million years time. In 1835 that year was a record year by definition. In 1836 there were two equally likely combinations: two record years in 1835 and 1836; or record in 1835 but not a record in 1836. Since those two outcomes contain 3 records the expected number of records is 3/2 or 1.5. The chance that 1837 is a record year is 1/3 (one of the three years has to be the biggest and it is equally likely to be any of them).  By similar reasoning, the expected number of record years at that point is 1 + ½ + 1/3. In fact the expected number of record years in year N is just 1 + ½ + 1/3 +… 1/N. You might even remember from math class that this is called the Nth harmonic number and you might further remember that although it grows unboundedly large it grows incredibly slowly. In fact so slowly that it takes hundreds of millions of years to get to 20.

Now for today’s problem. A banana plantation is located next to a desert. The plantation owner has 3000 bananas. The market is 1000 kilometers away. He has one camel which can carry a maximum of 1000 bananas at a time but eats one banana for every kilometer it travels. How do you get the maximum number of bananas to market?

Answer next week.

Posted in puzzles | Comments Off

Public affluence, private squalor

Somebody in a comment earlier this week said that I was especially turned off by unions in a post talking about California. Unions actually behave just the way that you would expect them to, to maximize their own power. The thing they are most interested in is not actually their member’s welfare but their own welfare; they would always prefer increases in the prison and education budgets to be taken as more people rather than higher salaries or higher productivity for the existing members, and we’ve certainly had a lot of that. The California democratic politicians, meanwhile, have gerrymandered themselves a permanent incumbent majority and they are supported by those same unions. So they go along with expansion of spending year after year, and expansion of salaries and, especially, benefits. Everyone gains except the private sector taxpayers, who get screwed.

I’m not especially anti-union, but I think that the political-union complex is completely out of control, and is a major threat to Silicon Valley’s continued long-term success. We are on a path to private squalor and public affluence in California. The private sector will have to fund all the promises that the politicians made over the years, in a massive transfer of wealth from the poor (retirees living on their savings and people making average salaries) to the rich (public sector retirees).

The cities of Vallejo went bankrupt recently, entirely due to firefighter and police salaries and benefits, especially retirees where they have lots of retired employees on six-figure salaries and unlimited medical benefits for life. Vallejo has 120,000 residents but $850M of unfunded retiree commitments to the police and firefighters. That’s around $25,000 per household. Those people probably also owe at least that much in unfunded public sector retiree benefits at the state level too. Many other cities are predicted to go bankrupt in the current downturn, since it’s the only way they have a chance to re-negotiate those gold-plated contracts.

A friend of friend works in finance for the Santa Clara school district. Every teacher they employ costs $180,000 per year. About a third of that is salary and medical benefits but the rest is their retirement benefits, which Santa Clara is smart enough not to leave unfunded to create a future disaster.  I wish someone was putting away over $100,000 per year for my retirement.

There’s not really any good way to measure prison guard productivity, but in education there is. Over the last 20 years, adjusted for inflation, California’s spending on education has doubled. But standards are exactly where they were 20 years ago according to the state’s tests. In what other industry has productivity halved? The best part of California’s education system is actually the universities and community colleges. But that part is now threatened because all the money is going to the inefficient K-12 segment where, in principle, we could cut spending by 50% with no effect on outcomes. Stanford is a private university, but Berkeley, UCLA and so on are not. If they lose their stars then it will certainly affect Silicon Valley.

Since Arnold Schwarzenegger was elected governor in 2003, California’s spending has increased by 40% because so much of the budget is on autopilot, driven by various propositions or by existing contracts. What that means is that we could reduce California’s budget by a bout a third and it would be something like 2001 again. It didn’t seem bad back then after all.

Posted in silicon valley | Comments Off

DAC in review

So now that there are a couple of weeks to allow a bit of introspection (also known as reading everyone else’s blogs) what were the big themes?

On the technical side I think it is the fifth, or whatever, year that this is the year of ESL at DAC. ESL has been the Gallium Arsenide of EDA for a long time, just about mature enough to finally really take off "real soon now". But this time it feels like it might be starting to be real. There are several established players in ESL synthesis aka HLS (Mentor, Forte, Cadence, Synfora, Bluespec, AutoESL) and one, Calypto, in formal verification. I think Calypto are actually very well-positioned in the market since there’s only one of them right now, and because formal verification is CPU intensive that drives license demand naturally as the tool is adopted and used on larger designs. There are clearly too many players in the synthesis space for them all to be successful, and several of them have been around so long with so many rounds of investment that it is hard to see how they could be successes for their investors whatever happens.

There definitely seemed to be a feeling that handoff of designs was moving up to the level of C (or C++ or SystemC) from RTL, although the precise way this would happen is still murky. And the software ties in there, maybe, but most people in EDA don’t really understand the software component of systems.

Another technical theme was that there are a lot of companies attacking Cadence’s Virtuoso walled-city for analog design. Whether they’ll succeed in bringing down the walls or once again blunt their lances on the brickwork remains to be seen. Synopsys now has a serious entry in the space, Springsoft continues to be strong especially in Taiwan, Ciranova is trying to break open the Skill portcullis, and lots of other companies filling in various analog niches.

Lurking around under the surface was multicore, especially in the nVidia keynote. I’m less bullish on the notion that Amdhahl’s law is repealed than William Daily was, and there are simply enormous amounts of legacy code around. But electronic systems have an enormous software component, and it is becoming a harder problem than the chip design in many cases.

At the risk of sounding self-serving I think this was also the year of the blogger. I was described on the blogging panel as the “Lance Armstrong of blogging” by Sean Murphy, which I guess means I’m an old guy who came third, but I think it was meant as a compliment. For the Denali "next EDA blogger" competition, Karen Bartleson of Synopsys won. I guess she’s the Alberto Contador of bloggers.

Seriously though, for analysis of the industry I think it is clear that the bloggers are doing a better job than anyone else. True, we are not trained as professional journalists but I’m constantly impressed by the standard of writing and the deep technical knowledge of some of the bloggers. It’s not clear how it will all play out, in particular how useful many of the corporate blogs will turn out to be. They tend to stick too close to the party line in most cases to be all that interesting, and avoid controversial topics. I’d be interested to know how widely read they are.

And on a personal note, it’s my Dad’s birthday. Hi Dad.

Posted in eda industry | Comments Off

Being too early to market

Apple LisaStartups have a singular focus on getting their product to market as quickly as possible. Given that focus, you’d think that the primary mode of failure for a startup would be being too late to market, but it’s actually hard to think of startups that fail by being too late. Some startups fail because they never manage to get a product shipped at all, which I suppose is a sort of special case of being too late to market: you can’t be later than never. But try and think of a startup that failed because, by the time it got to market, a competitor had already vacuumed up all the opportunities. Monterey in place and route, I suppose, simply too far behind Magma and the big guys re-tooling.

On the other hand, many startups fail because they are too early to market. In EDA, technologies tend to be targeted at certain process nodes which we can see coming down the track. There’s little upside in developing technologies to retrofit old design methodologies that, by definition, already work. Instead, the EDA startup typically takes the Wayne Gretsky approach of going where the puck is going to be. Develop a technology that is going to be needed and wait for Moore’s law to progress so that the world does need it. The trouble with this is that it often underestimates the amount of mileage that can be got out of the old technologies.

Since process nodes come along every couple of years, and even that is slowing, getting the node wrong can be fatal. If you develop a technology that you believe everyone needs at 45nm but it turns out not to be needed until 30nm then you are going to need an extra two years of money. And even then, it may turn out not to be really compelling until that 22nm node, after you’ve gone out of business. All the OPC (optical proximity correction) companies were too early to market, supplying technology that would be needed but wasn’t at that point in time. Even companies that had good exits, like Clearshape, were basically running out of runway since they were a process generation ahead of when their technology became essential.

The windows paradigm was really developed at Xerox PARC (yes, Doug Englebart at SRI had a part to play too). Xerox is often criticised for not commercializing this but in fact they did try. They had a computer, the Xerox Star, with all that good stuff in. But it was way too expensive and failed because it was too early. The next attempt was Apple. Not Macintosh, Lisa (pictured above). It failed. Too early and so too expensive. One can argue the extent to which the first Macs were too early, appealing only to hobbyists at first until the laser printer (also invented at PARC) came along. There are other dynamics in play than just timing but Microsoft clearly made the most money out of commercializing those Xerox ideas, coming along after everyone else.

Another means of being too early is simply having an initial product that it turns out nobody needs yet because it’s not good enough yet. Semiconductor development processes are all about risk-aversion, and any change has to mean that the risk of changing is less than the risk of not changing. For a startup with an early product in a process generation where the technology might be only nice-to-have this is a high barrier to cross. The startup might just serve as a wakeup call to everyone else that a product is required in the space, and eventually another startup executes better (having seen the first company fail) or the big EDA companies copy the technology into their own product line.

Overall, I think more startups fail by being too early to market than fail by being too late. Remember, it’s the second mouse that gets the cheese.

Posted in investment, management | Comments Off

Four Steps to the Epiphany

There’s a book on how to bring a product to market that is almost a samizdat document in the marketing world. It’s a privately published book originally intended to accompany a course at Berkeley and Stanford. It’s not the most readable of prose so don’t expect the Innovator’s Dilemma or Crossing the Chasm. However it is packed with good stuff for any startup, and especially for EDA startups who embody all the problems that the book addresses. It’s called Four Steps to the Epiphany by Steve Blank. You can get it on Amazon for $40 or from CafePress for $29.99 plus shipping and tax which comes to about the same thing.

The heart of the idea of the book is that you don’t know what the customer wants. So in addition to developing a product (preferably the minimum shippable product, since how do you know the customer even wants that?) you need to develop customers. You have a product development process. You need a customer development process. And hiring a VP of sales and a VP of business development and waiting around for engineering to ship doesn’t count.

A secondary idea is that the customer development process is very different if you are creating a brand new market, entering an existing market or re-segmenting an existing market (producing a product that only serves part of the market, usually but not always either creaming off the high-end or disrupting the low-end).

It is hard to summarize an entire book in one blog post and I don’t intend to try. You’ll have to invest in the book yourself and I guarantee that you will find plenty of thoughtful ideas that are immediately applicable to almost any product launch, whether in a startup or a large company.

If you only take one idea away from the book it would be this: get out of the building. Startups don’t fail for lack of technology, they fail for lack of customers. Heed Steve’s words: “In a startup, no facts exist inside the building, only opinions.” You have to go and talk to potential customers and even talking won’t be enough. You’ll have to ship them early product, burn them when it doesn’t do what they needed, and correct your course. If you scale the company before you have the product right, you’ll run out of money (and in the current climate you’re not getting any more).

The idea of listening to customers is not to find out everything that they want and build a laundry list. It is to attempt to narrow the product down to the minimum shippable product, one that at least a few customers can get value from even if it doesn’t do everything they want. Saint-Exupery’s quote that “A designer knows he has achieved perfection not when there is nothing left to add, but when there is nothing left to take away,” summarizes the goal for the earliest stage of customer development. Then start iterating as your understanding increases.

If you read newspaper articles on startups, you often get the idea that a couple of guys in a garage really understood something deep and took down some huge corporation that was too dumb to notice. The reality is that almost all successful startups end up doing something different from what they first intended when they were founded, sometimes dramatically so. Look at Paypal (originally doing beamed payments from Palm Pilots) or more recently Twitter (part of a podcasting company). Or even Google (originally just doing search without a clue about how to monetize it). In EDA the changes are less dramatic but very few business plans survive after contact with the market.

Posted in book review | Comments Off

Value propositions

I spent some time earlier this week giving someone a bit of free consulting about value propositions in EDA. If you take the high-level view then there seem to be three main value propositions in EDA: optimization, productivity and price.

Optimization means that your tool produces a better result than alternatives. A place and route tool that produces smaller designs. A synthesis tool that produces less negative slack. A power-reduction tool that reduces power. This is the most compelling value proposition you can have since the result from using your tool as opposed to sticking with the status quo shows through in the final chip affecting its price, performance or power. The higher the volume the chip is expected to run at, the higher the value of optimizing it.

Productivity means that your tool produces an equivalent result to the alternatives but does it in less time. My experience is that this is an incredibly difficult value proposition to sell unless the productivity difference is so large that it is a qualitative change: 10X not just 50% better. Users are risk-averse and just won’t move if they have “predictable pain.” It may take an extra week or an extra engineer, but it is predictable and the problem is understood and well-controlled. A new tool might fail, causing unpredictable pain, and so the productivity gain needs to be enormous to get interest. Otherwise the least risky approach is to spend the extra money on schedule or manpower to buy predictability.

The third value proposition is that you get the same result in the same time but the tool is cheaper. For something mission-critical this is just not a very interesting value proposition, sort of like being a discount heart surgeon. Only for very mature product spaces where testing is easy is price really a driver: Verilog simulation for example. The only product I can think of that strongly used price as its competitive edge was the original ModelSim VHDL simulator, and even then it was probably simply the best simulator and the low price simply left money on the table.

Another dimension of value proposition is whether the tool is must-have or nice-to-have. By must-have I don’t mean that customers must buy your tool (nice work if you can get it) but that they must buy either from you or one of your competitors or roll their own. Nice-to-have means that a chip can be designed without a tool in that space, doing stuff by hand, creating custom scripts, having a longer schedule or whatever. It is almost impossible to build a big business on a nice-to-have tool.

Moore’s law makes must-have a moving target. Signal integrity analysis ten years ago was, perhaps, nice-to-have. Then for designers in leading edge processes it became must-have. Eventually the technology got rolled into place and route tools since everybody needed it.

That is actually a fairly typical route for technology. Some new wrinkle comes on the scene and somebody creates a verification tool to detect the handful of fatal wrinkles that can then be fixed by hand. A couple of process generations later, there are 100,000 fatal wrinkles being detected and so it is no longer adequate to have just a verification tool. It becomes necessary to build at least some wrinkle avoidance into the creation tools so that fatal wrinkles are not created, or are only created in manageable numbers again. So the tool goes from nice-to-have, to must-have to incorporated into the main flow.

Posted in marketing | Comments Off

Friday puzzle: rainfall records

Last week’s puzzle was the toenail cancer test. The correct answer is that your chance of having toenail cancer is just under 1.9% obtained as follows. Of 20,000 people, 20 have toenail cancer (1 in 1000). Of those 20, 19 test positive (the test is 95% effective) but of the remaining 19,980 there will be 999 that test positive (the test is only 95% effective here too, meaning 5% of people without the disease test positive). So 999+19 = 1018 people test positive, and of them only 19 have toenail cancer. So your chance of having toenail cancer is 19/1018 which is 1.866%. This is much lower than intuitively people think (especially if they’ve just been told they’ve tested positive) and also much lower than physicians think since Bayes theorem is not a part of medical training (but should be). Test for rare events like this, even if seemingly effective, are overwhelmed by false positives (if toenail cancer only occured in 1 in 100,000 people, then the chance you have toenail cancer given you tested positive is approximately 1 in 500).

In any given year the weather station in New York’s central park records a certain rainfall. Ignore any trends and just assume that rainfall in one year is independent of rainfall the preceding year. A record year is one in which the rainfall exceeds that of any preceding year. Measurements started in 1835. In what year would you expect to get to the 20th record year (and as a hint, over the 160 year period from 1835 to 1994 there were 6 records).

Answer next week

Posted in puzzles | Comments Off

Where does everyone come from?

Where does all the brainpower that drives Silicon Valley come from? The answer, by and large, is not from round here.

A good analogy I saw recently was with Hollywood. Where do all those pretty young actresses come from? By and large, not from Los Angeles. If you are pretty enough with some acting talent living in a small town in the mid-West, Hollywood is potentially your route to advancement. The odds aren’t that great, of course, since pretty women aren’t a vanishingly small percentage, and Hollywood doesn’t want all its actresses to look like supermodels anyway.

Silicon Valley draws in intellectual firepower in the same way. In fact in an even bigger way since we don’t care what race you are and whether your English is perfect. We draw in many of the smartest people from all over the world, in many cases have them do Masters degrees or PhDs here, and then employ them, to the extent that our grandstanding politicians will allow (which is not the topic for today).

I remember studying a 240 person engineering group I was responsible for and I estimated that over half of them were born outside of the US: a lot of Indians, of course, Vietnamese and Chinese. But also French, English, South American, Egyptian. Pretty much everywhere. Of the people who were brought up in the US, a big percentage seemed to be from the mid-West just as in the Hollywood example above. That was a surprise.

This isn’t meant to be a criticism of California’s K-12 education system, although there is certainly plenty of criticism to go round, especially for the bureaucrats and the venal teacher’s unions. But if you are brought up around here (or in New York, Boston and so on), you have lots of options and working really hard in high-school so that you can go to college and work on a really hard engineering degree might not be that attractive. But if you are in a small town in the middle of an agricultural state, or a large town in India, with little in common with your peers due to your geekiness, then this seems like the a good way to escape. It’s probably not that deliberate a plan, teenagers are notorious for acting in the moment, but, as Sam Lewis and Joe Young put it: “How you going to keep them down on the farm after they’ve seen Paree?” Well, Mountain View isn’t quite Paris but it’s not Nowhereville either, and the weather is a lot nicer.

Politicians all over the world look at Silicon Valley and say “we want one too.” But Silicon Valley is really self-sustaining, sucking in intellectual talent from wherever it is found. Those politicians want a film industry too.

But Silicon Valley and Hollywood both got started in another era through a serious of chance events like Shockley preferring California to New Jersey, and the early film industry wanting to get as far away from Edison and his patent lawyers . Getting a Silicon xxx or a film industry going in your state requires more than just a few adjustments to the state tax code. The best talent, even from your state, is going to California (so long as California’s appaling political leaders don’t sell the entire state to the public sector unions).

Silicon Valley, the Hollywood of the North.

Posted in silicon valley | Comments Off

DAC attendance

Kevin Morris does a great job of dissecting the DAC attendance numbers. Since he has his FPGA hat on while doing this, one of the points he emphasizes is that most people doing electronic design are doing FPGA+software systems and so, largely, are not targeted by DAC, which is almost completely focused on IC design in leading-edge process nodes. As we all know, and as I’ve discussed several times, the number of design starts and the speed of migration to new processes are both going in the wrong direction and means that EDA as currently defined is a market in genteel decline.

The EDA industry has failed to expand into the FPGA space or the embedded software space in an effective way, which are the obvious areas for growth. Yes, both Mentor and Synopsys have FPGA synthesis, but it’s neither a significant part of their overall business nor is it small but growing explosively.

But I think Kevin gets some things wrong in his analysis. The technical conference attendees are not the “hard-core customers of the EDA industry.” They are academics salted with some EDA development engineers and, perhaps, a few customers. DAC is really two almost independent events: a technical conference attended by technical people who don’t themselves buy EDA tools; and a trade-show attended by people who buy or use EDA tools but aren’t that interested in all the stuff under the hood. There’s obviously bleed through so this is an exaggeration especially as more business sessions have been added.

The hard-core customers of the EDA industry, in my experience, typically have exhibit only badges. They don’t come to attend the technical conference, apart perhaps for some keynotes. They come to see what is new in the industry, get a feel for trends, see if there are any new startups they should know about but don’t and to network with people that they really only see once a year at DAC.

I’m reminded of a story a purchasing guy at VLSI told me years ago. He was responsible for purchasing semiconductor equipment to outfit VLSI’s fab. These are multi-million dollar pieces of equipment. He got invited to a party given by one of the vendors. There were about 50 people there, the party was abuzz, there were lots of cute women being very friendly. Gradually he realized that everyone there except him was actually connected to the vendor. The entire party was put on for him.

Obviously DAC is not put on for one person. But the reality is that there are at most a few hundred key decision makers in the semiconductor companies, and only dozens who control big budgets. As long as those few people show up then DAC can be successful for the exhibiting companies, especially the startups who struggle for visibility. Admittedly this year, Japan and Europe sent far fewer people than usual so some of those people really weren’t there this year. But I think that really is due to the economic downturn meaning that semiconductor companies are very tight with discretionary expense spending.

Next year DAC is in Anaheim again. Attendance will probably be down. But if the guys who control the $5B in EDA spending show up, then the exhibitors will be happy. And that is true whether or not a bunch of junior engineers also show up, so you can’t read as much as you think into the attendance numbers.

Posted in eda industry | Comments Off

Integration and differentiation

EDA acquisitions are very tricky to manage in most cases. This is because most acquisitions are acquiring two things: a business and a technology.

In the long run the technology is usually the most important aspect of the acquisition but the business is important for two separate reasons. Firstly, the revenue associated with the standalone business, ramped up by some factor to account for the greater reach of the acquiring company’s sales channel, is the way that the purchase price is usually justified. It is too hard to value technology except as a business. That’s why the venture capital euphemism for selling a company for cents on the dollar is “technology sale.” But more importantly, the business is the validation of the technology. Nobody can tell whether a startup’s technology is any good except by looking to see if anyone is buying it.

However, once the acquisition is done there is an immediate conflict. There is a running business to be kept going. After all that was the justification for the purchase price. It took the whole company to do that before acquisition, so presumably it will take the whole company afterwards. In the short term, the differentiation of the technology rests on its continuing to sell well. But the real reason for the acquisition is often to acquire the base technology and incorporate it into the rest of the product line. The only people who know the technology well enough to do this are the acquired company’s engineering organization. Suddenly they are double booked, developing the product to the plans that underpinned the forward bookings forecast, and working with the acquiring company’s engineers to do the integration.

An example. When I was at Cadence we acquired Cadmos for their signal integrity product SeismIC. plus other stuff in development. Googling back at the press release, I see that a person whose name sounds strangely familiar said:

"Adding CadMOS signal integrity analysis engines to established Cadence analog and digital design solutions provides us with the best correct-by-design timing and signal integrity closure capabilities in the industry," said Paul McLellan, corporate vice-president of custom integrated circuit (IC) products at Cadence.

Except, of course, to realize that vision required the Cadmos engineering team to work full-time on integration. Meanwhile, there is a business going full blast selling the SeismIC standalone. I believe there was also an earnout (part of the acquisition price depends on how much was sold) based on the standalone business only. A difficult balancing act for the engineering managers and myself.

We had similar issues when Cadence acquired Ambit. We needed to integrate the Ambit timing engine (and later the underlying synthesis technology itself) into Cadence’s whole digital product line at the same time as we were trying to give Synopsys a run for their money in the standalone synthesis business. Both of those goals were really important strategically but there was only one set of engineers.

Balancing these two conflicting requirements is probably the hardest aspect to manage of a typical EDA acquisition. It is really important, not just for financial reasons, to maintain the leadership position of the technology in the marketplace. At the same time, integrate that leadership technology so that it is available under-the-hood in other parts of the product line which, in the end, is probably how it will mostly get into customer’s hands. Preserve the differentiation while doing the integration.

Posted in management | Comments Off