Why isn’t there a large thriving FPGA software market? After all, something like 95% of semiconductor designs are FPGA so there should be scope for somebody to be successful in that market. If the big EDA companies have the wrong cost structure, then a new entrant with a lower cost structure, maybe.
In the early 1980s, if you wanted to get a design done then you got tools from your IC vendor. But gradually the EDA market came into being as a separate market, driven on the customer side by the fact that third-party tools were independent of semiconductor vendor and so avoided the threat of paying excessive silicon prices due to being locked into a software tool flows. Once the 3rd party EDA industry captured enough of the investment dollars then they could advance their tools much faster than any single company and the captive tool market was largely doomed.
For FPGAs that is not the case. If you want to do a design with Xilinx, then you get tools from Xilinx. With Altera, tools from Altera and so on. Yes, there are some tools like Synplicity (now part of Synopsys) and Mentor’s FPGA suite, but they are focused only on the high end. But it is hard to make money only at the high end. When, over time, the high end goes mainstream then the FPGA vendors produce good-enough tools at free/low prices. So the R&D costs need to be recovered from just those few high-end users since the tools never become cash cows like happens with IC design tools for any particular process node as time progresses. Like the red queen in Alice through the Looking Glass, it takes all the running one can do to stay in the same place.
There may be change coming as more and more FPGA designs are actually prototypes for ASIC designs, or might want to be cost-reduced into ASIC designs and so on. This means that people want to use the same tools for ASIC and FPGA design, and on the surface is one reason Synopsys acquired Synplicity.
One other issue is that it is FPGA architectures and their tools are more intimately bound together than with IC designs. It is a dirty secret of synthesis (and maybe place and route) that despite the lower price point, FPGA synthesis is harder, not easier, than mainstream synthesis for standard-cell libraries. Solving the general problem for all the different varieties of FPGA architectures seems to be extremely costly. By contrast, Xilinx only has to build tools that work with Xilinx and can ignore that its algorithms might not work with Altera’s arrays.
But probably the biggest factor is that there are not enough FPGA companies. If there were a dozen FPGA companies then enough of them would compete by supporting a 3rd party FPGA tool industry and eventually the investment there would overpower the companies that tried to keep it internal. This is just what happened when the Japanese and everyone else tried to get into ASIC. They had no internal tools so they leveled that playing field by making Daisy/Mentor/Valid and subsequently Cadence then Synopsys successful. Eventually companies like VLSI Technology and LSI Logic felt they should try and spin out their tools and adopt EDA industry tools.
It is unclear whether this was good financially. I told Al Stein, CEO of VLSI, when we spun out Compass that he shouldn’t expect his tool bill to go down. He would pay more to the EDA industry that Compass was entering (some of it to Compass) than he had been paying just to fund the division that became Compass. And this turned out to be a true prediction.
For ASIC designs today, IBM’s backend tools are the only ones internally developed. But they are #1 in cell-based design so it is hard to argue that the strategy of keeping that development internal is demonstrably bad.
And Xilinx and Altera are doing OK keeping their tools internal.