For the second year now I’m virtually attending the DAC, the 47th annual EDA conference held this week in Anaheim. As a web oriented company we’ve yet to exhibit at the EDA industry’s biggest conference in Anaheim. Not being there physically, I enjoy all the info I can obtain remotely using twitter, blogs, and other online media. This year there seems to be more chatter about cloud computing and EDA - a topic that’s of particularly interest to a web oriented EDA company like PDTi.
Firstly, I saw a twitter post from James Colgan, the CEO of EDA community provider Xuropa, indicating that Kevin Bushby claimed that the Cloud is the only way EDA can grow. I’m assuming this is Kevin Busyby COO of FastScale Technology which was acquired by EMC, and who formerly worked at Cadence. While I agree that the cloud can help EDA grow, I’m curious to understand how Kevin and others see it growing.
Here are some ways I can see EDA growing using the cloud:
Some of these are things that we have already realized with our SpectaReg web application for register management and automation, which is offered onsite, hosted by the customer, or online, hosted by us. Whether hosted by the customer or us, the application is essentially the same, except the online user has the opportunity for some additional customizations. Interestingly, some of our customers are using virtualization technologies to create their own private cloud where they deploy SpectaReg onsite.
The great thing about the cloud is the ability to scale the compute resources, like RAM and CPUs on demand, and to have failover/redundancy available should some piece of hardware fail. If one has a fairly static requirement for these then cloud computing might not make sense. For example, a while back I ran the numbers on the cost of the equivalent of a dedicated machine would be on Amazon’s Elastic Compute Cloud (EC2). To have the equivalent compute resources available 24 x 7 x 365 via EC2 would cost more; however, a lot of machines are not used full-time and the compute requirements are bursty. This burstyness of compute requirements is where cloud computing really adds value.
To really take advantage of cloud computing, the application must be able to monitor/predict it’s load and be able to scale things up or down dynamically as needed. EDA applications or their wrapper scripts would need to get smarter to do this.
Another obstacle that critiques of EDA cloud computing often point out is that need to move files between tools and script flows. Technically, I don’t think this is an issue, aside from perhaps the need for EDA users to increase the bandwidths of their network pipes. Web service APIs would allow people to script all sorts of operations in their flows and move info between different EDA tools in the cloud, perhaps hosted by different cloud providers.
There are many other angles to cloud computing and EDA, and I could likely write 10 more blog postings on the topic. In terms of an end market, the cloud is a electronic system and there are opportunities for EDA to serve this growing market . Lori Kate Smith of ARM wrote up the 47DAC reception, mentioning how Mary Olsson of Gary Smith EDA cites Cloud computing as an application driver for EDA.
Another opportunity for EDA and FPGA vendors would be to have a cloud of FPGAs that could be re-configured. This re-configurable cloud computing would be pretty cool. Perhaps we’d need FPGA virtualization first, if it doesn’t already exist. Wonder if the folks at Google are looking into stuff like that…
Of course there is also the issue of security when it comes to cloud computing. I see Harry the ASIC guy was interviewing 2 cloud security experts at DAC and I’ve yet to check that out. Knowing Harry it will be worthwile. One concern is that if the cloud infrastructure becomes compromised then everything running on it can potentially become vulnerable. This is a bit different than a data center with isolated and distinct dedicated machines where each machine would need to be compromised individually.
Clearly there are a lot of opportunities and challenges for EDA with respect to cloud computing. It will be exciting to see how the future unravels. Stay tuned.
Wow, what a spectacular run the equity markets have had since the low in March of 2009. Meanwhile, the jury is out on whether this is a sustainable recovery, backed by fundamentals and precedent? There are people calling for hyperinflation and others for deflation ahead. With such uncertainty, opinions vary widely regarding which way things will go as illustrated by the following articles which I found interesting (followed by my point form summaries):
The Greenback Effect
NT Times Opinion Article by Warren Buffett
In engaging with companies about register-map automation, it amazes me how many engineers think that because they have an in-house register solution, or because they could build one, that this route makes the most sense. Although there are some people that get it, quite often engineers fail to take opportunity cost into consideration (sunk costs too, but I’ll cover that in another posting). I’ll explain via an example…
Picture this; an engineering team needs a register automation tool for developing a new ASIC that is the highest performance XYZ. Their options are:
The engineering manager estimates the costs of the different options using an hourly rate per engineer vs. the licensing costs for the commercial tool. Due to the fact that the commercial tool is shared among several companies who subsidize its ongoing development, it’s priced lower than the cost of creating an in-house solution. For argument sake, though, let’s assume that somehow the manager estimates that option 2 is cheaper by some accounting magic. Forget about option 3, the ASIC has way too many registers to do manually, so that’s out of the question.
The engineering manager chooses option 2 because, in this strange, imaginary, upside down world of magical accounting, the cost of building an in-house tool is less than the cost of buying a commercial one. Reasonable right?
Wait a minute, maybe not…
The company’s core competency is building high-performance XYZ products, not building, testing and supporting tools. That tool building dilutes focus. What is the cost of forgoing the opportunity to focus as intensely as possible on building the best and highest possible performance XYZ? That’s a bit harder to quantize, but it is a cost. That’s the opportunity cost!
What happens if the competition chooses option 1, and smokes the company in terms of performance? How much does that cost in the long run?
If the manager had decided to go with option 1 — licensing a commercial register tool — that would enable the team to focus more effort on building a better, higher performance XYZ product. That extra focus would enable the company to be more differentiated from their competition, ensuring their dominance and their ability to demand higher profit margins. It’s this differentiation that provides competitive advantage in the marketplace.
One other way to think about it is that differentiating value-added efforts have a greater return on investment (ROI) than non-differentiating efforts.
Something to think about, next time you have an option that would enable more intense focus on differentiating core competencies.
Stay tuned for more postings on the economics of build vs. buy engineering decisions.