Should it Take a Village to get a Design Tool to Work?

Source: EDACafé

What happens when a project team under a tight schedule takes delivery on a tool promised to change the way it does chip design and verification and it doesn’t live up to the promise?

Why, the vendor sends in a village of AEs, R&D engineers and PhDs, of course, to work onsite with the lead designer and his or her designs. Yes, it’s common for a design automation company to send AEs into an account as “super” users of a specific software tool, such as formal verification, because it’s specialized and specific technology not everyone has mastered. And yes, old-style hardware emulators came with AEs because early generations of the tool were difficult to deploy. They required expertise and plenty of manual effort to get them operational, and hence the refrain “time to emulation.”

Some more positive project teams would assume this gesture seemingly is a sign of commitment. Actually, it’s not. Not even close. It’s a sign of the tool’s weakness because it doesn’t work.

Sadly, this scenario continues today, even with across-the-board advances in design automation methodologies, tools and techniques. Most design automation tools are on their umpteenth generation and should be almost foolproof.

It’s especially egregious as chip design continues on a scaling-up vector. Hardware emulation is an example and my area of expertise. They are purpose-built turnkey solutions for project teams to effectively perform advanced development tasks on leading technology nodes and the largest design scale. Hardware emulation has become a foundational tool for pre-silicon hardware and software design verification and validation, and is used for post-silicon debug. It cannot and should not fail, not when the stakes are so high.

When the stakes are high and the chip designs are down (or not being well debugged), hardware emulation vendors have been known to send in the village. Villagers then work with the lead designer, a scarce resource on any project team, to get the emulator up, running and usable. It’s a proposition no one wants and can ill afford.

In the future, said project team has vowed not to take a leap of faith. Instead, it will perform careful due diligence on all design automation tools and have tighter control of the evaluations. Evaluations will be far more rigorous and will include benchmarks for comparisons. Vendors will have minimal involvement to ensure the verification engineers know for sure the hardware emulator works as advertised. Members of the team will be assigned to check message boards, LinkedIn Groups and user testimonials.

The upcoming Design Automation Conference is a good place to start the due diligence. Vendors should have continuous demonstrations and user presentations with benchmark information and results. It may be a different application, but there will be hints of the effectiveness of the hardware emulator.