Hardware Emulations Goes Mainstream

Source: Chip Design

These days, you’d need to be buried under a pile of verification reports not to know that hardware emulation has gone mainstream, moving away from a dusty back room to your cubemate’s desktop.

The metamorphosis did not happen overnight –– it was more like a decade-long journey –– but once it happened, the rush to adoption was on. It took a dedicated effort by more than one marketing department to tout the benefits of hardware emulation to combat increasing time-to-market hassles. Promoting emulation as the panacea for the escalating complexity of hardware design and the explosion of the embedded software now encompassing validation programs, drivers, operating systems, applications and diagnostics was the obvious message. It was not the only driver, however. New use modes accelerated the adoption process.

For example, the traditional in-circuit emulation (ICE) mode was forced to take a back seat in favor of the transaction-based emulation mode, often called acceleration mode. And, the transaction-based approach opened the path to the integration of emulation with virtual environments.

Emulation’s prime time was made possible by a few dedicated engineering teams who toiled to get the formerly hard-to-use hardware to be cost-effective, easy to use and an efficient verification solution.

The revival of the field programmable gate array (FPGA)-based approach for emulation put an end to the dominance of the processor-based method that ruled the first decade of the new millennium. Specifically, the custom FPGA technique promoted by one supplier as “emulator-on-chip” removed the drawbacks that troubled the original commercial FPGA-based emulators, namely, excruciating setup time, long compile time and problematic design visibility.

Regardless of the hardware implementation, design capacity dramatically expanded to meet even the most esoteric and largest designs ever created, a capability that the revered hardware-description language (HDL) simulator could not keep up with. Not only more capacity but also a multi-user resource improved the return on investment of this expensive verification technology. New features were added for push-button compilations and simulator-like debug capabilities such as waveform generation, assertion and coverage checking. Other features enabled power analysis and switching activity tracking to support power estimation. Execution speeds were improved and so was hardware reliability.

For about two decades, emulation had been a niche player within the verification space. Emulators were purported to be expensive and difficult to use and only the most difficult designs at semiconductor companies with the biggest budgets had any use for them. Today, hardware emulation is a staple of any verification strategy because it is able to identify even the most difficult-to-find bugs in the most challenging designs, especially when the bug effects cross the border between the embedded software and the underlying hardware. Time-to-emulation has never been lower –– sometimes, set up can be done in a day –– and design iteration time dropped significantly, allowing for multiple iterations per day.

Yes, usage is growing as companies invest in hardware emulation. Emulation is the only way to fully verify a design’s hardware and software together prior to tapeout. It has morphed from a tool used by eccentric project teams doing esoteric designs into a mainstream, “must have” tool that’s now viewed as flexible, scalable, multipurpose and invaluable.