The emulator thrives as verification models mushroom

Source: Tech Design Forum

Emulation has evolved from infancy at the end of the 1980s to adolescence in the 1990s and finally to maturity in the 21st Century. Yes, emulation has been around for close to 30 years. Who remembers Supersim, the first custom-field programmable gate array (FPGA)-based emulator launched in 1988, pre-dating the rise of Zycad, Quickturn and IKOS, three icons of the 1990s?

During the 1990s, significant engineering effort went into decreasing the problem of design setup to be emulation-ready and into enhancing debugging features. Until then, in the absence of native visibility into the mapped designs, emulators were used as ‘black boxes’. Deployed in in-circuit-emulation (ICE) mode, they verified the design-under-test (DUT) with real traffic data, though running at a lower speed than the actual speed. In those early days, the deployment of an emulation system came with a “team of field application engineers in the box” to compensate for the drawbacks.

These limitations drove the introduction of custom chip emulators to replace standard FPGA-based solutions. They eased the task of mapping the DUT and opened it for complete visibility of its inner workings, supporting the acceleration mode driven by software testbenches.

In the first decade of the new millennium, the acceleration mode evolved into a virtualization mode made possible by the transaction-based communication between the emulator and the testbench running on the workstation.

The emulator achieves greater cost efficiency

Throughout hardware emulation’s existence, design capacity continued to increase while the typical selling price consistently dropped. Specifically, the capacity has grown from less than 100,000 ASIC gates to more than one-billion gates today. Designs now consist of multi-core processors with integrated graphics and networking switches and router with 256 or more ports. Initially priced at more than $10 per gate, it is not uncommon to see emulation deals now for less than one cent per gate.

From those early days, hardware emulation has moved into the verification mainstream and is considered the foundation of many verification strategies. Use models are growing as well because of its versatility and value as a resource. In fact, several new use models address tasks previously deemed inefficient or unachievable by this verification tool, giving hardware verification engineers and software development teams more options to increase their productivity and reduce verification risks.

For example, verification engineers can perform low-power verification and accelerate power estimation through a tight integration between the emulator and power estimation tools. Another use model is design for test (DFT), which lets test engineers verify the DFT circuitry and DFT patterns in a complex system-on-chip (SoC) design. While the DFT logic is synthesized automatically and inserted at the gate level, the insertion may disrupt the functionality of the design, making verification of the DUT mandatory to ensure there are no isolated scan chain or vector errors.

Since the complexity of most designs defeat the hardware description language (HDL) simulator, only emulators can perform thorough debug. Indeed, DFT verification can be completed before tape-out, ensuring that any errors can be fixed prior to the creation of masks.

The emulator beyond ICE

Another deployment mode overcomes one of the main drawbacks of ICE-based emulation. That is, the non-deterministic nature of the real world traffic. Real-world traffic is not repeatable, making design debug a frustrating and time-consuming experience. If a particular traffic pattern pinpoints a fault, repeating the scenario is virtually impossible. But by capturing the design activity from the initial ICE run, and replaying it over and over without connection to the real world, it is possible to create a repeatable and deterministic environment. If a bug is detected, instead of trying to repeat that traffic pattern on the original ICE setup, the verification team can use the replay database to re-run the test as it did in the first run that caused the problem. This accelerates the time required to unearth and fix bugs.

New use models are cropping up for hardware emulation at an accelerated pace, offering verification engineers many more options and helping them solve tough debug challenges. Several verification tasks can be carried out faster and more thoroughly, making possible new verification scenarios to tackle new vertical markets.

The emulation market has been growing steadily since 2001, from $120m to more than $300m in 2015. To say that it is thriving is almost an understatement.