A Match Made in Chip Verification Heaven: Simulation and Emulation

Given the complexity of chip designs and the various verification stages along the way, project teams use both simulation and emulation in their verification flows

Source: EETimes

Simulators — for both analog and digital design — have been around for so long (~40 years now) that most project teams take them for granted. Hardware emulation has been around for almost as long (30 years, not 40); however, it has only recently become a go-to verification tool used by just about every project team verifying a chip design.

While pundits may predict the departure of simulation from the verification flow and the rise of hardware emulation, they actually co-exist nicely… thank you very much.

(Source: pixabay.com)

In the 1980s, the logic simulator became main-stream, promoted by the Computer Aided Engineering (CAE) industry as the vehicle for design verification. Up to this point, all logic simulators operated at the gate level (by contrast, analog simulators of that time operated at the transistor level).

In the 1990s, simulation moved up the abstraction level to the register transfer level (RTL) supporting the two most popular hardware description languages (HDLs) — Verilog and VHDL. Toward the end of that decade, the vendors — now called EDA (Electronic Design Automation), which embodied a merger of CAE and CAD — supported both languages in the same tool. Today, all three major EDA players — Cadence, Mentor, and Synopsys — offer their own HDL simulators, with each company holding roughly one third of the market.

Accurate timing modeling, four logic states (0, 1, X, and Z), and multiple logic strengths give the HDL software simulator the breadth to verify both the functionality and timing behavior of any digital design in any industry segment… up to a point. Due to cache misses and memory swapping, design sizes in excess of about 100-million ASIC-equivalent gates slow execution speed. Such a design size is not a hard limit, and larger designs can be simulated although the speed of execution becomes unbearably slow.

To simulate one second of real data on a 100-million ASIC-gate design clocked at 500 MHz in the real-world, for example, a fast simulator running at 10 Hz would take 50-million seconds. That is almost 600 days for a large and complex design, unacceptable by rigid schedules, and one second of real data is not realistic for hardware debugging. Thus, typical testbenches aiming at hardware debug using a software simulator may generate data equivalent to one millisecond or less in real time. This would reduce the execution time to a day or less when simulated on a state-of-the-art PC configured with lots of memory — a reasonable target.

To increase the throughput of HDL software simulators, they can be run in parallel on PC farms, with each PC processing a self-contained testbench. In the above example, a farm with 1,000 PCs would process close to one-billion cycles per day. This impressive processing power is a welcome necessity for running many small block tests in parallel, typical of large regression test suites.

Still, this approach is not suitable to run single monolithic tests that are sequential in nature and that require advancing the design to a “state of interest” for testing following the OS boot-up. That is to say that simulation farms are simply not adequate when it comes to executing embedded software. To process embedded software, it is necessary to execute several billion cycles in sequence, since software programs cannot be split in subsets and run in parallel. The task is inherently a sequential process. For this, hardware emulation is the perfect choice.

Hardware emulation became popular in the 1990s to verify the largest designs of the time. Processor and graphics designs to this day demand long sequences of test cycles.

Using the same example discussed above, an emulator running at 1 MHz would take 500 seconds to execute one second of real time and — along the way — process 500 million cycles. Thus, an emulator is capable of booting an operating system in one to two hours.

Of course, any advantages came with strings attached in those days. Hardware emulators were unfriendly to use and required several months to set up the design-under-test (DUT), which often took things past the delivery of the first silicon from the foundry, thereby defeating any value they provided. Their reliability was abysmal, with a mean time between failures (MTBF) of less than one day; their cost was high; and the return-on-investment (ROI) was low.

Limitations of the early emulators based on commercial FPGAs drove new thinking and the introduction of custom-chip emulators. Over time, two different schools of thought were established. One promoted custom FPGA-based emulators, also called emulator-on-chip, as offered by Mentor Graphics; the other fostered processor-based emulators, as offered by Cadence Design Systems. Meanwhile, significant advances in commercial FPGAs reopened the door to emulators based on off-the-shelf FPGAs. Synopsys an advocate of this approach after it acquired EVE in 2012.

Regardless of the technological foundation, all modern emulators have either removed or significantly alleviated early drawbacks associated with their technologies. Undefeated by any design size when compared to HDL simulators, they continue to require a relatively long time to setup (from a day to a week) and they are relatively slow to compile (from a few minutes to a few hours).

The trend of emulation toward the mainstream has also been facilitated by supporting a single source flow that can interchange between software simulation and hardware emulation with identical results.

Conclusions
As long as the DUT’s size is manageable — if the simulation time lasts a day or less — HDL software simulators are the best choice for hardware debug. They are easy to use, quick to setup, extremely fast to compile the DUT, and superbly flexible with regard to debugging a hardware design. Furthermore, they are also reasonably priced.

Typically, they are used in the early stages of the verification process, at the IP and subsystem level. However, they become challenging at the system level when the DUT reaches into several tens of million gates, and they are of no use for embedded software integration and validation.

Conversely, hardware emulators are undefeated by any design size. They can unearth difficult-to-find hardware bugs that require many millions of verification cycles. And they are the only choice to integrate embedded software with the underlying hardware that necessitates billions of clock cycles.

Given the complexity of chip designs and the various verification stages along the way, project teams use both software simulation and hardware emulation in their verification flows, and they wouldn’t want it any other way.

Dr. Lauro Rizzatti is a verification consultant and industry expert on hardware emulation (www.rizzatti.com). Previously, Dr. Rizzatti held positions in management, product marketing, technical marketing, and engineering. He can be reached at lauro@rizzatti.com.