What’s Behind Hardware Emulation’s Rising Status?

https://www.eeweb.com/whats-behind-hardware-emulations-rising-status/

Five common questions often come up when chip designers and verification engineers ask me about hardware emulation. All are well-considered and answers are widely shared.

Today, emulation is mandatory in the design verification toolbox. Why? For two unrelated reasons: the ever-increasing demand for performance and throughput from verification tools and the remarkable progress in hardware emulation technology. The convergence of the two has propelled hardware emulation to a position of prominence in any verification toolbox.

Today, SoC designs consist of two soaring domains: staggering hardware complexity and escalating software content. Only hardware emulation can handle the demanding task of verifying the integration of the two and trace design bugs across their boundary.

The invention of virtualization technology in support of hardware emulation pioneered by IKOS Design Systems in the late 1990s opened the path to new deployment modes and led to the creation of emulation data centers. (Note: IKOS was acquired by Mentor Graphics, now Siemens EDA, in 2002.)

What is emulation’s value proposition?

Whether we like it or not, market dynamics present a significant force in our lives. They can generate wealth and destroy fortunes. Miss a market window for a new product in a highly competitive market at your own risk –– it could kill your product and take down the company.

In the electronic design world, missing a market window is typically due to a silicon re-spin. More generally, it is due to a poorly scheduled roadmap with inadequate resources in manpower and design tools.

The more advanced the technology process node, the higher the cost of the re-spin. No matter how costly a re-spin, the late-market entry is vastly more expensive. A product that is late by three months wipes out one-third of the total potential revenue.

The bottom line is crystal clear: It is mandatory to eliminate the risk of missing a market window. Hardware emulation is the best verification tool for risk avoidance. By virtue of its thorough and fast hardware/software verification capabilities, it can eliminate re-spins, accelerate the roadmap schedule, and simultaneously increase the quality of the product.

From a user perspective, what are the differences between HDL simulators and emulators?

The differences are design size and size of verification workload. As long as the size of the design under test (DUT) is in the ballpark of 100 million gates or less, and the workload execution extends to no more than a day, HDL simulators are the preferred choice for hardware debug. They are easy to use, quick to set up, extremely fast to compile the DUT, and flexible for debugging a hardware design. And, rather important, they are inexpensive to acquire.

All make the case that HDL simulators are the ideal choice for verification at the IP and block level in the early stages of the hardware design cycle.

When design and workload sizes exceed those limits and hardware/software testing is necessary, HDL simulators become ineffective, leaving hardware emulation as the only choice.

Today, hardware emulators are undefeated by any design size, even multi-billion gates, found in AI/ML, 5G, and automotive applications. They can pinpoint difficult-to-find hardware bugs that may need many billions of verification cycles to be uncovered as required for integrating embedded software with the underlying hardware. They support multiple concurrent users and can be accessed remotely from anywhere in the world. And, rather important, despite their perceived high cost of acquisition, their ROI is remarkably low.

From a user perspective, what are the differences between emulators and FPGA prototypes?

In principle, FPGA prototypes share the same technology foundation with hardware emulators. Both use dedicated and reprogrammable hardware to accelerate the verification cycle. The hardware in emulators is typically designed from the ground up and customized to target design verification. In prototypes, it is based on an array of commercial FPGAs.

Taking a closer look, prototypes trade off fast and easy design setup and compilation, as well as powerful DUT debug for significantly faster speed of execution. Specifically, on the same DUT, a prototype may run 10× faster than an emulator.

FPGA prototypes are a better choice for software validation and emulators are perfect for hardware verification at system-level and hardware/software integration.

Can emulators and FPGA prototypes be integrated in a common verification/validation flow?

Definitely. They can and should be integrated.

First, they should share the compilation front end, while the back end is tool-dependent. The benefit would be easier and faster DUT compilation. If it compiles for emulation, it will likely compile for prototyping.

Second, they should share the same DUT database to allow for offloading the execution from one to the other at runtime. For example, booting an OS and executing a software workload could be carried out in a prototype until hitting a bug. Then saving the design database in the prototype and restoring it into the emulator would significantly speed up accurate debug tracing.

One step further on the integration roadmap can be implemented by adding a virtual prototype platform based on hybrid emulation.

By tightly coupling best-in-class emulators, virtual prototypes, and FPGA prototypes, a verification team can implement a state-of-the-art and effective “shift left” strategy.

Earlier this year, several announcements touted next-generation hardware-assisted verification platforms that tie hardware emulation, prototyping deployed in virtual mode with a comprehensive software test environment, all foundational tools in all chip design verification flows.