The Versatility of Hardware Emulation Magnifies its Return on Investment

Hardware emulation is widely considered the universal verification tool that can be used throughout the SoC development cycle

Source: Electronic Design

The Design Automation Conference — that veritable shopping bazaar for design automation tools — is fast approaching, and verification teams are starting to make their “must see” lists of vendors. Chief among them has to be hardware emulation providers, for this is the tool dominating verification strategies these days.

And, little wonder, because hardware emulation is widely considered the universal verification tool that can be used throughout the SoC development cycle. It can map any design size, even in excess of a billion gates, and it gives users full design visibility for thorough hardware debugging. In virtue of its six orders of magnitude speedup versus RTL simulators, it can validate embedded software, including drivers, operating systems, diagnostics, and applications. In fact, it is used extensively to debug processors, graphics, multimedia, networking, storage, automotive designs, and practically any other digital design.

The return on investment (ROI) analysis has to be significant for engineering managers and accountants alike.

(Source: pixabay.com)

Let’s consider… Today’s engineering teams are composed of both hardware engineers and software developers because software and hardware are being produced simultaneously. Both groups have come to value hardware emulation’s ability to verify concurrently hardware and software. Because the emulated design is based on an actual silicon implementation, albeit not timing-accurate, it offers an accurate functional representation of the design before the silicon is ready for testing. This is critical to trace bugs, even software bugs that propagate through the hardware. Its value is quantifiable and justifiable.

Over time, the deployment of hardware emulators has changed. The change has been accelerated by the need for a more accommodating test environment than the long-relied-upon in-circuit emulation (ICE) mode. In ICE mode, which is still in use but waning in popularity, a physical target system — where the design under test (DUT) will reside once it’s taped out — provides stimulus and processes the response.

The change has come in the form of virtual target systems driving the DUT via interfaces implemented by transactors — synthesizable software models of protocol interfaces. These transactors communicate with the virtual target system through an untimed packet of information and with the DUT using bit-level signals. Since transactors can be mapped inside the hardware emulator, they execute at the maximum speed of the emulator. In transaction-based verification, users describe the virtual test environment or testbench at a higher level of abstraction using at least one order of magnitude less code than in conventional hardware verification language (HVL) testbenches.

Transaction-based emulation is attractive to many engineering teams and semiconductor companies because it doesn’t require an on-call technician. This is handy when a remote user logs in, or if another user needs to swap designs, because no manual intervention is needed to plug or unplug speed adapters.

While ICE mode verifies the DUT with real traffic, new developments like VirtuaLAB effectively replace the ICE physical testbench with a functionally equivalent virtual testbench, thereby removing one of the few remaining obstacles for using it.

Engineers still wanting to use ICE now have a new capability called “Deterministic ICE” that removes the randomness inherent to the physical world. Such non-repeatable behavior makes design debug a royal pain in the neck. Think of a fault that shows up on the first run on clock cycle “N” and on the second run on cycle “M” and on the third run it disappears. Confusing, isn’t it? The ability to record an ICE session and replay it at will without connection to the physical test environment makes any subsequent emulation run repeatable and deterministic, thereby speeding up the design debug.

Just as important, transaction-based emulation paved the way for the creation of design datacenters, which are enabled by state-of-the-art resource management tools, and which can be accessed remotely by any number of users from anywhere in the world at any time. Global emulation enterprise servers with a design capacity of several billion gates are able to support multiple large designs or a combination of large and small designs.

Furthermore, the recent introduction of emulation apps is expanding the use model beyond traditional RTL verification. These include, to mention a few, verification of low-power designs with tens if not hundreds of power domains, tracking the design switching activity for estimating average and peak power consumption, and verification of DFT designs.

Each of the above contributes to improve ROI. Cumulatively, they dramatically magnify the ROI of hardware emulation, and this is what this year’s DAC attendees will expect to hear more about.

Dr. Lauro Rizzatti is a verification consultant and industry expert on hardware emulation (www.rizzatti.com). Previously, Dr. Rizzatti held positions in management, product marketing, technical marketing, and engineering. He can be reached at lauro@rizzatti.com.