Hardware-Assisted Verification Overtakes HDL Simulation

LinkedIn

Jean-Marie Brunet, MKTG Director, Mentor, a Siemens Business, jm_brunet@mentor.com
Lauro Rizzatti, MKTG Consultant, lauro@rizzatti.com

Sept 4, 2020

Over the past two years, a remarkable, though virtually unnoticed financial event  occurred in the electronic design automation (EDA) space: revenue from hardware-assisted verification tools, basically hardware emulation and FPGA-based prototyping, surpassed those from hardware-design-language (HDL) or register-transfer-language (RTL) simulation.

A glimpse into the Market Statistics Service (MSS) reported quarterly by the ESD Alliance reveals that from 1995 through 2018,HDL simulation revenue exceeded revenue of hardware-based tools in average of $100 million. See figure 1.

The situation reversed in 2018.  It was further magnified in 2019, and, according to the latest ESD Alliance Q1/2020 quarterly report, the shift continues: $190 million versus $128 million. Figure 2 focuses on the last three-year quarterly revenue, and highlights the trend with linear interpolations.

Why did this happen, and is there a sensible reason to assume that the trend will persist into the future?

A Bit of History

At the closing of the last millennium, the most advanced process technology node shrank to 180nm, which facilitated the growth of design sizes over 10-million ASIC-equivalent gates. The fabric of the largest designs consisted of single processors, memory blocks, and hardwired logic in the form of a few commercial IP and custom blocks.

Hardware design verification was well served by running HDL simulators on RTL designs, tested by hardware verification language (HVL) testbenches. Hardware emulation was in its infancy, deployed only in in-circuit emulation (ICE) mode, and processing real-world traffic to perform system-level validation of the largest processor and graphic designs.

Embedded software was not popular. When developed, its validation was accomplished either on FPGA-based prototypes after RTL reached stability, or on pre-production silicon samples.

In the course of two following decades, process technology nodes continued to shrink and hardware design complexity continued to grow. The fabric, shaped into a multi-layer hierarchical structure, included a variety processing cores, plenty of memory, and many IP and custom blocks. Today, the largest designs are approaching 10-billion ASIC-equivalent gates, an increase of three orders of magnitude.

While the hardware expanded, so did the embedded software to the point of taking over the implementation of design functionality. As with hardware, software assumed the shape of a multi-layer hierarchical stack.

This state of affairs dramatically affected the overall cost of designing chips, and profoundly changed the design verification methodology. See figure 3.

Figure 4 charts the dollars spent in hardware verification and software validation at each process technology node. The scenario is reflected in the composition of the design community, where a software team offsets a hardware team by several multiples.

New Verification/Validation Requirements: Trillions of Cycles for Billions of Gates

The gripping changes in the semiconductor design landscape impacted the design verification methodologies, above all, HDL simulation whose throughput hit the wall.

Consider what is needed to perform exhaustive verification of hardware and validation of embedded software in designs exceeding one-billion gates.

Embedded software includes drivers, one or more operating systems, middleware and applications. All interact with the hardware. Any anomaly in either affects the other, worsening the already demanding task for the verification/validation team. Unearthing deep-seated bugs, whether in hardware or in software, requires propagating their consequences to the design outputs.

Today, the validation team must cope with an additional task. To win in a very competitive environment, a new design must outperform the competition in delivering high performance and low-power consumption. Both goals must be verified in the development stage before taping out the design.

In practical terms, this translates to the need to process trillions of verification cycles on a billion-gates design under test (DUT). The objective cannot be accomplished by any testbench regardless of its complexity. It can only be met by processing real-world applications or real workload benchmarks.

Conclusion

Only hardware-assisted verification tools such as hardware emulators and FPGA-prototypes possess the throughput to boot operating systems and execute entire software stacks necessary to verify, validate, and analyze a DUT before committing to silicon.

It should be noted that the cost of ownership (COO) of emulators and FPGA-prototypes is significantly higher than the COO of HDL simulators, which explains why in 2018 their revenue surpassed the revenue from HDL simulation. Revenue aside, chip design verification groups must rely on emulators and FPGA prototypes to accomplish their goals. As a result, the revenue trend will continue.