Hardware Emulation Plus FPGA Prototyping: A Perfect Fit for Today’s SoC Verification
- January 27, 2021
- Posted by: Lauro Rizzatti
- Category: 2021
By Lauro Rizzatti | Wednesday, January 27, 2021
A quick glance in today’s design verification toolbox reveals a variety of point tools supporting the latest system-on-chip (SoC) design development. Combined and reinforced by effective verification methodologies, these tools trace even the most hard-to-find bug, whether in software or in the target hardware.
Two of those tools stand out: hardware emulation and field programmable gate array (FPGA) prototyping.
Hardware emulators provide massive design capacity, performance in the low-megahertz range, and powerful debug capabilities. Since their inception, they have become easier to use, versatile in addressing multiple use models and verification tasks, and deployable in data centers supporting several concurrent users and designs.
The collective capabilities make emulators the perfect platform for integrating hardware and software and for thorough testing by running real-world software workloads and benchmarks before tape-out.
Below are key attributes of a powerful hardware emulator.
Speed of emulation
Fast speed of execution is at an emulator’s foundation. At up to six-orders-of-magnitude–faster speed than software-based simulators, and reaching a few megahertz, they can test a design under test (DUT) with real traffic in in-circuit emulation (ICE) mode and/or with simulated real-world workloads and industry-standard benchmarks in virtual mode.
While the future of Moore’s Law is debated in the semiconductor industry, design size continues to grow at an exponential rate. Designs for artificial Intelligence/machine learning (AI/ML), 5G, and autonomous vehicle applications, as illustration, may already reach 10 billion gates. Only hardware-assisted verification platforms can manage this capacity.
Significant progress in compilation technology has alleviated the pain of mapping billion-gate designs onto the emulator, now measured in days instead of months.
Modes of operation
The virtual deployment with a software-based test environment driving the DUT via transaction-based protocol interfaces heralded new verification opportunities. The ability to drive the DUT with virtual workloads and to execute industry-standard benchmarks made it possible for emulators to estimate power consumption with accuracy of about 15% from real silicon when performed at the register transfer level and about 5% at the gate level.
Data center deployment
Virtual deployment appeals to engineering teams because it doesn’t require on-site supervision. Remote users can log in while other users swap designs without manual intervention.
The practice of periodically upgrading platforms has different implications, depending on the emulator architecture. New custom architectures are under the control of emulation suppliers who can enhance them and re-spin the emulation chip to smaller process technology nodes independently from third-party vendors. Upgrading FPGA-based emulators relies on new FPGAs.
Running at about an order of magnitude faster than emulation for the same design size, FPGA prototyping is a must-have tool to address system validation and software execution. The higher speed of execution is achieved by trading off DUT debug capabilities and rapid compilation times, critical features for fast design iteration times but not essential for prototyping applications.
FPGA prototyping is also an ideal platform for demonstrating the functionality of a new design before silicon availability or for OEM developers to verify new cores embedded in end systems.
The success of commercial FPGA prototyping providers led to industry consolidation with mergers and acquisitions. Emulation vendors today also offer FPGA prototyping platforms, either designed internally or through OEM agreements with FPGA prototyping developers.
FPGA prototypes come as either desktop or enterprise platforms. The main differences concern capacity, scalability, and multi-user usage.
Desktop FPGA prototyping platforms
As the name implies, desktop versions are single-user resources that trade off scalability and capacity to achieve the highest execution speed. Typically serving as vehicles for system validation of small- to medium-sized designs well below 1 billion gates, they are implemented on single boards with limited scalability of one to four FPGAs. Their fast speed and limited dimensions make them portable and conducive to deploy as demo platforms for OEM blocks/core providers.
They reach speed of execution that ranges between 200 and 300 MHz on a single FPGA and 50 MHz or so on 4× FPGAs. To achieve high performance, routing optimizations between FPGAs is necessary, a manual task that slows DUT mapping to several weeks. DUT compilation is assisted by commercial FPGA synthesis and partitioning tools, as well as P&R tools from FPGA vendors.
Desktop FPGA prototypes provide limited debug capabilities, implemented by compiling selected probes within the DUT. Commercial and proprietary tools may assist the user to trace design bugs.
While all are deployed in ICE mode, few desktop platforms support virtual mode via a transaction-based interface to a software test environment running on the host server.
Enterprise FPGA prototyping platforms
Enterprise FPGA prototype platforms are shared resources that extend the benefits and applications of desktop versions to entire teams of developers and support design sizes well in excess of a billion gates.
Enterprise platforms expand the DUT capacity to billions of gates via large arrays of devices mounted on multiple boards. Unlike emulators, they scale from single FPGAs to the maximum number of devices available. Their size limits the maximum speed achievable to low double-digit megahertz when configured for maximum capacity.
To achieve a higher speed than emulation, the FPGA interconnect network mixes fixed with manually configurable wires that extend the time to map a DUT. Emulation vendors share the front end of their emulation compiler with their own prototype compiler to help with the compilation task. Once compiled for emulation, switching to prototyping becomes easier and faster.
Their large dimensions prevent deployment as demo platforms for OEM blocks/core providers. Similar to the desktop version, hardware debug capabilities are rather basic, limited to compiling selected probes in the DUT.
All support virtual mode in addition to ICE mode. Virtual mode is appealing, as it provides a deterministic environment that can be shared with emulation.
Looking into the future
Emulation and FPGA prototyping are complementary verification technologies. Emulation excels in hardware debug and hardware/software integration, with quick design iterations made possible by fast compilation times. It also supports performance and power analysis driven by real-world workloads.
FPGA prototyping stands out in speed of execution. Combining the two in a unified flow that harnesses the strengths of each leads to a powerful verification environment ideal for conquering the challenges posed by state-of-the-art SoC designs in AI/ML, 5G, and automotive industry segments.
The perfect integration of emulation with prototyping is supported in virtual mode by virtue of its determinism and repeatable behavior. The DUT can be verified at every stage of the flow by either of the two tools, taking advantage of their unique features for effective and faster design verification. For instance, after thoroughly performing hardware debug and verifying bare-metal software via emulation, the DUT could be offloaded to a prototype to accelerate booting an OS, freeing the emulator to carry out power and performance analysis. If, after booting an OS via a prototype, a design bug would manifest itself, the DUT could be offloaded to the emulator to accelerate bug tracing quickly and efficiently.
Both hardware emulation and FPGA prototyping have come a long way. Today, both are mandatory verification engines in any advanced verification toolbox. A tight integration lowers the cost per verification cycle, accelerates the time to identify verification issues, optimizes the utilization of both verification platforms, and, ultimately, boosts the return on investment.