Moving to Virtual Prototyping
- April 2, 2021
- Posted by: Lauro Rizzatti
- Category: 2021
A unified virtual-prototyping environment allows verification to progress with models early, gradually building the system as different pieces emerge from development, breaking dependencies between hardware design groups and software developers.
Electronic systems are no longer generic hardware blocks assembled into a system that executes the software. That time-honored hardware design approach proceeded without considering the application software because hardware was often designed to be generic, playing host to a variety of different software.
Design has evolved to a system-on-chip (SoC) methodology that implements a full hardware system on a single chip. It’s no longer assembled by a system integrator into a self-contained silicon chip that handled the bulk of the computing work.
The evolution changed traditional verification flows as well. Because an SoC processes specific data workloads for specific applications, full system SoC verification is required now to verify the connection between the underlying hardware and the software it will execute. Unlike earlier generic processors, the processing architecture must be well-matched to the data it will process. Instruction sets, bus architectures, and the memory/cache structure need to be adjusted to maximize performance and efficiency and minimize power.
Hardware engineers now need access to software workloads early in the design cycle because everything including the architecture must be validated. A dependency on the software impacts the project schedule, particularly difficult in the era of “shift left” priories that add new functional and scheduling considerations.
Typical Design Flow Kicks Off at the Architectural Level
The current design and verification flow employs different tools at different phases, starting at the architectural level. System architects optimize a system’s key functional elements early in the project, including:
- An assessment of processor architectures
- Changes to instructions to better handle common operations
- System interconnect—buses and networks-on-chip (NoCs)
- Memory types, quantities, and hierarchy and other hardware blocks to accelerate common functions
During this phase, architects use high-level simulation environments that abstract design details, focusing on data, transactions, and block interactions.
Designers then create hardware blocks, using and configuring third-party IP as well as employing their own custom hardware and interfaces. They must ensure that their circuits handle data workloads correctly and efficiently at the block level done at the early design stages, using logic simulation with debug capabilities.
Integration, Verification, and Implementation
In the system integration phase, hardware emulation is used for scale and performance to attain a wider range of tests. Full hardware-system interactions can be assessed, along with performance and power characteristics. The interface with software begins here, starting with drivers that connect the operating system to the hardware. From there, hardware emulators can execute the full operating system as well as some level of application software.
Once most of the verification is complete, the system can be implemented in prototypes for use by software developers for early evaluation. The hardware is mature enough at this stage to make further prototype changes unlikely because prototypes have been optimized for higher performance, a time-consuming process.
Finally, an extensive checkout process is needed once the silicon is built to ensure that the actual SoC performs as predicted by pre-silicon verification. The goal is to find no new hardware bugs at this late stage.
The complicated and disconnected design flow relies on many different tools for different domains, including hardware design, software development, software debug and validation, hardware/software correlation, power and performance analysis, performance analysis, and SoC validation.
Since these domains often are supported by different tools and, in some cases, from different vendors, each tool requires significant effort to prepare the SoC design for verification. Verification engineers, for example, are unable to reuse previous work, whether implementing different functional models, testbench creation, or execution of tests in a variety of design languages.
Early software implementations are needed when software interactions are tested. While booting an operating system may seem like a generic operation, most SoCs have customized versions of common operating systems like Linux and Android. As with application software, these customizations cannot be complete until much later in the product design cycle, leaving the hardware in limbo until enough software is available to complete verification, slowing the hardware verification cycle.
A Smoother Flow with Virtual Prototyping
Something needs to change. Breaking the software dependency begins with a software-enabled verification and validation environment used throughout the full design flow. A system like this, known as virtual prototyping, includes virtualization of blocks and workloads, and a streamlined flow integrating various tools.
Virtualization of blocks can mean both hardware and software functions with incomplete final designs. For example, a customized operating system may not be available. A more generic version can be used to run the system through several tests that don’t need customizations. This reduces the test burden later in the project and leaves only a small subset of verifications when the full custom version is available.
The same is true for application software. A 5G SoC can leverage abstract control-plane models to verify that the data plane can be correctly configured. A SoC for use in high-performance computing can be substituted with realistic workloads to prove the data plane.
Streamlining the flow means that every tool follows the same basic formats, rather than having each tool work independently with its own preparation requirements. Naturally, some work is required for each tool. Designers can build on completed work rather than starting from scratch with each tool transfer, letting them focus only on incremental work related to the specific phase of verification.
Using software to configure the verification environment, models, and stubs can be implemented to allow for targeted verification of critical elements without waiting for full availability of unrelated blocks. Data workloads can be virtualized to enable thorough vetting of processing efficiencies. The overall design evolves as a set of interconnected blocks that originate in the architectural phase. Those blocks are gradually refined as the design progresses.
The common nature of the environment means that different blocks or system embodiments can move back and forth between design groups and tools without extensive rework.
Software driver development is the use case normally associated with hardware emulation, though engineering groups typically need a full software stack and stable register-transfer-level (RTL) code to be successful. While these two pieces enable testing of the integration of drivers with Linux and Android, getting to stable RTL code is challenging.
The way forward is with a unified software-enable verification and validation environment that accommodates frequent RTL updates (see figure). Development begins sooner and IP operating in the context of the entire real-world software workload environment is validated earlier. With early driver validation, the confidence that drivers are properly programmed allows for pre-silicon validation of the entire SoC to move forward quickly.
A unified virtual prototyping environment is a way for verification to progress with models early, gradually building the system as the different pieces emerge from development, breaking dependencies between hardware design groups and software developers. Verification begins early and final pre-silicon tests involve last-minute refinements and full system validation, thus speeding tapeout, reducing re-spin risk, and simplifying post-silicon verification. It accelerates time to market, provides more thorough checkout, and reduces risks and cost.