If chip design had a face, it would have a wrinkle or two, an especially deep one caused by the increasingly complex challenge of hardware and software verification.
Until recently, these two elements of a system design were done separately and at different times, with hardware design often beginning way ahead of software development. Generic hardware blocks were assembled into a system by a system integrator without consideration to the application software that would ultimately run on it. The generic hardware may have hosted any number of possible types of software, but it was not optimized for any of them. The setting was a challenge for project teams, as hardware verification and validation must prove that the intended software works on the hardware, achieving adequate performance within a power-consumption budget to ensure success.
Today, design has evolved into a system-on-chip (SoC), a self-contained customized silicon chip that handles the bulk of the computing work. Likewise, SoC verification has refocused on the interaction between the underlying hardware and the software running on it. Accordingly, verification and validation morphed into a software-enabled verification and validation methodology based on workload analysis and used throughout the design flow from earlier hardware verification to software integration all the way to system validation.
The key to smoothing the wrinkle caused by the software dependency is virtualization of design blocks and software workloads, as well as a streamlined flow integrating various tools. Virtualization of blocks including both hardware and software functions should be done prior to completion of final designs.
For example, if a customized operating system is not available, a more generic version could be used for testing the system with a large number of tests without customizations. This alleviates the test burden later in the project with only a small subset of verifications needed when the full custom operating system is available. The same holds true for application software. An SoC for use in high-performance computing can be validated with realistic workloads to prove out the data plane. Streamlining the flow ensures each tool follows the same basic formats. Of course, some work is required for each tool. Rather than starting from scratch with each tool transfer, they can build on the work already done, focusing only on incremental work related to the different phases of verification.
Using software to configure the verification environment enables implementation of models and stubs for targeted verification of critical elements without full availability of unrelated blocks. Data workloads can be virtualized for thorough vetting of processing efficiencies. The design evolves as a set of interconnected blocks that originate in the architectural phase. Those blocks are gradually refined as the design progresses and the common nature of the environment facilitates moving different blocks back and forth between groups and tools without extensive rework.
Hardware-assisted verification is a perfect fit for a software-enabled verification and validation methodology. Development begins sooner and validation of hardware in the context of the entire real-world software workload environment can progress with models early on, gradually building up the system as the different pieces emerge from development. While verification can begin immediately, final pre-silicon tests focus on last-minute refinements and full-system validation, accelerating design to tape out, increasing design quality, and reducing re-spin risk and cost. It also simplifies post-silicon verification.
A unified software-enabled verification and validation environment breaks the dependencies between hardware design groups and software developers. Such a methodology is required for the increasingly complex SoCs of the future. While chip design didn’t get a facelift, the deep wrinkles are smoothed over.