By: Lauro Rizzatti, Verification Expert, www.rizzatti.com
April 17, 2020
A 5G system-on-chip (SoC) design is a multifaceted system with complex communications, computing and memory requirements, and many blocks that must work together seamlessly for it to effectively perform its duties. Verifying a such an intricate design is not a matter of just running a long list of tests. It must push boundaries and stress systems to determine where they break. Tools must run through a variety of realistic verification tests to ensure that, once deployed, the chip can handle the many tasks it must perform.
Some of the challenges facing 5G SoC designers include:
- Reassembled and reinvented earlier technologies
- New applications with more standards and use cases
- Virtualized technology
- Massive data volumes due to new, ubiquitous applications and new technologies such as massive MIMO (mMIMO) and beamforming
- Multiple architectures depending upon the balance of cost and performance
- Performance requirements with tighter latency, higher bandwidth and frequencies
The number of possible combinations of technologies and equipment configurations makes it impractical to build prototypes and then test out their capabilities and resilience.
As a result, verification must be done pre-silicon and hardware emulation is the only verification tool capable of testing the full range of performance of 5G components pre-silicon. Simulation runs too slowly to achieve useful data with such a diverse and exacting range of tests, while emulation performs on the order of 1000 times faster than simulation. That enables emulation to test real-world scenarios involving both hardware and software.
Cellular Technology Reinvented
Significant changes to the 4G architecture are part of the 5G technology, including the radio access network (RAN) reimagined as Cloud RAN (sometimes called Centralized RAN) or C-RAN. The backhaul is split into two with a centralized unit (CU) handling baseband processing before sending signals to the distributed units (DUs) in base stations. The connection between the CU and DUs is now called fronthaul. Technically, backhaul now occurs between the CU and the core.
Higher-level networking layers are virtualized on standard computing equipment for greater flexibility and agility. This network-function virtualization (NFV) and software-defined networking (SDN) implementation makes it possible to configure and reconfigure networks as needed to make optimal use of available radio antennas and other resources.
These new configurations add flexibility and lower operating costs, while creating more configurations than would have been possible with 4G and previous wireless technologies.
Those configurations must be verified.
Smartphones have both voice and data capabilities. With 5G, the amount of data that phones can handle will be greater, increased by technologies like mMIMO and beamforming. Rather than discarding signals formerly considered inter-cell interference, those multiple signals with more data are leveraged for the directionality and greater sensitivity that they provide.
Beyond more traditional voice and data, 5G takes on more responsibility than prior generations. First, it will be a channel for internet-of-things (IoT) devices transporting data to and from the cloud. While low-bandwidth applications for commercial IoT devices, they will contribute to traffic levels. Industrial IoT devices could generate larger amounts of data.
5G will carry automotive vehicle-to-everything (V2X) traffic as cars are expected to transmit huge volumes of data as they drive. Autonomous vehicles will communicate with everything around them to keep them safe, and all vehicles –– driven or driverless –– will send volumes of operational data to the cloud to track performance and make improvements.
Yes, 5G will produce volumes of data that adds to the number of use cases when proving out new equipment. Hundreds of user equipment units and tens of base stations to anything remotely useful must be built to stress-test equipment using real prototypes. Multiple companies building multiple versions of equipment needed to prototype real-world systems for testing multiple use cases means an entire cycle of building silicon, systems to see how they work, then revising the silicon and building commercial products. That means an extensive project cycle and a large budget.
Testing systems before committing to silicon provides better coverage without a build-test-rebuild cycle and at least one less mask turn for each IC in the entire 5G infrastructure.
Thorough Testing a Must
Running a long list of tests is only part of the 5G chip design verification process. It’s also pushing boundaries and stressing systems to see where they break. Each system will contain one or more SoCs and each must be run through a variety of realistic verification tests to ensure that, once deployed, it will perform as needed.
Essential tests are:
- Power, including peak, average and minimum, and compliance with a comprehensive power intent specification
- Minimum and maximum latency
- Identifying critical paths if performance changes are needed
- Points of failure when stressed up to and beyond expected limits
- Code coverage for both high level synthesis (HLS) and register transfer level (RTL) code for the hardware and software code that will make their way into the overall systems
- Fault coverage metrics that determine whether all testing has been comprehensive enough
- Inclusion of a testing infrastructure, such as design-for-test (DFT)
- Physical verification known as design-for-manufacturing (DFM) to identify and modify any yield-limiting hot spots on the chip
While some tests can be run on live systems, others, such as critical-path identification, DFT and DFM, only can be performed on a system that has visibility into the design itself.
AI in the Mix
Adding to the mix of 5G equipment is artificial intelligence (AI) and machine learning (ML). Architects view ML as a useful tool for numerous sophisticated uses to optimize the 5G infrastructure in real time. That includes automatic channel estimation for over-the-air (OTA) transmission; self-organizing networks (SON); automated multiple-access handover; and coordinated multi-point (CoMP) technology for improved MIMO and diversity.
Systems will operate with trained neural-network models subject to updates. The number of options available for processing neural nets implies that a selected option must be thoroughly vetted, and alternatives tested before the final option is decided.
Model training is another consideration. The model itself will depend on specific examples used for training and the order in which that training occurs. As more training examples become available, models can be further refined. While updates can improve models in equipment that’s already deployed, initial models must be rock solid. That means testing them against an enormous number of examples to ensure that they behave as expected under a wide range of circumstances.
5G Verification Is Impossible without Emulation
A 5G SoC is a complex system (see example baseband unit below). Communications, computing and memory requirements are substantial, and many blocks must work together seamlessly for the SoC to effectively perform its duties.
Given that the full range of performance of 5G components must be tested pre-silicon, the only option is hardware emulation. As note previously, simulation runs too slowly to achieve useful data with such a diverse and exacting range of tests. Emulation performs on the order of 1000 times faster than simulation, making it possible to test real-world scenarios with both hardware and software.
Using real equipment to generate data streams for verification might seem like the correct approach, but this has three main weaknesses. First, the data rate coming from the cable doesn’t naturally match the testing rate within the emulator, necessitating the use of rate adapters. Second, these connections and rate adapters must be manually connected, making a centralized data-center emulation model impossible.
Finally, and most important, that traffic input is not predictable or repeatable. If an error occurs, it’s difficult to go back and replay the event when trying to debug the problem. It’s better to have a data source that’s deterministic, reproducible and scalable –– meaning that connections can be made to scalable units in a data center without manual intervention. These are key features of an emulation system when accompanied by virtualized protocol test modules, such as the VirtuaLAB units (from Mentor, a Siemens Business) and other comprehensive pieces of verification intellectual property (VIP). They can be instantiated within or connected to an emulator. Each protocol used to connect 5G system components can be modeled and virtualized to drive and receive data from the design-under-test (DUT) and can be instantiated remotely in a data center.
Emulation is also more accurate because all pieces of the test system are clock-aligned. If a failure occurs, it can be tracked to the cycle it occurred on and correlated with the exact input data and system state in place when the error happened. This takes guesswork out of the debugging process, streamlining efforts to correct design flaws and deliver a fully verified design for mask generation.
Huge portions of the test plan and results from pre-silicon verification can be used directly for post-silicon verification, eliminating an enormous amount of test generation work and making debugging easier if issues arise in the silicon check-out. By replacing the virtual DUT with a physical silicon chip, all the infrastructure brought to the verification process can be re-used to confirm that the silicon works as expected. That includes the testing of the wide range of configurations and use cases needed for pre-silicon verification.
Delivery of 5G wireless communications technology is welcome news for consumers worldwide who want improved performance and reliability from their smartphones. Adding hardware emulation to the chip design verification flow will ensure testing the 5G SoC design’s full range of performance.
About Lauro Rizzatti
Dr. Lauro Rizzatti is a verification consultant and industry expert on hardware emulation. Previously, Dr. Rizzatti held positions in management, product marketing, technical marketing and engineering.