Enthusiastic Hardware Emulation Audience at DVCon India

Source: EEWebEEWeb

Lauro Rizzatti (far right) moderated a DVCon India session with panelists (from left to right) Sundararajan Haran, Ashok Natarajan, Ravindra Babu and Hanns Windele (Source: Lauro Rizzatti)

Attendees at DVCon India in September were an enthusiastic and engaged group of verification engineers, as they were last year and the year before that. Once again, I had the opportunity moderate a program panel. Titled, “Hardware Emulation’s Starring Role in SoC Verification,” this featured animated panelists speaking to an attentive audience. Yes, the title is a big one and a challenge to panelists to determine whether it is true or not. All weighed in with insightful comments.

The panelists were: Ravindra Babu, design engineering director at Cypress India; Hanns Windele, vice president of Europe and India for Mentor, a Siemens Business; Sundararajan Haran, senior engineering manager, SoC Products Group from Microsemi Hyderabad Development Centre; and Ashok Natarajan, director of Engineering at Qualcomm.

We started with an introductory question where each panelist described the design and verification environment and the role emulation plays in this environment. Mentor’s Windele commenced the discussion by explaining that he represented an emulation vendor, after which he continued as follows: 

Windele: “Historically, after a drop in emulation usage over a decade ago, emulation is a mainstream verification tool,” he began. “This is happening for three main reasons. First, emulation has come a long way and doesn’t require a dedicated expert to use it.”

“The second is the move of emulators from a single resource in a lab to a datacenter resource. Traditionally, emulation was used in in-circuit-emulation (ICE) mode where cables and all sorts of things connected an emulator to a physical target system in a lab. The virtualization of the target system via transactional interfaces allowed the deployment of emulation in a data center where virtual peripherals are connected to the emulator via transactors.”

“Basically, a data center deployment enables a community of engineers to use the emulator much more than what any users could possibly do with an emulator in the lab.”

“And, last but not least, the emergence of emulation apps. An app is an intelligent way to interact with an emulator and perform tasks other than just register transfer level (RTL) verification.”

Babu: “I want to talk about two experiences. In my previous company, I was involved in the design of a massive chip of about 15 to 16 million gates, and emulation was an integral part of our entire verification process. We used to start with simulation for basic verification at the block/IP level. Then, as we moved to the system level, we switched to emulation because it was impossible to get the required speed to simulate all possible conditions that the chip would be exposed to in real life. Most of the chip functionality, at least 85%, was verified on an emulator.”

“We could perform protocol conformance checking and software development on the emulation platform. From emulation, we moved to FPGA prototyping and validated the design at real speed. All three steps were well integrated.”

“In my current company, we used to perform verification in the traditional manner, starting with simulation and moving to FPGA prototyping. Lately, we have actively adopted emulation as well. Wherever there is a complex IP subsystem, like a multicore CPU module, the number of combinations that must be verified is extremely high, and emulation is ideal for such tasks. We use emulation primarily for testbench acceleration and find it useful to cover a lot of use cases.”

Natarajan: “My team in Qualcomm is responsible for SoC-level verification of all designs created in Bangalore, encompassing several different chips, from mobile chipsets, modem chips, computer chips and IoT chips. In terms of technology, we are pushing the envelope. Today, we are at 10 and 7 nanometers. In terms of complexity, it’s multifold. Some of these chips include more than 100 IP blocks, including high-performance cores, high-speed peripherals, low-speed peripherals and multiple cores, such as DSP cores. In terms of scheduling, many of these SoCs must be verified in eight weeks.”

“We use emulation primarily for two purposes. One is for software development, one or two months after verifying the design hardware. The second purpose is for performance simulation closure, power performance correlation and concurrency verification. We use the emulation platform for signoff.”

Haran: “My team is an SoC business unit responsible for verification of both SoC and non-SoC based FPGAs. The complexity of our designs is constantly growing, and the IP we put into FPGAs is growing as well. On top of that, adding processor-based SoCs expands the number of protocols, further increasing design complexity.”

“For thorough verification, we must run system-level scenarios. Emulation is ideal for processing real-life scenarios at the system level. We use emulation in testbench acceleration and put the testbench’s backend, including the transactors, together with the design in the emulator and leave the testbench’s frontend in the host server.”

Each panelist has been exposed to emulation either directly or indirectly and was asked to describe its benefits and advantages.

Windele: “Hardware/software co-verification is becoming an important task for emulation because there is little meaningful amount of software that can be executed using a software simulator. Any application that requires execution of a meaningful amount of software will have to use emulation. Of course, this was true in the past, but it had to overcome a big hurdle because the use of emulation was not easy. The use of emulation is better today.”

“Design analysis in emulation has become much more useful. For example, when you perform hardware/software co-verification with an emulator, you can point to a specific part in your software code and verify how much power is consumed by that specific part. You can do all sorts of tracing with total visibility and accessibility to find and trace bugs in your design, an otherwise non-trivial task. For tracing bugs in software with an emulator, accessibility is not only important, it is mandatory.”

“Although speed is meaningful, accessibility and total visibility to trace a design bug is more valuable. Total turnaround time for design debugging, including analyzing the design, finding a bug and fixing it, in the shortest amount of time is what is important.”

Babu: “From my perspective, there are two important benefits with an emulator. First, the sheer number of combinations of tests you can try out on a complex design as compared to what you can do in a simulator is of great value. That is the biggest benefit of an emulator when we are designing something like a complex CPU subsystem with multiple cores accessing the same set of common resources.”

“Emulation could completely and thoroughly verify a huge networking chip supporting 10 to 12 protocols and a large number of packet formats. We could run all the protocol conformance tests and then 80% of the final software could be verified on the emulator. Without an emulator, I don’t think we could have achieved this.”

“Regarding drawbacks, if your chip has hard IP, you will run into a roadblock unless you plan in advance how to tackle that in an emulator. You need to plan well in advance.”

Natarajan: “I would like to highlight some of the benefits of emulation in our test cases. Some of the performance test cases we run would need the entire SoC build. It becomes impossible to run some of these cases in a simulation environment. It takes days just for basic testing to go through. To get any meaningful results in a short span of time, emulation is the only platform that we could rely on, mainly for the concurrency scenarios that we do as part of benchmarking mobile chips.”

“Emulation has a unique advantage in that it could run with real physical targets. In our verification environment we are using verification IP. Even though models are accurate, 99% accurate, the real physical target can be emulated with real software to eliminate any critical bug.”

“In terms of drawbacks, it takes some time for the first build to come up. It takes a lot of effort, especially for a big chip to get something working on a FPGA emulation platform.”

“The second thing is the overall debug approach. How do you debug a design if something fails on an emulation platform after a one day run because you are limited in terms of how much waveform dump you can perform? These things have to be planned ahead in terms of how you are going to debug your design if there is a failure on the emulation platform.”

Haran: “In terms of the benefits, I have seen three things. First, our complete processor compliance suit of tests, exercising all combinations, takes less than a day. In our estimates simulation would take almost a month.”

“We also do FPGA prototyping. Unlike FPGA prototyping where the focus is on software drivers and stuff like that, in TBX we actually port the real design into the emulator and that gives good visibility and controllability to debug the hardware.”

“We also have verification transaction libraries (VTL) equivalent of simulation verification libraries. For example, if we have a DDR protocol we have the equivalent simulation verification and transaction libraries to seamlessly move our simulation setup into the emulation testbench acceleration mode.”

“In terms of drawbacks, currently in FPGAs, we have analog-based and schematic-based models. The challenge is in our simulation setup where we use a behavioral approach but now plan to make them synthesizable. The emulator has its own dynamics to synthesize them. Lack of time and understanding is affecting our effectiveness. Once we get hold of it, then I believe we will see benefits.”

We moved on to design verification scenarios where emulation was able to solve a design problem, find a bug and fix it that no other verification approach or tool could solve.

Babu: “By using an emulator, we were able to thoroughly verify a big chip design of about 15-16 million gates with combined sets of protocols and made sure that design was 100% correct. This would have been impossible to achieve with a simulator or FPGAs because at that time no FPGA was large enough to hold such a design. And mapping the design across multiple FPGAs would have been a nightmare. The only solution was to use an emulator.”

Windele: “A major semiconductor company was designing an SoC with Arm cores and an AXI bus interconnecting peripherals such as SMS, GPS. Those peripherals had separate power domains controlled by software to minimize the power consumption of the entire SoC. When the engineering sample came back from the foundry, power consumption was sporadically different and higher than expected. Software engineers were asked to look at their code to determine whether they have done something wrong. Hardware engineers were asked to check their hardware on whether the power wasn’t turning off when it was supposed to be shutting off. Neither team could find anything wrong.”

“It was emulation that finally got them the answer. Ultimately, the problem was traced to an incorrect implementation of the AXI specifications. Without emulation, the team would have lived with random accelerated power consumption.”

Natarajan: “About four years back, we came up with a new architecture targeting value-tier mobile chipsets to reduce area and save cost. To accommodate this idea, much of the IP was redesigned, including performance goals. Before this new architecture, we took a previous chip as a reference and did TLM modeling to get the expected performance on the new chip. In this case, since the architecture changed, the overall SoC infrastructure changed including latency and BW expectations.”

“We didn’t have a reference platform to carry out multiple performance benchmarking and could not confidently sign off the design. That’s when we moved to emulation. We were able to quickly run performance benchmarks run on previous platforms and correlate with our TLM models to ensure that we met the chip performance for all standard benchmarks. I don’t think we would have been able to solve this purely relying on simulation.”

“In my previous company, a post-silicon bug reported by a customer on a networking chip would show up after millions of packets were processed with no way for us to narrow it down on a simulation platform. We ran it on an emulation platform, ran many thousands of packets, narrowed it down to a point where we understood what the issue was, then took it back to simulation and recreate it with a smaller scenario.”

Haran: “I would like to share two case studies we ran at Microsemi. One has to do with an FPGA fabric, not the switching fabric but the programmable fabric that maps user designs.”

“In simulation, we could not completely validate the whole FPGA cluster because it was too big for simulation. When we moved to emulation, we were able to see the complete synthesized equivalent model of the fabric calibrated to the emulation platform and were able to send it, receive it back, store it and even verify and cluster them. This was not possible at the simulation level.”

“We also perform protocol benchmark testing such as CoreMark or Dhrystone and cannot use simulation for such testing. Instead, we use emulation. We are able to port the complete processor into the emulator and run the complete CoreMark and Dhrystone testing.”

Since everyone in business is asked at some point in their career for a recommendation, our panelists were asked to give a recommendation for emulation to a user or potential user.

Babu: “Basically, what you should avoid is to make emulation an afterthought. You cannot start with a pure simulation approach and, when your simulator is running out of steam, move to an emulator. What happens is you will use the same simulation environment and push it into an emulator and not get the expected speed on the emulator. Instead, you have to re-architect your verification infrastructure to make it suitable for an emulator, a huge amount of extra effort and time.”

“Emulation should be part of your flow from the beginning. You have to make sure that your testbench is architected so that it is reusable across simulation and emulation and is efficient for emulation. In emulation, a portion of the test environment is executed on the host server and the remaining portion is mapped inside the emulator. Everything that is synchronous to the basic clock must go into the box and only the transactional layer, part of the verification infrastructure, should be kept in the host server to achieve a good level of acceleration.”

Windele: “Use emulation where emulation is helpful, but don’t use it where it’s not valuable. I would recommend using emulation beyond hardware verification and hardware/software co-verification. You can use emulation for power estimation and for other applications.”

“Emulation for big SoCs is almost mandatory but I would not say simulation doesn’t have room in there. If you use small pieces of IP, simulation is easier to use and set up and perfectly okay. To get the best use out of emulation, you must plan in advance how to deploy it.”

Natarajan: “In terms of use models, emulation is not going to completely replace simulation. Simulation is still one of the primary tools for verification. Emulation is slowly coming up to support new features available in simulation, but it still has some limitations.”

“Because of the limitations, emulation has to be used only in areas where simulation is not effective. It’s important to identify what areas you want to target in emulation.”

“In my opinion, I wouldn’t suggest using emulation for block-level verification. This could be done effectively in simulation using constrained random, functional coverage-based environment and there you can close all coverage collection. Trying to move all that into emulation and closing verification may not be the most effective approach.”

“Don’t start emulation until your design is somewhat stable. Because the first-time build is quite an effort and every time you have to migrate and then keep doing these builds again, it’s going to be unproductive. Wait for the design to be stable on your simulation platform where you run a good number of tests and then move into emulation and run through your complete regression test suite or carry out performance testing or hardware/software co-verification.”

“This should be planned ahead in terms of where you want to use emulation or what areas simulation is not effective for your chip and plan accordingly.”

Haran: “Emulation should be planned ahead when you’re working on the overall verification strategy. We are doing that now for our current SoC FPGAs. Early on, we only had simulation and FPGA verification planning. Now, we have an emulation track and we are planning ahead. For example, we list what is required, what synthesizable models we need to implement, what needs to be synthesized inside the emulator and what should remain inside the simulator.”

“Emulation cannot completely replace simulation. We are still verifying IP at the simulation level. To support a seamless transition from simulation to TBX, we need to create a dual testbench approach. We need VIP simulation libraries and equivalent verification transaction libraries. The verification transaction libraries can be mapped inside an emulator to get more controllability and performance.”

We concluded our discussion with predictions on whether emulation will continue to enjoy its current success, if it should evolve or change and recommendations.

Natarajan: “Make it cheaper, faster and smaller. I think if all three are taken care of, emulation is going to grow and find more usage.”

“Areas that are going to grow are emulation and static/formal verification given the size of chips and shrinking verification timeline. What used to be a six-month verification signoff timeline is now two months. In the current market, this is needed for the competitive edge. We have to look at simulation and alternate ways we verify and get signoff done. Emulation supports several new features that help us find new use cases for emulation on every project. We are able to dump coverage and run power verification in emulation.”

“The area where I would really like to see more usage is on power. We are seeing the tip of the iceberg because there are lots of problems to be solved in power verification. We expect emulation vendors to come up with easier debug to perform power verification faster.”

Haran: “In the current verification scenario, simulation is used more than 80% of the time and emulation is used about 20% of the time. As design complexity is growing, we can predict that these numbers will change. Maybe emulation can go all the way to 80% and simulation drops to 20% with IP at 15-20%.”

“I believe analog support would be welcome. Currently, we only emulate digital parts and we black box analog sections, but need to be able to emulate analog designs.”

Babu: “Emulation is going to expand its footprint between simulation and FPGA prototyping. As debugging on an emulator gets easier, it’s going to expand into the current simulation space. Simulation will be used for getting raw design to work to some extent, then move to emulation. 

“We are still struggling when the design has hard IP to effectively use the emulator. How do we embed these components into the emulator and give us a hook where we can put this into the emulator?”

Windele: “Ease of use is going to be important and we have made great progress. Streamlining the testbench for simulation and emulation is important. Running the job must not be a big effort going from simulation to an emulator.”

“The emulator is becoming a part of the data center now. Everybody will be able to use emulation but the prerequisite for that is ease of use. Speed is important but it’s not everything, it’s only important in connection with visibility and transparency.”

“One aspiration I will mention as well is analog. It’s a pretty difficult problem because you need to be able to synthesize something even if it’s analog, and this is a non-trivial problem to solve. It would be wonderful if we could emulate analog circuits. I think we will get there, but it will take some time.”

And, with this nod toward a future analog emulation use model, the panel concluded.