A Debate at DAC on Simulation, Emulation and Formal

Source: FormalWorld

DAC attendees this year were treated to an informative panel moderated by Adapt-IP’s John Sanguinetti titled, “The Great Simulation/Emulation Faceoff,” a lively debate about whether the simulation/formal verification flow works more effectively than a simulation/emulation flow. Panelists were Stephen Bailey of Mentor Graphics, Dave Kelf from OneSpin, IBM’s Ronny Morad, Frank Schirrmeister at Cadence and Alex Starr of Advanced Micro Devices.

Right out of the chute, Steve Bailey, Mentor’s director of emerging technologies and first panelist to present his position, rejected the premise of the panel, noting that there is no faceoff between emulation and simulation or anything in verification. The most important problem, he said, is the rising cost of doing verification. He posited that the most expensive aspect of creating a chip today is functional verification. Design teams need every engine possible and every capability possible to get to the productivity.

The belief that hardware acceleration, which covers emulation and FPGA prototyping, is taking simulation cycles from software simulators is a fallacy –– or, there’s a fixed pie and we have to divvy it up, Steve continued. The pie for verification cycles is not fixed, but growing exponentially. The number of software simulation cycles is growing, though perhaps slower than the number of cycles going into hardware acceleration or formal. This needs to be addressed to bring the cost of verifying chips under control, he concluded.

Ronny Morad, manager of post-silicon validation technologies and verification analytics group at the IBM Haifa Research Lab, remarked that simulation, emulation, formal are all important for verification, proven methods and tools for many years. He believes there is synergy between emulation and silicon, and emulation plays a bigger role today than it ever did in the past. This trend will continue.

His position is that the silicon itself is essential for verification. He pointed out that the Wilson Research Group and Mentor Graphics did a study in 2014 and found that 70 percent of designs don’t make it to first silicon success and require a respin. Other sources indicate that at least one percent of bugs escaped to silicon, which gives some motivation for post-silicon validation. Silicon is not just a part of a product, but a platform to leverage for verification and finding the bug that escaped pre-silicon. Post-silicon validation is a mature discipline and comes with its own set of tools and methodologies, Ronny added.

Next up, Frank Schirrmeister, senior group director, product management at Cadence, described the core engines –– formal, dynamic with static pieces to it –– and noted the intent to make things faster. That’s where hardware acceleration comes in. Yes, he said, it’s acceleration, emulation and FPGA, and another dimension: smarter that’s not all about speed. Hardware acceleration is brute force –– put a rocket on it and make it faster for testbenches, but add the ability to make it smarter with metrics for verification. As a result, different verification flows are emerging between verification for IP, subsystems and the full SoC with its environment, all requiring different combinations of engines. Within the context of smart, design teams can combine engines, different abstractions and use software as an instrument for verification.

Formal, simulation, acceleration, emulation and FPGA based prototyping all have individual advantages, Frank observed, and there is no real faceoff. Each has its specific ability and design teams need the combination to be successful.

Alex Starr is an AMD fellow and took a few minutes to describe how AMD uses simulation and emulation. He freely offered that AMD has done emulation for quite a long time, starting with post-silicon. He echoed what Ronny said as well –– AMD works in real silicon with JTAG debuggers, platform-level validation workloads then pre-silicon debug using an emulator. The team puts the SoC in the emulator and, he stated, it is tremendously successful. AMD uses emulation on every product it makes.

What has been interesting, commented Alex, is seeing the evolution, from post-silicon shifting left into more verification-style emulation that has expanded into two dimensions. One is IP-level or subsystem-level emulation. The other is SoC emulation where AMD is doing difficult corner cases and long-running tests. The other angle is software and the time-to-market pressure –– the faster the product is released, the faster the company makes money. And all of the product features are effectively part of verification, he added.

Dave Kelf is OneSpin’s vice president of marketing and quickly began by saying everyone knows about verification performance. Simulation is running out of steam and emulation needs to be taken from a performance viewpoint. He questioned what is going on in verification and whether the industry has identified the right resources to solve those problems. Simulation, emulation and FPGA prototyping is a nice continuum. Adding formal to handle some of the complexity issues in verification could complete the story and make a difference with real problems.

Verification performance is an issue, but performance requirements seem to be slowing down, Dave stated. The real problem is the complexity of testing. And it’s not just the chip, safety, power, security, cross team issues and all the other problems associated with design development. More performance is necessary. Yes, emulation for software/hardware co-verification, but he appealed to the industry to get smarter and figure out how to tackle these problems in a slightly different way with a slightly different technology. Formal is the answer, he confided, and will become part of the approaching three-legged stool of verification.

While the discussion didn’t turn out to be a faceoff on simulation, emulation or formal after all, John Sanguinetti and his panel agreed that they are key pieces to any verification strategy and methodology. That’s not to say the verification challenges have been solved. Each panelist pointed to a challenge yet to be tackled. They include verification cost, performance and complexity, and recognition that silicon is essential for verification and post-silicon validation. The need to work smarter and continue growing deployment modes for emulation was highlighted as well, as did moving formal into the three-legged stool of verification. Certainly, attendees walked out of the meeting room at the Austin Convention Center with a host of ongoing verification challenges to think about.

###

About Lauro Rizzatti 
Lauro Rizzatti is a verification consultant and industry expert on hardware emulation. Previously, Dr. Rizzatti held positions in management, product marketing, technical marketing, and engineering.