ARM’s Bill Neifert on the early days of hardware emulation
Source: Embedded Computing Design
- November 29, 2017
- Posted by: Lauro Rizzatti
- Category: 2017
I recently had a chance to catch up with Bill Neifert, Senior Director of Market Development at ARM. Talking with Bill was like climbing into TARDIS (Time And Relative Dimension In Space) with the time-traveling alien adventurer Doctor Who to be swept back to the 1990s.
Hardware emulation is a relative newcomer compared to the Doctor Who series, arriving on the scene in the 1980s versus a staple of BBC television that began in 1963. Nonetheless, it’s as popular in some verification circles as the science-fiction TV show.
Bill worked as an applications engineer for Quickturn in the 1990s when it was the emulation leader, where he was first exposed to engineering groups trying to solve software problems in a hardware design. Emulation, as he correctly noted, wasn’t nearly as advanced technologically then as it is today. Making full use of those early emulators was a massive project where engineers compiled the netlist, mapped it into the emulator, then designed a custom interface board. After all, those were the days before testbench technology, when the physical target system needed to be slowed way down for the emulator to be deployed.
Back then, emulation was a huge undertaking for both hardware design and software development groups. In those days, engineers were attempting to build a platform to do software development on with some bare metal. Getting the system booted was the ultimate test and considered a success if it worked. And, millions of dollars and tens of man years were spent trying to accomplish what would now be considered a modest feat. During those early days Bill was struck by the importance of the interplay between hardware and software, and how it worked in the overall design process. This was a similar experience to mine, and that insight came in handy a bit later in his career.
Jumping into the future, Bill co-founded a startup in 2003 called Carbon Design Systems, which was acquired by ARM in 2015. The founders took inspiration from emulation’s high cost of entry and painful use model (in those early days), determining that there must be a software approach to hardware emulation. Their answer was cycle-accurate models. Carbon’s unique technology gave system designers a way to convert their register-transfer level (RTL) designs into a higher level description so they could develop their code on a fast model or virtual model. While not as fast as emulation, the models were accurate.
As is often the case, Carbon was a bit ahead of its time. While it had cycle-accurate virtual models, virtual prototyping platforms didn’t exist until CoWare and ARM came along in the mid-2000s. Carbon acquired a virtual prototyping platform from ARM in 2008, which is ironic since it’s now, once again, part of the ARM fold.
At ARM, Bill now works on integrating fast models and cycle-accurate models, a growing area in the semiconductor industry.
At the end of our talk, I thought about today’s engineering world and marveled at hardware emulation’s move from inauspicious beginnings to becoming the foundation of many chip design verification strategies.
To hear my entire conversation with Bill Neifert, check out the Verification Perspectives podcast at http://bit.ly/2ozgQGT.