Catching up at DAC in Austin

Source: EDACafé

ecently, I spent a few enjoyable days in Austin at the Design Automation Conference, June 6-9, attending sessions, checking out verification vendors exhibiting on the show floor and catching up with long-time friends and colleagues. Evenings were filled with dinner events and more catching up.

The event was opened Monday morning by Chuck J. Alpert, general chair of the 53rd DAC, who delivered a motivational welcome supported with a few statistics. The following summarizes the paper and poster submissions and the acceptance rate, including the territorial breakdown.


Attendance somewhat exceeded 6,000 visitors, slightly more than the last event in Austin in 2013. But rather exceptional was a 45% increase in technical conference attendees.

Following Chuck’s opening remarks, a Visionary Talk titled, “Learning from Life: Biologically Inspired Electronic Design,” was delivered by Lou Scheffer, principal scientist at Howard Hughes Medical Institute.

A Keynote titled, “Revolution Ahead – What It Takes To Enable Securely Connected, Self-driving Cars,” was presented by Lars Reger, CTO in the Automotive Business Unit at NXP.

Both were remarkable and memorable, and I highly recommend watching the video recordings by clicking on the embedded links.

No less intriguing was the keynote on Tuesday morning titled, “Driving The Next Decade of Innovations in Visual and Accelerated Computing,” delivered by Sameer Halepete, VP of VLSI Engineering at NVIDIA. Again, watching the video by following the embedded links is recommended.

A highlight of the event for me was moderating a panel on hardware emulation and its growing use models at the Mentor Graphics Booth. Titled “What’s up with all those new use models on an emulator,” it aimed at assessing the technology in 2016 by comparing pros and cons, including new recent deployment modes. For the purpose, three panelists known as power users of emulation and articulate spokespersons joined me on the stage: Alex Starr from AMD, Guy Hutchinson of Cavium Networks and Rick Leatherman at Imagination Technologies.

We had a sizeable crowd listening to our discussion and a few audience members asked some intriguing questions as well. As the session wound down and after praising hardware emulation’s many attributes, we talked about what it cannot do. Alex mentioned analog testing, something that won’t happen any time soon. Guy agreed that analog testing was not on the horizon.

The audience heard as well that debug in emulation is not as good as simulation, which lead to the point that verification engineers need to consider the cost/benefit tradeoff when using emulation versus simulation.

In a closing comment, Rick remarked that emulation is a finite resource because it is a shared resource. While he’s right, that’s starting to change as emulation moves into design datacenters and becomes a shared resource accessed remotely by any number of users from anywhere any time. Instead of being a finite resource, global emulation enterprise servers with a design capacity of several billion gates are able to support multiple large designs or a combination of large and small designs.That makes it an infinite resource.

Watch for an upcoming full transcript of the event.

I also enjoyed a panel titled, “The Great Simulation/Emulation Faceoff,” in the Design Track Tuesday afternoon. Moderated by Dr. John Sanguinetti, chairman of Adapt-IP, it included five panelists: Alex Starr, AMD; Frank Schirmeister, Cadence Design Systems; Ronny Morad, IBM; Stephen Bailey, Mentor Graphics; and Dave Kelf, OneSpin Solutions. They debated the state-of-the-art of simulation versus emulation with frequent references to formal analysis. The bottom line was that all three verification approaches are here to stay and complement each other rather effectively. Watch for an upcoming full transcript of the event.

I was invited to a roundtable moderated by Ed Sperling of Semiconductor Engineering titled, “Verification Continuum Or Individual Tools,” with Stephen Bailey, Frank Schirrmeister; Krzysztof Szczur of Aldec, Bill Neifert from ARM and Sundai Mitra of NetSpeed. We discussed the spectrum of verification engines, from simulation to emulation, virtual and FPGA prototyping, formal, and whether their integration in a unified platform as proposed by all three major EDA suppliers is ready for prime time. Look for the three-part series on Semiconductor Engineering.

Regarding the exhibition, a pleasant surprise in my line of business –– hardware emulation –– was to discover a product from RunTime Design Automation, founded and headed by Dr. Andrea Casotto. RTDA has been around for several years, offering high-performance job scheduling, workflow optimization solutions, monitoring and execution, and efficient resource management.

RunTime DA is putting the final touches on a new product called HERO (Hardware Emulation Resource Optimizer), a vendor-agnostic solution designed to manage complex hardware emulation workloads and to optimize emulator resource management. HERO addresses all aspects of a hardware emulation environment from design compilation, synthesis, simulation, and emulation.

I met Andrea for the first time at the yearly Needham & Company dinner Sunday night. Surprisingly, I found out that we come from two neighboring areas in Northeast Italy, only about 80 miles apart. Andrea comes from Treviso, in the land of the Serenissima Repubblica di Venezia, I come from Gorizia, called the “Nice of the Austrian Empire” under the Hapsburg Empire for its mild weather and tranquil life style… until WWI when everything changed. We struck a cordial conversation and Monday Andrea gave me a demo of the HERO scheduler. To say I was impressed is an understatement. It runs orders of magnitude faster with lots more user control/configuration capabilities than any competitive product.

I admire companies able to develop and sustain a business by offering tools to augment and/or enhance design flows built around tools from mainstream EDA companies. Friendly with all EDA players, they do not step on any toes, and maintain a professional relationship with both vendors and end-users. Companies that come to mind are Verific and Concept Engineering. I am pleased to add RunTime DA to this short list.

As a side note, I had the pleasure to attend “Case Study For Use of Hardware Platform in Algorithm Trading” as part of a design track titled, “Custom Hardware For Algorithm Trading.” It was presented by Luc Burgun, formerly CEO of Emulation Verification Engineering (EVE), today CEO of NovaSparks, an emerging company in the fast trading space. Luc described the NovaTick trading platform, and NovaLink FPGA-to-FPGA communication link. What stood out as unique was NovaTick’s lowest latency in the industry at 1.1us average wire-to-wire. It has the ability to handle 8,000 securities per feed, thrusting it way ahead of pure FPGA-based competing solutions limited to few tens of securities per feed.

All things considered, this year’s DAC offered a wealth of interesting sessions that left attendees with plenty to think about. The 54th DAC will be held again next year in Austin June 18-22.