Visibility Enhancements – Novas

However, there is a bunch of advanced technology that we have developed that is in addition to what is in Debussy. We have put that into our current flagship debug product which we call Verdi. Verdi is a superset of Debussy. In incorporates all of Debussy's capabilities plus it adds a bunch of new stuff, namely debug automation which means using formal analysis techniques to automatically trace from some behavior, some effect that you observe back to its cause. The idea of debugging, trying to figure design behavior is really about working on an effect and tracing back to its causes. We've automated a big part of that using some formal analysis techniques. In addition we have incorporated into Verdi testbench debug which means that the Vera and e languages are incorporated and the ability to trace across the boundary between HDL and HVL which I believe is unique to Verdi as well incorporating solution level debug, incorporating PSL, SVA and even OVA, and integrating the concept of viewing the results of assertions in the simulation. Say you have assertions and they fire, you can see that in the waveform. You click on that and you get the source code and you go back to the design source. It's naturally integrated. Verdi even has the ability to do post simulation assertion checking based solely on the signal dump without having to rerun the simulator which means you can add new assertions and check them leveraging that valuable resource which is your signal value dump. Verdi is a superset. It is our flagship. It's what we currently sell the most of. There are lots of Debussy licenses out there. Many people are upgrading to Verdi as the problems get more complex.

Associated with Verdi are also two what I call helper modules. One is called nESL. Again I apologize for latching onto an industry buzz phrase. ESL means a lot of different things to a lot of different people. I think that there are three common elements in ESL. One is SystemC, one is transactions and the other is hardware/software. The nESL module lets Verdi work at the higher level of abstraction by adding those three capabilities, basically treating SystemC as an HDL, transactional analysis being able to view and analyze the transaction flow and also being able to work together with software debuggers. At the other end there is a thing that works at the net list level when you are doing static analysis, timing, power and clock analysis. That is called nAnalyzer. It helps Verdi work at that level of abstraction for those specific netlist closure tasks.

What is the price range for Verdi and Debussy?
There's a whole matrix of annual and perpetual prices but in general Verdi has a list price of around $10K a year.

What can you tell me about Siloti, the newly announced product line?
The news is that we have announced a whole new product line, really a new space, that we are calling visibility enhancement. The key point is that you get full visibility with a partial signal dump. I will get into why that is important right away. It helps you debug the results from lower level representations such as netlists that are in FPGAs and emulators at the familiar RTL level. There are new technologies of course to analyze the design and to figure out which signals you need to expand the data into full visibility and to correlate between where you execute and where you want to visualize. We've packaged that into two application specific products called SimVE and SilVE. They are tightly integrated with Verdi.

I want to say something a bit audacious. The verification methodology is fundamentally broken. Not so much that it is too slow or that you can't express things more completely. You have great languages, great simulators and all kinds of choices with respect to simulators and testbenches. All this stuff is really quite refined and mature. The problem comes when one of these testbenches detects that there is a mismatch. At that point the flow just breaks because there is no data. In order to run at full speed you basically run without any dumping. That's true of simulation. It's true of emulation. It's true of prototypes. You just want to get to the point where you detect as quickly as possible but then of course you have to debug. The debugger is just sitting there waiting. It's waiting to get fed like a little bird in a nest. You have to feed it signal values. You say “How am I to get out the signal values?” and “How do I know which ones?” People go through this manual process. Again we are talking about accelerating engineers. What we visualize is that there are engineers sitting around asking themselves “Which block should we dump?” Because if you turn on full dumping, dump every signal on every value change, you will have the multi gigabyte files. You may be able to hold onto those but it gets old fast. Not only that but the whole process just slows down because it is dominated by the extraction of the values and writing them onto the disk. There has been work done on compression and trying to make this dumping more efficient. We have been at the forefront of that with our FSDB format; working with all the simulation vendors. But it has just become untenable to do that simply by software engineering. We had to figure out a way to change the game. That's why we focused on visibility enhancements. The verification part runs fast or good. The debug part is in good shape. It's tying them together where we see a huge hole. It's real expensive to get that data out. That causes many, many iterations. You say lets dump that block then you find out that there is not enough data. Then you go back to get the rest of the data and so on.

There's a bunch of benefits from these products. We address the problems. You get better comprehension, verification cost savings and optimization of expensive resources such as emulators.

This fits right in side by side with Verdi, because it is all in the name of comprehension. It is all based upon our open system platform with its design data base and its signal data base. They work together. You can feed Verdi directly from the simulators or you can feed it through Siloti in order to improve your verification methodology. The two products are application specific because there are some subtle difference between what you need in simulation and what you need in the silicon arena.

Inadequate visibility hampers verification. There is low impact to dumping everything when you are just simulating a module but it gets worse and worse when you go through the flow. When you are doing full chip regression you are generally running without dumping and if there is an error flag and you have to go back, you have access to everything but boy is it expensive to dump everything. So you generally try to be selective. This leads to a big thought process and a lot of iterations. In emulation people are trying to run as many emulations through as they can. They have these multi-user boxes. If you start instrumenting a lot of stuff, the image grows. Now it won't fit in the box or only one will fit where five used to fit. It's that kind of stuff. When you are in the final stages of silicon validation we talked about design for debug. What that means is having to think about where am I going to put extra logic on the chip and how am I going to route it to the other side? That's quite an involved process. How can we help with all that? Visibility enhancement optimizes verification and validation process by reducing the impact of observing the things you need to figure out how it works or why it doesn't. You have to make this tradeoff. Am I going to dump everything and slow it way down or am I going to dump nothing and run real fast? How can I split the difference there? If we look at how it works. Utilizing RTL or gate source files it analyzes the design to figure out which signals are essential. What is essential is determined by what the expansion engine needs to give you full visibility. These things are tied tightly together. Again it is a little different for simulation and for silicon. Once you analyzed, you get what you need for wherever you are in that validation flow. You either get a dump list for the simulation. You get help figuring out what to instrument in your FPGA prototype or your DSP. You get probe list to use with your emulator. Basically, that's going to produce out of one of these things a partial dump file, a subset of the signals, but because you have Siloti you can still get full visibility because when you want to look at a value at a particular time in Verdi, the expansion engine will be invoked on the fly. You get on demand expansion. There is never a process of doing some sort of batch process to try to get back all the signals. You just expand the things you are actually going to look at. That's the other thing about these huge dump files. Let's say you have dumped everything. You would actually look at only a tiny fraction of those signal values. We figured out a way to do it smart - to use an analysis, an analytical approach, to dump so you can get back full visibility. That's going to transform the verification process because you shouldn't have to think about what to dump or whether to dump. You should be able, most of the time, just to dump that essential file and be ready to debug immediately.

« Previous Page 1 | 2 | 3 | 4 | 5  Next Page »


Review Article Be the first to review this article

Featured Video
Mid-Level Mechanical Engineer for Kiewit at lenexa, Kansas
Mechanical Engineer for Allen & Shariff Corporation at Pittsburgh, Pennsylvania
System Designer/Engineer for Bluewater at Southfield, Michigan
GIS Data Analyst for CostQuest Associates, Inc. at Cincinnati, Ohio
Software Developer for CHA Consulting, Inc. at Norwell, Massachusetts
Geospatial Analyst/Programmer for LANDIQ at Sacramento, California
Upcoming Events
ESPRIT World 2018 at Indianapolis Marriott Downtown 350 West Maryland Street Indianapolis IN - Jun 11 - 15, 2018
HxGN LIVE 2018 at The Venetian Las Vegas NV - Jun 12 - 15, 2018
IMTS2018 International Manufacturing Tech Show at McCormick Place Chicago IL - Sep 10 - 15, 2018
4th International Conference on Sensors and Electronic Instrumentation Advances (SEIA' 2018) at Movenpick Hotel Amsterdam City Centre Amsterdam Netherlands - Sep 19 - 21, 2018
Kenesto: 30 day trial
SolidCAM: Break the Chains

Internet Business Systems © 2018 Internet Business Systems, Inc.
25 North 14th Steet, Suite 710, San Jose, CA 95112
+1 (408) 882-6554 — Contact Us, or visit our other sites:
AECCafe - Architectural Design and Engineering EDACafe - Electronic Design Automation GISCafe - Geographical Information Services TechJobsCafe - Technical Jobs and Resumes ShareCG - Share Computer Graphic (CG) Animation, 3D Art and 3D Models
  Privacy PolicyAdvertise