[Preface: the upcoming "Club Formal" on October 17 here at the Cadence San Jose campus will also touch on this topic - please join us!]
While it's now common knowledge that there are many benefits to using simulation technology within a metric-driven verification (MDV) flow, as it turns out there are also an equal number of benefits to using formal analysis technology in such a flow as well. Even better, users can combine the resulting metrics from simulation and formal to take advantage of the best each technology has to offer. However, combining metrics of different types from completely different types of engines is not trivial without common semantics, methodologies, and technologies to harmonize heterogeneous data into something that is meaningful to a metric-driven functional verification flow.
Recently Team Verify's Chris Komar (a Product Expert that you may remember meeting at DVCon 2012) and our colleague John Brennan (an expert in coverage and metric driven methodologies and tools) gave a webinar covering all these issues and solutions, entitled "Combining the Best of Both in an MDV Flow - Simulation and Formal". A recording of this free webinar is available at http://www.cadence.com/cadence/events/Pages/event.aspx?eventid=684 (registration is required).
What you will learn from this free presentation is the detailed operational and technical information on how to combine verification metrics from both simulation and formal analysis, allowing you to substantially save in the overall verification effort. A new metric methodology --"enriched metrics"-- managed by Cadence® Incisive® Enterprise Verifier ("IEV") enables the co-operation of engines and, combined with higher level management tools, better visualization and a more refined verification flow. Consider the following example of enriched metrics in action:
On the left hand side of the diagram are the results from simulation and dynamic assertions; on the right are formal cover and proof results. In the above example, the results from the formal analysis are a mathematical proof that it's impossible to write a test to hit this cover point. Hence, you should halt your simulations or formal analysis and begin debugging why this is the case. The next diagram shows the happier case where the formal results on the right prove with mathematical certainty that this case can never fail.
It's important to note that that while the simulation and dynamic assertion results on the left hand side of this diagram are positive, those results are only true for the relatively narrow cases that the user encoded in the given test(s). In contrast, the positive formal result on the right is proven true for all inputs, for all time -- truly a relief to know! Plus: you can also safely stop developing new tests and/or testbenches to hit this coverpoint, often saving substantial amounts of time. (There are many customers that run formal on an IP block first. If verification of the block can be completed with firmal alone, they use this results display to show management it's save to move on and/or skip block level simulation.)
Putting all this in perspective: there was a time when formal was merely a point tool - albeit a very powerful one - used in isolation. So called "hybrid" flows that mixed formal and simulation were certainly an improvement, but in most cases the mapping of the joint simulation+formal analysis results into the overall project database was quite painful, or only done verbally (Engineer: "Hey Boss, we found X number of bugs, but have a few more ‘Explores' left to run down. Boss: That's great -- I think. When do you think you are going to be done, again?"Fortunately, in the webinar Chris shows how the results can be fed back into the main project verification plan and results database in a useful way, to wit:
* Checks and coverage are visibly linked to the human and machine readable verification plan
* The user can easily implement appropriate checks as assertions manually, or have the tool generate them automatically given certain specs, and/or leverage Assertion-Based Verification IP
* Assertions can be run on all available formal and simulation engines
* All Ccntributions from all engines shown in a unified view
All of these activities output data in format that makes sense to the simulation-centric management -- and thus, all of the sudden the isolation of formal and multi-engine flows ends, and these tools and related solutions gain mainstream acceptance.
Again, the webinar recording is free (registration is required):
http://www.cadence.com/cadence/events/Pages/event.aspx?eventid=684
and clocking-in at well under an hour, it's perfect for informative lunch time viewing.
Enjoy!
Team Verify
On Twitter: http://twitter.com/teamverify, @teamverify
And now you can "Like" us on Facebook too:
http://www.facebook.com/pages/Team-Verify/298008410248534