Quantcast
Channel: Cadence Functional Verification
Viewing all articles
Browse latest Browse all 652

Everything New is Old … Everything Old is New

$
0
0
The title of this post is taken from a fairly obscure 1982 record album (yes, vinyl) on which several classic doo-wop groups performed versions of then-current songs. It's achieved a bit of cult status since Joey Ramone contributed a song called "Doreen is Never Boring" that, as far as I know, Ramones themselves never recorded. What the heck does this have to do with functional verification? Well, I was reminded of this title by a recent blog post from my colleague Adam Sherer about the persistence of gate-level simulation.

Adam's observations were triggered by a blog post from Ann Mutschler that specifically discussed verification of low-power design structures. She commented "in a small sense, what's old is new" in regard to the use of gate-level simulation for low-power verification. Adam's quote of her comment is specifically what reminded me of the album title, but it also led me to think a bit about some other "old" EDA tools and techniques that show no signs of going away and thus seem to be perpetually new.

There was certainly a time when many in the industry felt that gate-level simulation would go away. As I noted in my most recent post, early logic synthesis users ran lots of gate-level simulations to cross-check the synthesis process. Once RTL-to-gate logical equivalence checking (LEC) became available, the amount of gate-level simulation dropped significantly. However, it never went entirely away since patterns destined for chip testers generally had to be simulated at the gate level with full back-annotated post-route timing.       

As Adam and Ann noted, gate-level simulation is even enjoying somewhat of a comeback. But it's not the only "old" technology still around. What about LEC? There used to be talk about "correct-by-construction" logic synthesis that would always create gates true to the RTL so that cross-checking would no longer be needed. There was also a notion of correct-by-construction place and route, so that such techniques as design rule checking (DRC) and layout-versus-schematic (LVS) comparison would not be needed either. Well, none of those tools shows signs of disappearing anytime soon.    

Coming back up to functional verification, the advent of constrained-random stimulus generation seemingly spelled the death of manually written directed tests. In truth, many project teams still do write some tests to exercise known corner cases that may be hard to hit automatically. What about simulation itself? As formal analysis gained in popularity there was some speculation that it would replace simulation entirely. However, capacity limitations and the challenges of specifying all intended behavior in the form of assertions have kept formal in use mostly on portions of a design, but rarely on an entire chip.

I'm sure that I can think of some more examples, but this seems like the ideal spot to stop and solicit comments from you, the readers. What EDA tools and technologies did you think would be obsolete by now? What other examples can you recall where the predictions of the "experts" turned out to be wrong? What's "old" but "new again?" What's advertised as "new" but really just a rehash of old technology? I'd love to hear your thoughts!

Tom A.

The truth is out there...sometimes it's in a blog.

 


Viewing all articles
Browse latest Browse all 652

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>