Has formal verification technology stalled?
Dr. Olivier Coudert
” .
Has formal verification technology stalled?
Dr. Olivier Coudert
There is certainly no lack of competition in formal verification. The big three EDA public companies, Synopsys, Cadence, and Mentor Graphics, have all their own formal verification offering (Formality, Conformal, 0-in), and there are a number of startups, e.g., Normal 0 false false false EN-US X-NONE HE MicrosoftInternetExplorer4 Jasper, Atrenta, Real Intent, OneSpin, Blue Pearl Software,, to name a few. Formal verification products cover a wide range of applications: System Verilog Assertion Normal 0 false false false EN-US X-NONE HE MicrosoftInternetExplorer4 (SVA) and property checking; RTL static check; equivalence checking (EC); some limited IP verification; clock-domain crossing (CDC) verification; and timing exception verification (false paths and multi-cycle paths).
Looking at the Normal 0 false false false EN-US X-NONE HE MicrosoftInternetExplorer4 DAC submissions this year though, I am puzzled by the overwhelming number of papers focused on increasing simulation speed and coverage, as opposed to the handful of papers discussing formal techniques. And this year is not different from last year. And the year before last. Does that mean there is a lack of innovation in formal verification core techniques?
Improving simulation –higher coverage, less patterns, more automation— with formal techniques is a very active field, both in the academic and industrial world. Some inject faults in the RTL to separate the most discriminating patterns (e.g., Normal 0 false false false EN-US X-NONE HE MicrosoftInternetExplorer4 Certess). Others use SAT and integer constraint solvers to reduce the number of patterns, or to automatically generate patterns for hard-to-cover code branches (e.g., Normal 0 false false false EN-US X-NONE HE MicrosoftInternetExplorer4 NuSym). But success is all relative. Certess was quickly acquired last year, while NuSym is actively looking for a buyer. There are also semi-formal tools, mixing simulation and state exploration techniques (e.g., Normal 0 false false false EN-US X-NONE HE MicrosoftInternetExplorer4 Magellan), but they a have limited usage.
What about the more fundamental formal verification technologies? The 80’s were dominated by the development of rigorous semantics models (e.g., multi-valued logic, Verilog and VHDL operational semantics for synthesis and simulation, Normal 0 false false false EN-US X-NONE HE MicrosoftInternetExplorer4 , temporal logics, and synchronous languages like Esterel and Lustre) and the introduction of BDDs. The 90’s saw EC tools spreading in the industry and the rise of model checking. The 00’s were all about Normal 0 false false false EN-US X-NONE HE MicrosoftInternetExplorer4 SAT and model abstraction to push the capacity of EC and bring property checking to the end-user, as well as static code analysis, CDC, and timing verification. What are we going to see in this decade?
Verification has a lot of challenging problems, with incomplete or no solution at all. Here is my list:
• Merged arithmetic. There are robust methods to verify adders and multipliers of practically any size, but no one can verify merged arithmetics as small as 32-bits.
• Low power. This leads to complex properties capturing the correctness of sequential clock gating and power gating. The former is becoming more common, and there are techniques to address most of it (e.g., Calypto and Conformal). But the later is still waiting for a comprehensive and automated solution.
• RTL debugging. There are a number of static code checkers, but debugging is still very poor.
• HW/SW verification. Can we leverage deductive methods (predicate logic, HOL, rewriting system) to close the gap between software and RTL?
• Mixed signal (analog/digital) devices: this is a very young area of research, but it should see a lot of focus given the increasing ubiquity of mixed signal designs.
If formal verification core technology is to evolve, we will see some original solutions to the problems listed above. What do you think should be added to this list? And which techniques will evolve as the most promising?
Dr. Olivier Coudert has 20 years experience in software architecture and EDA product development, including 10 years in research. He received his PhD in Computer Sciences from Ecole nationale supérieure des Télécommunications, Paris , France , in 1991. He has published 50+ papers and book chapters, and he holds several patents on combinatorial optimization and physical synthesis. He is a recognized expert in the fields of formal verification, logic synthesis, low power, and physical synthesis. He led the development of several EDA products, including 3 from scratch in a startup environment. You can follow Olivier on Twitter, meet him on LinkedIn, or read his blog.
{loadposition content-related} |