The OVM shines spotlight on automated metric-driven verification
Hardware Verification Languages (HVLs) have been around for over ten years now, with steadily growing adoption throughout this entire period. This adoption has been driven largely by the inability of manual directed verification approaches to keep pace with the complexity of the verification space for leading-edge design. HVLs offer significant productivity gains because they provide the ability to automatically generate tests using constrained random stimulus, which makes verification much more robust by enabling considerably higher levels of automation.
HVLs also directly address your project quality and schedule predictability concerns because they can enable the generation of test scenarios well beyond what the average test engineer could conceive and write manually. At the same time, these languages also provide the capability to explicitly track concrete progress metrics – thus saving a huge amount of time and effort. Ever since the early success of the e Verification Language and the more recent introduction of SystemVerilog, well trained verification engineers have been taking advantage of these many advances in functional verification by using these HVLs.
However, for various reasons not everyone designing hardware has taken the necessary leap into the world of hardware verification languages. Sometimes users do not have the right background or skill set, or they see various methodologies as proprietary or difficult to learn, or they just have not found an easy path to get started. In order to fully take advantage of HVLs your hardware engineers need to adopt skills much more common among software engineers for defining and constructing their hardware verification environments. Perhaps most importantly, you also need better ways to determine and express the “complete picture” or feature set of your design under test so that you can properly plan, build, and track these automated verification environments.
Coverage-driven approaches used in conjunction with earlier methodologies around e and SystemVerilog have been a major step toward improving the overall verification process and easing adoption. But it is a full metric-driven verification approach combined with a proven, open, standard-based methodology that will ultimately address the verification challenges of your most complex systems and chips and deliver the full return needed for your investment in the adoption of these HVLs. Newer solutions such as the Open Verification Methodology (OVM) are helping accelerate adoption of the latest high-power HVL-based verification approaches by allowing standard-based access to the building blocks and guidelines you need to plan, build, and track your automated, metric-driven verification environments.
Directed Testing to Coverage-Driven
When using directed testing, as many design and verification users do today, you write directed tests for each of the many items you need to test within a test plan. The problem is that this method can be very time consuming and lacks sufficient automation and reuse potential. It also ignores the verification aspects within your system or chip that you may not have thought of, either because they were not anticipated or because you simply did not have time to write these tests, which could lead to costly missed bugs. In addition, due to the lack of automation when using only directed testing, you typically have to redo much of your testing environment in the event of even relatively small design changes. This is because with the directed testing approach the test cases themselves are the primary verification metric. Since the intent of each test case is tied to a specific combination of events on the inputs and outputs of your design under test (DUT), after any design change you must manually revisit the test cases to the re-establish their integrity and confirm their intent has been preserved.
Coverage-driven verification was a huge step towards adding automation within your testing process. By combining automated test generation using constrained random stimulus together with explicit capture of observed DUT behavior using functional coverage, coverage-driven verification shifts the focus from tracking specific test cases to a much more robust and reliable focus on reaching verification goals. Finally, by developing checking components that are decoupled from the stimulus creation, your verification process has the ability to explore goal areas and even go beyond, ensuring proper DUT responses are met and measured– even for non-goal states. By combining these powerful elements together, the coverage-driven approach provides you accurate information on the effectiveness of each simulation run as well as that of all your simulation runs combined, even after design changes. So what you’re getting with a coverage-driven approach is delivery of high levels of automation as well the ability to scale and more readily adapt to changes. However, the process of moving from directed testing to a coverage-driven approach necessarily requires more discipline up front to be effective, so Cadence has developed a structured approach known as metric-driven methodology for easing this transition.
The OVM: The Catalyst for Adoption of Advanced Verification
Various verification methodologies currently available all have their benefits; the OVM, which is based on the combination of the Advanced Verification Methodology (AVM) from Mentor and the Universal Reuse Methodology (URM) from Cadence, has significant momentum in the industry today. The OVM has been adopted in hundreds of projects and is industry tested, with the further advantage that it is completely open and interoperable with the majority of the commercial simulation platforms on the market. The OVM can accelerate your HVL adoption by providing guidelines and proven building blocks for creating advanced verification environments, including the ability to enable multi-language VIP reuse. It offers a “cookbook” of sorts to ease verification environment creation for those customers looking for an easy path to get started and leverages a robust class library that encapsulates object-oriented and aspect-oriented capabilities for building coverage-driven environments. There is a strong focus within the OVM on reusability with guidelines for multi-language verification to maximize ROI and enable your design and verification teams to use their preferred HVL language. What you get with the OVM is an easy access way to truly address your most critical quality, productivity, and predictability concerns.
So, that’s all you need, right? Not quite. As stated previously, the OVM provides you “access” to address your quality, productivity, and predictability concerns—this answers the very important question of “how” you will address your verification problem. However, the key question you must always answer first is “what” you need to verify—in other words, the reason you will build the verification environment in the first place. A true metric-driven approach incorporates specification requirements and the experience of your entire project team to capture a structured verification plan. Based on these requirements, your plan defines the stimulus, checking and coverage capabilities that the verification environment must implement. Ultimately, a true automated metric-driven verification process is all about first planning the verification effort and then using that plan as the basis for tracking completion of the key project milestones, including the implementation of the OVM-based verification environment and meeting the coverage and checking goals defined by the plan.
By adopting the OVM within a metric-driven methodology you gain constant visibility into the overall quality with the ability to monitor status, information on incomplete goals, and the nature of failing tests. Many of your everyday mundane tasks will become automated, including managing regressions, collecting data for analysis and reporting, as well as the ability to apply powerful analysis to best optimize your valuable resources –both compute and human. You will also be able to address your schedule demands with better predictability. You will be able to easily measure your progress against milestones which will give you much higher levels of visibility into the project team progress, allowing you to adjust accordingly to stay on schedule.
The Ideal Verification Match: The OVM + Metric-Driven Verification
Essentially a well planned metric-driven methodology is the best way to boost the overall effectiveness and ease of adoption of today’s powerful HVLs. It ensures you realize the full return on investment for making that adoption. When combined with the OVM, it delivers a full solution based on a solid overall plan that can be tracked and measured. The cookbook style provides structure to ease creation of your initial verification plan and provides customer-proven solutions for the challenges involved with adoption. You’ll be introduced to better ways to maximize verification completeness with access to technology that improves throughput for large parallel simulation regressions and with ease of access to performance-based solutions such as hardware acceleration and emulation. You’ll also be shown how to merge, analyze, and correlate large amounts of coverage and checking data from multiple forms of measurement. The OVM used with a metric-driven approach will ultimately leave you and your team with the advanced verification solutions and know-how to efficiently verify today’s complex designs by identifying and eliminating bugs and helping you deliver high-quality products on schedule.
John Nehls is an architect and team lead in the Verification Core Competency group Cadence at Cadence Design Systems living in Boulder, Colorado . John has spent the last 14 years working closely with industry-leading electronic companies to evolve and apply advanced verification methodologies across a wide range of target applications. In addition, John has played a key leadership role in helping to deliver these new verification technologies and methodologies into the marketplace to keep up with the toughest verification challenges.
Prior to his work in the Electronic Design Automation sector, John was a design and verification project lead at Harris Corporation in Melbourne , Florida , developing digital signal processing and communications systems. John graduated magna cum laude with a B.S. in Electrical Engineering from the University of Florida in Gainesville, Florida .