While everyone inherently knows what requirements and specifications are, we don’t often stop to think about them and how we could perhaps better use them in a flow. We tend to think about the specification as being what we want to create and the requirements as being the things that get in our way – the constraints. Alternatively when requirements tracking is mentioned, we tend to think about large mil/aero projects where all kinds of arduous documentation is involved and everything takes ten times as long. As an industry, we have focused on the specification and the requirements are barely utilized in any kind of automation, and that to me is an opportunity lost.
Before I explain how I think they should be used, let me take a side step for a moment. I promise I will get back to the point quickly. Functional verification now takes up a significant portion of the development cycle. The verification team is responsible for many pieces such as creating the verification plan, mapping that into coverage points, building the testbench, or other verification strategy, such as formal, that fulfills each line item in the plan, then executing the verification and analyzing the results. Verification is the act of ensuring that two descriptions of the same thing are the same. One of the descriptions is the design, the other is the testbench. If the two of them have been independently derived then there is a good chance that a common point of error has not been included. We use coverage to provide guidance as to the completeness of the verification process. Coverage items are derived from the verification plan, and it has been reported that this is one of the most difficult tasks that a verification team has to perform. Basically, there is no correspondence between the two and the mapping is difficult. If this is not done well, then the verification task will either take too long, or the task will be incomplete.
We have long had the goal to have an executable specification. That would enable more automation and perhaps most importantly a formal description of what we want to build so that all traces of ambiguity are removed. But it has been more elusive than may have been expected. Part of the problem is that if we spend time creating this, it is another model that has to be verified. Without an automatic path from there to implementation, it is not seen as holding sufficient value. If we have an automated path to implementation from the executable spec, then we cannot use it as the independent model with which to verify the implementation. This is an equivalence checking exercise and so ensures that if the input is wrong, then the output will be equally wrong.
At this point, perhaps you can see where I am going with this. It would be natural to be investing time and effort into building executable requirements. These would in fact be where the testbench is derived from and a real opportunity to add some automation in the verification process. The specification is basically the definition of the design from the inside looking out and the requirements are the same things from the outside looking in. The two must match and this is what we are trying to show when we perform verification. If this were coupled to a tool suite that directly attempted to find ways to prove or show equivalence between requirements and specification, then we could add a lot of efficiency to the development process. It would also allow for a top-down flow where both the design and testbench could be refined over time and even evolve in an incremental manner. Today, we are utilizing a highly wasteful set of processes and procedures that provides little flexibility.
Do you think that we should be investing more in verification automation, especially when most of a design today is coming from reuse?
Brian Bailey – keeping you covered
If you liked this feature, and would like to see a weekly or bi-weekly collection of related features delivered directly to your inbox, sign up for the IC Design newsletter.