Verification plan review

Verification plan review is still a manual process. Verification plan is made up of three plans Test plan, Checks plan and Coverage plan. As of…

Verification plan review is still a manual process. Verification plan is made up of three plans Test plan, Checks plan and Coverage plan. As of today’s technology, I am not aware of any tools to do automatic verification plan review based on machine learning.

First challenge in the verification plan review is to have a verification plan. Second challenge is to make it a traceable verification plan so that it can be trusted.

What  home work is required from verification plan reviewer?

Verification plan building requires specification expertise and verification expertise. Verification plan writer gets time to do a detailed scan of the requirement specification to write the verification plan.

The challenge for the verification plan reviewer will not be able to get same amount of time. Verification plan reviewer will still have to do the home work to make the review effective.

Reviewer should make a quick independent high level verification plan containing section, subsections, features, configurations and combinations. This would serve as guideline to review the verification plan.

If this review is conducted as part of quality overhaul process the listing the bugs discovered should be analyzed. From this analysis pattern of areas where bugs are discovered should be identified. Any coverage holes waived off or still remaining to be covered should be analyzed to find the pattern of areas uncovered.

The bugs discovered are pointing at the holes but if there is hole in one area there might be holes in other areas as well. So a pro-active review can be done rather than waiting for new bugs to be discovered. This pro-active review can use the high level plan created with the fresh perspective to scan for the holes.

Spotting the holes in the verification plan, is easier said than done. Spotting holes requires specification expert and verification expert. Review is the important tool for this process. Any review to be fruitful and successful has to be stimulating.

How to stimulate the verification plan review?

Key simulation to verification plan review is presentation of the verification plan. Verification plan has to be looked at from multiple angles to spot the holes. Again care should be taken not to duplicate the verification plan to create the different views. That would be a disaster. From single source of verification plan, it should be possible to provide the following information indexed views:

  • Specification indexed view
  • Feature category indexed view
  • End application use case indexed view

All three views should allowed to be visualized in:

  • Hierarchical views like mind map
  • Excel like listing views

How to go about doing verification plan review?

One of the key to review of verification plans is to where to invest the efforts. Everything cannot be reviewed uniformly. Based on the different criterias a bit of prioritization and intensifying reviews in certain areas can give good results. For example, areas where the bugs are being discovered can also be used as initial starting point to review. Also quick check on number of items versus complexity can give hints about possible weakness. The features that were added late to plan can also be the targets. It’s important that instead of reviewing verification plan from start to end, use some of these insights to get to the weaker areas faster.

Specification indexed view helps to see the coverage of the verification plan from the requirement specification point of view. Section, sub-sections, scenarios, coverage and checks belonging to each of these can be quickly checked. This provides ability to identify the holes in the requirement specification coverage.

This alone is not sufficient.

Sometimes a single feature could be spread out in different sections of the specifications. Specification indexed views may not give clear idea about the complete coverage of the feature. In those cases feature indexed view will help group all the verification items related to a feature at one place. This would help spot the holes in the verification of features and combination of features.

Use case indexed view is also very important because this is how the end application uses the design. Although constrained random might be providing sufficient coverage but it’s important to tag those tests or add specific tests to cover the use case scenarios. These could be covering the combination of features. Use case indexed view will be able to spot any critical use cases being missed. Note that specification indexed view or feature indexed view may not be covering these specifically.

Hierarchical views help provide easy to read view with the help of structured information. While excel like list view presents information in easy to process form using features like filtering or pivot tables to look for specific information or extracting the numbers for analysis.

What are additional considerations for constrained random verification methodology?

For the constrained random verification methodology based verification plans, additional information regarding what is being randomized should be listed. Variables randomized, range of randomization and frequency of randomization is also equally important. This allows the ability to answer by default what exactly is getting randomized in the test bench. This is also equally important because we cannot just say things are randomized it should be possible to answer specific questions as to what is randomized, what is the range and what is the frequency of randomization quickly.

Error injection verification plan can bloat very easily without really adding much of the value. Use some of these guidelines to formulate the error injection verification plan and use these tips to prioritize the execution.

Sometimes constrained random may not really be required. Take example as PCIe advanced error reporting(AER) verification. Let’s consider problem of interrupt generation for different error types under different values of masks and enables. All combinations have to exercised if the bugs are getting discovered in this area. Here doing constrained random may cause the delay in discovering the scenarios. Where scenarios can be enumerated and all seems to be of almost same importance with reasonable state space, there is no point in waiting for constrained random to hit it. Enumerate and create programmable tests to sweep through the possible scenarios and combinations.

Covering features under all configurations and combinations is another challenge. It may not be possible to cover all the features in the all the legal scenarios by sweeping everything in some cases. Not all the configurations and combinations may be equally important. Here the weighted random distribution with the functional coverage should be the way to go.

Testbench architecture should be reviewed to ensure it does allow the combinations of features under different configurations. Spread out the configurations around different test cases.

 

Similar Posts

Leave a Reply