Functional verification – Quality improvement

Design quality is highly cared for. But it’s important to realize that the design quality cannot be achieved without functional verification quality. When the functional verification is…

Design quality is highly cared for. But it’s important to realize that the design quality cannot be achieved without functional verification quality. When the functional verification is done from scratch the principles of the ZVM: Verification methodology for quality can be applied to achieve quality. What if it’s late? Yes there is still a chance to do a functional verification quality overhaul to restore the quality.

Before we understand more about functional verification quality overhaul, it’s important to understand what is functional verification quality and what leads to poor functional verification quality.

Functional verification quality overhaul requires one to have good understanding of bigger picture of the functional verification. There are three phase of the functional verification execution. They are planning phase, development phase and regression phase. Poor functional verification quality is net result of poor planning phase, poor development phase and poor regression phase. Based on the symptom of poor quality right proportions of following guidelines have to adopted to restore the quality.

Yes quality overhaul is going to take time and resources. Sorry there are no shortcuts.

Screen shot 2016-06-22 at 1.49.40 PM

What are the signs that indicate a verification quality improvement is required?

Functional verification is mostly signed off. Just after few days later burst of bugs are being discovered. It’s certainly stressful situation.

Reflex action response to this scenario would be to jump in and add more tests for the bugs discovered and functional coverage to patch it up. This goes on for few rounds but this bursts ain’t stopping. There is pause but after peaceful few days next burst of bugs are being coming in.

More tests are being added, existing constrained random tests are seeded additionally, some of missed configurations discovered from updated functional coverage are enabled. Flurry of new bugs starts showing even with the internal regressions. Most of it are also turning out to be test bench issues. Ah, we did not think of this scenarios. This configurations was not enabled in the constrained randoms. Oh! We were not even sure that this combination is required our test bench architecture does not support it. Scoreboards will not survive this error injection. End of test will not withstand it. This is disturbing. Especially at this point.

Adding more tests for the point issues may not fix the problem. It may be the sign, that rabbit hole is deeper than what it appears. It may require a bigger picture of the problem to figure out right solution. Its sign of the functional verification quality has problems. A quality overhaul may be required.

What is functional verification quality improvement?

Functional verification quality overhaul could be as drastic as requiring moving away completely from the current verification setup to adding bunch of new tests and bunch of new functional coverage in different areas. Legacy test bench quality overhaul is one of the examples where drastic measures of moving away from the existing verification setup may be required.

First step of quality overhaul is accepting and acknowledging that it is required. This is very important. It would take time and resources. Quality is not cheap. A good quality always comes with the price tag attached. Choice is do you want it or not?

Functional verification is broadly a three phase process. They are planning phase, development phase and regression phase. The quality lapses in each of these phases contributes to overall poor quality. The quality overhaul is needed when varied level of process lapses have taken place in each of these phases and no attempt was made to recover in time.

Poor planning phase, poor development phase and poor regression phase describe more about the symptoms poor execution. This will help in understanding which phase of execution is causing the problems. The quality overhaul process should go bottom up. That is first fix the mistakes of the regression phase, development phase and planning phase. After the new updates are identified the normal flow of development phase and regression phase are repeated.

Now the question may arise, does the reactive point problem fixing activity that started off should it be stopped altogether for the quality overhaul. Yes, if you can. But answer will be NO for most of the practical situations. It may not be possible to stop this reactive bug fighting activity but make sure you start setting up parallel resources for the quality overhaul as well.

Is functional verification quality improvement one time process?

Quality overhaul is not a one time process. Verification environments need periodic servicing like cars to stay in shape and deliver quality results. If done periodically then time and resources required will reduce.

Why does need for functional verification quality improvement arise?

You cannot just keep focusing on extracting the results out of the test bench all the time, sometimes you need to pay attention to test bench itself as well. Primarily need for quality overhaul of functional verification arises because of two reasons. Abuse during deadlines and code fatigue due to aging.

Verification environments are subjected to rigorous execution and go through demanding rough terrains of release and tapeout. This is time when lot of good practices and process is overridden to get things done. This does cause wear and tear to test bench code. It’s understandable that its demand of the hour and that’s why it was put in place for the first reason. But later test bench needs to be serviced, repaired and restored if you want it to last long. When test bench undergoes more and more of these events, it also needs periodic servicing to stay fit.

Test bench aging should not be laughed off. Five years and changing three hands test bench code will start showing signs of fatigue. This is only natural because the original intent keeps diluting as the time passes and hands are changed. Result of this fatigue is showing signs of poor quality results.

This phenomenon is well accepted in the other areas of engineering such as civil, mechanical or electrical engineering. The same needs to be accepted in the verification engineering world as well. Code fatigue should be recognized and accepted in the software world. We need code garages where they are refactored and restored.

How to go about doing functional verification quality improvement ?

If you thought it makes sense to do quality overhaul read on. Generic guidelines for the quality overhaul are captured in the following flow chart. Note that these are generic guidelines, based on assessment of each test bench to extent each of these needs application for restoration will vary.

Quality overhaul is four phase process:

 

Similar Posts

Leave a Reply