Author: admin

  • Effective randomization in Constrained Random Verification

    One of the key component of coverage driven constraint random verification environment is randomization. High-level verification languages provide various constructs to implement the randomization. However that is not the focus of this article.

    In spite of rich randomization constructs, many of the constrained random verification environments fail to achieve the optimum results either due to excessive or insufficient randomization. This article will focus on addressing how to hit that balance.

    How to effectively use the randomization to meet the verification goals? Let’s find out the answers by asking more questions and answering them below. We can call these as requirements for the randomization. Next in the series we will consider how to meet these requirements one of the popular HVLs.

    Why do we use randomization?

    Consider a design state space and feature combinations that are so large, that it’s practically impossible to enumerate and cover all of them exhaustively. This is a scenario, suitable to be addressed by randomization. Randomization of the stimulus will explore the state space and combinations, which we cannot manually enumerate.
    (more…)

  • How to improve UVM use effectiveness?

    UVM is primarily based on a framework reuse model.

    Effective use of Universal verification methodology (UVM) requires understanding of effective framework reuse requirements.

    Effective framework reuse requires user to do bit of deep diving. Effective framework reuse cannot be achieved by just reading the documentation of it or playing around with the examples. They are necessary but not sufficient. Framework reuse model to be effective, certain level of familiarity with the framework source code is required.

    The user needs to understand the verification problem to be solved clearly. Framework is one of the possible solutions. Solution offered by framework is not complete on its own. It is a partial solution. It’s a guiding skeleton, user needs to customize and populate it with the application specific code to make it a complete solution.
    (more…)

  • Verification quality improvement – Test bench

    Test bench quality overhaul involves three primary activities cleaning, trimming and refactoring.

    First step of quality overhaul is cleaning and trimming. Typically verification solutions with the quality problems would have developped weeds. Weeds are redundant and useless growth in the code. This is a distraction. In order to focus on the right problems first step is to trim it and clean it.

    Trim the verification plans, regression lists consisting of test variants and seeds per test. Trim the tests. Trim test benches.  Be ruthless clean, trim and cut it down. Trim in all aspects of the functional verification.

    Cleaning test bench

    Cleaning test bench consists of three steps. Cleaning compile and run time errors, cleaning redundant files and cleaning dead code. More about cleaning the test bench is described in cleaning the test bench.

    Refactoring test bench

    According to wikipedia refactoring is process of restructuring existing computer code – changing the factoring(decomposition)- without changing its external behavior.
    (more…)

  • Testbench quality improvement: Refactoring test bench

    First step in refactoring the test bench code is to identify the code requiring refactoring.

    Poor code in the test bench typically bloated, showing the instability, limitations or partial implementation should be identified. This can be based on the areas identified during debugs, previous bug history and feedback from verification engineers.

    Let’s looks at some of the test bench components that can have major impact on quality and how to address it.

    Poor quality Bus functional model(BFM)

    Bus functional models are pillars of the test bench. Quality of BFMs used can have significant impact on overall quality. Poor quality BFMs can hide the real bugs, bring up false failures and lead to instability of regression.
    (more…)

  • Testbench quality improvement: Refactoring tests

    First step to refactoring tests is to review tests to identify the possible reductions. Reduce the number of tests to provide the necessary coverage. This is not going to provide immediate verification coverage increase but this opportunity to set things right for future maintenance. Any opportunity to group related tests to reduce the code redundancy should be spotted.

    Sometimes to update single additional scenario multiple related tests have to updated individually since they are kept separate. If two tests have greater than 60 % code same then it should merged. Tests that are no longer valid might still be running eliminate them. Tests might be running for the configurations and feature combinations that are no longer valid , eliminate them as well.

    Additionally part of refactoring reviews should spot tests that are not meeting their intent called out in the test plan. For large verification projects this is a big challenge. Selective reviews should be conducted around the areas where bugs are being discovered, tests which have frequent false failures or tests that have been passing for very long time or the ones complained by designers and verification engineers. These are some of the ideas basically review process needs to be come up with smart criterias to maximize the return on review.

    Maintaining regression history can be very helpful in helping identify the tests to be reviewed. Again bulk of the cleaning can be completed but periodic such audits and reviews have to be conducted to maintain it in the same form.

  • Testbench quality improvement: Cleaning test bench

    First step in the test bench quality rehaul is cleaning the test bench code.

    Clean compile and run time warnings

    Review the compile warnings from the compile log file. This should be done periodically. Some of the compile warnings can turn into bugs later.

    Review the runtime warnings. It could be possible some constraints might be failing and not getting caught because it’s not checked, some of the checks downgraded from error to warning and forgotten to be re enabled and any simulator tool warnings.

    In fact it’s best to add the check on the compile and run time warnings as part of the check-in regression to keep it under control.

    Cleaning redundant files

    Some of temporary files created during development which are no longer useful can accumulate over time.
    (more…)

  • Verification quality improvement – Regression & Debug

    Regression phase is the climax of the functional verification. This is phase during while the real value of the functional verification is being realized. That’s why this phase has to be highly efficient to meet the verification objectives planned. By nature this is a phase very hard to constrain it under a strict schedules. The only way to keep under control is to have highly efficient flow to manage regression and debugging.

    Improve the efficiency of the regression and debug process to support reactive fire fighting process to meet the deadline.

    Regression productivity

    Full regression is process that requires religious commitment to maintain quality. It contains elements that require repetition.

    Regressions are repetitive tasks. The efficiency of the overall regression process significantly affects the productivity of the overall verification and team. Frequent repetition can only be effectively when it’s automated.
    (more…)

  • Functional verification – Quality improvement

    Design quality is highly cared for. But it’s important to realize that the design quality cannot be achieved without functional verification quality. When the functional verification is done from scratch the principles of the ZVM: Verification methodology for quality can be applied to achieve quality. What if it’s late? Yes there is still a chance to do a functional verification quality overhaul to restore the quality.

    Before we understand more about functional verification quality overhaul, it’s important to understand what is functional verification quality and what leads to poor functional verification quality.

    Functional verification quality overhaul requires one to have good understanding of bigger picture of the functional verification. There are three phase of the functional verification execution. They are planning phase, development phase and regression phase. Poor functional verification quality is net result of poor planning phase, poor development phase and poor regression phase. Based on the symptom of poor quality right proportions of following guidelines have to adopted to restore the quality.
    (more…)

  • Verification quality improvement – Legacy test benches

    Legacy tests benches are the ones based on the hardware description language(HDL)s like VHDL or Verilog, with or without high level languages such as C or C++. These test benches are not based on the coverage driven constrained random approach. Although they may also contain randomization approach in rudimentary form.

    Typically designed for the early first generation of the older designs. It would have provided the necessary coverage at that point but as newer revisions show up with the increased complexity the legacy technologies based test bench may not be able to do a good job.

    In fact beyond certain point of design complexity the effort to verify the feature in the legacy test bench for the same coverage can become significantly higher compared to latest HVL and verification methodology driven test benches. This is because of lack of built-in support for the constructs aiding the constrained random verification. Finding engineers to maintain and update the legacy test benches can also become challenging.
    (more…)

  • Verification quality improvement: Verification plan

    Root of most the functional verification quality problems lies in the quality of verification plan and its management. So one of the key component of quality overhaul of verification requires quality overhaul verification plans.

    Verification plan is a seed. A bad seed grows into tree with the bitter fruits. Poor verification quality is result of poor quality in the planning phase.

    Verification plan consists of three plans : test plan, checks plan and coverage plan. In a coverage driven constrained random approach test plan and checks plan tends to get ignored. This is dangerous. Test plan and Checks plan are equally important. In fact more important than functional coverage plan because they talk about what needs to be achieved and how it needs to be achieved.
    (more…)