Author: admin

  • Verification plan review

    Verification plan review is still a manual process. Verification plan is made up of three plans Test plan, Checks plan and Coverage plan. As of today’s technology, I am not aware of any tools to do automatic verification plan review based on machine learning.

    First challenge in the verification plan review is to have a verification plan. Second challenge is to make it a traceable verification plan so that it can be trusted.

    What  home work is required from verification plan reviewer?

    Verification plan building requires specification expertise and verification expertise. Verification plan writer gets time to do a detailed scan of the requirement specification to write the verification plan.

    The challenge for the verification plan reviewer will not be able to get same amount of time. Verification plan reviewer will still have to do the home work to make the review effective.
    (more…)

  • Traceable verification plan

    Traceable verification plan is the one where following are all stitched together.

    Test plan <-> Tests <-> Test variants <-> Regression status 

    There is accountability for every test yet to be written, passing and failing test in the test plan. A test added to test plan is guaranteed to be executed. A verification plan that can be trusted. A traceable verification plan can be trusted. Its anchor for the functional verification quality.

    Right not the concept is called out in the context of test plan which is the one of the three plans of verification plans where traceability matters most. Ideally it should be implemented to all three plans.

    Why we need traceable verification plan?

    Three phase of functional verification are: Planning phase -> Development phase <-> Regression phase. Requirements specifications are not really written for the functional verification.
    (more…)

  • Verification plan review – First Challenge

    Review of verification plans can be very challenging process. First challenge is presence of the verification plan. If present, matching it with the latest status. Verification plans are created initially in many cases but they are not kept up-to-date.

    Three possible scenarios good, bad and ugly based on status of verification plan are captured below.

    • Verification plans itself does not exist – Yes this can happen in certain cases. Verification has just evolved. This is ugly
    • Verification plans were created initially but not updated. Initial verification plans were never updated and current state of execution is far different from what is present in the plans. But it’s agreed that plans are not up to-date. This is good
    • Verification plans are present and unclear update status. Initial verification plans were updated some times but thought to be completely updated. Yet they cannot correlate 1:1 to tests and regression results. This is bad
    Verification plan - good, bad & ugly
    Verification plan – good, bad & ugly

    The ugly is just plain ugly. It’s raw and hence its clean to fix.
    (more…)

  • Functional verification quality – Poor regression phase

    Poor quality in regression phase is like spoiling the climax of the functional verification. If the right things have been done in planning phase and development phase, good news is it is easy recover back from quality lapses in the regression phase.

    If the planning phase and development phase also had poor execution regression phase can become hell. Effects of poor planning phase and poor development phase impacts are not mixed. Focus here is effects of poor regression phase on functional verification quality.  
    (more…)

  • Functional verification quality – Poor development phase

    Poor quality in the development phase results in poor quality test bench and tests. Poor quality in development phase has second biggest impact on functional verification quality after the poor planning phase.

    One of the early symptoms of poor quality test benches are schedule slips in writing tests and getting them to passing state. Poorly architected test benches make the test writers job difficult. They do not provide adequate hooks and abstractions. This causes test writers to write lot of additional test specific code.  Coupled with this are the test bring up delays due to struggle with the debugging the test bench code and test bench bugs are early symptoms.

    Certain feature update in test bench especially during the middle of the development, showing more than average bring up delays and causing significant instability regression causing large number of existing passing tests to fail frequently is also clear sign of potential bad architecture and implementation around that feature area.
    (more…)

  • Functional verification quality – Poor planning phase

    Poor quality in planning phase leads to poor verification plans and test bench architecture. This is a seed. This seed grows into tree as the project progresses and yields bitter fruits later. Poor planning phase has big impact on the functional verification quality.

    Bulk of the planning activities about 60-80% completes during the planning phase of the project and remaining is completed during the development phase. This provides opportunity to recover from some of the mistakes of the planning phase during development phase.

    If there are quality problems found with the planning phase deliverables during development phase, it’s best to fix them at that point rather than ignoring or putting it off to later. It’s well known fact that cost of fixing the mistakes increases exponentially as the project progresses.
    (more…)

  • Functional verification quality

    Functional verification quality has big impact on the quality of the design.

    Quality and functional verification are very closely related. The whole objective of the functional verification is to build the high quality design. In order to do that all aspects related to functional verification itself should be of high quality.

    As a part of ZVM: Verification methodology for quality we have already seen science, art and religion of verification needs to applied together for achieving high quality verification, which in turn increases the chances of achieving high quality design. This is all good but what if, for some reasons process was not followed and now we have quality issues with the functional verification. What to do now?

    Here are the series of articles to help restore the quality of the functional verification. There could be varied level of quality issues and each needs varied application of these generic principles to restore quality. These are generic guidelines to help detect the causes and series of steps to address them. A firm commitment to invest time and resources is required to restore the quality. Quality always comes with its own price tag.
    (more…)

  • Tasks management tools

    We have seen that project execution is a very dynamic activity. Any planning tools that are static in nature will fail to meet the dynamic demands .

    What are static planning tools? Word documents for the test plans, coverage plans and checks plans. They can be generated for review but not for execution. Excel sheets are out for the task tracking and planning. Both of these are very static in nature.

    Why do we consider them as static? Note that plans are property of team and don’t belong to any individual. Plan management system should allow multi user interface. Microsoft word and excel are great tools. But come think of is multi-user interface of it, even version control tools cannot do automatic merge on them due to their binary formats. This makes it unfriendly for the multi user operation. When medium used for these plans does not provide clean multi-user interface it hinders the team work during execution. Strong recommendation is to put in place a good task management software. There are plenty out in market. If you are big corporation and want to shed some weight from your wallet get custom one made. If you are not, go with the off the shelf products. Task management software tools provide interface for multi user operations. This means everyone can do updates. It reduces the unnecessary email traffic.
    (more…)

  • Planning – Milestones

    After detailed task lists are in place, it’s time to schedule them for the execution by assigning the real time lines and engineers. First step before scheduling is setting up the milestones. A milestone is grouping of set of meaningful functionalities. Milestones should be ordered based on the priorities of the functionality. Put the expected dates on the milestones.

    Now tag each of the tasks with the corresponding milestones. Number of man days of tasks adding up in each milestone gives idea about the time it would require. If there is big difference between the duration available and number that’s showing up with the tasks list, consider options of adding more engineers if it’s possible or redefining the milestones.

    One simple and obvious rule for task assignment to engineers is, group the related task and assign them to same engineer. It takes effort for engineers to build the context and they should allowed to make full use of it. Randomly spreading the tasks across engineers will cause unnecessary waste of the time invested in building context. Building newer contexts and dealing with switching contexts, increases the pressure on the engineers. It’s best avoided unless its must for meeting a critical deadlines. If these critical deadlines are coming very often consider it as bad planning.
    (more…)

  • Testbench architecture

    According to MSDN,  “Software application architecture is the process of defining a structured solution that meets all of the technical and operational requirements, while optimizing common quality attributes such as performance, security, and manageability. It involves a series of decisions based on a wide range of factors, and each of these decisions can have considerable impact on the quality, performance, maintainability, and overall success of the application”.

    The test bench architecture definition is also not radically different from it.

    Architecting test bench is a four step process.

    The first step is understanding the prerequisites. These prerequisites set the requirements for the architecture.

    The second step is dividing the functionality to various components meeting the requirements and mapping to the classes.

    The third step is validation, to check if the architecture has any inconsistencies.

    Fourth and the final step are architecture delivery to execution by packaging it in documents and abstract classes.

    Architecting testbench - 4 steps process
    Architecting testbench – 4 steps process

    Step 1: Testbench architecture : Prerequisites 

    There are four key prerequisites that provide the requirements for the test bench architecture process.

     

    Test bench architecture: 4 key driving factors
    Test bench architecture: 4 key driving factors

    It might seem tempting to directly jump into the test bench architecture with partial understanding of the prerequisites or even skipping some.
    (more…)