There are couple of additional challenges to be dealt while architecting the testbench. They are ambiguity in the requirements specification and some odd features not fitting with the mostly completed testbench architecture. Following sections provide some of the ideas as to how deal with them.
Handling ambiguity
One of the other challenges during architecture is handling ambiguity. The very nature of new products is they come bundled with certain level of ambiguity. Specifications themselves are evolving. It’s not practically possible to have clarity on everything during architecture phase. So its inevitable that architects have to devise ways to deal with the ambiguity.
Testbench architecture validation is process to check if it meeting the verification requirements. Additionally check for inconsistencies and complexity. In some cases complexity may be necessity but validation should ensure that complexity is manageable.
Review
It’s important to validate the testbench architecture. Review is one of the very good tools to do effective architecture validation. Even when it’s not possible to get inputs from reviewers a self review is also equally valuable. Some tips for conducting the review are following.
One of the effective ways to validate the architecture is to run through the key categories of the features from the verification plan.
Divide the functionality to various blocks or layers. Map the blocks to classes. Designate the classes as data classes and processing classes. Clearly define the interface between various classes for communicating the information and achieving the synchronization.
Data classes do not have any threads for processing. While processing classes have the single or multiple threads internally to process the data classes.
Data classes help model transaction classes or helper classes. Transaction classes represent a unit of data, which stays together. Example of transaction data class can be protocol data units (PDU) of the bus protocol or image to be processed by image processing engine. Helper classes can come in variety of forms containing various related tasks and functions. They could be forming different types of protocol data units for processing based on discrete information, implementing various encoding & decoding algorithms, managing sequence numbers etc.
For processing classes the number of threads, blocking and unblocking conditions for threads, pseudo code for the functionality of the thread should be specified.
Architecture involves making choices. Knowledge of the key requirements is essential to make the right choices.
To get the architecture right all the prerequisites specified should be on the top of head of architect. Even if it’s not at full details but the key aspects of all the four areas DUT, Verification plan, Application world abstraction strategy and reuse strategy should be on the top of head of test bench architect. These key details will help making the choices among the alternatives by evaluating the pros and cons of each approach.
Holding all the key requirements on the top of head is a challenge for human mind. In order to handle this effectively, various specification summaries containing key points should be prepared by the architect. Mind map is one of the tools very helpful in capturing various requirements to be made useful for this purpose.
One of the key to getting the functional division and reusability right in the testbench architecture is to think of it as product to be sold either as full solution or in parts. For example think of bus functional models(BFMs), Scoreboard, functional coverage should be individually licensable components. This thought process although virtual in many cases guides in the right direction.
Borrowing some concepts from software world, which is older and wiser. Key to good architecture is “cohesion and coupling”. Cohesion is related things staying together. Coupling is about the level of interdependency between the software modules. High cohesion leads to low coupling. Tight coupling leads to change in one module causing ripple effects of changes in other modules. Low coupling and high cohesion is desired.
Low cohesion and tight coupling leads to bad architecture. Bad architecture leads to unintended test bench bugs, harder debugs, maintenance pain and makes every update a challenge. All this leads to schedule slips and holes in verification.
Wear your verification goggle. This will help you to keep testbench light and focused on what is really needed.
Application is the reason for existence of the DUT. Application is combination of the software and hardware. Hardware may be design under test combined with the other designs. Although focus from the verification engineer is design under test (DUT) but it’s important for verification purpose to understand the logical interface of the design to application and how application uses the design. For the purpose of the verification entire application need not be modeled as is. Application world details should be abstracted and relevant details have to be modeled. This abstraction strategy is one of the key areas of the test bench architecture.
That raises question what details of application world is relevant for the purpose of verification? (more…)
Layered view of test bench is grouping of the related functionality components into five layers. In the following article let’s look at what these layers are and what are the group of components in each of them.
Testbench architecture – Layers
Layer#1: Test bench top – Connecting DUT to Test bench
Test bench top is container for connecting the design under test (DUT) to the test bench. Typically the HVLs provide way to encapsulate the related set of signals as interfaces. This allows passing related set of signals as single unit. Encapsulate related set of signals in to different interfaces of the design. Parameterize the interface to suit to various interface reuse needs.
In the test bench top module connect interfaces to various design’s ports. These ports are passed to the test bench. Test bench interacts with the design through these ports. (more…)
We have seen that project execution is a very dynamic activity. Any planning tools that are static in nature will fail to meet the dynamic demands .
What are static planning tools? Word documents for the test plans, coverage plans and checks plans. They can be generated for review but not for execution. Excel sheets are out for the task tracking and planning. Both of these are very static in nature.
Why do we consider them as static? Note that plans are property of team and don’t belong to any individual. Plan management system should allow multi user interface. Microsoft word and excel are great tools. But come think of is multi-user interface of it, even version control tools cannot do automatic merge on them due to their binary formats. This makes it unfriendly for the multi user operation. When medium used for these plans does not provide clean multi-user interface it hinders the team work during execution. Strong recommendation is to put in place a good task management software. There are plenty out in market. If you are big corporation and want to shed some weight from your wallet get custom one made. If you are not, go with the off the shelf products. Task management software tools provide interface for multi user operations. This means everyone can do updates. It reduces the unnecessary email traffic. (more…)
After detailed task lists are in place, it’s time to schedule them for the execution by assigning the real time lines and engineers. First step before scheduling is setting up the milestones. A milestone is grouping of set of meaningful functionalities. Milestones should be ordered based on the priorities of the functionality. Put the expected dates on the milestones.
Now tag each of the tasks with the corresponding milestones. Number of man days of tasks adding up in each milestone gives idea about the time it would require. If there is big difference between the duration available and number that’s showing up with the tasks list, consider options of adding more engineers if it’s possible or redefining the milestones.
One simple and obvious rule for task assignment to engineers is, group the related task and assign them to same engineer. It takes effort for engineers to build the context and they should allowed to make full use of it. Randomly spreading the tasks across engineers will cause unnecessary waste of the time invested in building context. Building newer contexts and dealing with switching contexts, increases the pressure on the engineers. It’s best avoided unless its must for meeting a critical deadlines. If these critical deadlines are coming very often consider it as bad planning. (more…)
According to MSDN, “Software application architecture is the process of defining a structured solution that meets all of the technical and operational requirements, while optimizing common quality attributes such as performance, security, and manageability. It involves a series of decisions based on a wide range of factors, and each of these decisions can have considerable impact on the quality, performance, maintainability, and overall success of the application”.
The test bench architecture definition is also not radically different from it.
Architecting test bench is a four step process.
The first step is understanding the prerequisites. These prerequisites set the requirements for the architecture.
The second step is dividing the functionality to various components meeting the requirements and mapping to the classes.
The third step is validation, to check if the architecture has any inconsistencies.
Fourth and the final step are architecture delivery to execution by packaging it in documents and abstract classes.
Architecting testbench – 4 steps process
Step 1: Testbench architecture : Prerequisites
There are four key prerequisites that provide the requirements for the test bench architecture process.
It might seem tempting to directly jump into the test bench architecture with partial understanding of the prerequisites or even skipping some. (more…)