5 tips to speed up regressions
Hardware design verification consumes more than 60% of resources, often more than that. These resources are not just engineers but also time, machines and licenses.
Speeding-up regression run-time not only reduces the time to achieve verification closure, but also reduces the need for additional EDA licenses, and the pressure for additional compute power. So how do you go about doing that?
Here are a few tips to reduce the regression run-times for any verification environment :
1. Are you in the right forest? It seems obvious, but none the less, a good place to start is to make sure you have the right set of tests in your regression suite. This question takes you back to the verification plan and all the way back to the functional specification of the design. There is no point verifying functionality that has been removed or modified. I happen to talk to a verification manager at a large semiconductor company recently and he told me his team was so focused on improving coverage of their verification suite that they were shocked when they found out that the specs had changed and they were not in the loop.
2. A list for every occasion! : Create multiple regression lists so that users can choose the most appropriate one to run. Share these lists with the design and verification team. Here are a few ideas on which lists to create …
- Qualifying regression list : Contains tests that will be run by all designers to make sure they are making forward progress and have not broken any functionality.
- Function based regression list : Have a regression list for each function group so that various aspects of the design can be verified.
- Comprehensive regression list : This is the integration regression list which is the kitchen-sink of all tests. This is run periodically – say very night or every weekend.
Having an appropriate regression list ensures just the optimum tests are run without causing any waste. The Qualifying regression is the one that will be run the most often so it better be as succinct as possible, while covering major functional sections.
3. Fine-tune the regressions : Remove redundant tests from the regression lists and reduce run time for tests. This can be done in the following ways …
- Comparing historic test results. If two tests have the same “result signature” (same pass/fail status) from the past, then only one need to be in the Qualifying regression set. Which one is chosen can be based on their respective run times. Obviously, the shorter the run time the better.
- Often tests can be made faster by using techniques such as back-door loads for configuration registers, using special simulation oriented configurations (e.g. use a shorter frame size, shorter time-out values etc.)
- Use coverage based test ranking. Almost all good simulators give the ability to rank the tests, this can be used to choose the most appropriate tests.
4. Run only if needed : Use the dependency management in load management systems (e.g. LSF) to run tests based on the results of other tests. For example, if the basic Reset tests fail, no need to run any other functional tests, or if functional tests fail, no need to run the low-power tests, and so on.
5. Vote for a systematic approach : Use a systematic method to manage all regressions and their results. Good analysis of current and historic regression results significantly reduces regressions themselves and gets to faster verification closure – the holy grail of verification teams.
These tips can be adopted in an ad hoc manner for regression and verification management. In our experience, using these techniques one can cut down on regression run times anywhere from 25-50%.
What has your experience been, how have you reduced your regression run-times?