(ALM for Developer Series)
As a Testing manager in Digité, I am perhaps best placed to share Digité Enterprise’s impact on my team’s and my work.
Prior to using Digité, we used to manage our test inventory in Excel sheets – which is how a lot of organizations and teams still manage their test repository. These were multiple sheets that were organized module-wise. Test results were similarly tracked in Excel and we would depend on a combination of filters/ pivot tables in Excel to help us identify the failures, resubmit them for execution, etc.
Simpler and More Accurate Manual Testing
Since we moved all our Manual Test Inventory from Excel to Digite, life has become simpler. Earlier, for any regressing test cycle, it was difficult for a team lead to consolidate the test status when our test suite was divided across 20 testers. At the end of the day, they would have to chase everyone to mail back their testing status, find out how many defects were identified, calculate pending test cases and then redistribute the same so that testing could be completed on schedule.
With the Test inventory automated in Digite Enterprise, the Team Lead can just execute a report to gather the status. The report gives them the current testing status, number of pending cases, number of failed cases, number of defects identified in the system, status of those defects and when are they expected in the next development cycle.
Effective Test Automation
Further, we also track our automated test case inventory in Digité Enterprise. As the automation team makes progress automating manual test cases, the test cases automatically get excluded from the manual testing cycle using a simple filter in the product. Earlier, this was tracked in Excel by my team leads for automation and manual testing. Maintaining consistency of this information by checking Excel files manually was complex and time consuming. Further, these would never tally because they would be done at different points of time and data was dynamic. Now we have ability to get all sorts of reports – module-wise failures, priority-wise failures, tester-wise defect identification, etc.
Leveraging Integrated Application Lifecycle Management
Managing the overall testing would have been very difficult without Digité Enterprise. Test case development starts early in each release we plan, right after Product Management delivers the Use Cases/ User Stories for the release. The QA team develops the Test Cases for each Use Case, automatically linking them via Traceability. The Test Cases are jointly reviewed by Product Management, Engineering and QA for each release and approved.
With 3,000 QTP Scripts and around 1,500 business functions, we track over 7,000 manual test cases (sanity, smoke and detail test case) in the automated test repository. Product changes and their impact to automation testing assets are tracked through CRs in the automation project. We track the product branches on which this change is applicable. With our Subversion integration, every change is directly reviewed in Digite Enterprise.
Automated Tests run every night, and the results are available for analysis the next day. After completing root cause analysis of all failures, product defects are filed in Digité’s Defect Tracking module. A report gives us an assessment of how well the nightly automation ran. A number of metrics are automatically calculated to give the nightly test trends across a month, last run’s defects – product and automation scripts defects, script failures (false alarms and genuine), current status of product defects and the number of script failures due to the product defects, by module.
Significant Productivity and RoI
In the last 2 years since we moved our entire Testing process to Digité, our manual testing team has shrunk from 10 to 3, while we have kept our Test Automation team stable. Our test repository has added from 14000 to 19000. That is a huge jump in productivity and a significant ROI in the last 2 years.
Manager – Quality Engineering, Digité, Inc.