Assisted in developing pioneer cross platform sports utility product
Our client is a pioneer in performance analysis in sports and has been involved with it for more than 19 years. They have worked with many leading clubs and organisations in sport to deliver high-quality performance analysis services.
Client had multiple tools for player tracking and event coding and wanted to develop and publish a common set of tools that could be utilized across multiple sets of cameras. They had partnered with an external developer to develop the tool and were looking for a vendor to provide QA services for this product.
An expert view of the tools and the process was necessary to successfully test the tools. The client wanted to have a centralized team providing QA services across multiple products at a price point that fits the budgeted requirements.
Some of the key challenges were that product documentation was inexistent and only a few use cases existed for gaining product knowledge and developing test cases.
- Scope of testing involved smoke, regression, system and integration tests.
- Discovery and knowledge sessions spanning one week each were conducted at different instances onsite to understand the requirements and develop an understanding with the development team and other stakeholders from client management team.
- Leveraged the existing knowledge base of production resources to train the team on the processes.
- Conducted exploratory tests on the product to create test cases and gain knowledge on the product as there was not enough documentation available.
- Set up a flexible resource pool of dedicated resources working within agile sprints to provide QA services.
- Provided management oversight through a dedicated QA manager who was provided responsibility for entire QA lifecycle.
- Utilized production team to conduct Beta testing and performance tests before releasing to production. This was the flexi pool of resources maintained and used as and when required which resulted in efficient resource usage.
- Performance testing was also conducted for web and desktop based applications. JMeter was used for web applications whereas manual load was applied on desktop based applications to gain insight on application performance.
- Performed integration test for all the components of tools to check proper handshake is done between each component.
- JIRA and Redmine were used as defect management tools.
- Despite the non-availability of documentation, team achieved the following.
- Defect rejection ratio of only 5%.
- Testing was done on-schedule with minimal effort variance.
- Achieved test coverage of 95%.
- Direct interaction with developers to apprise them of critical issues and managed the defect triage
- Authored and set up a new set of regression test suites to run each time a new build was ready.
- Increased productivity of test resources over a span of 9 sprints –
- Daily and Weekly status reports provided to management team to communicate and track automation status.
- Increased test case authoring productivity by 80% (from 1 TC/hr to 1.8 TCs/hr).
- Increased test case execution productivity by 66% (from 3 TCs/hr to 5 TCs/hr).
- Weekly status reports provided to management team to communicate and track product QA health.
- Performance analysis and issues identified before launch to production.