Choosing the right automated testing for your softwares and Applications

Choosing the right automated testing for your softwares and Applications

Case Study

Quality overhaul of a leading consulting firm’s feedback application

Client

A leading player in Strategy Consulting Domain.

Problem Statement

Multiple product issues were reported by end-user’s who were accessing the application over different OS/Browser combination. This defect leakage into prod and not validating the app over multiple browser/OS combinations was a critical miss largely attributed to inadequate testing planning, lack of QA processes and shortage of time for testing prior to production release.

Application

Partner Feedback Survey

Challenges

  • Inability to predict performance of application over different OS and Brower combinations due to inadequate testing and reporting.
  • Absence of Collaborative efforts among various teams leading to unplanned and insufficient testing
  • Lack of proper QA Process throughout testing leading to no transparency on  test coverage, test efficiency, application health etc.
  • BRDs for new/ changed functionalities were missing and no proper structure of documentation was followed
  • Priority Modules were not regression tested with every release.
Team Size

2 Sr. Test Engineers

What we did

A comprehensive examination of the QA processes, project documents and roles and responsibilities of various team members was done. All the critical stakeholders were interviewed to identify quality pain points and address the same

 

  • Understood their existing defects and identified areas which are more critical .
  • Plugged the gaps by designing critical scenarios and test cases for different modules, some scenarios are explicitly designed keeping the OS/Browser in mind
  • Team took ownership of basic performance testing before PROD Release to check the performance aspect of application on multiple browsers & OS.
  • Regression testing of complete application performed multiple times before release.
  • Apart from this QA also performed Exploratory testing time to time (OS/Brower wise) of the complete application to ensure there is no bug leakages in Production 
Industry

Internal Portal

Outcome

  • Due to the availability of the relevant documents ,test design productivity of the testing team increased by 27%
  • Provided 100% coverage with proper test planning with each release.
  • Overall efficiency of the test team (design and execution) increased by 43% which further resulted in closing of testing phase in almost half of the time.
Tech Stack
  1. Java,
  2. Dot Net,
  3. SQL Server
Resource Distribution

Offsite

Duration

3 Months

Types of Validation
  • Manual testing
  • Manual Performance Observation
  • Cross-browser testing
Recommendations

Replicate the QA processes, standards and artefacts created for this application to other projects as wel

Contact Us

For Sales to Reach You:
For Any Other Information:

    Privacy Policy
     

    This site use Cookies to deliver our Service. By using this site you acknowledge that you have read and accepted Our Privacy and Cookie Policy. Your interest in Crestechglobal or use of its free trial or purchase of a subscription package is subject to these Terms and Policies