cross browser testing for customer review and feedback application

Case Study

Quality overhaul of a leading consulting firm’s feedback application

Client

A leading player in Strategy Consulting Domain.

Application

Partner Feedback Survey

Team Size

2 Sr. Test Engineers

Industry

Internal Portal

Tech Stack

  1. Java,
  2. Dot Net,
  3. SQL Server

Resource Distribution

Offsite

Duration

6 months

Types of Validation

  • Manual testing
  • Manual Performance Observation
  • Cross-browser testing

Recommendations

Replicate the QA processes, standards and artefacts created for this application to other projects as well.

Problem Statement

Multiple product issues were reported by end-user’s who were accessing the application over different OS/Browser combination. This defect leakage into prod and not validating the app over multiple browser/OS combinations was a critical miss largely attributed to inadequate testing planning, lack of QA processes and shortage of time for testing prior to production release.

Challenges

  • Inability to predict performance of application over different OS and Brower combinations due to inadequate testing and reporting.
  • Absence of Collaborative efforts among various teams leading to unplanned and insufficient testing
  • Lack of proper QA Process throughout testing leading to no transparency on  test coverage, test efficiency, application health etc.
  • BRDs for new/ changed functionalities were missing and no proper structure of documentation was followed
  • Priority Modules were not regression tested with every release.

What we did

A comprehensive examination of the QA processes, project documents and roles and responsibilities of various team members was done. All the critical stakeholders were interviewed to identify quality pain points and address the same

 

  • Understood their existing defects and identified areas which are more critical .
  • Plugged the gaps by designing critical scenarios and test cases for different modules, some scenarios are explicitly designed keeping the OS/Browser in mind
  • Team took ownership of basic performance testing before PROD Release to check the performance aspect of application on multiple browsers & OS.
  • Regression testing of complete application performed multiple times before release.
  • Apart from this QA also performed Exploratory testing time to time (OS/Brower wise) of the complete application to ensure there is no bug leakages in Production 

 

Outcome

  • Issue redundancy has been reduced significantly (around 60%)
  • UAT /PROD defects have reduced and enhanced the application's performance and its functionality across all browsers/OS with a reduction of close to 75% in the initial 15 days of deployment on production.
  • Application is now running on multiple OS/Browsers (4/13) smoothly, without any major issues reported.
  • Customer is now able to gauge the health of project in real time (have better insight and visibility)
  • Collaboration among team members has significantly enhanced with the help of email notifications and comments in TFS. Response and query resolution is more efficiently done through mails and in TFS.