Allocation Levels of QA Activities

This article discusses the allocation levels of Quality Assurance (QA) activities based on the maturity of the project and provides recommendations for QA analyst involvement. It outlines the tasks and responsibilities of QA analysts at different allocation levels, ranging from full-time allocation for pilot or new projects to minimal allocation for existing projects with small enhancements. The article also highlights the risks associated with each allocation level.

QA Responsibilities

Outlines the tasks and responsibilities of Quality Assurance (QA) analysts at different allocation levels. Here's a breakdown of the terms and activities mentioned:

  1. Functional Testing based on User Stories: This involves testing the functionality of the software based on user stories, which are short, simple descriptions of a feature told from the perspective of the person who desires the new capability.

  2. Documenting Testing in X-Ray: X-Ray is a test management tool that allows for the creation and management of test cases, bug reporting, and reporting of test results.

  3. Retesting Bugs after Bug Fixes: This involves retesting previously identified bugs after they have been fixed to ensure that the fix was successful and did not introduce any new issues.

  4. Regression Testing: This type of testing is performed to make sure that new code changes have not adversely affected existing features.

  5. Smoke Tests after Deployments: Smoke testing is a preliminary level of testing to check basic functionality and ensure that the critical features of a program work.

  6. Checking Risks Connected with Testing: This involves identifying and assessing potential risks associated with the testing process and the solution being tested.

  7. Prepare Test Strategy: This refers to creating a plan that outlines the approach to be used to test the solution.

  8. Functional Demo for the Customer: This involves preparing and presenting a functional demonstration of the software to the customer during feature sprints.

  9. Communication with PM, SA, CE, IE, BA, and the Customer: This involves engaging with project management, solution architects, customer engineers, implementation engineers, business analysts, and the customer to understand user stories and customer needs.

  10. Reviewing Acceptance Criteria: This entails reviewing the conditions that a software product must satisfy to be accepted by a user or customer.

  11. Test Automation in Cypress: Cypress is a test automation tool used for testing web applications. This activity involves automating tests using Cypress.

  12. API Testing in Postman: Postman is a popular tool for testing APIs. This activity involves testing the functionality and performance of APIs using Postman.

  13. Basic Performance Testing: This refers to conducting initial performance tests to assess the speed, responsiveness, and stability of the software under normal conditions.

  14. Helping with UAT Preparation: This involves assisting the customer with User Acceptance Testing (UAT) preparation, which is the final phase of testing in software development before the software is released to the market or to users.

QA Allocation Levels

Allocation level of QA analyst

Project maturity and recommendation

Preparation and Setup

Feature Sprints

UAT

Allocation level of QA analyst

Project maturity and recommendation

Preparation and Setup

Feature Sprints

UAT

Full time allocation

(this should correspond to expected 1:2 ratio (1 QA Analyst for 2 CEs on the project))

Pilot / New projects, full QA support for the customer is expected

 

 

  • stand-up meetings

  • UAT intro with the customer

  • SA, CE discussions - scope and solutions

  • Integration and data testing

  • document integration and data testing

  • X-Ray preparation

  • test plan and test strategy preparation

  • stand-up meetings

  • SA, CE discussions - user stories and AC discussions

  • create test cases

  • X-Ray usage

  • testing according to test cases - positive cases

  • testing according to test cases - negative cases, exceptions

  • report bugs as sub-tasks

  • Pfx project team communication

  • communication with the customer about UAT preparation and testing related issues

  • investigate bugs from customer

  • sw demo for the customer (prepare and deliver)

  • regression testing after bugfixes

  • smoke testing before customer testing

  • SPOC for the customer for all QA related issues

  • stand-up meetings

  • SPOC for the customer for all QA related issues

  • Customer support (UAT testing)

  • Smoke testing

  • bug investigation and retesting

Half time allocation

(this should correspond to about 35% - 40% of CE allocation for QA Analyst)

 

Pilot / New projects, basic QA support for the customer is expected

 

Risk: As we don’t expect using X-Ray in this option, it can be difficult to share our test cases with the customer. The customer can get some examples, but he should know our support in this will be limited.

  • stand-up meetings

  • UAT intro with the customer

  • Integration and data testing

  • test plan and test strategy preparation

  • stand-up meetings

  • create test cases (no X-Ray usage expected)

  • testing according to test cases - positive cases

  • testing according to test cases - negative cases, exceptions

  • report bugs as sub-tasks

  • Pfx project team communication

  • communication with the customer about UAT preparation and testing related issues

  • regression testing after bugfixes

  • smoke testing before customer testing

  • stand-up meetings

  • Smoke testing

  • bug investigation and retesting for internal team

12 hours a week

(this should correspond to about 25% of CE allocation for QA Analyst)

Smaller project, QA Analyst will support internal Pfx team only, NO SUPPORT FOR THE CUSTOMER

 

Risk: if the customer maturity in testing is low, without QA Analyst support it can lead to not prepared UAT tests, misunderstandings reported as bugs and project delay.

  • stand-up meetings

  • basic integration and data testing

  • high-level test strategy preparation

  • stand-up meetings

  • create test cases (just high level, to be used by internal team only)

  • testing according to test cases - positive cases

  • report bugs as sub-tasks

  • Pfx project team communication

  • regression testing after bugfixes - partly

  • stand-up meetings

  • bug investigation and retesting for internal team

8 hours a week

(this should correspond to about 20% of CE allocation for QA Analyst)

Smaller project, QA Analyst will support internal Pfx team only, NO SUPPORT FOR THE CUSTOMER

 

Risk: if the customer maturity in testing is low, without QA Analyst support it can lead to not prepared UAT tests, misunderstandings reported as bugs and project delay.

  • stand-up meetings

  • high-level test strategy preparation

  • stand-up meetings (partially)

  • testing according to user stories - positive tests only

  • report bugs as sub-tasks

  • Pfx project team communication

  • bug investigation and retesting for internal team

4 hours a week - note this can be used just for next stage of existing project when QA Analyst was on the previous stage and therefore knows the project and the customer. DON’T USE THIS ALLOCATION FOR A NEW PROJECT!

Existing customer, ongoing project, small enhancements, no support to the customer

Risk: For a new customer this option has no sense, it means NO QA Analyst on the project with all consequences as here there will be no QA support for the customer and for our internal Pfx team

 

  • for the next stage of an existing project, QA Analyst can do basic exploratory testing with very limited documentation