Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

Here is This article provides a comprehensive list of important terms we are using on our projects, it should simplify you communication with your project team and with the customer. Note that descriptions are more how we are using those terms, they were not just copied from some general methodology.General terminology:common terms and definitions related to Quality Assurance (QA) in software development. It covers terms such as unit test, functional test, regression test, user acceptance test (UAT), smoke test, bug/defect/issue, severity, test report, test management, priority, black box testing, white box testing, exploratory testing, retesting, happy flow, epic, user story, root cause, test case, test scenario, use case, test automation, deployment, performance, partition, acceptance criteria, accelerators, and accelerator packages..

General terminology

Term

Description

Unit test

Test done by CE during development in Feature Sprints. Unit test should be finished before QA Analyst starts his testing

Functional test

Test done by QA Analyst mostly based on functionality description or acceptance criteria in user stories. This test is typically done during Feature Sprints.

Regression test

Test done by QA Analyst to be sure there were no side effects the application after bug fix. Usually it is done during Feature Sprints or UAT phase. More development was done on the project, more important Regression test is. In case the project can cover also test automation, Regression tests are typically good candidates for automation as they are repeated often and they are typically not changed often.

User acceptance test (UAT)

Test done by the customer to make sure the application meets agreed acceptance criteria. Testing during UAT should be always done based on E2E business scenarios, not just based on particular user stories.

Smoke test

Test done by QA Analyst after deployment to the environment, just basic functionality check to be sure there are not obvious issues after deployment and it is possible to start official testing.

Bug / Defect / Issue

We are typically not so strict about terminology here, so we can say Bug / Defect / Issue simply means that behavior of the application in test is not according to requirement description or not as we expected. Bug is more often used in UAT phase as “Bug“ is also issue type in Jira and the customer submits Bugs to JIRA when incorrect behavior is found during UAT testing

Severity

Describes category of how serious the founded bug is from functionality point of view and is submitted by tester (QA Analyst or the customer). Categories are described in SoW 

Test report

Report with results of tests. We are not creating any special document for this, we are using our test management tool (X-Ray) which contain current project data.

Test management

Test management basically mean documenting everything around testing on the project - creating test cases, traceability to requirements, documenting results of testing and creating final report from testing. As test management is quite complex on the project, it definitely needs a specialized tool. We in Pricefx are using X-Ray - plug-in module in JIRA. There are other test management tools on the market, they can be used by Partners or customers, but they should definitely use them for documenting testing on the project. Note: it can be a big surprise for somebody, but Excel is not a test management tool!

 Priority

The ordering of defects against the need to be fixed, higher priority bugs should be fixed sooner. This is usually set by the Customer/Product Owner, or if they can’t/won’t then by the Project Manager.

Black box testing

Testing with no knowledge of the code or how the functionality was developed. Testing is based on the requirement. This way of testing makes space to question the implemented solution if it was done as the requirement stated. The tester is not influenced by the thinking of the developer.

White box testing

Testing with some knowledge of the code or the way how the functionality was configured. It is often used on projects with not completed or none documentation or in case that you jump in some running project when you need to be involved quickly to the process.

Exploratory testing

It is the practice of verifying the functionality of a system without using a predefined test case. In this case you are typically testing based on your previous experience and you are trying to verify as many things as you can and find as many bugs as possible. Quite a big disadvantage of this approach is that there is usually no record of what was really tested.

Retesting

When a bug was fixed, it is necessary to verify not just the fixed functionality but also that fix did not introduce another bug on the same screen. So don’t be focused just on fixed functionality, feel free to test around at least a bit.

Happy flow

The main scenario which will pass end to end testing. It is often used for quick verification that the functionality is working. Also when you start testing of a new functionality, you should start here.

Epic

An epic is a large user story that cannot be delivered as defined within a single iteration or is large enough that it can be split into smaller user stories.

User story

It is a specific task within an epic. Typically we are creating test cases for testing particular user stories.

Root cause

Main reason why the functionality is not working, sometimes it is easily visible during testing, sometimes it needs deep analysis in the code done by CE.

Test case

Set of steps to test to figure out if the software works correctly and fulfill defined requirements..

Test scenario

Some people mean Test case when they are talking about Test scenario. We typically speak about Test scenario when we are talking about test automation - so Test scenario is a piece of code for testing based on Test case

Use case

It is the typical case how the customer will use the system, usually based on their business needs.

Test automation

Test automation means automatic testing using developed Test scenarios. Test automation is optional on our projects and it can really help especially when we need to be sure that the previous code was not corrupted by new development or when we are repeating the same test over and over again. The tool we are using for test automation is Cypress.io

Deployment

It basically means to get your software ready for actual use, in its strictest sense, software deployment refers to the release of the final version of the software or an app to some specific partition (dev, qa, prod). Before you start your testing on a partition, the code must be deployed there.

Performance

How a system performs in terms of responsiveness and stability under a particular workload. There should be performance requirements on projects, especially when big amount of data is expected.

Partition

Means testing environment. Typically we have these partitions:

  • dev - for development and testing by QA Analyst during Feature Sprints

  • qa - for final testing during Feature Sprints (by QA Analyst and the customer) and during UAT for final acceptation

  • prod - for real work after Go-live, nothing should be tested here

There can be more partition for some projects, for example for test automation, however usually we work just with partitions above.

Acceptance Criteria

Acceptance criteria refer to a set of predefined requirements that must be met to mark a user story complete. It should be part of every user story unless it is agreed differently with the customer.

...

Accelerators

These are snippets of Groovy logics which are deployed to partitions and designed to set up some frequently used functionality within a Pricefx partition, typically various dashboards. These snippets are pulled directly from a Git repository.

Accelerator Packages

These are the most powerful components available in PlatformManager Marketplace. They are pieces of code (Groovy, JSON definitions, Rest API calls) which allow you to configure a partition to a required pricing application state (ie. Handling Rebates, approval workflows, sales insight dashboards, etc). They provide ability for user interaction like entering inputs and uploading data.

Tools used on Pricefx projects

Tool

Area

Description

JIRA

Project management

Tool which we are using for managing the projects from the beginning.

X-Ray

Test management

Tool which allows us manage everything about testing in JIRA

Cypress.io

Test automation

Tool we are using for test automation in Pricefx. Test automation can be used during development or it can be done based on customer’s request.

Postman

API testing

Tool we are using when API testing is necessary on the project

JMeter

Performance testing

Tool for Performance testing, it is used when performance requirements are defined on the project or when there are big loads of data and performance issues can be expected.

Excel

General tool

General tool for help on the project - note Excel is not a test management tool.

(info) Note for Partners: You can use different tools if you want, but you should always respect general principles we are describing in our methodology