Using API Test Designer

The article introduces API Test Designer, an automated testing tool that aims to improve the testing process by making it faster and more efficient. It provides a URL to access the tool and mentions the features it offers.

The main goal of API Test Designer is to streamline and enhance the testing process by automating it. This means that the tool aims to make testing more efficient and less time-consuming by automating the testing procedures. "API" stands for Application Programming Interface, which is a set of protocols, tools, and definitions that allows different software applications to communicate with each other. In this context, "test designer" refers to the tool's capability to design and create tests for APIs.

NOTE: You can find more information about the API Test Designer tool, including release notes, on the GitLab repository at the following link.

USE: to use API Test Designer, click here.

Features

The key features of API Test Designer include:

  • User-friendly interface for creating and managing API tests: The tool provides an intuitive and easy-to-use interface for designing and organizing API tests, making it accessible to users with varying levels of technical expertise.

  • Ability to define test scenarios: Users can specify and define different test scenarios, allowing them to simulate various conditions and behaviors to thoroughly test the API.

  • Capability to set up test data: The tool enables users to configure and prepare the necessary test data for conducting comprehensive API tests.

  • Execution of tests without the need for coding: Users can execute tests without having to write complex code, making the testing process more accessible to non-programmers.

These features collectively aim to simplify the testing process and make it more efficient, allowing users to create and execute API tests with ease.

Usage

To use the API Test Designer, follow these steps:

  • Create a New API Test: Start by creating a new API test in the API Test Designer tool.

  • Define Test Inputs: Define the inputs for your API test, including the request parameters, headers, and body.

  • Set Assertions: Set up assertions to validate the response from the API, ensuring that it meets the expected criteria.

  • Run the Test: Execute the API test to send the request to the API and receive the response.

  • Review Results: Review the results of the API test to verify that the API is functioning as expected and that the response meets the defined criteria.

  • Refine and Iterate: Based on the results, refine and iterate on the test as needed to ensure thorough API testing.

As a QA professional using Pricefx API Test Designer, here's a step-by-step example of a use case:

  • Step 1: Create a New API Test

    • Log in to the Pricefx API Test Designer tool and create a new API test for a specific API endpoint.

  • Step 2: Define Test Inputs

    • Specify the details of the API request, such as the endpoint URL, request method (e.g., GET, POST), request headers, and request body if applicable. For example, you may want to test the functionality of the pricing calculation API endpoint.

  • Step 3: Set Assertions

    • Define the expected response from the API by setting assertions. This could include checking for specific response codes, response body content, or headers. For the pricing calculation API, you might set assertions to validate that the calculated price matches the expected result based on predefined input data.

  • Step 4: Run the Test

    • Execute the API test to send the request to the API and receive the response.

  • Step 5: Review Results

    • After running the test, review the results to check if the API response meets the defined criteria. Verify that the pricing calculation API returns accurate and expected results.
      Step 6: Refine and Iterate

Based on the results, refine and iterate on the test as needed to ensure thorough API testing. This may involve modifying the test inputs or assertions based on the specific use case and requirements.
By following these steps, you can effectively use Pricefx API Test Designer to create and execute API tests for your use case.

Sample Scenario

The following are a sample sequence of steps for testing and configuration management that assume the usage of API Test Designer. Here's a summary of the features listed:

  1. User Session Management:

    • Users can log in and out of any partition.

    • The setup is stored in the browser's local storage specific to each username/partition.

  2. Test Case Management:

    • Users can create, rename, delete, and reorder test cases.

  3. Test Step Management:

    • Steps can be added into test cases with options to add, rename, delete, and reorder them.

  4. Supported Steps:

    • Wait: Pauses the test for a specified number of milliseconds.

    • Run Logic: Executes logic on the partition.

    • Verify Logic Outputs: Checks the results of previously run logic.

    • Modify Data: Allows creation, updating, or deletion of data (with identifiers like P, PX, C, CX, PP).

    • Verify Data: Confirms the existence and values of data.

    • Create Quote: Calculates a quote without saving it to the partition.

    • Import from Excel: Supports copying and pasting inputs and line items from Excel.

    • Verify Quote Outputs: Checks the results of an existing or previously calculated quote.

    • Import to Excel: Supports copying and pasting outputs and line items into Excel.

  5. Test Case Execution:

    • Each test case reverts any changes made to the partition once it is finished.

  6. Configuration Export/Import:

    • Allows exporting the test configuration in JSON format.

    • Enables importing of previously exported configurations from JSON.

NOTE: These features suggest that the application is designed for creating and managing automated tests, possibly for a system that deals with logic processing, data manipulation, and quoting functionalities. The ability to import from and export to Excel and JSON indicates that the application can interact with common data formats for ease of use in various workflows. The reverting feature ensures that each test case leaves the system in its original state, which is useful for maintaining a consistent testing environment.

NOTE: GitLab + release notes can be found here.

 

 

Â