Standards and Guidelines (Integration)

Below is a listing of recommendations on Pricefx data integration best practice:

Before Integration:

Local-Based Development

When customizing the code generated by our IM templates or developing integration solutions from scratch, Pricefx recommends the use of local-based development that utilizes IntelliJ running locally to test and debug our integrations.

LEARN MORE: For more information on local-based development, click here.

Boomi vs SAP vs IM Templates vs Custom

The decision on which integration process to follow can be recommended as follows:

Dell Boomi Adapter

This option should be selected when the customer is either familiar with Boomi or makes a request that it be used for their integration. It is a good path to allow for customer integration self-sufficiency and smaller incremental data loads.

SAP Adapter

This option can be selected when the customer has SAP available data in S/4Hana AND has the SAP Integration Suite installed. If customer has all SAP components, then this approach it is recommended before IM templates since it would allow customer to manage their integration maintenance.

IM Templates

If either Boomi or SAP Adapter can’t be used, then the next recommendation would be to utilize the IM templates approach. This will allow project teams to quickly build and maintain integration with minimal level of technical expertise.

IM Customization

Based on a variety of metrics (ie. data complexity, volumes, requirements, etc), none of the other approaches will suffice. In that scenario, it is recommended that a completely customized integration plan be developed.

Use of IM Templates

Pricefx recommends the use of IM templates via the PlatformManager Marketplace as a starting point for your integration workflows. These templates can be used to accomplish a high percentage of integrations for standard projects.

LEARN MORE: For more information on the use of IM templates and how to employ them for integration, then click here.


Recommended IM Templates

For most common integration usage, Pricefx recommends the use of these IM templates for inbound files: primary option is Import CSV from File to Pricefx and secondary option is Import CSV from FTP to Pricefx. For outbound files: Export Pricefx Table to CSV via FTP.

LEARN MORE: For more information on inbound templates, click here or for outbound templates, click here.


Data Readiness Methodology

We recommend the use of the DRM process for all integration projects and recognize the value of ensuring quality pricing data for our application represents.


During Integration:

IM template customization

It is quite possible that the XML code generated for data integration may need to be customized to meet the requirements of our customer. If staging tables combined with CFS are not an option, then the generated code can be modified and managed.


Inbound Data Volumes

We recommend these limitations for inbound data files:

-Initial data loads of up to 20 million rows

-Incremental daily data loads of hundreds of thousands of rows

-Restriction of 1 million rows per file


Outbound Data Volumes

Scheduled exports of CSV files should be based on a specific time window with no guarantee for overlap of exported data. Limitation of up to 100,000 rows per file.


Data formatting

If there is a need for data formatting, which includes cleansing, transforming, and validating, then it is recommended that it be performed after the data has been loaded into temporary staging tables and be performed via a Calculated Field Set operation.


Use of Staging Tables

We recommend the use of staging tables as an intermediate step during data integration to facilitate the transformation and validation of data. These staging tables server as an intermediary storage for imported data to allow for cleansing, manipulation, verification and transformation processes.

The main functions and benefits of temporary staging tables are as follows:

Data Transformation, allow for data transformation and manipulation before it is loaded into the target system. Includes tasks such as data cleansing, formatting, aggregation, and applying business rules. Use of staging tables allows us to perform complex data transformations and ensure the data is in the desired format and structure.

Data Mapping and Enrichment, provide a central location for mapping and enriching data from multiple sources. Data from different systems or sources can be loaded into staging tables and then mapped to the corresponding fields in the target system. Can enable data enrichment by incorporating additional information or look-up values from reference tables.

Data Validation and Error Handling, can be a convenient place to perform data validation and error handling before data is loaded into the target system. During this process data can be validated against predefined business rules or data quality standards and any errors or discrepancies can be identified and handled. Ensures only clean and accurate data is migrated to Pricefx master tables.

Data Integrity and Auditing, provide an opportunity to maintain data integrity by loading data into staging tables, and then we can perform validation checks, record audit trails, and track the history of data changes. Can serve as a control mechanism to ensure the accuracy and reliability of our data.

ETL Operations, can act as a staging area for our Extract, Transform, Load (ETL) operations. After data is extracted from source systems, and loaded into staging tables, then it can be transformed and processed. This allows for efficient handling of large volumes of data and complex data transformations.


Use of CFS for Data Transformations

Use of CFS as a powerful tool for transforming, validating and cleansing data in staging tables. During this phase, a CFS will provide benefits of flexibility, automation, consistency, standardization, reusability, maintainability, auditability and tracing.


Use of inbound CSV files

Pricefx recommends the use of CSV files for inbound data integration because of these benefits; simplicity and universality, efficient and lightweight, ease of manipulation, system compatibility, data integrity and auditable format.


Use of outbound CSV files

Pricefx recommends the use of CSV files for outbound data integration because of these benefits; simplicity and universality, non-proprietary efficient and lightweight, ease of manipulation, system compatibility, data integrity and transparency, and ease of use for batch processing.


Task & Job Orchestration

When performing data integration, it is recommended that integration tasks be organized and scheduled in a predefined workflow to ensure the efficient flow of data and execution. It involves defining the dependencies between tasks and jobs, defining the order of execution, error handling, and monitoring the status of jobs.

Use of Parameter tables

During integration efforts, we recommend the use of Company Parameter tables for these types of activities:

Configuration Management, to allow us to centrally manage and configure various settings and parameters related to the integration process (ie. defining data mappings, transformation rules, validation criteria, etc.)

Data Mapping, can be used to store attribute mappings between source data fields and target data fields. This would us to update and modify the mappings without modifying the integration code.

Data Transformation, store transformation rules and logic that need to be applied to the data stored in our temporary staging tables. These rules could include calculations, data conversions, conditional statements, or other transformations needed to harmonize data between different systems.

Validation and Filtering, can be leveraged to define validation rules and filters to ensure the quality and integrity of the integrated data in staging tables or other data structures. We could specify criteria for a range of data validations (ie. data type checks, range checks, or consistency checks)

Enrichment: can act as a look-up table containing reference data that is used to enrich or enhance the integrated data in our staging tables. F

Dynamic Configuration: provide a dynamic configuration mechanism that allows us to update the behavior of our integration without modifying the code. Enables quick adjustments to integration logic, mappings, or rules based on changing business needs.


PA Data Enrichment

To simplify the integration process, it is recommended that any enrichment requirements be performed after the data has been loaded into temporary staging tables and NOT be performed by customizing the XML code of the route. The use of a Calculated Field Set (CFS) process is recommended.


Use of Notifications vs Alerts

Notifications can be used to inform users and customers about the status or outcome of a data integration event or process and provide basic informative content (ie. success or failure). Notifications are pre-set to watch only certain events that happen in the integration. If Notifications are not enough, user can use Alerts, which are more powerful (an Alert can be triggered by finding a specific pattern in the integration logfile) but require precise setting in order to generate Alerts only in case of the issue.

After Integration:

Debugging

To incorporate debugging, we recommend the option of running your integrations locally. There is a sequence of events that must be completed to prepare our data integration for local-based development which is the first step in performing customizations and utilizing debugging capabilities.

Trouble Ticket & Support

If you have a problem with your IntegrationManager instance, PlatformManager access or any other issues related to your data integration routes, you can complete a trouble ticket via the Pricefx Help Desk here.

Troubleshooting

For recommendations and additional information on common troubleshooting and FAQ-related content, click here.