Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

Status
colourYellow
titleThis section is for PFX only

In this section you will find the answers to the most frequently asked questions as well as some helpful troubleshooting tips.

Table of Contents

pfx-rest:post accepts only String type payload not InputStream

When using a marshaler, e.g. JSon, the output is InputStreamCache. Pfx-rest:post does not support streams and requires the payload to be String or byte[].

You can simply convert the body to String using following the line:

Code Block
languagexml
 <convertBodyTo type="String"/>

After upgrade to IM 2.x you have to specify the date format

In IntegrationManager 2.x the Groovy Joda framework is deprecated. To get the current date time, use the simple language. You have to specify the ISO mask.

Code Block
languagexml
${date:now:yyyy-MM-dd'T'HH:mm:ssZ}

How to add row number to an exported file

Sometimes you need to add a row number to an exported file.

First you need to save the properties PfxRecordIndex (internal Pricefx Mapper property) and CamelSplitIndex in the first mapper. Usually you are inside a split.

Code Block
languagexml
  <pfx:loadMapper id="IndexPriceCalculationLogicSimulation_TestingMapper">
    <pfx:body in="Currency" out="TI_COPY_RECORDS_WAERS" />
    <pfx:body in="Quantity" out="TI_COPY_RECORDS_KPEIN" />
    <pfx:body in="UOM" out="COPY_RECORDS_KMEIN" />
    <pfx:property in="PfxRecordIndex" out="FifcoPfxRecordIndex" />
    <pfx:property in="CamelSplitIndex" out="FifcoCamelSplitIndex" />
  </pfx:loadMapper>

Then you can calculate the row number in the second mapper. Just set includeUnmappedProperties=true to include all fields mapped in the first mapper.

Code Block
languagexml
  <pfx:loadMapper id="AddRowNumberMapper" includeUnmappedProperties="true">
    <pfx:groovy expression="return (body.FifcoPfxRecordIndex+1)*(body.FifcoCamelSplitIndex+1)" out="TI_COPY_RECORDS_KPOSN" />
  </pfx:loadMapper>

You need to call api-model twice then. That's all.

Code Block
languagexml
      <toD uri="pfx-api:fetch?filter=PGIExportFilter&objectType=PGI&typedId=${exchangeProperty.lpgId}&batchedMode=true&batchSize=10000"/>
      <split>
        <simple>${body}</simple>
        <toD uri="pfx-api:fetch?filter=PGIExportFilter&objectType=PGI&typedId=${exchangeProperty.lpgId}"/>
        <process ref="addPGIMetadataBasedFieldsProcessor"/>
        <to uri="pfx-model:transform?mapper=IndexPriceCalculationLogicSimulation_TestingMapper"/>
        <!-- add row number -->
        <to uri="pfx-model:transform?mapper=AddRowNumberMapper"/>
        <to uri="pfx-csv:marshal?delimiter=,&header=SI_APPLICATION,SI_CONDITION_TABLE,SI_CONDITION_TYPE,SI_DATE_FROM,SI_DATE_TO,SI_ENQUEUE,SI_MAINTAIN_MODE,SI_NO_AUTHORITY_CHECK,SI_SELECTION_DATE,SI_USED_BY_IDOC,SI_OVERLAP_CONFIRMED,SI_USED_BY_RETAIL,SI_I_KOMK_KONDA,SI_I_KOMP_KPOSN,SI_I_KOMP_MATNR,SI_KEY_FIELDS_KONDA,SI_KEY_FIELDS_MATNR,TI_COPY_RECORDS_MANDT,TI_COPY_RECORDS_KPOSN,TI_COPY_RECORDS_KAPPL,TI_COPY_RECORDS_KSCHL,TI_COPY_RECORDS_KDATU,TI_COPY_RECORDS_KRECH,TI_COPY_RECORDS_KBETR,TI_COPY_RECORDS_WAERS,TI_COPY_RECORDS_KPEIN,COPY_RECORDS_KMEIN,COPY_RECORDS_KOUPD,TI_COPY_RECORDS_STFKZ,TI_COPY_RECORDS_UPDKZ,TI_COPY_RECS_IDOC_KZNEP"/>
        <log message="LPG ${exchangeProperty.lpgId} exporting batch # ${exchangeProperty.CamelSplitIndex} to ${header.CamelFileName}"/>
        <!-- file name and folder is in the header -->
        <to uri="file://?fileExist=Append"/>
        <!-- to overcome out of memory issue, split is holding all bodies until the end of iteration -->
        <setBody>
          <constant></constant>
        </setBody>
      </split>

Refresh Integration Information in PlatformManager

Sometime, we see information in PlatformManager which does not match existing IM on the server. To refresh the information:

  1. In PlatformManager, go to Account > Integrations > Monitoring tab.

  2. Turn off the 'Monitored' option and click 'Save'. 

  3. Wait for about 10 seconds. 

  4. Turn on the 'Monitored' option again and click 'Save'.

...

Integration Account on PROD Partition

Integration is a special account which IM uses to communicate with a PFX partition.

It should follow this convention:

  • login name: integration

  • ​email: integration@pricefx.eu

...

Removing IntegrationManager

If IM is no longer used, we need to remove it from the servers and release resources that it occupies.

To remove IM:

  • Stop the service.

  • Remove all installed files from the folders.

  • Remove the linked file in /etc/init.d.

  • Remove the Jenkins jobs.

  • Remove the IM configuration from PlatformManager.

Using Special Delimiter Characters

Example usage of Hex 14 for inbound and Hex 0A for outbound (core 1.1.18.3):

Code Block
csv-sap-import-delimiter=%14

csv-sap-export-eol=%0A

<!-- parse CSV, fields are taken from the header row -->

<to uri="pfx-csv:unmarshal?skipHeaderRecord=true&quoteDisabled=true&delimiter={{csv-sap-import-delimiter}}&header=CLIENT,CUSTOMER_ID,COMPANY_CODE,SOLD_TO_BUYING_GROUP"/>

<to uri="pfx-csv:marshal?header=Condition type,Sales Org&delimiter={{csv-sap-export-delimiter}}&camelSplitIndexAware=true&recordSeparator={{csv-sap-export-eol}}"/>

Sorting by DESC Order in Filter

To sort the result set in a descending order, put the following before the field to sort by:

Code Block
<pfx:filter id="mirrorEventFilter" sortBy="-typedId">

Duplicated Rows in Result Set When Fetching Data from PFX Table in Batch Mode

In some cases, there are duplicated rows in the result set when fetching data from a PFX table in the batch mode. If this happens, check if there is the sortBy attribute added in the filter for fetching.

Code Block
<pfx:filter id="fetchPriceListItemFilter" sortBy="id">
<pfx:and/>
</pfx:filter>

HTTPS Protocol

The latest IM version (1.1.16.5) uses only the HTTPprotocol for monitoring by PlatformManager.

The previous versions use HTTP.

...

Invalid Command Parameters Error

This error "org.hibernate.exception.SQLGrammarException: could not execute statement" is raised when IM uses an old API pfx:dsLoad to load data into a PX table. 

Expand
titleStacktrace

---------------------------------------------------------------------------------------------------------------------------------------
net.pricefx.integration.api.NonRecoverableException: Invalid command parameters (org.hibernate.exception.SQLGrammarException: could not execute statement)
at net.pricefx.integration.api.PriceFxExceptionTranslator.doRecoveryActions(PriceFxExceptionTranslator.java:122)
at net.pricefx.integration.api.client.GeneralDatasourceApi.loaddata(GeneralDatasourceApi.java:1)
at net.pricefx.integration.api.client.PriceFxClient.loaddata(PriceFxClient.java:182)
at net.pricefx.integration.command.ds.Loaddata.execute(Loaddata.java:83)
at net.pricefx.integration.command.Command.process(Command.java:36)
at org.apache.camel.impl.ProcessorEndpoint.onExchange(ProcessorEndpoint.java:103)
at org.apache.camel.impl.ProcessorEndpoint$1.process(ProcessorEndpoint.java:71)
at org.apache.camel.util.AsyncProcessorConverterHelper$ProcessorToAsyncProcessorBridge.process(AsyncProcessorConverterHelper.java:61)
at org.apache.camel.processor.SendProcessor.process(SendProcessor.java:148)
at org.apache.camel.processor.CamelInternalProcessor.process(CamelInternalProcessor.java:201)
at org.apache.camel.processor.RedeliveryErrorHandler.process(RedeliveryErrorHandler.java:548)
at org.apache.camel.processor.CamelInternalProcessor.process(CamelInternalProcessor.java:201)
at org.apache.camel.util.AsyncProcessorHelper.process(AsyncProcessorHelper.java:109)
at org.apache.camel.processor.MulticastProcessor.doProcessParallel(MulticastProcessor.java:860)
at org.apache.camel.processor.MulticastProcessor.access$200(MulticastProcessor.java:86)
at org.apache.camel.processor.MulticastProcessor$1.call(MulticastProcessor.java:330)
at org.apache.camel.processor.MulticastProcessor$1.call(MulticastProcessor.java:316)
at java.util.concurrent.FutureTask.run$$$capture(FutureTask.java:266)
at java.util.concurrent.FutureTask.run(FutureTask.java)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: io.swagger.client.ApiException: {"response":{"node":"node1","data":"Invalid command parameters (org.hibernate.exception.SQLGrammarException: could not execute statement)","errors":{"Message":{"errorMessage":"Invalid command parameters (org.hibernate.exception.SQLGrammarException: could not execute statement)"},"Status":{"errorMessage":"-1"},"HTTPCode":{"errorMessage":"400"},"Action":{"errorMessage":"/vaillantgroup/loaddata/PX"},"Timestamp":{"errorMessage":"Wed Mar 04 06:42:19 UTC 2020"}},"status":-1}}
at io.swagger.client.ApiClient.invokeAPI(ApiClient.java:615)
at net.pricefx.integration.api.ConfigurableApiClient.invokeAPI(ConfigurableApiClient.java:101)
at net.pricefx.integration.api.client.GeneralDatasourceApi.loaddata_aroundBody4(GeneralDatasourceApi.java:805)
at net.pricefx.integration.api.client.GeneralDatasourceApi.loaddata_aroundBody5$advice(GeneralDatasourceApi.java:45)
... 21 common frames omitted
13:42:20.733 | WARN | Camel (camel-1) thread #3 - AggregateTimeoutChecker | | | o.a.c.p.a.AggregateProcessor | Error processing aggregated exchange. Exchange[ID-LAPTOP-ANHDO-1583304000567-0-25]. Caused by: [net.pricefx.integration.api.NonRecoverableException - Invalid command parameters (org.hibernate.exception.SQLGrammarException: could not execute statement)]
net.pricefx.integration.api.NonRecoverableException: Invalid command parameters (org.hibernate.exception.SQLGrammarException: could not execute statement)
at net.pricefx.integration.api.PriceFxExceptionTranslator.doRecoveryActions(PriceFxExceptionTranslator.java:122)

One of the reasons for this error is a missing parameter businessKeysMaxLengths. This parameter was not mandatory before but now it is. 

It is good to switch to the new API pfx-api:loaddata but here businessKeysMaxLengths is mandatory.

Error 404 displays on Logfile page in PlatformManager

When register a new IM instance in PlatformManager and you get an error as shown bellow, add this line into the IM properties file (if it is not there yet):

logging.file=main.log

...

Missing roles when calling refreshCustomerDS

When you encounter the following error:

Code Block
net.pricefx.integration.api.NonRecoverableException: Not authorized for command: class net.pricefx.server.commands.datamart.RunDataLoad
    at net.pricefx.integration.api.PriceFxExceptionTranslator.doRecoveryActions(PriceFxExceptionTranslator.java:122)

Try adding these roles to the integration account:

...

header.PfxTotalInputRecordsCount is empty when using split loading

Code Block
<split streaming="true">
	<tokenize token="\n" group="{{customer-batchSize}}"/> <to uri="pfx-csv:unmarshal?		
		header=SALES_OFFICE,SHIPPING_POINT&skipHeaderRecord=true&delimiter=\t"/> <to uri="pfx-api:loaddata?
		mapper=customerMapper&objectType=C&businessKeys=customerId"/>
</split>

To fix it:

Code Block
<split streaming="true" strategyRef="recordsCountAggregation">

Using delimiter \t in inbound CSV files

The delimiter \t can cause errors if the core 1.1.16 is used. To avoid this error, use 1.1.16.1

It is caused by:

Code Block
java.lang.IllegalArgumentException: String must have exactly a length of 1: \t
    at org.apache.camel.converter.ObjectConverter.toChar(ObjectConverter.java:118)

How to prevent pfx:csv-to-list to double each line when it ends with cr and lf (\r\n)

Request the file owner to fix ends of lines from "\r\n" to "\n".

How to call (from IM) dataload with Type=Calculation and TargetType=DMDS

Code Block
<to uri="pfx-api:calculate?objectType=DM&targetName=DMDS.HistoryPriceListActive&label=PriceActive"/>

Not authorized for command: class net.pricefx.server.commands.configurationmanager.Set

Got this error:

Code Block
Caused by: io.swagger.client.ApiException: {"response":{"node":"node1","data":"Not authorized for command: class net.pricefx.server.commands.configurationmanager.Set","errors":{"Message":{"errorMessage":"Not authorized for command: class net.pricefx.server.commands.configurationmanager.Set"},"Status":{"errorMessage":"-1"},"HTTPCode":{"errorMessage":"401"},"Action":{"errorMessage":"/deanfoods-dev/configurationmanager.set/IM_DataLoadReport"},"Timestamp":{"errorMessage":"Mon Apr 22 12:15:39 UTC 2019"}},"status":-1}}
at io.swagger.client.ApiClient.invokeAPI(ApiClient.java:615)
at net.pricefx.integration.api.ConfigurableApiClient.invokeAPI(ConfigurableApiClient.java:101)
... 21 common frames omitted
19:15:55.500 | WARN | Camel (camel-1) thread #12 - timer://emailReport-TEST-timer | | | o.a.c.component.timer.TimerConsumer | Error processing exchange. Exchange[ID-LAPTOP-ANHDO-1555935326620-0-1]. Caused by: [net.pricefx.integration.api.NonRecoverableException - Not authorized for command: class net.pricefx.server.commands.configurationmanager.Set]
net.pricefx.integration.api.NonRecoverableException: Not authorized for command: class net.pricefx.server.commands.configurationmanager.Set
at net.pricefx.integration.api.PriceFxExceptionTranslator.doRecoveryActions(PriceFxExceptionTranslator.java:100)

You need to check the role in PFX UI: General Admin (without User Management)

If you use fetch/AP or update/AP, you should be ok just with the Data Integration role.

How to skip records which cannot be parsed during import

Q: I have to import some old exports with no way of correcting the data and I need to skip records which cannot be parsed (mapper throws error). Is there a simple way to do that?

A: It is supported since the version 1.1.16.

Is there a way to lookup DF name based on label?

Code Block
private Map<String, Object> fetchDataLoadItem(String label, String type) throws ApiException {

        FetchFilterBuilder calculationDataLoadFilterBuilder = new FetchFilterBuilder()
                .with(equal("type", type));

        calculationDataLoadFilterBuilder.with(equal("label", label));

        Response dataLoadsResponse = priceFxClient.getDatamartApi().getdataloads(calculationDataLoadFilterBuilder.create());
        List<Object> dataLoads = (List<Object>) dataLoadsResponse.getResponse().getData();

        if (dataLoads.size() > 1) {
            throw new RuntimeException(String.format("More than one dataLoad object was found for label='%s'!", label));
        }

        if (dataLoads.isEmpty()) {
            throw new RuntimeException(String.format("dataLoad object was not found for label='%s'!", label));
        }

        return (Map<String, Object>) dataLoads.get(0);
    }

How to set "exclude hidden attributes" in export to XLS files in IM code

We can get metadata of the table, it may have the information about hidden attributes of a column.

In case of a price list, we can get and see this log:

Code Block
{version=0, typedId=7245.PGIM, fieldName=attribute5, label=Brand, fieldType=2, requiredField=false, elementName=Brand, priceGridId=157, hidden=false, manual=false, createDate=2019-03-07T15:12:15, createdBy=82}

How to update an approved PLI

Code Block
def data = ["typedId" : item.typedId, "key2" : item.get(getAttributeName("key2")), "sku" : item.get(getAttributeName("sku")), "priceGridId" : oneLpg.id, (getAttributeName("ExportStatus")): "EXPORTED"]
api.addOrUpdate("XPGI", data)

You don't need key2. 
getAttributeName gets the attributeXX for the name of the element. 

How to track changes in a PX table

The audit log works only for a PP table. Change tracking in PX, CX tables is not supported.

We could filter changed rows by LastUpdatedDate > last_check_date condition and last_check_date could be stored in a PP table.

Data Integration role and changing an approved quote via API

In PFX 3.6.2 anyone can change anything as it used to be. In 3.6.3 this will be restricted only to users with the dataintegration role. 

Triggering PADATALOAD_COMPLETED

Q: When/how can we catch an event that fired after Datamart refresh?

A: IM code and/or PFX UI button clicked event handling creates JOBS for Datamart refresh. The scheduler (of PFX) will schedule execution of the job according to the available resources. You will receive an event when the jobs finish successfully and this is reason why event has the name XXXX_COMPLETED.

Could not locate PropertySource: I/O error on GET request for http://localhost:8888/application

Set the configuration value:

Code Block
health.config.enabled=false

net.pricefx.integration.api.NonRecoverableException: Not authorized for command: net.pricefx.domain.EventTask

The problem is access rights of the user – your user cannot read event data from PFX. jsonWebToken should have no impact to this. 

applicationContext is null in PfxSalesforceProducer class

Try to add into your properties file:

Code Block
integration.configuration.enabled=true

Data fields in IM UI configuration

...

  • Instance Management URL – Address where your IM listens to requests from IM UI.
    Example : http://int1.eu.pricef.eu:8080
    Address – Your IM server address and port. Port can be set by server.port property

  • Instance Management Username – Set by security.user.name

  • Password – Set by security.user.password property in your IM config file

Can I start calculation of CFS and pass parameters to the underlying calculation?

As of 28/02/2019, this feature is not available.

https://qa.pricefx.eu/pricefx-api/json/develop.html#!/cfs/calculate

How to fetch data from Datamart and iterate over it and update some values

The correct API is:

Code Block
getPriceFxClient(exchange).getDatamartApi().fetch(typedId, fetchRequest);

Java:

Code Block
public void fetchReadyTransactions(Exchange exchange) throws ApiException {

        try {

                FilterCriteria filterDate = new FilterCriteriaBuilder()
                        .with(lessOrEqual("DispatchDate", "12/04/2018"))
                        .and()
                        .with(greaterOrEqual("DispatchDate", "12/01/2018"))
                        .create();

                FetchRequest fetchRequest = new FetchRequest();

                // set filters
                fetchRequest.setData(filterDate);

                FetchResponseData response = priceFxClient.getDatamartApi().fetch("ReadyTransactions", fetchRequest).getResponse();

                exchange.getOut().setBody(response.getData());

        } catch (ApiException apiException) {
            PriceFxExceptionTranslator.aspectOf().doRecoveryActions(apiException);
            throw apiException;
        }
    }

XML
<!-- fetch ReadyTransactions -->
            <to uri="bean:apiService?method=fetchReadyTransactions(${exchange})" />
            <!-- loop over transactions and fetch pricing info -->
            <split>
                <simple>${body}</simple>
                <log message="split ${body}"/>
            </split>

How to get property within Groovy in Camel

As in

Code Block
<groovy>new org.joda.time.DateTime().minusDays(${deleteOtherThan}).toDate()</groovy>

...

Code Block
<groovy>new org.joda.time.DateTime().minusDays(exchange.getContext().resolvePropertyPlaceholders("{{deleteOtherThan}}")).toDate ()</groovy>

Example of route calling Datamarts refresh

Code Block
<!-- Datamart refresh --> <pfx:dmRefresh id="refresh_DM_Fee_Schedule" dataMartName="DM.FeeSchedule"/>
<to uri="refresh_DM_Fee_Schedule"/>

How to track value changes in a PP table

We need to audit the changes in price parameters (PP) tables.

  • Load the whole PP at the start and keep it in memory (if it's not too big).

  • Data Change Request (DCR) can be also solution, we use it on Bosch. 

    SE can set up DCR.

How to access Advanced Configuration in Pricefx

Code Block
priceFxClient.getConfigurationApi().set("key name", your payload);
priceFxClient.getConfigurationApi().get("key name");

Who is responsible for archiving/cleaning data in the outbound folder?

It should be the customer. The main reason is that we don't know exactly if the files we sent (outbound) are processed at their side or not. For instance, if we clean-up/archive a file but the customer did not pick it up yet, it will be a "data missing" issue. And we will give the client the file delete permission.

However, it's flexible. For example, in SENETIC, IM does the clean-up/archive job. We asked the customer to move the outbound files to another place after they processed them. After that, IM archives the outbound files at 12:00 AM every night.

Examples of projects using REST client/server

  • REST server example: project firthnz
    file outbound/quoteRoutes.xml

  • REST server example: project Haefele
    file net.pricefx.integration.processor.PfxBasicAuthProcessor

How to avoid processing files that have not been fully transferred yet?

Use one of the following:

  • Done file or other file parameter

  • Bean - net.pricefx.integration.filter.FileModifiedFilter

    ​In URL use the filter parameter:

    Code Block
    maxMessagesPerPoll=1000&eagerMaxMessagesPerPoll=false&sortBy=file:name&delay=60000&filter=#fileModifiedFilter

    In camelcontext:

    Code Block
    <bean id="fileModifiedFilter" class="net.pricefx.integration.filter.FileModifiedFilter">
    	<property name="timeoutMinutes" value="${csv-file-extra-parameters-fileReadLockTimeoutMinutes}"/>     
    </bean>
  • Readlock option is only for SFTP component (on the FILE component on Linux it does not work)

How to get notified when e.g. quote is deleted?

You can use the ITEM_UPDATE_Q event type, in deletion, its 'operation' is 'DELETE'.

How to change log levels on your project in runtime

Note that it will create more data in log files / send more data to ELK and it could slow down the application.  

...

Splitting list of messages into 5000 messages per chunk

Q: How to split a list of messages (body) into e.g. 5000 messages per chunk by a standard Camel component? I would like to export CSV files containing no more than 5000 rows into two different locations (SFTP, audit)?

A: On Dana we can see usage of:

Code Block
<aggregate strategyRef="aggregatorStrategy" completionSize="5000" completionTimeout="10000">

DsUniqueName in pfx-api:loaddata

Q: For the pfx-api:loaddata component, there must be DsUniqueName specified when loading objectType=DM? dataSourceName is not sufficient?

A: Yes there must be DsUniqueName. It works same way as the dmLoad object.

Events exist in PFX but are not fetched to IM (missing events)

PFX partition uses the PULL event model, so you need to check that the Advanced Configuration Option disabledEventProcessing.

...

Notification when a quote is deleted

Q: Is there a way to get notified when e.g. a quote is deleted? Is it possible to trigger some code to generate a custom event or something like this?

A: You can use the ITEM_UPDATE_Q event type, in deletion, its 'operation' is 'DELETE'.

Different handling of a property substitution in the property file

You could run into issues when IM starts correctly but during processing throws an exception like this:

Code Block
Illegal character in path at index 0: {{pfx-coesia-default.url}}/norden-qa/loaddata/P

Spring resolves the substitution using ${ } as a property placeholder. The ​format {{ }} uses Camel when it resolves endpoints URI.

Example
Code Block
// spring resolved property 
pfx-coesia.url=${pfx-coesia-default.url}

// camel resolved property for URI endpoint
rootFolder-inbound={{customer.directory}}/qa/inbound

Magic properties for an instance of the PFX client

You do not need to instantiate the PFX client bean in your camel-context.xml. It is enough to have properly named properties for PFX URL.

For details see PfxClient

Example
Code Block
pfx.url=https://cox.pricefx.eu/pricefx
pfx.partition=readyautodev
pfx.username=integration
pfx.password=************

Incoming REST request implementation

Q: Do you have an implementation of an incoming call of a REST service to IM?

A: Yes, see the example below.

Route
Code Block
languagexml
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
       xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
       xmlns:pfx="http://www.pricefx.eu/schema/pfx"
       xmlns:camel="http://camel.apache.org/schema/spring"
       xmlns:spring-security="http://www.springframework.org/schema/security"
       xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd
       http://camel.apache.org/schema/spring http://camel.apache.org/schema/spring/camel-spring.xsd
       http://www.pricefx.eu/schema/pfx http://www.pricefx.eu/schema/pfx.xsd
       http://camel.apache.org/schema/spring-security http://camel.apache.org/schema/spring-security/camel-spring-security.xsd
       http://www.springframework.org/schema/security http://www.springframework.org/schema/security/spring-security.xsd">


    <bean id="apiResponseUtil" class="net.pricefx.integration.util.ApiResponseUtil"/>
    <bean id="competitionDataValidator" class="net.pricefx.integration.processor.CompetitionDataValidator"/>
    <bean id="pfxBasicAuthProcessor" class="net.pricefx.integration.processor.PfxBasicAuthProcessor"/>

    <routeContext id="competitionRestRoutes" xmlns="http://camel.apache.org/schema/spring">
        <route id="restRoutes-processRequest">
            <from uri="direct:processRequest"/>
            <doTry>
                <!--authenticate and authorize request-->
                <!--get api user credentials and groups-->
                <setHeader headerName="authPartition">
                    <simple>{{pfx.partition}}</simple>
                </setHeader>
                <setHeader headerName="authUri">
                    <simple>{{pfx.url}}</simple>
                </setHeader>
                <process ref="pfxBasicAuthProcessor"/>
                <!--authenticate end-->
                <process ref="competitionDataValidator"/>
                <to uri="direct:createCompetition"/>

                <filter>
                    <simple>${body} == null</simple>
                    <removeHeaders pattern="*"/>
                    <setHeader headerName="Exchange.HTTP_RESPONSE_CODE">
                        <constant>404</constant>
                    </setHeader>
                    <to uri="bean:apiResponseUtil?method=createErrorResponse(4004, 'No records found', 404)"/>
                </filter>

                <doCatch>
                    <exception>java.lang.Exception</exception>
                    <handled>
                        <simple>true</simple>
                    </handled>
                    <removeHeaders pattern="*"/>
                    <to uri="bean:apiResponseUtil?method=createErrorResponseFromEx(*)"/>
                    <marshal ref="gson"/>
                </doCatch>
            </doTry>

        </route>

    </routeContext>
    <restContext xmlns="http://camel.apache.org/schema/spring" id="competitionRest">
        <rest path="/api/competition">
            <description>Service for Competition Data</description>
            <post consumes="application/json" >
                <responseMessage code="200" message="Success"/>
                <responseMessage code="400" responseModel="net.pricefx.integration.model.ApiErrorResponse" message="Invalid parameter entry"/>
                <responseMessage code="401" responseModel="net.pricefx.integration.model.ApiErrorResponse" message="Authorization failure"/>
                <responseMessage code="500" responseModel="net.pricefx.integration.model.ApiErrorResponse" message="Unexpected Error"/>

                <route>
                    <to uri="direct:processRequest"/>
                </route>
            </post>
        </rest>
    </restContext>
</beans>

Calling a Camel processor in a separate transaction

Q: Can I call a Camel processor in its own transaction?

A: See a Camel example here: https://github.com/apache/camel/blob/master/components/camel-spring/src/test/java/org/apache/camel/spring/interceptor/MixedTransactionPropagationTest.java

You have to call the processor in a new route.

Camel Processor Snippet
Code Block
languagejava
from("direct:mixed2")
        // tell Camel that if this route fails then only rollback this last route
        // by using (rollback only *last*)
        .onException(Exception.class).markRollbackOnlyLast().end()
        // using a different propagation which is requires new
        .transacted("PROPAGATION_REQUIRES_NEW")
        .your_processor ...        

LoadData with businessKeys with more than 5 columns

Q: There is an error in dsLoad when the businessKeys attribute has more than 5 columns.

Error message
Code Block
{
  "Message":{
    "errorMessage":"Invalid command parameters (org.hibernate.exception.SQLGrammarException: could not execute statement)"
  },
  "Status":{
    "errorMessage":"-1"
  },
  "HTTPCode":{
    "errorMessage":"400"
  },
  "Action":{
    "errorMessage":"/<partition>/loaddata/PX"
  },
  "Timestamp":{
    "errorMessage":"Wed Dec 19 08:20:12 UTC 2018"
  }
}

A: Since Pricefx 3.3 Cosmopolitan, you can use an optional request param to adjust the length of columns defined in the businessKeys attribute. For more details see  

Jira Legacy
serverSystem JIRA
serverId0e65890b-f559-36c3-b4f7-807e5b0df1a0
keyPFCD-3594

Code Block
{....
"maxJoinFieldsLengths": [
                                {"joinField": "sku", "maxLength": 3},
                                {"joinField": "name", "maxLength": 4},
                                {"joinField": "attribute1", "maxLength": 1},
                                {"joinField": "attribute2", "maxLength": 1},
                                {"joinField": "attribute3", "maxLength": 1},
                                {"joinField": "attribute4", "maxLength": 1},
                                {"joinField": "attribute5", "maxLength": 3}
]
...}

Removing old data automatically

To remove old data automatically, you can use backup routes. 

Code Block
<pfx:backupRoute id="backupRouteCustomer"
                 cronExpression="${backup.cronExpression}"
                 sourceDirectory="${backup.sourceDirectory}/C"
                 destinationDirectory="${backup.destinationDirectory}/C"/>
<pfx:deleteBackupRoute id="deleteBackupRouteCustomer"
                       sourceDirectory="${backup.destinationDirectory}/C"
                       ageInDays="${deleteBackup.ageInDays}"
                       cronExpression="${deleteBackup.cronExpression}"/>

Connecting to SAP using WSDL

The old way to connect was to use RFC (direct connection to SAP) and this is no longer recommended

New projects use only web services where we don't call directly SAP ERP but we call SAP PI and it calls SAP ERP. 

We also export Idocs to SFTP; customers SAP PI processes such data and again calls SAP. 

In SAP, they should be able to wrap SAP RFC with web services as well.

Why Q_START_DATE field format changes?

...

Q: Any suggestions why the Q_START_DATE field format changes?

A: Check the original value in the CSV file and the convertor for the date type. 

Convertor example
Code Block
<bean id="stringToDate" class="net.pricefx.integration.mapper.converter.StringToDate">
    <property name="format" value="yyyy-MM-dd"/>
</bean>

Multiple PFX connections in one IM

Q: I need to process an event in one IM and save data into another partition / message other IM.

A: You can connect from one IM to many partitions and call API. You need to set partitionPfxApi. As a reference, check out the Toys "R" Us project.

Code Block
<setHeader headerName="partitionPfxApi">
    <constant>tru-ce</constant>
</setHeader>

This code will call the tru-ce PFX connection.

So you can load a list of partition names and just iterate through them with split and set up this header before store logic. 

Backing up whole directory structure

Q: I would expect that the whole directory structure is backed up under integration.backup.source but it seems that it only accepts files. Is it possible to backup the whole directory tree?

A: Version 1.1.1 does not have support for nested directories. In the next version there will be support for WS and a list of directories. If WS is enabled, backup creates a backup route for all WS automatically.

Getting dashboard data

Q: How can IM fetch calculation result from a dashboard?

A: priceFxClient.getFormulaApi().executeformula("DashboardName", new FormulaExecuteRequest());

Response sample
Code Block
{
  "response":{
    "node":"us-node1",
    "data":[
      {
        "resultName":"PricelistReport",
        "resultLabel":"Pricelist Report",
        "result":{
          "entries":[
            {
              "Operating Unit":"TMA",
              "Pricelist Name":"TAEC-CSP-XXX-T0P10440001-USD-001",
              "Currency":"USD",
              "Pricelist Start Date":null,
              "Pricelist End Date":null,
              "Item Name":"SDFAM03GEB01",
              "UOM":"Ea",
              "List Price":111,
              "Quarter":"2017CQ3",
              "Line Start Date":null,
              "Line End Date":null
            },
            {
              "Operating Unit":"TMA",
              "Pricelist Name":"TAEC-CSP-XXX-T0P10440001-USD-001",
              "Currency":"USD",
              "Pricelist Start Date":null,
              "Pricelist End Date":null,
              "Item Name":"SDFAM03GEB01",
              "UOM":"Ea",
              "List Price":222,
              "Quarter":"2017CQ4",
              "Line Start Date":null,
              "Line End Date":null
            },
            {
              "Operating Unit":"TMA",
              "Pricelist Name":"TAEC-CSP-XXX-T0P10440001-USD-001",
              "Currency":"USD",
              "Pricelist Start Date":null,
              "Pricelist End Date":null,
              "Item Name":"SDFAM03GEB01",
              "UOM":"Ea",
              "List Price":333,
              "Quarter":"2018CQ1",
              "Line Start Date":null,
              "Line End Date":null
            },
            {
              "Operating Unit":"TMA",
              "Pricelist Name":"TAEC-CSP-XXX-T0P10440001-USD-001",
              "Currency":"USD",
              "Pricelist Start Date":null,
              "Pricelist End Date":null,
              "Item Name":"SDFAM03GEB01",
              "UOM":"Ea",
              "List Price":444,
              "Quarter":"2018CQ2",
              "Line Start Date":null,
              "Line End Date":null
            },
            {
              "Operating Unit":"TMA",
              "Pricelist Name":"TAEC-CSP-XXX-T0P10440001-USD-001",
              "Currency":"USD",
              "Pricelist Start Date":null,
              "Pricelist End Date":null,
              "Item Name":"SDFAM00EXB01",
              "UOM":"Ea",
              "List Price":444,
              "Quarter":"2017CQ3",
              "Line Start Date":"2017-07-01",
              "Line End Date":"2017-09-30"
            },
            {
              "Operating Unit":"TMA",
              "Pricelist Name":"TAEC-CSP-XXX-T0P10440001-USD-001",
              "Currency":"USD",
              "Pricelist Start Date":null,
              "Pricelist End Date":null,
              "Item Name":"SDFAM00EXB01",
              "UOM":"Ea",
              "List Price":555,
              "Quarter":"2017CQ4",
              "Line Start Date":"2017-10-01",
              "Line End Date":"2017-12-31"
            },
            {
              "Operating Unit":"TMA",
              "Pricelist Name":"TAEC-CSP-XXX-T0P10440001-USD-001",
              "Currency":"USD",
              "Pricelist Start Date":null,
              "Pricelist End Date":null,
              "Item Name":"SDFAM00EXB01",
              "UOM":"Ea",
              "List Price":666,
              "Quarter":"2018CQ1",
              "Line Start Date":"2018-01-01",
              "Line End Date":"2018-03-31"
            },
            {
              "Operating Unit":"TMA",
              "Pricelist Name":"TAEC-CSP-XXX-T0P10440001-USD-001",
              "Currency":"USD",
              "Pricelist Start Date":null,
              "Pricelist End Date":null,
              "Item Name":"SDFAM00EXB01",
              "UOM":"Ea",
              "List Price":777,
              "Quarter":"2018CQ2",
              "Line Start Date":"2018-04-01",
              "Line End Date":"2018-06-30"
            }
          ],
          "columns":[
            "Operating Unit",
            "Pricelist Name",
            "Currency",
            "Pricelist Start Date",
            "Pricelist End Date",
            "Item Name",
            "UOM",
            "List Price",
            "Line Start Date",
            "Line End Date"
          ],
          "defaultFormat":null,
          "columnFormats":null,
          "columnTooltips":null,
          "enableClientFilter":false,
          "title":null,
          "preferenceName":null,
          "onRowSelectEvents":{

          },
          "resultType":"MATRIX"
        },
        "warnings":null,
        "alertMessage":null,
        "alertType":null,
        "displayOptions":16,
        "formatType":null,
        "suffix":null,
        "resultType":"MATRIX",
        "cssProperties":null,
        "userGroup":null,
        "resultGroup":null,
        "overrideValueOptions":null,
        "overrideAllowEmpty":true,
        "overridable":false,
        "overridden":false,
        "resultDescription":null
      }
    ],
    "status":0
  }
}

Architecture implementation – document sample

For a description how to implement architecture on a project see the /wiki/spaces/CLI/pages/83656777.  

Solenis uses SAP integration.

IOException occurs when quoteChar (" by default) is inside field value

Q: How do we solve a problem with double quotes (" ") inside a cell value which encapsulated by a quote (")?

A: The solution is to escape all the double-quote characters inside the field value or disable the quote mode to make IM accept the quote " as a part of the string.

DisabledQuote mode
Code Block
<bean id="productFormat" class="org.apache.camel.model.dataformat.CsvDataFormat">
    <property name="useMaps" value="true"/>
    <property name="skipHeaderRecord" value="false"/>
    <property name="header" ref="productHeader"/>
    <property name="delimiter" value=";"/>
    <property name="disabledQuote" value="true"/> <!-- this option will ignore quote char -->
</bean>

Support for fetch to core web services

Since the version 1.1.3 the following entities can be fetched: P, PX, PPV, C, CX and Damatart (Datamart, DataSource, Data Feed).

Code Block
<soap:Envelope xmlns:soap="http://schemas.xmlsoap.org/soap/envelope/">
    <soap:Body>
        <Fetch xmlns="http://webservices.pricefx.eu/ProductService">
            <Options>
                <startRow>0</startRow>
                <endRow>500</endRow>
                <countOnly>false</countOnly>
                <resultFields>sku,attribute1</resultFields>
                <sortBy>attribute1</sortBy>
                <returnNullFields>true</returnNullFields>
                <filter>
                    <and>
                        <criterion fieldName="label" operator="equals" value="name2"/>
                    </and>
                </filter>
            </Options>
        </Fetch>
    </soap:Body>
</soap:Envelope>

Using header values in Groovy expression in loadMapper

Q: Is it possible to use the header values in a Groovy expression in loadMapper? For example:

Code Block
<groovy expression="body.ProductSKU + '_'+ (header.loadDateCompetitors.getMillis() / 1000)" out="attribute9" />

A: Try one of these options:

  • headers.loadDateCompetitors

  • The following code sample:

    Code Block
    <groovy expression="body.Sold_to_party + ':' + body.Condition_type + ':' + body.Condition_Table + ':' + body.Sales_Org
                + ':' + body.Validity_start_date.toString() + ':' + body.Validity_end_date.toString()
                + ':' + body.Material_pricing_group + ':' + body.Price_list_type + ':' + in.headers.dateOfImport"
                                             out="UUID"/>

Using from(file).to(file) to transfer large files

Q: I have an issue with from(file).to(file) used to transfer large files. Apparently, this method tries to load the whole file into the memory (or at least some parts) and it results in the "OutOfMemoryError: Requested array size exceeds VM limit" error.

A: Use .split(body().tokenize("\n")).streaming() instead.

A2: You can use direct linux command (example for scp to sftp):

Code Block
<recipientList>
	<simple>exec:scp?useStderrOnEmptyStdout=true&args=-P {{fileOutPort}} -i {{app.home}}/externpricefx.key ${header[CamelFilePath]} {{fileOutLocation}}/${header[fileOutLocation]}/{{fileOutSubdirectory}}/{{ftp.environmentDir}}</simple>
</recipientList>      

Adding templateName param to quote > fetchpdf API

Q: How can I add the templateName parameter to quote > fetchpdf API? I need to select a template for PDF export but I cannot find any API supporting this.

A: There are the following options:

  • Use invokeAPI:

    Code Block
    post /quotemanager.fetchpdf/{quote uniqueName}?templateName={templateName}
  • Or as a workaround, take the fetchPDF method from pricefx-api, modify it and use in your processor. 

  • Or ask the IM team to add this method into IM.

Dynamic mapping CSV headers to PX fields – NOT SUPPORTED

Q: Is there a way to dynamically map CSV headers to PX fields? I.e. a header with the name "attribute1" will get mapped to attr1 in PX, regardles of the position in the CSV.

A: No, this is not supported. you can submit a Jira ticket for the IM team to implement it. 

A2: Implemented in Schneider project: DynamicHeaderCvsParserProcessor

Switch in Camel

Code Block
<choice>
             <when>
                 <simple>${header[MaterialNo]} != ''</simple>

                 <log message="Material master WS message for MaterialNo '${header[MaterialNo]}' received." />
                 <multicast>
                     <to uri="direct:archiveRequest"/>
                     <to uri="direct:processCalculation"/>
                     <to uri="direct:processSales"/>
                     <to uri="direct:processMaster"/>
                 </multicast>

             </when>
             <otherwise>

                 <setFaultBody>
                     <constant>no MaterialNo found in the request</constant>
                 </setFaultBody>

             </otherwise>
         </choice>

Errror message 'sun.security.validator.ValidatorException: PKIX path building failed'

Q: We are getting the below error when we try to send data to the Pricefx QA server.

...

For your reference only, you can download a new certificate (in Chrome) and import it with keytool. If the customer uses our web services, they need to update the certificate of our HTTP proxy.

Invalid command parameters: Lock wait timeout exceeded

Q: How can I fix the following error which I got when trying to load data to a PX table.

Invalid command parameters ((conn:4439655) Lock wait timeout exceeded; try restarting transaction

A: I would avoid a parallel loaddata to the same target table, although feeding different (physical) tables in parallel is probably completely ok. 

The change at the Solenis project was twofold:

  1. A DB level index that matches the business key must exist (someone must create it manually). It probably applies only to PX/CX since all other types have fixed keys (and hence indexes exist).

  2. The fix I committed: When doing loaddata based on join keys, PFX was generating a join condition which caused in some cases the DB planner not to pick the right DB index. We changed it and now a simpler query should be generated (with some DB specific things).  Now the planner can always select (and use) the right index. (This fix is in the master and will be part of the next hot patch, probably 3.3.6). 

According to my tests, Oracle based instances behaved the same way as MariaDB instances. 

On Solenis, loaddata took seconds to minutes, now it takes 150 ms per request.

Using PIE for JMS consuming

Q: Does anybody use PIE for JMS consuming as in MS?

A: Bosch uses PIE with JMS and ActiveMQ.

Configuring number of consumers on PIE

If there is only one customer, you will see this:

Code Block
<from uri="seda://setStatusOnPL?concurrentConsumers=1" />

...

Code Block
<fromuri="activemq://Consumer.{{regionName}}.VirtualTopic.updateCustomerRefData?asyncConsumer=true"/>

Inserted product does not display

Q: I have an issue with: <pfx:dsLoad id="loadProduct" objectType="P" businessKeys="sku" mapper="productMapper"/>

The CSV contains 2 updates and 1 insert, update works fine, but the insert does not go through. Any ideas?

A: Avoid caching by changing browser:

  • Check the filter on the user perspective.

  • Check the filter on preferences:
    Image Removed

businessKeys for PX

Q: If I have businessKeys for PX defined in PFX, is it enough to commit them in integration to work?

A: Yes, if you use detectJoinFields = true

Code Block
if (StringUtils.isEmpty(businessKeys)) {
component.addPropertyValue("detectJoinFields", true);
component.addPropertyValue("businessKeys", null);
} else {
component.addPropertyValue("detectJoinFields", false);
component.addPropertyValue("businessKeys", businessKeys);
}

Running DM Calculation after DM Refresh

Q: I need to run DM Calculation after DM Refresh. In IM, how can I make sure DM Calculation is called when DM Refresh is done? Or do I need to call dmCalculation because it includes DM Refresh?

A: Use an event system. 

Code Block
"eventType":"PADATALOAD_COMPLETED"
"type":"DM_REFRESH"
"targetName":"DM.TransactionsDM_Sales"

You can add "status":"READY".

Workaround for large files and convertBodyTo

Q: What is the workaround for large files and <convertBodyTo type="java.lang.String" charset="UTF-8"/>? This is just CSV processing, but the file is from the FTP and there is a comment in the source: FTP files currently need this in order to do split + tokenize.

A: Just set a temp file on the FTP component and you do not need to use convertBodyTo. The trick is that it will download a file and you can use the split immediately. The convertBodyTo line was used to "download the file" to be able do the split. It looks like there originally was a bug in Camel which was fixed later on. Now you can just use the local temp file which will do the trick.

What is <bean id="PricesDSExtractor" parent="pExtractor" p:constantKey="${Prices_DM_DSname}" />?

It is a PIE component, Parameter Extractor, it sets contantKey to the header for the next PIE component processing, it sets a DS name in exchange and it is used in connection with the PIE component. 

In new PFX components (not released yet) you would just do <pfx:fetch?dsUniqueName=Prices&filter=dsFilter/>

How to trigger CFS after PP change?

There is a new event to do that.

OutOfMemory error

Q: Does anyone have experience with the error "Caused by: java.lang.OutOfMemoryError: GC overhead limit exceeded"? It happened with [net.pricefx.integration.command.pl.FetchPriceListItems] when the pricelist has more than 500K items.

A: You have to batch it. 

Code Block
<beans:bean id="batchedFetchRoute" class="net.pricefx.integration.route.BatchedFetchRoute">
	<beans:property name="routeName" value="direct://batchedFetchRoute"/>
	<!-- endpoint uri that this batch route will listen on -->
	<beans:property name="batchSize" value="5000"/>
	<beans:property name="fetch" ref="fetchAllProducts"/>
	<beans:property name="outputUri" value="direct://logOnlyRoute"/>
	<!-- the actual sub-route that will get called for each fetched batch -->
</beans:bean>

Fetch it is used as your filter. outputUri is the output URI.

Code Block
<pfx:dsFetch id="fetchAllProducts" objectType="P" filter="allProductsFilter" />

Scanner aborted because of IOException

Q: Any idea what to do with this?

Code Block
[direct:loadCompeti] [split94           ] [split[tokenize{body() using token:
}]                                        ] [   1110008]

Stacktrace
---------------------------------------------------------------------------------------------------------------------------------------
org.apache.camel.RuntimeCamelException: Scanner aborted because of an IOException!
        at org.apache.camel.processor.Splitter$SplitterIterable$1.hasNext(Splitter.java:175) ~[camel-core-2.17.7.jar:2.17.7]

A: Check if you have:

  • Correct format of CSV and none of the columns is missing.

  • The same number of columns for for each line.

  • Correct encoding.

Camel choice

Q: How I can get the following?

Code Block
<route>
            <from uri="direct:customerList" />
            <when>
                <simple>${header.token} contains "ahoj"</simple>
                <log message="VALID TOKEN!" />
            </when>
            <otherwise>
                <log message="INVALID TOKEN!" />
            </otherwise>
...
</route>

LOG:

Code Block
12:18:46.756 | INFO  | qtp1409864883-19 | route5 | ID-ni3mm4nd-K73SV-1523009915657-0-1 | route5 | VALID TOKEN!
12:18:46.759 | INFO  | qtp1409864883-19 | route5 | ID-ni3mm4nd-K73SV-1523009915657-0-1 | route5 | INVALID TOKEN!

How can I receive both? It does the same with "contains" or "==" or "=~", no matter what I choose. What am I doing wrong?

A: Choice is missing.

Code Block
<choice>
	<when>
		<simple>$simple{headers["CamelSqlRowCount"]} > 0</simple>
		<log message="Received matmas material from DB $simple{body}" loggingLevel="INFO" logName="business.global.material" />
		<split parallelProcessing="true">
			<simple>$simple{body}</simple>
			<to uri="bean://productDataMapper" />
			<to uri="disruptor://pxIntegrate?size=8192" />
		</split>
	</when>
	<otherwise>
		<log message="No matmas material from db received" loggingLevel="WARN" logName="business.global.material" />
	</otherwise>
</choice>

Warning when calling calculation logic in PFX without sending any data to it

Q: I need to just call a calculation logic in PFX without sending any data to it. Any idea how to send an empty body? I tried <setBody> and empty ArrayList but I still got a warning.

A: It can be handled by catching the exception and continuing as if nothing happened. Or you just send some random dummy which then hopefully would be ignored.

A2: I think you need to set empty Map to body instead of ArrayList

Note about loadMapper 

If you already have values from a CSV header in the "sku;attribute1;..." format, just use:

Code Block
<pfx:loadMapper id="productMapper" includeUnmappedProperties="true"/>

You do not need to define a mapper if your CSV is 1:1.

Access body in criterion

Q: How do I access body[0] in a criterion like this?

Code Block
<pfx:criterion fieldName="sku" operator="equals" value="simple:body[0][sku]"/>

And a property?

Code Block
<pfx:criterion fieldName="attribute3" operator="greaterThan" value="simple:property[JMSUpdated]"/>

...

body[0][sku] means that your body is an array and you take the first element and it is the name sku.

If your body is just an object, it should be body.sku.

To learn about simple expressions, see http://camel.apache.org/simple.html

Since Camel 2.15 it is correct to use:

Code Block
<pfx:criterion fieldName="attribute3" operator="greaterThan" value="simple: exchangeProperty[JMSUpdated]"/>

Concatenation of several input values in mapper to final out field

Q: Is it possible to concatenate several input values in a mapper to final out field? Something like this:

Code Block
<bean parent="mapperEntry" p:in="DOCNUM + MAT13 + VKORG" p:out="IDOCKey" />

A: You need to write a transformer.

Code Block
<pfx:loadMapper>
	<pfx:groovy expression="body.DOCNUM + ':' + body.MAT13  + ':' + body.VKORG" out="IDOCKey"/>
</pfx:loadMapper>

Fetch QuoteLineItems

Q: How to fetch quote line items data?

A: Fetch the full quote using:

Code Block
priceFxClient.getQuoteApi().fetch(typedId);

Cleaning up data feed

Q: How to clean up a data feed?

A: Call the Data Load Truncate after some FLUSH events. IM listens to the PADATALOAD_COMPLETED event and filters for DS_FLUSH | READY, then it will call DL TRUNCATE on the Data Feed.

PADATALOAD_COMPLETED event
Code Block
#listen to PADATALOAD_COMPLETED event

# route to execute logic when DS_FLUSH completed
<route id="eventPADataLoadCompleted">
	<from uri="direct:eventPADataLoadCompleted"/>
	<log message="PADATALOAD_COMPLETED event received with info - type: ${body[data][0][type]} - targetName: ${body[data][0][targetName]} - status: ${body[data][0][status]}"/>
    <filter>
		<simple>${body[data][0][type]} == 'DS_FLUSH' &amp;&amp; ${body[data][0][status]} == 'READY'</simple>
		<to uri="direct:dataLoadCompleted"/>
	</filter>
</route>

Exporting price lists – mapping by label instead of name

Q: How can we export price list data correctly when its fields names are changeable?

A: Since IM 1.0.4.2, we can use net.pricefx.integration.processor.AddPLIMetadataBasedFieldsProcessor and the injectHeaderFromKeysToFirstLine attribute of the list-to-csv component to have headers from keys included and data mapped by its metadata/labels. For details see list-to-csv - Export Data to CSV (DEPRECATED)

Price List export
Code Block
languagexml
<bean id="addPLIMetadataBasedFieldsProcessor" class="net.pricefx.integration.processor.AddPLIMetadataBasedFieldsProcessor">
	<property name="priceListId" value="simple:property.priceListId"/>
</bean>
<pfx:list-to-csv id="priceBookCSVTransform" outputUri="direct:priceBookToFile"
                     injectHeaderFromKeysToFirstLine="true" mapper="priceBookMapper" dataFormat="priceBookCsvFormat"/>
<route id="priceList-Fetch">
	<from uri="direct:priceListApproved"/>
	<setProperty propertyName="priceListId">
		<groovy>Double.valueOf(body.data.id[0]).longValue() + ""</groovy>
	</setProperty>
	<setProperty propertyName="priceListLabel">
		<simple>${body[data][0][label]}</simple>
	</setProperty>
	<setProperty propertyName="targetDate">
		<simple>${body[data][0][targetDate]}</simple>
	</setProperty>
    <setProperty propertyName="expiryDate">
		<simple>${body[data][0][expiryDate]}</simple>
	</setProperty>
	<log message="'ITEM_APPROVED_PL' notification received for price list '${property[priceListLabel]}' (id: ${property[priceListId]})" loggingLevel="INFO"/>
            
    <to uri="fetchPriceListItems"/>     
    <log message="Fetched ${body.size()} rows from PL '${property[priceListLabel]}' (id: ${property[priceListId]})" loggingLevel="INFO"/>
    <process ref="addPLIMetadataBasedFieldsProcessor"/>
    <to uri="direct://priceBookCSVTransform" />
</route>

Setting TAB character as CSV delimiter 

Q: How can I set the TAB character to delimiter a property of CsvDataFormat?

A: Use the hex code "&#x9;"

Code Block
<property name="delimiter" value="&#x9;"/>

Saving or updating quote 

Q: How can I update a Quote from IM?

A: Use massEdit or quoteApi from pfxClient.

Code Block
<pfx:filter id="fetchQuoteFilter">
    <pfx:and>
        <pfx:criterion fieldName="uniqueName" operator="equals" value="simple:header[sourceId]"/>
    </pfx:and>
</pfx:filter>
<pfx:dsMassEdit id="updateContractId" objectType="Q" filter="fetchQuoteFilter">
	<pfx:field name="additionalInfo3" value="simple:header.contractLN"/>
	<pfx:field name="additionalInfo4" value="simple:header.contractCA"/>
</pfx:dsMassEdit>

or 

Code Block
languagejava
SaveQuoteRequest saveQuoteRequest = new SaveQuoteRequest();
PriceQuoteRequest priceQuoteRequest = new PriceQuoteRequest();
PriceQuoteRequestData priceQuoteRequestData = new PriceQuoteRequestData();
try {
	Response price = pricefxClient.getQuoteApi().price(priceQuoteRequest.data(priceQuoteRequestData.quote(quote)));
	...
} catch (ApiException e) {
	...
}

Retrieving email list from user group 

Q: Is there an "official" way to retrieve an email list from a PFX user group?

A: use GeneralAPI and method fetch with object type U

fetch User Group by Id
Code Block
languagejava
priceFxClient.getGeneralApi().fetchByObjectTypeAndObjectId("UG", objectId, requestBody)

Exporting to CSV file with duplicating header column name

Q: As you can notice, there is a duplicity in the column name. So my question is how can I achieve something like this?

A:  Write the header beforehand and then export only values.

Export header
Code Block
languagexml
<setBody>                			         <simple>sku;OutletID;CreateTime;UserName;ChangePriceType;DeletePreMaintainedPriceFlag;SalesPrice;Currency;CreateSystem;ScheduledTime;Status;ExecTime;ExecMessage;Subsidiary;Brand;CalculationUUID;PriceTypeCode;PriceTypeText;FinalPriceReason;PriceProposal;LastPrice;S-Flags;P-Flags\n</simple>
            </setBody>
            <setHeader headerName="CamelFileName">
                <simple>${property[CamelFileName]}</simple>
            </setHeader>
            <to uri="file://{{central-sales-price-export-work-directory}}${fileOutParameters}&fileExist=Append" />

Adding CFS calculation while processing

Q: Calling it to calculate when it is pending is ignored, but how is it when it's processing? Can we successfully add a calculate job while it's in processing state?

A:  While processing, the request will get into the queue.

Creating price list from IM

Q:  Is it possible to create a price list and upload the lines to it via a CSV header and data in a file. Is this possible from IM?

A:  Create JSON with a price list configuration. When you create a price list in PFX, it is the same process. You prepare a configuration and then you click Create PL and it is done.

Fetching data with BatchModed=true

Q: I've played with the new pfx-api:fetch component with BatchedMode=true. What should be the result of the fetch? I am getting List of BatchingInterval objects but with no data.

A: 

Code Block
<route>
    <from uri="timer://fetchDataFromDatamartByQuery?repeatCount=1"/>
    <to uri="pfx-api:fetch?objectType=DM&dsUniqueName=1669.DMDS&batchedMode=true&batchSize=5000"/>
    <split>
        <simple>${body}</simple>
        <to uri="pfx-api:next"/>
        <log message="${body}"/>
    </split>
</route>

Web crawling - Metoda

Q: Did somebody hear about Methoda (web crawling partner)? Is it used somewhere?

A: The correct name is Metoda and its support is part of Pricefx.

Loading data to Data Source directly

Q:  I'm checking the new PFX APIs in IM 1.1.7 and I found that we can use pfx-api:loaddata to load data directly into a data source (ignoring steps of loading data into data feed + dmFlush call). Do you think it is possible?

A: Yes. Delete DF, then the data are loaded into DS. 

No events generated

Q: I have a listener for ITEM_UDPATE_PR and I wonder why no event is created if I do mass update on Price Records. What should be the right trigger or what are the rules? 

A: Mass edit/delete do not generate events. If you do an update, it generates an event. You can call integrate.

Starting LPG calculation 

Q: Is there any "official" API to start a LPG calculation from IM?

A: 

Code Block
<bean id="calculateFinalPriceListLPG" class="net.pricefx.integration.command.pg.Calculate">
        <property name="priceGridId" value="simple:property[loadedPriceGridId]"/>
</bean>

Preventing from being overridden by null value

Q: I have a requirement from the customer that they need to prevent null values from overriding existing values. Is there any solutions for that?

A:  No. You need to fetch data from Pricefx and compare it. Or if it is just only about one attribute, you can make a conditional integration request.

Working with IDOC format

Q: Do we have any project that works with SAP iDoc format (inbound data)?

A: iDocs are just XML files. You can search over Bitbucket and see how we dealt with them.

Working with JSON – jsonpath component

Q: Camel jsonpath is a good component for working with JSON. For example, I have this JSON: 

Code Block
{
"data": [
{
"version": 1,
"typedId": "22818674.PGI",
"sku": "121832",
"label": "Strybal Scoop Tee SS Women",
"resultPrice": 100.00000,
"allowedOverrides": "",
"calculatedResultPrice": 100.00000,
"tainted": false,
"priceGridId": 2595,
"approvalState": "APPROVED",
"activePrice": 100.00000,
"manualEditVersion": 0,
"manualPriceExpired": false,
"submittedByName": "admin",
"approvedByName": "admin",
"createDate": "2018-08-03T13:47:30",
"createdBy": 3139,
"lastUpdateDate": "2018-08-03T16:01:11",
"approvalDate": "2018-08-03T16:35:30",
"activePriceDate": "2018-08-03T16:35:30",
"completeResultsAvailable": true,
"itemExtensions": {},
"allCalculationResults": [
{
"result": 100,
"resultName": "price"
},
{
"result": 111.00,
"resultName": "price_authorized"
},
{
"result": 120.0,
"resultName": "price_nonauthorized"
}
]
}
],
"metricName": "PGI_Approved",
"eventType": "ITEM_APPROVED_PGI"
}

I would like to get the result 111 for allCalculationResults where resultName = price_authorized.

A:

Code Block
<setHeader headerName="authorizedPrice">
    <jsonpath>$.data[0].allCalculationResults[?(@.resultName == 'price_authorized')].result</jsonpath>
</setHeader>

Persistent message store in Camel

Q: Does anyone have some experience with persistent message store in Camel ? I tried to use krati but I had some issues with it.

A: We use MySQL and Oracle to store data on Bosch and MediaSaturn. 

Example with leveldb: https://svn.apache.org/repos/asf/camel/trunk/components/camel-leveldb/src/test/resources/org/apache/camel/component/leveldb/LevelDBSpringAggregateTest.xml (for aggregation only) 

Deleting approved contact

Q: Is it possible to delete an approved contract within PFX?

A: It is not possible. But Support can do that.

Using new components in IM 1.1.7

Q: I'm trying to use new components (in IM 1.1.7 and higher), but face error below:

Code Block
Failed to resolve endpoint: pfx-csv://unmarshal?delimiter=%7C due to: No component found with scheme: pfx-csv

A: Please add this section into your POM

Code Block
<dependency>
  <groupId>net.pricefx.integration</groupId>
  <artifactId>camel-pfx</artifactId>
</dependency>

Merging CSV columns into one PFX attribute

Q: How to merge two columns in CSV data into one column PFX table? (For instance, merging a “date” column and a “time” column into one column “datetime”)

A: You can do it in mapper:

Code Block
<pfx:groovy expression=" body.Date + ' ' + body.Time" out="attribute1"/>

Calling LPG recalculate in IM

Q: How to call LPG recalculate?

A: Please check MS project, custom processor "TriggerCompetitionRecalcProcessor".

SAP iDOC support in IM

Q: Does IM support working with SAP's iDOC flat text format?

A: No, IM currently only support iDOC as XML format.

Using dsLoad with key predefined in PFX

Q: When using dsLoad, how to force IM using keys defined in PFX instead of sending businessKey="sku,name"?

A: You can use detectJoinFields attribute (Supported from 1.1.9).

Using multiple PFX connections in single IM project

Q: Is it possible to use one inbound folder but data loaded into two partitions?

A: Yes, it's possible to have more than one pfx-connection in a IM project. These connections are identified by ID.

Connections are set up in camel-context.xml.

PRX Connections in camel-context.xml
Code Block
languagexml
    <pfx:connection id="coesia" uri="${pfx-coesia.url}" partition="${pfx-coesia.partition}" username="${pfx-coesia.username}" password="${pfx-coesia.password}" debug="${pfx-coesia.debug:false}"/>
    <pfx:connection id="gdm" uri="${pfx-coesia-gdm.url}" partition="${pfx-coesia-gdm.partition}" username="${pfx-coesia-gdm.username}" password="${pfx-coesia-gdm.password}" debug="${pfx-coesia-gdm.debug:false}"/>
    <pfx:connection id="raj" uri="${pfx-coesia-raj.url}" partition="${pfx-coesia-raj.partition}" username="${pfx-coesia-raj.username}" password="${pfx-coesia-raj.password}" debug="${pfx-coesia-raj.debug:false}"/>
    <pfx:connection id="norden" uri="${pfx-coesia-norden.url}" partition="${pfx-coesia-norden.partition}" username="${pfx-coesia-norden.username}" password="${pfx-coesia-norden.password}" debug="${pfx-coesia-norden.debug:false}"/>

To use a particalur connection you have to set the header partitionPfxApi before calling the PFX.

Route
Code Block
languagexml
    <route id="productMasterData">
      <from uri="{{coesia-products-fromUri}}"/>
      
      <setHeader headerName="partitionPfxApi">
        <constant>coesia</constant>
      </setHeader>
      
      <to uri="direct:ProductProcess" />  
    </route>

Refer to these projects for more details: https://bitbucket.org/pricefx/toys-r-us-integration/src/master/src/main/resources/camel-context.xml

or https://bitbucket.org/pricefx/coesia-integration/src/master/.

Working with big ZIP file larger than 4GB

Q: I am using Camel's ZipFileDataFormat to extract .ZIP files from customer. However, it showed error below when parsing files which have size bigger than 4GB.

Code Block
org.apache.camel.RuntimeCamelException: java.util.zip.ZipException: invalid entry size (expected 5629585467198288 but got 5587552134 bytes)
  at org.apache.camel.dataformat.zipfile.ZipIterator.getNextElement(ZipIterator.java:116)
  at org.apache.camel.dataformat.zipfile.ZipIterator.next(ZipIterator.java:85)
  at org.apache.camel.dataformat.zipfile.ZipIterator.next(ZipIterator.java:39)
  at org.apache.camel.processor.Splitter$SplitterIterable$1.next(Splitter.java:188)
  at org.apache.camel.processor.Splitter$SplitterIterable$1.next(Splitter.java:164)
  at org.apache.camel.processor.MulticastProcessor.doProcessSequential(MulticastProcessor.java:616)
  at org.apache.camel.processor.MulticastProcessor.process(MulticastProcessor.java:248)
  at org.apache.camel.processor.Splitter.process(Splitter.java:114) ...

A: That ZIP file must be corrupted. Possibly customer is using a non-supported ZIP64 application to compress the data file. In this case, the output file still can be extracted by some applications, but will be failed in some "standard" applications such as WinZip or Java JDK.

Samples of reading and writing data in Product Extension table

Q: Is there an example of read from and write to a Product Extension table?

A: Please put them here:
https://pricefx.atlassian.net/wiki/spaces/INTG/pages/537952266/Fetch+Data+from+Price+f+x
https://pricefx.atlassian.net/wiki/spaces/INTG/pages/537854028/Parse+CSV+and+Load+Data+to+General+Data+Source

A library of e-books

Q: Do/will we have a sharing library of e-books that we need in daily work?

A: I'm not aware of any shared ebook? We have an only a paper book: Camel in Action. One book is in Prague and the second is in Ostrava

Defining Table Name and Keys for PX or CX Master Data

Defining a name of a table for Product or Customer Extensions (CX, PX) is not all that intuitive.

Set the name of the table in a loadMapper first.

loadMapper
Code Block
languagexml
    <!-- Product Extension Mapper -->
    <pfx:loadMapper convertEmptyStringToNull="true" id="ProductExtensionMasterDataMapper">
        <!-- set Product Extension table name -->
        <pfx:simple expression="ProductExtension1" out="name"/>
        <!-- attribute mapping -->
        <pfx:body in="DIM_ARTICLE_KEY"       out="sku"/>
        <pfx:body in="PALLET_SPEC_ID"        out="attribute1"/>
        <pfx:body in="ROLLS_PER_PACK"    ...

Do not forget to set detectJoinFields in the URI.

Key definition
Code Block
languagexml
<to uri="pfx-api:loaddata?mapper=ProductExtensionMasterDataMapper&objectType=PX&detectJoinFields=true"/>

Saving a flag into PFX Advanced configuration

Q: How to store a flag or a value to PFX Advanced configuration? I would like to store for example last updated time for a data feed.

A: There will be an API in XML. For now you have create a configurator bean and call its method. See the dana project.

Creating a REST endpoint to call your route

Q: I need to call a route that consumes a web service with special parameters,  e.g. for a retry or an initial data load.

A: Create a REST endpoint running on the localhost and implement a proxy route that consumes the endpoint and calls the target system. The implementation is used in the Cox project.

The sample code takes a JSON payload, sets headers and calls the target service.

Define the REST endpoint in the camel-context.xml file:

REST endpoint
Code Block
languagexml
        <dataFormats>
            <json id="gson" useList="true" library="Gson"/>
            <json id="json" useList="true" library="Jackson"/>
        </dataFormats>

        <!-- REST service for initial load of Price2Spy data. Works as a proxy for the Price2Spy. -->
        <!-- example:
            curl -X POST -H "Content-Type:application/json" http://localhost:42080/Price2Spy/price -d '{"dateChangeFrom": "2018-01-01 00:00:00","dateChangeTo":"2018-12-01 00:00:00" }'
        -->
        <restConfiguration bindingMode="json" component="jetty" port="42080" host="localhost"/>
        <rest consumes="application/json" produces="text/plain">
            <post uri="/Price2Spy/price">
                <route id="adhocPrice2SpyREST" >
                    <to uri="direct:adhocPrice2SpyCompetitorData"/>
                </route>
            </post>
        </rest>

Implement the proxy route:

The proxy route
Code Block
languagexml
        <!-- for loading data through REST bound to localhost:42080 -->
        <route id="adhocPrice2SpyCompetitorData">
            <from uri="direct:adhocPrice2SpyCompetitorData"/>
            <!--<log message="Got ${body}"/>-->
            <setHeader headerName="Price2SpyDateChangeTo">
                <jsonpath>$.dateChangeTo</jsonpath>
            </setHeader>
            <setHeader headerName="Price2SpyDateChangeFrom">
                <jsonpath>$.dateChangeFrom</jsonpath>
            </setHeader>
            <to uri="seda:updatePrice2SpyCompetitorData"/>
        </route>

The target route consumes the seda endpoint. Do not forget to override HTTP headers:

The target route
Code Block
languagexml
        <route id="updatePrice2SpyCompetitorData">
            <!--<from uri="quartz2://price2spyDataUpdateTimer?cron={{price2spy-pricing-data-cron}}&trigger.timeZone=America/Chicago&stateful=true" />-->
            <from uri="direct:updatePrice2SpyCompetitorData"/>
            <from uri="seda:updatePrice2SpyCompetitorData"/>
            <!-- setup headers for the REST call -->
            <setHeader headerName="CamelHttpMethod">
                <constant>POST</constant>
            </setHeader>
            <setHeader headerName="Content-Type">
                <constant>application/json</constant>
            </setHeader>
            <setHeader headerName="Accept">
                <constant>application/json</constant>
            </setHeader>
            <setHeader headerName="Authorization">
                <simple>{{price2spy-pricing-data-authToken}}</simple>
            </setHeader>
            <setHeader headerName="CamelHttpUri">
                <simple>https4://your-service-provider.....</simple>
            </setHeader>
            <setBody>
                <simple>{
                    "dateChangeFrom": "${header.Price2SpyDateChangeFrom}",
                    "dateChangeTo": "${header.Price2SpyDateChangeTo}"
                    }</simple>
            </setBody>
            <log message="Fetching data from ${header.Price2SpyDateChangeFrom} to ${header.Price2SpyDateChangeTo}." />
            <!--<log message="Sending body:\n${body}"/>-->
            <!--<process ref="debugProcessor"/>-->
            <to uri="https4://your-service-provider....."/>
            <!-- convert json to map -->
            <unmarshal ref="gson"/>

Changing the heap size for the IntegrationManager in a dedicated environment

You may find that a dedicated environment is very slow. You get very a slow response from the PFX UI too. Check the operational memory allocation. What you usually get is 32 GB of RAM. The PFX server alone has the heap size of 24 GB. Once you deploy more than one IntegrationManager on the server, you are in a potential trouble. IM has a default heap allocation of 4 GB. Once both IMs will reach their maximum heapsize, the operating system will start swapping and everything slows down. The heapsize of IM can be adjusted. You need to create a a Helpdesk ticket and specify the size. The size is controlled by the parameter -Xmx.

Checking the heapsize of IM, both have 2 GB:

Heap Size
Code Block
languagebash
root@node1.irm-qa.pricefx.net ~ # ps ax|grep im-
 2700 ?        Sl    13:12 /usr/bin/java -Dsun.misc.URLClassPath.disableJarChecking=true -Xmx2G -XX:+HeapDumpOnOutOfMemoryError -XX:HeapDumpPath=/var/pricefx/runtime/im-iron-mountain-qa -Xloggc:gc.log -XX:+PrintGCDateStamps -XX:+PrintGCDetails -XX:+UseGCLogFileRotation -XX:NumberOfGCLogFiles=10 -XX:GCLogFileSize=5M -Dfile.encoding=UTF-8 -Dspring.profiles.active=iron-mountain_qa -jar /var/pricefx/runtime/im-iron-mountain-qa/im-iron-mountain-qa.jar
 3282 ?        Sl     0:58 /usr/bin/java -Dsun.misc.URLClassPath.disableJarChecking=true -Xmx2G -XX:+HeapDumpOnOutOfMemoryError -XX:HeapDumpPath=/var/pricefx/runtime/im-iron-mountain-dev -Xloggc:gc.log -XX:+PrintGCDateStamps -XX:+PrintGCDetails -XX:+UseGCLogFileRotation -XX:NumberOfGCLogFiles=10 -XX:GCLogFileSize=5M -Dfile.encoding=UTF-8 -Dspring.profiles.active=iron-mountain_dev -jar /var/pricefx/runtime/im-iron-mountain-dev/im-iron-mountain-dev.jar

Checking the amount of the RAM:

RAM
Code Block
languagebash
root@node1.irm-qa.pricefx.net ~ # free -h
              total        used        free      shared  buff/cache   available
Mem:            31G         30G        186M         10M        201M         93M
Swap:           15G        1.9G         14G

GUnzip processor for bigger files

The camel GZip unmarshal is in memory only. For larger files you end up with the out of memory exception. The following processor uses streams and creates .done file when unzipping is done.

GUnzipProcessor.java
Code Block
languagejava
package net.pricefx.integration.processor.your_project;

import org.apache.camel.Exchange;
import org.apache.camel.Processor;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;

import java.io.*;
import java.util.zip.GZIPInputStream;

/**
 * Processor for unzipping gzipped files using streams.
 *
 * @author rbiciste
 */
public final class GUnZipProcessor implements Processor {

    private static final Logger LOGGER = LoggerFactory.getLogger(GUnZipProcessor.class);
    /**
     * Unzips a file using gzip as a streams. Implemented to avoid out of memory exception.
     * Camel gzip unmarshal is memory only.
     * After decompression it creates .done file to signal CSV routes.
     */
    @Override
    public void process(Exchange exchange) throws Exception {

        //Decompress the file to a temp folder
        File file = new File(exchange.getIn().getHeader("CamelFilePath", String.class));

        GZIPInputStream in = null;
        OutputStream out = null;
        File target = null;
        File doneFile = null;

        try {
            //Open the compressed file
            in = new GZIPInputStream(new FileInputStream(file));

            String targetFileName = file.getName().substring(0, file.getName().lastIndexOf('.'));
            LOGGER.debug("Unzipping to " + targetFileName);

            //Open the output file
            target = new File(file.getParent(), targetFileName);
            target.createNewFile();
            out = new FileOutputStream(target);

            //Transfer bytes from the compressed file to the output file
            //Buffer of 1 mb.
            byte[] buf = new byte[1048576];
            int len;
            while ((len = in.read(buf)) > 0) {
                out.write(buf, 0, len);
            }

            // close streams
            in.close();
            out.close();

            LOGGER.debug("Unzipping is done. Streams closed.");

            // create .done file for CSV routes
            doneFile = new File(file.getParent(), targetFileName + ".done");
            doneFile.createNewFile();

            LOGGER.debug(".done file created.");

        } catch (FileNotFoundException e) {
            LOGGER.error("File " + file.getName() + " not found");
            throw e;
        } catch (IOException e) {
            LOGGER.error("Gunzipping the file " + file.getName() + " failed." + e.getMessage());
            throw e;
        } finally {

            // close in
            if(in != null) {
                in.close();
            }

            // close out
            if(out != null) {
                out.close();
            }
        }
    }
}

How to use constants and reusable code snippets

I had a map value that was static but used it in a number of places.

I created a class with a static method returning static members.

Util Class

Code Block
languagejava
package net.pricefx.integration.util.ironmountain;


import java.util.HashMap;
import java.util.Map;

public class DataLoadOrchestrationUtils {

    /** returns Country Code from a file name like irm_bi_ca_rm_contract_details_20190217.csv
     *
      * @param fileName
     * @return
     */
    public static String getCountryCode(String fileName) {
        String[] parts = fileName.split("_");
        // country from 'irm_bi_ca_rm_contract_details_20190217.csv'
        return parts[2].toUpperCase();
    }

    /** returns Data Set from a file name like irm_bi_ca_rm_contract_details_20190217.csv
     *
     * @param fileName
     * @return
     */
    public static String getDataSet(String fileName) {
        String[] parts = fileName.split("_");
        return parts[parts.length-1].replace(".csv","");
    }

    /** returns Map of entities needed to run Calculations
     *
     * @return
     */
    static Map<String, String> entityMap;
    static {
        entityMap = new HashMap<>();
        entityMap.put("InvoiceRevenue", "attribute2");
        entityMap.put("ContractDetails","attribute3");
        entityMap.put("RateTable",      "attribute4");
        entityMap.put("BillCode",       "attribute5");
    }

    public static Map getEntityMap() {
        return entityMap;
    }
}

...

I used in a Groovy code whenever I needed.

Groovy snippets

Code Block
languagegroovy
...
            <setHeader headerName="country">
                <groovy>
                    return net.pricefx.integration.util.ironmountain.DataLoadOrchestrationUtils.getCountryCode(headers.CamelFileNameOnly)
                </groovy>
            </setHeader>
            <log message="Setting country to ${header[country]}" loggingLevel="INFO"/>
            <!-- Data Set Date -->
            <setHeader headerName="dataSet">
                <groovy>
                    return net.pricefx.integration.util.ironmountain.DataLoadOrchestrationUtils.getDataSet(headers.CamelFileNameOnly)
                </groovy>
            </setHeader>
...
                    <transform>
                        <groovy>
                            // map entities to attributes in PPV
                            def entities = net.pricefx.integration.util.ironmountain.DataLoadOrchestrationUtils.getEntityMap();

                            // if list is empty throw the exception
                            if (body.size()==0) {
                                throw new RuntimeException("Received the flush event without data in Running state.")
                            } else { // update existing row
                                body[0].put(entities.get(headers['entity']),'Done');
                            }

                            return body;
                        </groovy>
                    </transform>

Making a route to be a singleton

I need to update status of a row in a PPV table from multiple routes. First I fetch the row and then make changes. This unfortunately leads to a concurrency conflict. Data are overwritten as more than one instance of the update route exists. You can set <from> for the "direct:" to be blocking.

http://camel.apache.org/direct.html

Route
Code Block
languagexml
        <!-- sets the entity status into the PPV row -->
        <route id="dataSetFileStatus">
            <!-- blocks for only one consumer, multiple calls raise concurrent update issues. -->
            <from uri="direct:dataSetFileStatus?block=true"/>

Formatting date when exporting a price list

When fetching a price list, the data come as String. Normally you would do a converter inside a mapper. That does not work and actually throws an exception. You can resolve it in Groovy:

Date Formatting
Code Block
languagegroovy
<pfx:groovy expression="if (body.ServiceReviewDate) { Date.parse('yyyy-mm-dd', body.ServiceReviewDate).format('MM/dd/yyyy')}" out="ServiceReviewDate" />

Getting output of a formula logic

Sometimes it is easier to implement a service inside Pricefx using a formula. The formula creates a map that is returned in a JSON form when the formula is called.

The logic element that emits the response has to have the Display Mode set to 'Everywhere'.
Request URI: https://irm-qa.pricefx.eu/pricefx/ironmtn-dev/formulamanager.executeformula/RadovanJsonTest
Parameters request payload:

Code Block
{
    "data": {
        "estimatedVolume": 456,
        "industry": "IMIndustry",
        "numberOfMarkets": 8
    }
}

Response body:

Code Block
{
    "response": {
        "node": "node1",
        "csrfToken": "1q06b29ijhxutvw7zje1j6i2l",
        "data": [
            {
                "resultName": "response",
                "resultLabel": "response",
                "result": {
                    "data": {
                        "params": [
                            {
                                "estimatedVolume": 456,
                                "industry": "IMIndustry",
                                "numberOfMarkets": 8
                            }
                        ]
                    }
                },
                "warnings": null,
                "alertMessage": null,
                "alertType": null,
                "displayOptions": 16,
                "formatType": null,
                "suffix": null,
                "resultType": "SIMPLE",
                "cssProperties": null,
                "userGroup": null,
                "resultGroup": null,
                "overrideValueOptions": null,
                "overrideAllowEmpty": true,
                "labelTranslations": null,
                "overridable": false,
                "overridden": false,
                "resultDescription": null
            }
        ],
        "status": 0
    }
}

Logic parameter access:

Code Block
languagegroovy
def input = api.local.input

api.local.estimatedVolume = api.stringUserEntry("estimatedVolume")
api.local.industry = api.stringUserEntry("industry")
api.local.numberOfMarkets = api.stringUserEntry("numberOfMarkets")

api.logInfo("---- estimatedVolume = " + api.stringUserEntry("estimatedVolume"))
api.logInfo("**** estimatedVolume = " + api.local.estimatedVolume)

Logic emitting the response:

Code Block
languagegroovy
api.local.response = [:]
def response = [:]
response.params = [] as List

api.local.estimatedVolume = api.stringUserEntry("estimatedVolume")
api.local.industry = api.stringUserEntry("industry")
api.local.numberOfMarkets = api.stringUserEntry("numberOfMarkets")

def params = [:]
params["estimatedVolume"] = api.local.estimatedVolume
params["industry"] = api.local.industry
params["numberOfMarkets"] = api.local.numberOfMarkets

response.params.add(params)
api.local.response["data"] = response

api.logInfo(api.jsonEncode(api.local.response))

return api.local.response

Custom file processing strategy driven by a signal file

A customer uploads files on a cloud based SFTP. Once the upload is finished, it creates done_yyyymmdd.ctl file. After that we should start downloading files. As simple as it sounds it is not easily implemented in Camel.

I implemented a custom file processing strategy based on Class GenericFileRenameProcessStrategy<T> from the Camel sources. You define a bean and then put into it the file URI.

Code Block
languagexml
<bean id="sftpDoneCtlFileProcessingStrategy" class="net.pricefx.integration.component.file.strategy.ironmountain.SftpDoneCtlFileProcessingStrategy"/>

...
<from uri="sftp://{{sftp-inbound-fromUri}}&processStrategy=#sftpDoneCtlFileProcessingStrategy"/>

Source code:

https://bitbucket.org/pricefx/iron-mountain-integration/src/master/src/main/java/net/pricefx/integration/component/file/strategy/ironmountain/SftpDoneCtlFileProcessingStrategy.java

Server health check warning

How to disable this warning?

Code Block
02:14:30.969 | INFO  | http-nio-8089-exec-5 |  |  | o.s.c.c.c.ConfigServicePropertySourceLocator | Fetching config from server at: http://localhost:8888
02:14:30.971 | WARN  | http-nio-8089-exec-5 |  |  | o.s.c.c.c.ConfigServicePropertySourceLocator | Could not locate PropertySource: I/O error on GET request for "http://localhost:8888/im-avery-dennison-anz-prod/averydanz_prod": Connection refused; nested exception is java.net.ConnectException: Connection refused

Set the following in the application.properties file (for core 1.1.11 or earlier): 

Code Block
health.config.enabled=false

This will be by default disabled in IM 1.1.12.

How to fetch mapping of attributes to labels for Customer, Products and Extensions

Got tired of typing attribute numbers and their labels from PriceBuilder tables (Customers, Products, Customer Extensions, Product Extensions)?

You can easily fetch them over the API.

Sample endpoint for the CX mapping:

https://delphi-qa.pricefx.eu/pricefx/delphi-dev/fetch/CXAM

Filter definition for a particular extension:

Filter
Code Block
languagejs
{
    "data": {
        "_constructor": "AdvancedCriteria",
        "criteria": [
            {
                "operator": "equals",
                "fieldName": "name",
                "value": "CustomerAddtionalAttr"
            }
        ]
    }
}

Type Codes can be found here.

Sample response:

Response
Code Block
languagejs
{
    "response": {
        "status": 0,
        "startRow": 0,
        "node": "node1",
        "csrfToken": "1t8kgyrh14ou7l8y9p3kqpa2q",
        "data": [
            {
                "version": 0,
                "typedId": "200.CXAM",
                "fieldName": "attribute1",
                "label": "INCOTERMS",
                "fieldType": 2,
                "requiredField": false,
                "readOnly": false,
                "name": "CustomerAddAttr",
                "createDate": "2019-08-19T22:05:05",
                "createdBy": 10,
                "lastUpdateDate": "2019-08-19T22:05:05",
                "lastUpdateBy": 10
            },

This sample shows how to get just the relevant lines from the response using awk. It can be done with jq https://stedolan.github.io/jq/ too.

Code Block
awk '/fieldName/||/label/ { print $0 }' c.json | sed -n 'h;n;p;g;p'| sed 's/"label":/<pfx:body in =/;s/"fieldName":/ out =/'| sed '$!N;s/\n/ /'| sed 's/,//g'|sed 's/$/\/>/'

There is a set of Groovy scripts to support this functionality.

https://bitbucket.org/pricefx/im-interface-generator/src/master/

How to set old IM <1.1.17 to work with openJDK 11 and newer

It may happen that new IM 1.1.17 is not released yet but some IMs should go to new servers with openJDK 11. Then you need to start old IM versions on openJDK 11 and newer.

  1. Add two dependencies into pom.xml and change the version of the spring-boot-maven-plugin.
    Example commit is here: https://bitbucket.org/pricefx/flint-nw-integration/commits/3b51ab77d1ea2878e7fd749bd34e78c55b9b26ee

  2. Create a support ticket specifying that you want to change your IM to work with openjdk11 and newer.

  3. The change to be done is from this:

    Code Block
    if [ ${3} == "flintnw_prod" ]; then
           TARGET=int4.eu.pricefx.net
           deploy_im_standard ${1} ${2} ${3} ${4}
           exit 0
    fi

    to this:

    Code Block
    if [ ${3} == "flintnw_prod" ]; then
           TARGET=int4.eu.pricefx.net
           deploy_im_standard_openjdk11andnewer ${1} ${2} ${3} ${4}
           exit 0
    fi

How to disable IM auto-registration into PlatformManager

Since IntegrationManager 1.1.17, IM instances are auto-registered in PlatformManager: IM sends the IntegrationManagerInstanceStartup event into Kafka every time IM starts up. This is enabled by default and could be disabled with the following configuration property:

Code Block
# disable kafka (it is recommended for DEV environment)
integration.event-driven.enabled=false
# disable autoregistration
integration.event-driven.auto-registration.enabled=false

How to process CUSTOM events in IM

Events in general are processed by eventType as in integration.events.event-to-route-mapping.ITEM_APPROVED_Q=direct:quoteApproved.

However, CUSTOM events have different "name" (eventType) in EventAdmin and eventType in the data of the event. IM has two routes to process events: the first one downloads the event by the "name" and the second one processes it (sends it to the route) using the eventType inside the event.

As these two eventTypes are different in CUSTOM events, you have to use this hack to process them (1st downloads all CUSTOM events, 2nd processes your custom event):

Code Block
integration.events.event-to-route-mapping.CUSTOM=hackForCUSTOMevents
integration.events.event-to-route-mapping.CUSTOM_OutBound_PP_ADD=direct:exportPriceList

Using Header/Property in groovy exchange loadMapper

Code Block
<pfx:groovy expression="request.headers.nationalCode + body.sku" out="prod_id"/>
<pfx:groovy expression="exchange.properties.nationalCode + body.sku" out="prod_id"/>

Understanding Imports and Includes in WSDL Files

  • include –  Only includes types from XSD to the current namespace.

  • import – Imports a namespace. There is a difference between xsd:import and wsdl_import.

It is well explained at https://www.ibm.com/developerworks/webservices/library/ws-tip-imports/ws-tip-imports-pdf.pdf.

Truncating Only Flushed Rows from Data Feed

Only flushed rows from a data feed can be truncated by adding dtoFilter to the truncate command. You have to filter rows where the column formulateResult equals "OK".

Filter and truncate
Code Block
languagexml
<pfx:filter id="truncateFlushedFilter" resultFields="name">
    <pfx:and>
        <pfx:criterion fieldName="formulaResult" operator="equals" value="OK"/>
    </pfx:and>
</pfx:filter>
....
<to uri="pfx-api:truncate?targetName=DMF.SalesTransactions&dtoFilter=truncateFlushedFilter"/>

PX Columns Character Size Limits

For each column, there is a limit of 255 characters. For a product extension with 50 attributes, the limit is 70 characters for each column. If you change the size of the product extension from a lower number of attributes to 50 attributes and some of the attributes have more than 70 characters, a warning is displayed and the character string is truncated.

For details see /wiki/spaces/UDEV/pages/1680834622

Event Types

For details see EventType Class

pfx-api:unmarshal does not propagate an exception outside of a split

When there is an error in the CSV format of the data like a quote inside quotes etc., the unmarshal component throws an exception but it does not stop processing the file. It merely skips the current batch and goes on. You have to add stopOnException to the split definition.

Split
Code Block
languagexml
            <!-- batching, add total number of rows in the header  -->
            <split streaming="true"  strategyRef="recordsCountAggregation" stopOnException="true">
                <tokenize token="\n" group="5000"/>

Avoid Concurrent Updates on a Row

When I saved a status of processing in a PP table, I ran into an issue when the row-updating route was called from multiple places. During high load, it caused randomly the following exception:

net.pricefx.integration.api.NonRecoverableException: Error: There is probably a long running task which already updated or deleted data you are trying to manipulate. Please refresh your view. (Concurrent Data Modification)
at net.pricefx.integration.api.PriceFxExceptionTranslator.doRecoveryActions(PriceFxExceptionTranslator.java:119)
at net.pricefx.integration.api.client.LookuptableApi.integrate(LookuptableApi.java:501)

at net.pricefx.integration.command.ppv.Integrate.execute(Integrate.java:71)

I avoided the issue by letting the route process only one exchange at a time using ThrottlingInflightRoutePolicy.

Bean Definition
Code Block
languagexml
    <!-- Process only one exchange at a time. Used for updating statuses -->
    <bean id="oneExchangeThrottlePolicy" class="org.apache.camel.impl.ThrottlingInflightRoutePolicy">
        <property name="maxInflightExchanges" value="1"/>
        <property name="scope" value="Route"/>
    </bean>
Route Definition
Code Block
languagexml
<route id="dataSetFileStatusUpdate" routePolicyRef="oneExchangeThrottlePolicy">

Log Snippet:
10:24:22.709 | INFO | Camel (camel-1) thread #13 - file:///home/customer/irm-emea-dev/filearea/inbound | dataSetFileStatusUpdate | ID-linux-wbx3-1601540648191-0-1 | o.a.c.i.ThrottlingInflightRoutePolicy | Throttling consumer: 2 > 1 inflight exchange by suspending consumer: Consumer[direct://dataSetFileStatusUpdate]
10:24:22.709 | INFO | Camel (camel-1) thread #13 - file:///home/customer/irm-emea-dev/filearea/inbound | dataSetFileStatusUpdate | ID-linux-wbx3-1601540648191-0-1 | o.a.c.i.ThrottlingInflightRoutePolicy | Throttling consumer: 1 <= 1 inflight exchange by resuming consumer: Consumer[direct://dataSetFileStatusUpdate]

onComplete behaviour when calling one route multiple times

OnComplete block is executed at the end of a parent route if used in a child route using the latest exchange. If the same route is called multiple times from a parent route, the onComplete block is not executed at the end of the child route but after all calls are finished. See the comments in the code sample:

Code Block
languagexml
        <!-- Price Condition 006 -->
        <route id="exportPriceConditionA006Regions">
            <from uri="direct:exportPriceConditionsA006Regions"/>
                <!-- EMEA -->
                    <log message="Exporting EMEA Region"/>
                    <!-- header for a dynamic over Sales Orgs -->
                    <setHeader headerName="Country">
                        <constant>EMEA</constant>
                    </setHeader>
                    <setHeader headerName="sftpOutputFolder">
                        <constant>INF0070</constant>
                    </setHeader>
                    <to uri="direct-vm:fetchSalesOrg"/>
                    <to uri="direct:exportPriceConditionsA006"/>
                <!-- Russia -->
                    <log message="Exporting Russia Region"/>
                    <!-- header for a dynamic over Sales Orgs -->
                    <setHeader headerName="Country">
                        <constant>RU</constant>
                    </setHeader>
                    <setHeader headerName="sftpOutputFolder">
                        <constant>INF0068-1</constant>
                    </setHeader>
                    <to uri="direct-vm:fetchSalesOrg"/>
                    <to uri="direct:exportPriceConditionsA006"/>
                <!-- North America -->
                    <log message="Exporting North America Region"/>
                    <!-- header for a dynamic over Sales Orgs -->
                    <setHeader headerName="Country">
                        <constant>US</constant>
                    </setHeader>
                    <setHeader headerName="sftpOutputFolder">
                        <constant>INF0068</constant>
                    </setHeader>
                    <to uri="direct-vm:fetchSalesOrg"/>
                    <to uri="direct:exportPriceConditionsA006"/>
                <!-- all three onComplete will run now with headers etc. setup by the last call -->
        </route>

        <route id="exportPriceConditionA006">
            <!--      <from uri="timer://exportPriceConditionA004?repeatCount=1"/>-->
            <from uri="direct:exportPriceConditionsA006"/>
            <log loggingLevel="INFO" message="Sales Orgs: ${header.salesOrgsList}"/>
            <!-- set the export file name -->
            <setHeader headerName="CamelFileNameOnly">
                <simple>PC_A006_DT_${date:now:yyyyMMdd_HHmmssSSS}.csv</simple>
            </setHeader>
            <setHeader headerName="CamelFileName">
                <simple>{{sap-price-condition-export.folder}}/${header.CamelFileNameOnly}</simple>
            </setHeader>
            <log loggingLevel="INFO" message="Starting export of the price condition to ${header.CamelFileName}"/>
            <!-- fetches the data set row by its status -->
            <to uri="pfx-api:fetch?objectType=LTV&pricingParameterName=A006_PricingConditions&filter=priceConditionsA006Filter"/>
            <!-- save number of rows for journal -->
            <setHeader headerName="exportedRows">
                <simple>${header.totalRows}</simple>
            </setHeader>
            <setHeader headerName="csvHeader">
                <constant>Condition type,Sales Org.,Distr. Channel,Price List,Document Currency,Material,Amount,Unit,Condition Pricing Unit,UoM,Valid From,Valid To,Scale Quantity,Scale Amount</constant>
            </setHeader>
            <!-- group by column for Scale Quantity Processor -->
            <setProperty propertyName="scaleQuantityKeyName">
                <constant>Material</constant>
            </setProperty>
            <filter>
                <simple>${header.exportedRows} > 0</simple>
                <multicast parallelProcessing="false" stopOnException="true">
                    <to uri="direct:exportPriceConditionA006ToFile"/>
                    <to uri="direct:exportPriceConditionA006ToDS"/>
                    <to uri="direct:exportPriceConditionA006MarkExported"/>
                </multicast>
            </filter>
            <log loggingLevel="INFO" message="Processing completed. Exported ${header.exportedRows} rows."/>
            <onCompletion onCompleteOnly="true">
                <!-- only if we found rows to export -->
                <choice>
                    <when>
                        <simple>${header.exportedRows} > 0</simple>
                        <!-- journal the export -->
                        <wireTap uri="direct-vm:journalExportedFile"/>
                        <!-- transfer the file to the sftp -->
                        <to uri="direct:exportPriceConditionsToSFTP"/>
                    </when>
                    <otherwise>
                    </otherwise>
                </choice>
            </onCompletion>
        </route>

How to Copy File(s) with SCP from One Server to Another via Proxy Jump

...

Code Block
languagebash
root@node1.customerA-qa.pricefx.net # scp -P 666 -o "ProxyJump jan.kadlec@jmp.pricefx.eu -p 666" root@node1.customerA.pricefx.net:/home/customer/customerA_prod/filearea/lizeosopricesfull/LizeoSelloutPrices_AMN_20200927* /tmp/
LizeoSelloutPrices_AMN_20200927000000.csv                                                                                                  100%   59MB 102.7MB/s   00:00    
root@node1.customerA-qa.pricefx.net # 

Now you have the required file from the PROD server in the /tmp folder of the QA server.

How to Remove IM from Server (with Refresh of Monitoring Tool)

Code Block
root@server-a.pricefx.net ~ # salt-call grains.get im
local:
    ----------
    installed_ims:
        - im-customer1-prod
        - im-customer1-qa
        - im-customer2-qa
        - im-customer1-dev
        - im-customr2-prod
        - im-customer3-prod
    installed_pies:
    running_ims:
        - im-customer1-prod
        - im-customer1-qa
        - im-customer2-qa
        - im-customer1-dev
        - im-customr2-prod
        - im-customer3-prod
    running_pies:
root@server-a.pricefx.net ~ # systemctl stop im-customer3-prod
root@int1.us-vh.pricefx.net ~ # systemctl disable im-customer3-prod
im-customer3-prod.service is not a native service, redirecting to systemd-sysv-install.
Executing: /lib/systemd/systemd-sysv-install disable im-customer3-prod
perl: warning: Setting locale failed.
perl: warning: Please check that your locale settings:
LANGUAGE = (unset),
LC_ALL = (unset),
LC_ADDRESS = "cs_CZ.UTF-8",
LC_NAME = "cs_CZ.UTF-8",
LC_MONETARY = "cs_CZ.UTF-8",
LC_PAPER = "cs_CZ.UTF-8",
LC_IDENTIFICATION = "cs_CZ.UTF-8",
LC_TELEPHONE = "cs_CZ.UTF-8",
LC_MEASUREMENT = "cs_CZ.UTF-8",
LC_TIME = "cs_CZ.UTF-8",
LC_NUMERIC = "cs_CZ.UTF-8",
LANG = "en_US.UTF-8"
    are supported and installed on your system.
perl: warning: Falling back to a fallback locale ("en_US.UTF-8").
perl: warning: Setting locale failed.
perl: warning: Please check that your locale settings:
LANGUAGE = (unset),
LC_ALL = (unset),
LC_ADDRESS = "cs_CZ.UTF-8",
LC_NAME = "cs_CZ.UTF-8",
LC_MONETARY = "cs_CZ.UTF-8",
LC_PAPER = "cs_CZ.UTF-8",
LC_IDENTIFICATION = "cs_CZ.UTF-8",
LC_TELEPHONE = "cs_CZ.UTF-8",
LC_MEASUREMENT = "cs_CZ.UTF-8",
LC_TIME = "cs_CZ.UTF-8",
LC_NUMERIC = "cs_CZ.UTF-8",
LANG = "en_US.UTF-8"
    are supported and installed on your system.
perl: warning: Falling back to a fallback locale ("en_US.UTF-8").
root@server-a.pricefx.net ~ # unlink /etc/init.d/im-customer3-prod
root@server-a.pricefx.net ~ # systemctl daemon-reload
root@server-a.pricefx.net ~ # systemctl reset-failed
root@server-a.pricefx.net ~ # salt-call saltutil.refresh_grains
local:
    True
root@server-a.pricefx.net ~ # salt-call grains.get im
local:
    ----------
    installed_ims:
        - im-customer1-prod
        - im-customer1-qa
        - im-customer2-qa
        - im-customer1-dev
        - im-customr2-prod
    installed_pies:
    running_ims:
        - im-customer1-prod
        - im-customer1-qa
        - im-customer2-qa
        - im-customer1-dev
        - im-customr2-prod
    running_pies:
root@server-a.pricefx.net ~ #

Removed im-customer3-prod from configuration.py -> It will be replaced by command execution "salt-call state.apply im.monitoring" with next relase of monitoring tool.

How to Send a Message to /dev/null

Sometimes you need to throw away a message to get a flow going. It can be achieved by logging to an empty channel. Camel will mark the message as consumed and go to another one.

Empty log
Code Block
languagexml
                <choice>
                <when>
                    <simple>${header.exportedRows} > 0</simple>
                    <wireTap uri="direct-vm:journalExportedFile"/>
                    <!-- transfer the file to the sftp -->
                    <to uri="direct:exportCreditMemoToSFTP"/>
                </when>
                    <otherwise>
                        <to uri="log:dev.null?level=OFF"/>
                    </otherwise>
                </choice>

Correct setup for LOG and ELK in IM 1.3.x and newer

Many integrations might have a wrong setup. If you migrated to Camel 3.5 (IM 1.3.x and newer), you need to do these steps: 

...

Remove the logback-spring.xml file from your IM project.
IM has its own configuration file. The main.log file will be created in the new subfolder "logs" in the IM instance folder. Monitoring is available there. 

Remove these properties from the property files. These properties are not needed. 

Code Block
logging.file.name=main.log
integration.logging.file=main.log

...

Add these new properties into the property files. It is configuration for ELK if a config server is not used. 

Code Block
integration.logstash.enabled=true
integration.logstash.address=elkint.pricefx.eu:4560

How to set Constructor Parameter for Bean, e.g. Converter

Some values are not available as properties. For example pattern for DecimalToString can only be set as a constructor parameter.

Code Block
languagexml
  <bean id="decimalToString" class="net.pricefx.integration.mapper.converter.DecimalToString">
    <constructor-arg name="pattern" value="#.###############################################"/>
  </bean>

How to access Spring properties in Simple blocks

Here is an example how to log a property set in a property file:

Code Block
languagexml
Property file:
bosch-rexroth.initial-load-master-data-batch-size=50000
Route:
<log message="Running batch number# ${exchangeProperty.CamelSplitIndex}, batch size ${properties:bosch-rexroth.initial-load-master-data-batch-size} for file ${header.CamelFileNameOnly}"/>

Manual Integration that uses event processing is filled with "perEventTypeEventRouteInputEventRoutePADATALOAD_COMPLETED" messages

Under Loggers, search for net.pricefx.integration.api.PriceFxExceptionTranslator and change the log level to WARN.

Provisioned instance fails on no spring.config.import property has been defined

If you had a provisioned AWS instance on version 3.7.0 and below, you might encounter this problem when you switch your project to Custom Image build. This is caused by missing dependency in pom.xml. Just add this dependency to pom.xml to resolve the issue:

Code Block
<dependency>
  <groupId>org.springframework.cloud</groupId>
  <artifactId>spring-cloud-starter-bootstrap</artifactId>
  <version>4.0.4</version>
</dependency>

...

Child pages (Children Display)
allChildrentrue