Transaction Datasource-Create and Load with PlatformManager
Load Transaction Data Source using PlatformManager
Our Data Source structures are the building blocks of the Pricefx DataMarts and the core of the DataMart is the customer’s transactional history. In this narrative, we will be using the PlatformManager product’s Data Upload function to load CSV file data into our Transaction Data Source.
Requirements
First, we will be defining a Transaction Data Source. Once the metadata has been defined, then we will be loading the table with data from the CSV file is listed below. here is a sample of a transaction CSV:
Create the Transaction Data Source
Before we can upload our data using PlatformManager, we need to create and define the attributes for our Transaction Data Source.
From the toolbar menu, click PriceAnalyzer | Data Sources
The listing of curren Data Sources will be displayed:
NOTE: This picture denotes the “out-of-the-box” version of Data Sources and notice that there is no Transaction Data Source. It must be created and column attributes defined by us.
To create a new Data Source, click on the Add Source button:
Fill out the following form as shown below:
Click Add button. New Transaction Data Source will appear:
Define Transaction Data Source Attributes
After creating our new Data Source, we will need to define the custom attribute columns that mirror the content that is being extracted from the customer’s application and uploaded via PlatformManager.
Click on the new Transaction Data Source and the default definition will appear:
NOTE: Unlike other table structures we may have encountered, when we build Data Sources there are no pre-built attributes that are defined. Additionally, we will only define those that are required for our DataMart, thus not all input CSV columns will be defined.
Rather than defining each column individually, we have the ability to import a JSON definition of our Data Source attributes. Thus, click on the Import & Export button:
This will display the following:
We are going to replace the entire contents of the JSON definition. So, we can delete the entire JSON contents on this panel:
Next, open the Transaction DS Definition file and copy and paste its entire contents into the JSON definition panel:
Click Apply button. The Transaction Data Source should now appear as:
NOTE: After the import of the JSON definition we should see 47 columns defined and the primary key is defined as UniqueId column.
Next, we need to click on the Deploy button:
This should display the before and after version of our Data Source:
Click on the Deploy button (lower right corner).
Click on OK button.
Upload Data using PlatformManager
Now that our Transaction Data Source has been defined, we are ready to upload the data in our CSV file using the Data Upload operation within PlatformManager.
Login to Platform Manager
In browser, go to URL: https://platform.pricefx.com/
If you are a Pricefx employee, click on Login with O365.
Access Target Partition
Navigate to Partitions option
Then, input (or select) your target partition:
Click on Data Upload option. Then click on the Add button:
On the Data Upload panel, add the name as TransactionDS_Load and select the Entity type of Data Source:
Then, we need to select the Data Source entity from the Entity Name list:
Click Continue.
Next, we need to identify the CSV file that will be uploaded:
Drag and Drop our Transactions.csv file onto the page. This is a large file so it may take a few moments.
PlatformManager will then parse the CSV file and using default parsing options it will show you a sample layout of the file:
NOTE: Verify that the layout shown here will map to the CSV file uploaded. This view within PlatformManager will be used to map these columns to our Transaction Data Source table.
Click Continue button.
Data Mapping
Our next phase is the mapping of the columns in the CSV file to the attributes defined in our Transaction Data Source table in our partition.
When performing the data mapping we should be aware that column mappings will be automatically pre-assigned when the input field matches the output field. The mapping page should now appear:
The order of the column mappings is based on the CSV input columns. From this list we can see that all names are matching so all mappings are being done automatically.
The only exception to this is the LastUpdateDate CSV input field. We don’t need this value, so click on the trashcan icon to delete this mapping:
Click Continue. This will submit the data upload for processing:
Click on our TransactionDS_Load upload process and select Show history from the menu options:
The history of the upload will appear as:
NOTE: Review the number of records processed and the number that failed. This upload shows that all records were processed successfully with 0 failures.
Additionally, if there were any record failures, then we can review the log records. The upload log records can be found via Data Upload Logs link:
Finally, to verify it was successful we need to view the new Price Parameter table in our partition. Click on PriceAnalyzer | Data Manager | Data Sources and choose TX Standard Data – High Tech data source:
When we return to the Data Source panel appears, select the Transactions Data Source and click on the Data tab:
This will reload the transaction data and it will appear as:
We have successfully loaded our Transaction Data Source table!