/
Plasma Logics

Plasma Logics

 

Plasma Quote Processor Package

Plasma_Quotes_Extractor

1.Mapping configuration is read from QuotesDataExtractorMappingConfiguration, saved in a format of map with 3 keys: columnsMapping,idCol,searchLevelOrder and assigned to api.global.ExtractorConfig – which is a result of element „MappingConfiguration”

  1. Dictionary:

  • mappingLine: row of QuotesDataExtractorMappingConfiguration table

  • searchLevelOrder: a list of search levels (Q,QLI,QLI_INPUT,QLI_OUTPUT,W)

  • columnsMapping: map of configurations per quote types, each quote type as key and as value - map of configs with: quoteAndWorkflowLevelConfig, otherLevelConfig, calculationAtQConfig, calculationAtQLIConfig

  • quoteAndWorkflowLevelConfig: map with config rows „Extract Level” column values : Quote or Workflow

  • otherLevelConfig: map with config rows „Extract Level” column values other than Quote or Workflow

  • idCol: value of columnsMapping.idColumn, if it’s null the default is always „UniqueId”

2.Quote object is fetched from Quote Configurator using api.currentItem()

3.Workflow info for quote is fetched by quote id and saved to object. It contains data regarding approval steps, the current step on which given quote is and some general data about customer etc.

4.Extraction config map is being set based on quote type (extractionConfigMapping), taken from columnsMapping by quote type as the key

5.Search level source map is set – and it contains quote data + qli data + workflow data, but for after refactor – search level map is divided to separate quote data extraction vs qli data extraction.

6.General quote data is extracted (about quote itself, not particular line item), fields like CreationDateTime, CustomerId, QuoteId, SalesPerson etc. On high level – data is extracted level by level – but the only real differentiation is quote vs qli data.

For extraction of general quote data only levels Q and W are used, QLI data is not used here.

6.1.Extraction of this general data is done calling „extractByConfig” on each column from columnsMapping. Column data (which is type,expression and level) is passed as List of Maps where column data is stored in 1 map

6.2. „extractByConfig” takes map out of the list and calls recursively itself, performing extraction on the map – calling „extractConfigFromMap” (unless these is no level for given config, then first found is returned)

6.3 It is checked if given config row in of „constant” type (can be default/constant/calculation). For „constant” type – expression String from given config is evaluated using FormulaEvaluator, for default or calculation, it is checked (yet again – possibly redundantly) if there is level specified, if not – then it takes first found

6.4. Depending on extraction level for given config – extraction method is chosen

(all these are operated on levelMap as the data to extract and + levelName to decide on extraction method + configExpression)

6.4.1. for Quote level (Q)-> if expression is found as key in searchLevel map, then value for that key is returned as extracted quote, otherwise it’s checking if expression is one of 4 columns: ”ConvertedToDealTimeDate” / „MarkedAsLostDateTime” / „RevokedDateTime” / „NeedApproval” then checking accordingly if quote status is equal to DEAL / LOST / REVOKED  - if so – the value of extracted quote is returned by lastUpdateTime. Special case is for „Need Approval” for which extraction isn’t being done.

By default – further extraction of quote is based on expression and line item data.

The expression after it’s parsed – is as a key used to access value in qli data.

The example of expression which needs parsing could be:

FIND_ALL( FIND_ALL(steps, it.executionStatus=='EXECUTED_APPROVED')[-1].stepHistory.history, it.status=='EXECUTED_APPROVED')[0].date

Expression can be used as a fallback only if user needs to extract something that was not provided OOTB like some mapping : NumberOfApprovalSteps, ConvertedToDealDateTime etc.

If not initialized earlier – api.global.ExtractorParsedConfigCached is initialized, if ExtractorParsedConfigCached is already initialized, it is checked if the value for this expression already exists (to avoid unnecessary parsing again)

Expression is parsed using ExtractorConfigTokenizer which parses expression String to a list of expressions, using dot (‘.’) as delimiter between them, and expression along with parsed expression are added to ExtractorParsedConfigCached. Then extraction is run recursively on each element of expressions list.

Then it extracts iterating over each level of quote map – treating each value on that map as separate search config. The first element of parsed expressions list is taken and used as a key to get value from quote map, then calling the same method (doExtract) this value is passeed as the line item data. With each recursive call. The current element is deleted from expressions list, and call extraction until this list is empty

6.4.5 for Workflow (W) -> It’s first checked if expression is one of specified Columns (ApprovedDateTime, SubmittedDateTime, DenyStep, ApprovedSteps,DieniedDateTime,NumberOfApprovalSteps,ApprovalProcessing) – for each of them extraction from line item data is a bit different

6.4.6 All above extractions are saved to quoteLineItemRow map – which serves us as the storage for extracted data (which will lated be added as a row to target DS)

7.For the config type „calculation” – if any values are present in calculationAtQConfig, extraction on these values is additionally performed and added to  quoteLineItemRow

  1. List of exisitng quote line items for given quote is fetched by quote type id, using api.getCalculableLineItemCollection

  2. Iteration over this list of quote line items begin

10. In each iteration for each line item – the same steps as in point 5,6 and 7 are performed but the only levels that are being processed are QLI,QLI_INPUT and QLI_OUTPUT:

-for Quote Line Item Level (QLI)-> api.global.QLIExtractorLastProcessingQuoteCache ia used to cache QLI objects, if current qli isn’t equal to previous one, the cache si cleared and put current one to it. If possible, the value for given expression is taken from cached map. If line item map contains value for expression, it’s returned as extracted value, oherwise the same path as for quote level is used

  • for Quote Line Item Output (QLI_OUTPUT) -> api.global.ExtractorInputLineId is used to cache it. First it’s checked if the value can be found in api.global.ExtractorInputsMapCached. If not, it is iterated over api.global.ExtractorInputsIdxCached to find the result.

  • for Quote Line Item Input (QLI_INPUT)-> the same mechanism as in QLI_OUTPUT, only different api.globals names

11. Each row to the target DS is addded:

Element Name

Description

RetainGlobal

sets api.retainGlobal to true

Library

evaluate expressions from config map using FormulaEvaluator Library

ExtractorConfigParser

Parse extractor expressions to list of expression items

Configuration

Fetch mapping configs from PP so they can be used in QuoteDataExtractor

WarningManager

Instance of WM

DataExtractionUtil

Utilities for extracting data from quotes and quote line items

Quote

Current quote object from Feeder(api.currentItem())

WorkflowInfo

Workflow object fetched for given quote

QuoteLevelExtractedData

Extraction of data from Quote object

QuoteItemLevelExtractedData

Extraction of data from line items objects for given quote

CombinedExtractedData

Acing combined q and qli data to target

 

Mapping Configurations

Mapping configurations are read from the "Plasma_Quotes_Extractor_Mapping" PP

Some mappings are given out of the box and they fit all the PFX partitions – and some fields need to be added – depending on extraction levels.

For extraction levels “Quote” and “Workflow” - we have default mappings that can be deployed from repo.

For “Quote LineItem Output” and “Quote LineItem Input” - the values need to be manually added depending on quote logic and fields in DM.

A map which defines configurations to extract:

def mapping = [

    quoteId:[expression:"uniqueName",level:"Q"],

    ....

]

Then it produces a result as a map for each line item:

[quoteId:"extracted Quote Id",

 ...

]

Extract Configurations

String or Map which define what to get from which level. If a level is not specified, the extractor will search for all levels by the configured order ["Q","QLI","QLI_INPUT","QLI_OUTPUT", "W"] and will return the first found.

·        Expression – Specifies the key to get data in a map-like structure (Q, QLI, W).

o   Predefined keys for the workflow level:

§  ApprovedDateTime – Date time of the last approved step

§  SubmittedDateTime – Submitted date time

§  DeniedDateTime – Denied Date time

§  DenyStep – Name of the Denied step

§  ApprovedSteps – List of Approved steps

§  DeniedSteps – List of Denied steps

§  ApprovalSteps – List of approval steps

§  NumberOfApprovalSteps – Number of approval steps

§  ApprovalProcessing – Shows if the workflow is processed or not (boolean)

o   Predefined keys for the quote level:

§  ConveretedToDealDateTime – When a quote was converted to deal

§  MarkedAsLostDateTime – When a quote was marked as lost

§  RevokedDateTime – When a quote was revoked

§  NeedApproval: Shows if a quote needs an approval or not (boolean)

·        Level – Specifies the search level.

def mapping = [

    quoteId:[expression:"uniqueName",level:"Q"], //get uniqueName in Quote :[uniqueName:"name", version:123,...]

    ....

]

·        Dot expression (.) – Specifies the sub key to find in a multiple level map structure.

def mapping = [

    submitDateTime:[expression:"activeStep.stepHistory.date",level:"W"], /* get date from Workflow :[

                        activeStep:[

                            stepHistory:[date:"01-01-2014"],

                            ...

                        ],

                        ...

                    ] */

    ....

]

·        Multiple keys searching  – Specifies multiple keys to try, the first found is returned.

def mapping = [

    quantity:[[expression:"quantity",level:"QLI_INPUT"],[expression:"Quote Quantity",level:"QLI_INPUT"]], //get quantity or "Quote Quantity" from Quote lineitem input

    ....

]

·        Default value – Specifies the default value if the result not found.

def mapping = [

    quantity:[[expression:"quantity",level:"QLI_INPUT"],[expression:"Quote Quantity",level:"QLI_INPUT",default:"20"]], //get quantity or "Quote Quantity" from Quote lineitem input, if nothing found default is 20

    ....

]

·        Brackets expression – Specifies a search formula (evaluated using the FormularEvaluator library) or an index to get data in a list structure or using the last/first keyword (to get teh last item / first item in a list).

def mapping = [

    submitDateTime1:[expression:"activeStep.stepHistory.history.[0].date",level:"W"], //get date of first history from workflow

    submitDateTime:[expression:"activeStep.stepHistory.history.[status=='INITIALIZED'].[0].date",level:"W"], /* get date from Workflow :[

                        activeStep:[

                                stepHistory:[

                                    [status:"INITIALIZED",date:"01-01-2014"],

                                    [status:"SUBMITTED",date:"01-01-2014"],

                                    ...

                                ],

                                ...

                            ],

                            ...

                        ] where status is INITIALIZED */

    ....

]

·        Constant type – Returns the defined value.

def mapping = [

    constantValue: [type:"constant", value:"value"], // return value of the constant type

    ...

]

·        Calculation type – Returns the calculated value from an expression using the FormularEvaluator library. The calculation type is calculated after all data are extracted from the quotes.

def mapping = [

    approvalSteps:[expression:"steps",level: "W"], // get number of approval steps from workflow, return a list of step

    numberOfApprovalSteps:[type:"calculation",expression:"IF(IS_NULL(approvalSteps),0,SIZE(approvalSteps))"], // after extract all data, calculate number of approval step.

    ...

]

·        String – You can specify mapping as a String. The extractor will use the string as a key to search for all levels, the first found is then returned.

def mapping = [

    quantity:"Quote Quantity",

    submitDateTime:"activeStep.stepHistory.history.[status=='INITIALIZED'].date",

    ....

]

 

 

Plasma_Quotes_Extractor_Feeder

This is a feeder logic to emit Quote items to Data Load.

Feeder fetches quote data and collects it typedId. Next that id’s are pushed forward by api.emitPersistedObject. This runs in incremental mode, and only fetches quotes that are having one of the below workflow status:

·        APPROVED

·        DENIED

·        NO_APPROVAL_REQUIRED

 

Extraction levels:

  • Q – quote data

  • QLI – quote line item data

  • QLI_INPUT – quote line item input data

  • QLI_OUTPUT – quote line item output data

  • QLI_OUTPUT_SUFFIX - quote line item output suffix

  • W – quote workflow data

  • QH_INPUT - quote header input

  • QH_OUTPUT - quote header output

  • QH_OUTPUT_SUFFIX - quote output suffix

Element Name

 

Description

Element Name

 

Description

QuoteIdFeeder

Fetches quotes from Quote Configurator, depending on lastUpdateDate (last update of quote in QC) and lastRun date (last job run of extractor DL)

 

Plasma_Quote_Sequencer

Plasma sequencer is logic which is part of plasma solution. It’s responsible for managing, starting, plasma steps. The main configuration of this process is contained in Plasma_Quotes_SequenceControlTable. This PP I described in other part of configuration with details, but we will present here some key information’s about how it works.

Element Name

Description

Element Name

Description

Sequence

Kicks off the Sequencer calling its processing method


Plasma_Quote_Sequencer_Reset

 

A CalculationFlow which runs every 2 weeks, it checks if the Plasma_Quotes_SequenceControlTable PP table has all rows with Status DONE. If yes, it will make the Status blank, which will be picked up the sequencer and processes all the jobs.

Element Name

Description

Element Name

Description

Sequence

Kicks off the Sequencer calling its processing method

 
Plasma_Quotes

This is a PA Calculation that aggregates the “Plasma_Quotes” DM data by each month/sellingFromRegion/sellingToRegion/businessUnit combination. Rollup is applied on the data to calculate sum for money/number fields. Data is sourced from Plasma_Quotes datamart. Result data is loaded into Plasma_Output_Quotes datamart.

Only data which is X months older than current date will be taken into consideration whenever this aggregation happens ( (X = DataDelayInMonths parameter in Plasma_Confit PP table)). The aggregated data is then written into “Plasma_Output_Quotes” DM.

Dem_Business_Unit in the target datamart is a key, so it cannot be null. If null, then defaults to ‘None’.

Element Name

Description

Element Name

Description

RetainGlobal

sets api.retainGlobal to true

IsSyntaxCheck

Technical Element

WarningManager

Instance of WM

Rows

Queries source data with rollup and adds rows to target datamartrowset


Plasma_Quotes_Exchange 

This is a PA Calculation that applies exchange conversion from any currency to USD. This takes the data from Plasma_Quotes DS, converts money fields to USD currency and loads into target Plasma_Quotes datamart.

It takes the currency exchange rates from the “ccy” DS”. If not found there, the exchange rates will be fetched from Plasma_Currencies” DS.

Element Name

Description

RetainGlobal

sets api.retainGlobal to true

SyntaxCheck

Technical element

WarningManager

Instance of WM

Row

Current row from source DS

Exchange

Contains util methods to get currency exchange rates

Library

Contains util methods to apply currency exchange rates, and calculate standardDeviation

Threshold

Extracts ApprovalVelocities of all quotes and calculate standardDeviation. Then returns the threshold by doing (2 * stdDev) + meanVelocity

UniqueId

Returns UniqueId from current row

Currency

Returns “USD”

ApprovalVelocity

Returns ApprovalVelocity from current row only if it is less than or equal to Threshold

BoundaryPrice

Applies currency exchange rates on BoundaryPrice from current row.

NeedsApproval

Returns NeedsApproval from current row.

 

Plasma Transaction Processor Package

 

Plasma_TX_Sequencer

Plasma sequencer is logic which is part of plasma solution. It’s responsible for managing , starting, plasma steps. The main configuration of this process is contained in Plasma_Transactions_SequenceControlTable.

By default, plasma extractor and distiller are built on 5 steps. Each step is represented by one row in Plasma_Transactions_SequenceControlTable PP - a name of sequence that should be executed.

Element Name

Description

Element Name

Description

Sequence

Kicks off the Sequencer calling its processing method


Plasma_TX_Sequencer_Reset

This is a CalculationFlow which runs every 2 weeks, it checks if the priceparameter table Plasma_Transactions_SequenceControlTable has all rows with Status DONE. If yes, it will make the Status blank to have it picked up the sequencer and to process all the jobs.

Plasma_Transactions_Enrichment

This is a PA Calculation that calculates complex standardized metrics from the aggregated Plasma data. Various queries are run to arrive at the accounts, products, customers, sales persons, price changes, margin, revenue calculated monthly, annually, by region, etc. The standardized metrics are then loaded (updated) back into Plasma_Output_Transactions Datamart.

Customer related metrics are grouped by SellingTo. Product/SalesRep metrics are grouped by SellingFrom. Transaction metrics are grouped by both SellingTo/SellingFrom.

Element Name

Description

Element Name

Description

Dem_Year

String user entry from source DS

Dem_Month

String user entry from source DS

Dem_Quarter

String user entry from source DS

Dem_Region_Selling_To

String user entry from source DS

Dem_Region_Selling_From

String user entry from source DS

Dem_Business_Unit

String user entry from source DS. If ‘None’, then treated as null in further calculation elements

WarningManager

Instance of WM

RetainGlobal

sets api.retainGlobal to true

SyntaxCheck

Technical element

YearMonth

Converts the month into 2020-M01 format

YearQuarter

Converts the quarter into 2020-Q1 format

EnrichmentUtils

Contains util methods to do various enrichments

QueryUtils

Contains util methods to execute various queries

Cache

Contains util methods to extract data into cache

Data12Months

Boolean that says if there is trailing twelve months of data or not

AbstractQueryManager

Returns util methods to calculate margin-based metrics.

AccountsManager

Creates instance of AbstractQueryManager by sending AccountsQuery as parameter to access the customer related margin-based metrics.

Total_Accounts_This_Month_Region

Accesses AccountsManager method to get the count of accounts in current month and SellingTo region.

Number_Account_Negative_Margin

Accesses AccountsManager method to get the count of accounts having negative margin in current month and SellingTo region.

Number_Account_Positive_Margin

Accesses AccountsManager method to get the count of accounts having negative margin in current month and SellingTo region.

Sum_Revenue_Account_Negative_Margin

Accesses AccountsManager method to get the revenue of accounts having negative margin in current month and SellingTo region.

Sum_Revenue_Account_Positive_Margin

Accesses AccountsManager method to get the revenue of accounts having positive margin in current month and SellingTo region.

Total_Accounts_Available_Annual_Region

Executes dm query to get the count of accounts in trailing 12 months and SellingTo region. (Even if there are less than 12 trailing months available)

Total_Accounts_Annual_Region

Executes dm query to get the count of accounts in trailing 12 months and SellingTo region. (Only if there are 12 trailing months available)

Total_Accounts_Available_Annual

Executes dm query to get the count of accounts in trailing 12 months. (Even if there are less than 12 trailing months available)

Total_Accounts_Annual

Executes dm query to get the count of accounts in trailing 12 months. (Only if there are 12 trailing months available)

Total_Accounts_This_Month

Executes dm query to get the count of accounts in current month.

ProductsManager

Creates instance of AbstractQueryManager by sending ProductsQuery as parameter to access the customer related margin-based metrics.

Total_Products_This_Month_Region

Accesses ProductsManager method to get the count of Products in current month and SellingFrom region.

Number_Product_Negative_Margin

Accesses ProductsManager method to get the count of Products having negative margin in current month and SellingFrom region.

Number_Product_Positive_Margin

Accesses ProductsManager method to get the count of Products having negative margin in current month and SellingFrom region.

Sum_Revenue_Product_Negative_Margin

Accesses ProductsManager method to get the revenue of Products having negative margin in current month and SellingFrom region.

Sum_Revenue_Product_Positive_Margin

Accesses ProductsManager method to get the revenue of Products having positive margin in current month and SellingFrom region.

Total_Products_Available_Annual_Region

Executes dm query to get the count of Products in trailing 12 months and SellingFrom region. (Even if there are less than 12 trailing months available)

Total_Products_Annual_Region

Executes dm query to get the count of Products in trailing 12 months and SellingFrom region. (Only if there are 12 trailing months available)

Total_Products_This_Quarter_Region

Executes dm query to get the count of Products in current quarter and SellingFrom region.

Total_Products_Available_Annual

Executes dm query to get the count of Products in trailing 12 months. (Even if there are less than 12 trailing months available)

Total_Products_Annual

Executes dm query to get the count of Products in trailing 12 months. (Only if there are 12 trailing months available)

Total_Products_This_Month

Executes dm query to get the count of Products in current month.

Total_Products_This_Quarter

Executes dm query to get the count of Products in current quarter.

TransactionsManager

Creates instance of AbstractQueryManager by sending TransactionsQuery as parameter to access the customer related margin-based metrics.

Total_Number_Transactions

Accesses TransactionsManager method to get the count of Transactions in current month, SellingFrom and SellingTo region.

Number_Transaction_Negative_Margin

Accesses TransactionsManager method to get the count of Transactions having negative margin in current month, SellingFrom and SellingTo region.

Number_Transaction_Positive_Margin

Accesses TransactionsManager method to get the count of Transactions having negative margin in current month, SellingFrom and SellingTo region.

Sum_Revenue_Transaction_Negative_Margin

Accesses TransactionsManager method to get the revenue of Transactions having negative margin in current month, SellingFrom and SellingTo region.

Sum_Revenue_Transaction_Positive_Margin

Accesses TransactionsManager method to get the revenue of Transactions having positive margin in current month, SellingFrom and SellingTo region.

DecilesUtils

Contains util methods to calculate deciles on the basis of revenue

Product_Deciles_Calculation

Calculates deciles of products in current month, businessUnit, SellingFrom on the basis of revenue

Number_Products_10_Revenue_Decile

Returns the count of products in 10 percentile revenue

Number_Products_100_Revenue_Decile

Returns the count of products in 100 percentile revenue

Customer_Deciles_Calculation

Calculates deciles of customers in current month, businessUnit, SellingFrom on the basis of revenue

Number_Customers_10_Revenue_Decile

Returns the count of customers in 10 percentile revenue

Number_Customers_100_Revenue_Decile

Returns the count of customers in 100 percentile revenue

PricesManager

Contains util methods to calculate price change metrics in trailing 12 months

Annual_Size_Change_Per_Unit_List_Price

Calculates and returns sum(unit list price change)/sum(quantity) in the trailing 12 months

Annual_Percent_Change_Per_Unit_List_Price

Calculates and returns sum(unit list price change)/sum(unit list price) in the trailing 12 months

Annual_Size_Change_Per_Unit_Margin

Calculates and returns sum(unit margin change)/sum(quantity) in the trailing 12 months

Annual_Percent_Change_Per_Unit_Margin

Calculates and returns sum(unit margin change)/sum(unit margin) in the trailing 12 months

Total_Sales_HC_Available_Annual

Executes dm query to get the count of salesRep in trailing 12 months. (Even if there are less than 12 trailing months available)

Total_Sales_HC_Annual

Executes dm query to get the count of salesRep in trailing 12 months. (Only if there are 12 trailing months available)

Total_Sales_HC_This_Month

Executes dm query to get the count of salesRep in current month.

Total_Sales_HC_Available_Annual_Region

Executes dm query to get the count of accounts in trailing 12 months and SellingFrom region. (Even if there are less than 12 trailing months available)

Total_Sales_HC_Annual_Region

Executes dm query to get the count of accounts in trailing 12 months and sellingFrom region. (Only if there are 12 trailing months available)

Total_Sales_HC_This_Month_Region

Executes dm query to get the count of accounts in current month and SellingFrom region.


Plasma_Transactions_Creation

This is a PA Calculation that aggregates the transaction data by each month/sellingFromRegion/sellingToRegion/businessUnit combination. Rollup is applied on the data to calculate sum for money/number fields. Data is sourced from Plasma_Transactions datamart. Result data is loaded into Plasma_Output_Transactions datamart.

Only data which is X months older than current date is taken into consideration whenever this aggregation happens (X = DataDelayInMonths parameter in Plasma_Confit PP table). The aggregated data is then written to the Plasma_Output_Transactions Datamart.

Dem_Business_Unit in the target datamart is a key, so it can not be null. If null, then defaults to ‘None’.

Element Name

Description

Element Name

Description

SyntaxCheck

Technical element

RetainGlobal

sets api.retainGlobal to true

WarningManager

Instance of WM

CalculationUtils

Contains Util methods for doing various calculations

RowsUtils

Contains Util methods to process target rows

QueryUtils

Contains Util methods for querying source data

Rows

Processes and adds rows to target datamartrowset


Plasma_Transactions_Exchange

This is a PA Calculation that applies exchange conversion from any currency to USD. It takes rows from “Plasma_TX” DS, applies currency exchange rates to convert Money fields to USD currency, loads the result into “Plasma_Transactions” DM. It runs with an incremental load. Currency exchange rates are fetched from the ccy DS. If not found there, then it is fetched from "Plasma_Currencies" DS.

All fields from source are copied to the target as is, except the money fields, which will be converted to USD currency. Also the Currency field will be overwritten to USD in target.

Element Name

Description

Element Name

Description

PreviousCurrency

String user entry from source DS

RetainGlobal

sets api.retainGlobal to true

SyntaxCheck

Technical element

WarningManager

Instance of WM

Row

Current row from source DS

Library

Contains util methods to get currency exchange rates and applying them

UniqueId

Returns UniqueId from current row

Currency

Returns “USD”

InvoicePrice

Applies currency exchange rates on InvoicePrice from current row.

GrossMargin

Applies currency exchange rates on GrossMargin from current row.


Plasma_Transactions_PreviousPrice

A PA Calculation that calculates previous prices for every transaction grouped by product/customer/business unit and sorted by pricing date. The calculation is done in local currencies with a buffer of +/-0.015, and loads the data back into Plasma_TX datasource.

Element Name

Description

Element Name

Description

ProductId

String user entry from source DS

CustomerId

String user entry from source DS

BusinessUnit

String user entry from source DS

PricingDate

String user entry from source DS

Currency

String user entry from source DS

LocalListPrice

String user entry from source DS

InvoicePrice

String user entry from source DS

NetPrice

String user entry from source DS

RealizedPrice

String user entry from source DS

Margin

String user entry from source DS

RetainGlobal

sets api.retainGlobal to true

SyntaxCheck

Technical element

ConstConfig

Aggregates all contant String Fields for their reusage across logic

QueryUtils

Contains Util methods to execute various datamart queries

BufferUtil

Defines and applies buffer to decide a price change.

Example: Buffer is +/-0.015, Then price is said to be changed if the difference between a price and its previousprice is greater than 0.015 or less than -0.015. Otherwise there is no change in price.

PreviousRowUtils

Contains Util methods to get previous row

PreviousRow

Returns the previous row

PreviousLocalListPricePU

Returns LocalListPricePU from previous row

PreviousInvoicePricePU

Returns InvoicePricePU from previous row

PreviousNetPricePU

Returns NetPricePU from previous row

PreviousRealizedPricePU

Returns RealizedPricePU from previous row

PreviousMarginPU

Returns MarginPU from previous row

PreviousCurrency

Returns Currency from previous row

PreviousPricingDate

Returns PricingDate from previous row

DeltaLocalListPrice

Returns change in LocalListPrice after applying buffer

ListPriceChangeCounter

Returns 1 if DeltaLocalListPrice is not equal to 0, else returns 0

ListPriceUpliftCounter

Returns 1 if DeltaLocalListPrice is greater than 0

DeltaInvoicePrice

Returns change in InvoicePrice after applying buffer

DeltaNetPrice

Returns change in NetPrice after applying buffer

DeltaRealizedPrice

Returns change in RealizedPrice after applying buffer

DeltaMargin

Returns change in Margin after applying buffer

 

An outlier limitations are applied for DeltaInvoicePrice and DeltaNetPrice.
If DeltaInvoicePrice is greater than DeltaLocalListPrice, then limit DeltaInvoicePrice to DeltaLocalListPrice. If DeltaInvoicePrice is negative, make it zero.
If DeltaNetPrice is greater than DeltaLocalListPrice, then limit DeltaNetPrice to DeltaLocalListPrice. If DeltaNetPrice is negative, make it zero.

Plasma Harvester Package

Plasma Harvester sits on the plasma partition, and it collects quote and transaction data from all the customer partitions via api bound calls. This is loaded into Harvester data sources. It also runs validators on the harvested data. Final data will be refreshed into Harvester datamarts which are joined with Plasma_Entities, Plasma_Taxonomy data sources.

Plasma_Entities contains demographic data for each customer, go-live date, taxonomy, etc.

Plasma_Taxonomy contains all available industry taxonomy up to 5 levels.

Also it distributes the KPI data to all the customer partitions’ Plasma_KPI data feed via api bound calls, and calls KPI_Refresh logic on all partitions to refresh the KPI data from data feed to datamart.

Plasma_Harvester

A PA calculation that collects the data from all customer partitions and loads to Plasma_Transactions_Harvester/Plasma_Quotes_Harvester based on the type of extractor name.

The mapping is defined in Plasma_Transactions_Harvester_Mapping PP table for Transactions and in Plasma_Quotes_Harvester_Mapping one for Quotes.

Every customer is assigned a unique HarvesterId ABCxxx and will be appended to the UniqueId of the customer data so as to identify which row belongs to which customer. The final data will not have any customer name, only to be identified by the unique HarvesterId ABCxxx. For each active HarvesterId, a bound call is made to the customer partition and the data is extracted.

Entity data is matched using "Unique Customer Id" in the Plasma_Transactions_Harvester_Mapping table and "Dem_Entity" in Plasma_Entities Data Source, and also using "Dem_Valid_From_Date".

The rows which have entity data in Plasma_Entities are only saved in the Plasma_Transactions_Harvester Data Source.

The Plasma_Transactions_Harvester_Mapping table is updated with "Last Process Date" so that it becomes a filter to process incremental data from the next run.
Last Process (From) Date will be blank at the first run, and will be populated by the last available month date of the harvested data. This field can be made null if there is a need to harvest all data again from the customer partition.

To Date can be set if we want to harvest only upto specified date.

Batch Size specifies the number of rows each api.boundcall can query, and if the number of rows are more than the batch size, it will query that many times until it processes all the rows.

Filters can be applied if there is a need to harvest only specific data, or exclude some data. This field takes filters in json format.

Only data which is X months older than current date is taken into consideration whenever this aggregation happens (X = DataDelayInMonths parameter in Plasma_Confit PP table).

Element Name

Description

Element Name

Description

ExtractorName

Returns string user entry from DL object, either “Quotes” or “Transactions”, by default it’s “Transactions”

ConstConfig

Aggregates all contant String Fields for their reusage across logic

AuditTable

Returns name of audit table, based on ExtractorName

WarningManager

Instance of WM (also created instance of Logger for WM to be reated)

Configuration

Reads mapping config and validator configs, returns map with configs and table names

BoundCallUtil

Utilities for boundcalls that are being called for data fetch

MetadataUtil

Utilities for processing of metadata

UpdateProcessedDateUtil

Utilities for updating processed date

BuildCriteriasUtil

Utilities for building criterias maps that are later used for fetching correct data

DemographicData

Reads data from “Plasma_Entities” DS

ProcessDataUtil

Utilities for data processing

Extraction

Main element which uses all utilities to extract data from bound partitions and save to target

 
Plasma_Harvester_Sequencer

Plasma sequencer is logic which is part of plasma solution. It’s responsible for managing, starting, plasma harvester steps. The main configuration of this process is contained in Plasma_Harvester_SequenceControlTable. This PP contains steps for both Transaction and Quote harvesting, which run independently.

Element Name

Description

Element Name

Description

Sequence

Kicks off the Sequencer calling its processing method

Plasma_Harvester_Sequencer_Reset

A CalculationFlow which runs every 2 weeks, it checks if the priceparameter table Plasma_Harvester_SequenceControlTable has all rows with Status DONE. If yes, it will make the Status blank, which will be picked up the sequencer and processes all the jobs.

Element Name

Description

Sequence

Kicks off the Sequencer calling its processing method

 Plasma_Transactions_Validator

A PA logic which validates the Plasma_Transactions_Harvester DS. It processes the DS and for each row, it validates the data according to the rules. If there are any violations then they will be written to “Plasma_Transactions_Validations” datamart, and also they will be categorized into Critical or not.

Validation rules: Plasma_Validations [Sharepoint].

Thresholds

For certain validation rules there are thresholds applied to see whether the values of particular buckets are within the correct thresholds (lower than positive thresholds and higher than the negative one) or not.

positive threshold = pricePoint * 0.0001

negative threshold = positive * (-1)

Example:

  • calculation of the bucket value:
    Sum_Revenue_[eg. Accounts]_Negative_Margin + Sum_Revenue_[…..]_Positive_Margin - Total_Revenue

  • calculation of positive and negative thresholds:
    positive threshold = (Sum of all Total_Revenues for given SellingFrom data ) *0.0001
    negative threshold = positive threshold * (-1)

Plasma_Quotes_Validator

A PA logic which validates the Plasma_Quotes_Harvester DS. It processes the DS and for each row, it validates the data according to the rules. If there are any violations then they will be written to “Plasma_Quotes_Validations” datamart, and also they will be categorized into Critical or not.

Validation rules: Plasma_Validations [Sharepoint].

Plasma_KPI_Distributor

A CFS logic which runs for every row in the ..Harvester_Mapping PP tables (correspondingly), and loads the KPI data from Plasma_KPI datasource to corresponding customer’s Plasma_KPI dataFeed. It inserts/updates a row in the Plasma_KPI_Refresher PP table’s “KPI Data Loaded?” field.

 

Element Name

 

Description

Element Name

 

Description

KPI_Type

Input user entry to either select “Transaction” or “Quote”

SyntaxCheck

Technical element

RetainGlobal

sets api.retainGlobal to true

Logger

Library element that provides methods to do logging

Cache

This queries the KPI data and saves to global cache

Distributor

This reads the global cache and distributes the KPI data to all the customer partitions’ Plasma_KPI dataFeed.

Plasma_KPI_Refresher

This is a CF calculation which runs every hour and checks the Plasma_KPI_Refresher PP table if “KPI Data Loaded?” is Y and “KPI Refreshed?” is N. And if the “Refresh Day Slot” and “Refresh Hour Slot” are matching with current day/time, then it calls the KPI_Plasma_Refresh logic no the corresponding customer partition. This is to avoid updating KPI data in the middle of business hours for any customer.

Available values for “Refresh Day Slot” are

·        Any Day

·        Weekday

·        Weekend

·        Sunday

·        …

·        Saturday

Available values for “Refresh Hour Slot” are (in UTC)

·        Any Hour

·        00

·        …

·        24

Element Name

 

Description

Element Name

 

Description

Refresher

This calls the Refresh formula on all the customer partitions, and that will refresh the KPI data from Plasma_KPI dataFeed to datamart.

Plasma_KPI_Distributor_Sequencer

Plasma sequencer is logic which is part of plasma solution. It’s responsible for managing, starting, plasma harvester steps. The main configuration of this process is contained in Plasma_KPI_Distributor_SequenceControlTable.

Element Name

Description

Element Name

Description

Sequence

Kicks off the Sequencer calling its processing method

Plasma_KPI_Distributor_Sequencer_Reset

This is a CalculationFlow which is non-periodic, it checks if the priceparameter table Plasma_KPI_Distributor_SequenceControlTable has all rows with Status DONE. If yes, it will make the Status blank, which will be picked up the sequencer and processes all the jobs. This CF needs to be manually run once we get the new KPI data from Bain.

Element Name

Description

Element Name

Description

Sequence

Kicks off the Sequencer calling its processing method

Margin_Tracker

A PA logic which tracks margin data based on Go-Live date of each customer. It calculates sum of Margin/Revenue and their cumulative sum of each customer for N months before and after Go-Live date. Go-Live date of each customer is picked from “Plasma_Entities” data source. Number of months “N” can be configured in PP table “Margin_Tracker_Config”. The data is written into target Datamart “Margin_Tracker” and can be used for displaying in a dashboard (in future).

Element Name

Description

Element Name

Description

Data

Calculates sum and cumulative sum of Margin/Revenue for N number of months for each customer

Outliers

Transactions

  • If one of Sum_Revenue_[eg. Account]_Negative_Margin or Sum_Revenue_[…]_Positive_Margin is negative, then add it to the positive value, make itself 0.

  • If Total_List_Price is negative in aggregated data, make it zero. (Apply same for Invoice, Net, Realized Price and GrossMargin)

  • Value (Total_List_Price_Change/Total_List_Price) should be between -1 and 1. (Apply same for Invoice, Net, Realized Price and GrossMargin)

  • If Price Change percentage is more than 100% or if PricePoints are negative, exclude such rows from Price Change calculation.

  • DeltaRealizedPrice should be less than or equal to DeltaLocalListPrice. If DeltaRealizedPrice is negative, then make it 0.

  • List Price Change is considered only when difference is greater than buffer of 0.015

Quotes

  • Exclude rows having ApprovalVelocity greater than (mean+(2*standardDeviation))

  • Value (Sum_Margin_Need_Approval/Total_Quote_Margin) should be between -1 and 1.

  • If one of Revenue_outside_guidelines or Revenue_in_guidelines is negative, then add it to the positive value, make itself 0. Applies for Revenue_need_approval and Revenue_approved as well.

  • Calculate InGuidelines/InBoundaries based on InvoicePrice vs TargetPrice.

  • If InvoicePrice is very high (greater than or equal 1,000,000,000,000,000), then exclude such rows.

Plasma Library

Plasma_Data_Extractor

Common for both Transactions (general) and Quotes (enrichment only).

A PriceAnalyzer logic used to gather the data using the mapping defined in the configuration tables. It loads the data into the Plasma_TX Data Source. It is designed to handle incremental load.

Transactions

Transaction Extractor takes data from all data sources we specify in config (P,C,PX,CX,DS,DM,PP), performs the mappings and formulas on specified columns and then joins all the data from these sources into 1 table, then loading data into the target DS.

How it works:

  • It gets the Extractor name from the Data Load > Calculation > Formula Fixed Inputs.

  • It reads the Plasma_Config table to see if the data is partitioned for load (feeder) and gets the primary source information.

  • It fetches the active configuration mapping from the Plasma_TransactionMapping table for the given Extractor. It reads the Plasma_Transaction_Fields table to fetch the required fields. If any required fields are missing in Plasma_TransactionMapping, then the processing will stop and an exception is thrown which is logged in the Plasma_Audit table.

  • For each active row source field, the exact field name will be fetched from the metadata if the label is given.

  • The data will be fetched from the primary source and saved as a temporary table (named by TableAlias), so that all the rows from the primary source are processed. The GroupBy formulas will be applied before saving to the temporary table.

  • If the data is partitioned, then data from the primary source will be filtered based on the items emitted from the feeder. Customer/Product master must be joined with the primary source to know the column containing the customer/product ID in the primary source.

  • If seed for masking is provided for any rows, those values will be masked before saving to the temporary table.

  • It identifies joining keys to join other sources and the data will be fetched from all the child sources based on the filtered data from the parent source.
    For example, if A is primary and is joined with B using customerId, then the data will be fetched from B which has only matching customerIds in A.

  • After all temporary tables are created for every source, the target fields will be queried by joining all the temporary tables. The formulas will be applied during the query.

  • If any required fields are null and no default value is provided, they will be omitted from the results and logged in the Plasma_Audit table. Other rows will be processed and loaded into the target data source.

Element Name

Description

Extractor

This element is to setup extractor name which will be used later as a parameter

SyntaxCheckAbort

Technical element

DataPartitonValue

Technical element

FromDate

Input element with default value

ToDate

Input element with default value

PlasmaConfig

Extracts main config from PP and contains it in user-friendly from

ShouldProcessDataLoad

Element decides if we should continue process or not

OperatorMapping

Extracts operator mapping from PP

MappingConfig

Extracts data from PP “Plasma_TransactionMapping” and generate config from them.

Data_query

This element is is responsible for creating and query for download data from temporary tables and combine them into final dataRowset which is generated into target.

Logger

Instance of Logger

WarningManagger

Instance of WM

StringConstants

Aggregate of all String Fields reused acrodd logic

IncrementalMode

Returns if we use uncremental mode or not

PlasmaInputs

Element that aggregates fromDate, toDate, currentItem and extractor name

PlasmaTransactionRequiredFields

Element that reads PP table with Transaction_RequiredFields

PlasmaTransactionMappingFields

Elelement that reads PP table with Transaction_Mappings

PrimarySourceConfig

Finds which table is a primary table and returns map with its name, type and alias

FileConfig

Element that aggregates all configs and api.locals and returns a map of them

GenerateQueryConfig

Returns map of all table aliases as keys and their configurations as the value

Columns

Extracts columns based on config

LookupData

Gets the data for all specified temporary tables

LoadDataIntoTableContext

The data from temporary tables is being added to table context

ModifyTemporaryTables

Performs changes on generated query configs adding to groupBy, fields and groupFieldMap

CreateFinalSelectSQL

Creates final select SQL

CreateFinalJoinSQL

Creates final join SQL

QueryForJoiningTempTables

This element is responsible for creating combined query based on CreateFinalSelectSQL and CreateFinalJoinSQL

Data_Query

This element is responsible for downloading data from temporary tables and combine them into final dataRowset which is generated into target.

Low level desciption of key elements of Transaction Extractor

GenerateQueryConfig
  • Inputs: fileConfigs, mainConfig,plasmaInputs,primarySourceConfig,joiningRelations,primaryTableAlias, fromDate, toDate, shouldProcessDataLoad,partitionValue

  • Dictionary: partition filer – is created for batching the data, created based on partitionValue and filterMap (from queryConfig) and added to ‘filter’ (list in query config). It defines relation of values from partition value, which should be fulfilled by all rows fetched from primary source

  • partitionValue: it’s a current item from feeder, used for batching from primary source, only used for primary source

  • date filter: it filters everything between fromDate and toDate, exists only for primary source

  • currentItem: current row from feeder

  • feeder: batches data in 100 000 rows per each batch

Behavior of GenerateQueryConfig:

  1. Reads the inputs

  2. Generates map of queryConfigs

  3. Sorts aliases in the order in which tables will be then joined in Data_Query

  4. Sorts queryConfigs according to sorted aliases

  5. Adds date filters and partition filters to queryConfig for primary source

  6. Returns map of queryConfigs

queryConfig structure:

  • filterMap: source column as key, primary source column as value

  • filter: list of all filters : date filters + optimization filters + partition filters + feeder filters

  • query: query object to fetch data for temporary table, query if filled with projections which are the names of columns and then that query is executed to have a lookuped data before loading it into table context

  • other keys: dataMap, fields, groupBy, groupFieldMap, columns, sourceType, sourceName

Modification of filterMap (queryConfig.filterMap) in case of feeder usage:

filterMap is created based on primary source alias and feeder alias (alias of first table found by feeder source type). To filterMap we add all joinings/mappings that are found between columns from primary table and feeder table.

Columns

Columns building and their types.

Columns are built based on : mainConfig and sourceMap.
2 main types of columns:

  • for data lookup there are 2 additional types:
      a) for sql which will fetch data
      b) for temporary table created only for the purpose of executing that sql from point a)

  • for creation on table names (1 to 1 of output from element „Columns”)

Dictionary:
  • sourceField: name of column from actual data source

  • targetField: name of column in final table

  • source table: P,C,PX,CX,PP,DM,DS.

Columns from temporary table are 1 to 1 with temporary source tables (if and only if we don’t permorm any operations on data from source table)

Data_Query
  • Inputs: ShouldProcessDataLoads, api.local.tables, PlasmaInputs.currentItem,QueryForJoiningTempTables, Logger, api.local.PrimarySorceRowCount

  • Dictionary:target – it’s a table which is set in DataLoad object as target and it’s accessed by api.getDatamartRowSet(„target”)

  • Behavior:Element fetches data from table context (from temporary tables), based on query from QueryForJoiningTempTables, performs validation of rows, checking the required columns in requiredFields, then, finally – adding data rows to target

CreateFinalSQL
  • Inputs: Columns, targetMap

  • Dictionary: targetMap – has all the mappings between temporary table and target
    TGT. – this prefix refers to target

  • Behavior: Element takes list of Columns and targetMap. Then it:

  • 1 Joins all columns from list to 1 String, using commas
    2 Changes keys in String for value from targetMap

CreateFinalJoinSQL 
  • Inputs: joinMap, tables, OrimaryTableAlias 

  • Dictionary: joinMap – api.global containing information how the temporary tables should be joined together, based on config in PP – Plasma_Config
    tables – api.global which stores all the tempoarary tables, that’s where they are being added after being created 

  • primaryTableAlias – alias which marks which table should be the primary one, read from config PP Plasma_Config  

  • Behavior: Element first performs validation of joinMap and PrimaryTableAlias, then sorts all the mapping relations from joinMap in order based on primaryTableAlias value. Next – it builds an SQL in String format, which will be used to join all the temporary tables, using SQL left joins - in next elements

LookupData
  • Inputs: GenerateQueryConfig

  • Dictionary: joiningRelations – map in which as keys there are aliases on tables which are to be joined – separated by a dot, in form of String, and as values – there are relations in which temporary tables will be mapped – mappings from 1 column to another. Joining of 2 tables can have more than 1 relation.

  • Optimization filter: filters which purpose is to avoid fetching redundant rows, optimize lookup data process. Optimization filters are created based on joining relations. After fetching the data for source – it’s checked whether source has any relations with other sources. If yes – queryConfig for given source is modified – the values for column shown in relation. It’s a list added as queryConfig.filter

  • Behavior: (1-3 are in loop iterating over queryConfigs):

  1. Lookups data based on queryConfigs

  2. Sets optimization filters for each table

  3. Performs data masking

  4. Returns map with masked data

  • Data masking process:

For masking, a seed is being used (which is a value in the column „Seed for masking” inn PP table) and after each process of masking, seed iis added to api.local.seedMap – to make sure we avoid duplication of the process (it checks before masking if given seed already exists in api.local.seedMap). Then maskedValueKey is created based on 3 inputs: source alias, value of given column and seed for fiven column. With this key, we take masked value from api.global.maskMap, and if such didn’t exist in maskMap – then PlasmaLib is called where we pass a seed and value for given column. Masking process happens in PlasmaLib through getMaskedString method.

Quotes

For the quotes area the Plasma_Data_Extractor logic is used for Plasma Quotes Enrichment only.

PlasmaLib

Element Name

Description

Element Name

Description

Common

Contains util methods for columns and tables across Plasma

BatchUtils

Prepares a set of SKUs of items being calculated in the current batch and internally caches this to api.global.currentBatch, api.local.isNewBatch and api.global.iterationNumber. Also checks whether a new batch of SKUs is being calculated and returns a set of SKUs of a current batch

Logger

Contains methods for logging debug, info, error, audit, audit entry and error entry - to reuse across the Plasma

Lookup

Contains methods for finding PP tables, PP or any lookup table Id and returning data from target object

Mask

Data masking methods

Formula

Methods for Strings transformations based on given source maps and formula Strings, for cases

Sequencer

Methods processing all sequences from the sequence table. This element covers e-mail notifications for failed jobs as well.

WarningManager

WM adds additional layer on Logger, enabling throwing exceptions and warnings, based on codes usage. Plasma Warning Handling

DataStructureUtils

Utils for mapping data

DateUtils

Utils for dates operations

ExchangeUtils

Utils for exchange rates

LookupUtils

Utils to get items/ properties from different containers

 

Plasma_Data_Extractor_Feeder

A logic to emit items to Data Load. It can be configured to emit either Customer or Product items to the Data Load. It also accepts a list of Customer/Product IDs as an input, so it can emit only the items from the list. Default type of supported entityId’s is customer, with a filter on customerId. Multiple Customer IDs can be passed to feeder.
Feeder is using sharedLib.stream to download data and collect it typedId. Next that id’s are pushed forward by api.emitPersistedObject.

Element Name

Description

Element Name

Description

EntityType

Element allows us to choose entity type

EntityId

Element allows us to choose entity id

AbortIfSyntaxCheck

Technical element

FeedEntity

Main element which process data

Incremental

Returns incremental data

Plasma_KPI_Sequencer

Plasma KPI Sequencer is responsible for managing, starting KPI Refresh steps. The jobs of this process are defined in Plasma_KPI_SequenceControlTable. Whole point of this sequence of jobs is to delete the existing KPI data, and refresh with latest KPI data from harvester.

Plasma_KPI_Refresh

Plasma KPI Refresh is a generic logic which works as reset for KPI Sequencer. When this is called by the harvester, it checks if the priceparameter table Plasma_KPI_SequenceControlTable has all rows with Status DONE. If yes, it will make the Status blank, which will be picked up the sequencer and processes all the jobs.

 

Plasma Dashboards

package name in repository: plasma-kpi-dashboards-accelerator

Collects dashboards logic designed to work independently for Transcation and Quotes or for both together.

·        Configurator_PlasmaInputs

·        DL_PerformanceDataGenerator

·        Dashboard_PlasmaKPIs

·        PlasmaDashboardCommon

 

Related content

Plasma Quotes Technical Documentation
Plasma Quotes Technical Documentation
More like this
Plasma Quote Processor Package Configuration
Plasma Quote Processor Package Configuration
More like this
Plasma Quotes Processor - Upgrade Instructions 1.4 -> 1.5.0
Plasma Quotes Processor - Upgrade Instructions 1.4 -> 1.5.0
More like this
Plasma Quotes Processor - Upgrade Instructions 1.4 -> 1.5.2
Plasma Quotes Processor - Upgrade Instructions 1.4 -> 1.5.2
More like this
Plasma Quotes Processor - Upgrade Instructions 1.4 -> 1.5.1
Plasma Quotes Processor - Upgrade Instructions 1.4 -> 1.5.1
More like this
Plasma Quotes Testing Steps
Plasma Quotes Testing Steps
More like this