Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

Expand
titleReview Model Types

Select Model Types from the main menu path: Optimization > Pricing Guidance > Model Types as follow: 

The system displays a list of existing model types, the one you will use in this exercise is the Segmentation. You can review the algorithm behind the model by clicking the View definition under the Definition column to display the elements and steps used in the calculation. 

Review Model Types: 

 

Select Models from the main menu path: Optimization > Pricing Guidance > Models, as follow: 

The system displays existing models with number of records calculated and model type. 

On the top-right corner of the window, click on the Add Model icon.

Provide a name with your Student number (no spaces), a label/name to identify your model, and slect from the Model Type drop-down box the Segmentation option. Click on the Add icon. 

The system displays a confirmation message at the top of the window: 

The system displays the main tabs to build the model: 

 

Definition – represents the area where the logic is determined to identify the calculation context with user inputs (filters, mapping parameters, percentiles, etc.). 

Analysis – represents the area where the data is uploaded, selected, and distributed for the Segmentation.  

Calculation – represents the execution to generate the Segmentation, including several steps with calculation results. 

Results – represents the display of the Segmentation results (tree-branches) with an Evaluation area to search for specific segment (tree-search) using different attribute values. 

 

To populate the Field rows on the right area, you must select the source of data (Datamart) for this model. Open the drop-down and select Standard Sales Data (DM) option: 

As soon as you select the Datamart, the filters are displayed. It is very important to avoid negative values or null values in the model, so you must create a filter for data quality purposes. Click on the Set Filter option: 

Using the drop-down arrows, select the displayed filter options for Gross Margin Percent

You will need to create more than one filters at a time. Click on the Add Rule option, select the AND condition: 

Using the drop downs, complete the following filters for Gross Margin and Global List Price selections, and click on the Apply icon. 

  

Note: Using these filters will avoid using negative values (from the Datamart) for margin and list prices in the model calculations. 

 

The additional displayed fields are used as the Mapping area that tells the system logic (behind the model) which data to bring into the model from the different Data Sources/Datamarts to perform the calculations. Using the dropdowns select the following options: 

 

Note: These selections are based on the business requirements and Datamart’s content to perform the model’s calculation. 

 

The price optimization/Segmentation is based on percentiles of the target metric distribution. Every segment is given a floor and stretch percentiles that define guardrails for the target percentile. The system already displays the Percentile Values, such as floor and ceiling (stretch) prices, you can update them here based on your business requirements. If the Target is left blank, it will be calculated automatically based on the optimization/Segmentation model. For this exercise, you can leave this area as default it. 

Floor Percentile – represents the floor optimization target in the segment. 

Target Percentile – will be used to propose an optimization target value for items whose target is between floor and target before calculation. It is optional to be entered into the model or calculated based on each segment score. 

Ceiling (stretch) Percentile – represents the stretch optimization target value in the segment. 

 

Expand
titleAdditional Parameters

The Additional Parameters area helps to determine the amount of data to be used in the Segmentation model calculation for the available transactions in the Datamart. This selection will help to build the dimensions, data profiling and drivers within the model. Update the following parameters. 

 

Max Cardinality – largest count of distinct values that a dimension is allowed to have, to be taken into the account for price drivers (across the spectrum). 

Min #Transactions in Segment – smallest size a segment can have. If the segment is too small, it won’t be created, and the model will use the above (parent) level in the segment tree. 

Min #Customers/#Products in Segment – smallest size of customer/product in a segment. 

Policy Start Date/End Date – will provide time bounds when using the policy records. 

 

After completing all the parameters, filters, and data mapping sections, click on the Save Model icon on the top-right corner of the window. 

 

The system displays a confirmation message at the top of the window. 

Now the Field Rows are available based on the data mapping fields and the Datamart used as a source in the previous steps. 

 

Click on the Source Data tab. The model is empty because you need to complete the Analysis of the model and execute the data uploading process to populate the data records from the Datamart. 

 

Click on the Analysis tab on the top-left corner of the window. 

 

This section is used to prepare the data for the model the way you require it. In this Data Preparation section, you must click on the Calculate icon to start uploading the data into the model from the Datamart. 

 

Note: DataPrep step materializes the data applying all the filters and helps to reduce the processing time of the next steps. The “Incremental Calculate” icon can be used for partial calculations for specific changes/updates. 

 

Click on the Continue icon in the validation message. 

 

The system will display the status of the model preparation passing from Pending > Processing > Ready (give couple minutes to the system to process the request, click on the Refresh icon, if necessary).  The system now displays the number of rows of data loaded from the Datamart that will be used in this model. 

You can validate the actual data loaded into the model by going back to the Definition tab. 

Click on the Source Data tab. 

 

The system displays all the loaded data into the model. Notice some functions icons on the top-right area, they are available to maintain the records within the model, if case of changes.  

 

Purge Cache – clears the cache. The cache applies to queries on the model from the Data Preparation calculation. 

Truncate – deletes all/filtered rows from the calculation. This option is used when filters in the Definition area are changed and to avoid duplications.  

Upload – loads and XLS/XLSX sheet or a CSV/TXT file.  

Mass Edit – allows you to apply mass changes to editable attributes. 

Status History – displays information related to the records status history.  

Export – allows you to export the records to an external XLS or CSV format file. 

...