Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

Type

Description

Available for

Purge Cache

Clears the cache. The cache applies to queries on Data Sources and Datamarts, from the UI and from logics. There is no expiration, but it is limited in size (default is 1000 entries / partition). It is a LRU (Last Recently Used) cache. It gets invalidated after a data change (load, flush, refresh). Cache purging is a recovery mechanism.

  • Data Source

  • Datamart

Truncate

Deletes (all/filtered) rows in the target.

Note: When a Data Source is deployed, the Truncate Data Load of the linked Data Feed is updated with a filter to include only rows previously successfully flushed to Data Source and it is scheduled to run once a week. This applies only if there is no other filter or schedule already defined.

Note

Incremental mode is no longer available for Truncate jobs. For older jobs (created before upgrade to Collins 5.0 release) where this option was enabled, it will stay enabled. If you disable the Incremental option the check-box will become non-editable and you will not be able to enable the option again. For Data Loads saved with the Incremental option off, the check-box is completely hidden.

  • Data Feed

  • Data Source

  • Datamart

  • Sim Datamart

Upload

Loads an XLS/XLSX sheet or a CSV/TXT file. Supports scientific notations of numbers: 1e02, 1E02, 1e+02, 1E+02, 6e-04, 6E-04.

  • Data Source

  • Datamart

Mass edit

Allows you to apply mass changes to editable attributes. See Common Table Actions for details.

  • Data Source

  • Datamart

Flush

Copies data from the Data Feed into the Data Source. It can also convert values from string to proper data types set in the Data Source.

It can copy everything or just new data (i.e., incremental Data Load).

  • Data Feed

  • Data Source

Refresh

Copies data from Data Sources (configured in the Datamart fields) into the target Datamart. It can copy everything or just new/modified data (i.e., incremental Data Load).

If a source filter is set on the Source tab, only the filtered rows are loaded to the Datamart. Rows that do not meet the filter’s condition and are present in the Datamart are automatically deleted from the Datamart when the Refresh job is run.

(tick) If you want to run a non-incremental refresh but avoid the costly merging of almost the same data, you can truncate the DM first – set the advanced configuration option 'truncateDatamartOnNonIncRefresh' to true.

Note: Since Godfather 8.1, rows updated during Refresh behave differently: their calculated fields are cleared to NULL instead of being persisted. For details see the release notes.

  • Datamart

  • Sim Datamart

Calculation

Applies a logic (defined in Configuration) to create new rows, or change/update values in existing rows in the target Data Source or Datamart. The calculation can take data from anywhere, e.g., Master Data tables.

Example usage:

  • Datamart / Data Source columns calculations

  • Rebate allocations

  • Copy of data from PX / Company Parameters /... into the Data Source

  • Data Source

  • Datamart

Calendar

Generates rows of the built-in Data Source "cal" and you get a Gregorian calendar with US locale. (If you need any other business calendar, just upload the data into the "cal" Data Source from a file or via integration and do not use this Data Load).


Customer

Special out-of-the-box Data Load which copies data from the Master Data table "Customer" into the Data Source "Customer".


Product

Special out-of-the-box Data Load which copies data from the Master Data table "Product" into the Data Source "Product".


Simulation

Applies a logic to the data as defined by the simulation for which the Data Load was created.

  • Sim Datamart

Internal Copy

Copies data from a source into the Data Source table.

The source here can be:

  • Master Data table (P, PX, C, CX, Company Parameters)

  • PA Rollup query (intended for Rollups materialization)

  • PO Model table

  • Price Records table (Quoting or Agreements & Promotions – select the preferred option on the Source tab)

  • Rebate Records table

The easiest way to create this type of Data Load is to create a new Data Source from Template and deploy it; this automatically creates the Data Load and pre-fills the columns.

(warning) The incremental mode in Internal Copy tasks is not exactly the same as in the Refresh or Calculation type. Here, incremental means the Data Source will not be truncated before the copy, i.e., it will keep old records instead of being a true copy.

  • Data Source

Index Maintenance

This task can be run to repair indexes associated with the target Data Source or Datamart, typically after backend DB migration. The task should be run only in these special circumstances, not on a regular or scheduled basis. We also strongly recommend consulting Pricefx support before you run this task. Index maintenance is not normally manually run but, it is run every time a Data Load completes processing. This process should be stated in the document to not cause confusion.

  • Data Source

  • Datamart

Distributed Calculation

Allows you to split a large set of data into batches and process them independently. See Distributed Calculation in Analytics for details.

  • Data Source

Publishing

Publishes data after Refresh which makes new data accessible by queries. It is system generated and cannot be created manually.

  • Data Source

  • Datamart

...