...
Type | Description | Available for | ||
---|---|---|---|---|
Purge Cache | Clears the cache. The cache applies to queries on Data Sources and Datamarts, from the UI and from logics. There is no expiration, but it is limited in size (default is 1000 entries / partition). It is a LRU (Last Recently Used) cache. It gets invalidated after a data change (load, flush, refresh). Cache purging is a recovery mechanism. |
| ||
Truncate | Deletes (all/filtered) rows in the target. Note: When a Data Source is deployed, the Truncate Data Load of the linked Data Feed is updated with a filter to include only rows previously successfully flushed to Data Source and it is scheduled to run once a week. This applies only if there is no other filter or schedule already defined.
|
| ||
Upload | Loads an XLS/XLSX sheet or a CSV/TXT file. Supports scientific notations of numbers: 1e02, 1E02, 1e+02, 1E+02, 6e-04, 6E-04. |
| ||
Mass edit | Allows you to apply mass changes to editable attributes. See Common Table Actions for details. |
| ||
Flush | Copies data from the Data Feed into the Data Source. It can also convert values from string to proper data types set in the Data Source. It can copy everything or just new data (i.e., incremental Data Load). |
| ||
Refresh | Copies data from Data Sources (configured in the Datamart fields) into the target Datamart. It can copy everything or just new/modified data (i.e., incremental Data Load). If a source filter is set on the Source tab, only the filtered rows are loaded to the Datamart. Rows that do not meet the filter’s condition and are present in the Datamart are automatically deleted from the Datamart when the Refresh job is run. If you want to run a non-incremental refresh but avoid the costly merging of almost the same data, you can truncate the DM first – set the advanced configuration option 'truncateDatamartOnNonIncRefresh' to true. Note: Since Godfather 8.1, rows updated during Refresh behave differently: their calculated fields are cleared to NULL instead of being persisted. For details see the release notes. |
| ||
Calculation | Applies a logic (defined in Configuration) to create new rows, or change/update values in existing rows in the target Data Source or Datamart. The calculation can take data from anywhere, e.g., Master Data tables. Example usage:
|
| ||
Calendar | Generates rows of the built-in Data Source "cal" and you get a Gregorian calendar with US locale. (If you need any other business calendar, just upload the data into the "cal" Data Source from a file or via integration and do not use this Data Load). | |||
Customer | Special out-of-the-box Data Load which copies data from the Master Data table "Customer" into the Data Source "Customer". | |||
Product | Special out-of-the-box Data Load which copies data from the Master Data table "Product" into the Data Source "Product". | |||
Simulation | Applies a logic to the data as defined by the simulation for which the Data Load was created. |
| ||
Internal Copy | Copies data from a source into the Data Source table. The source here can be:
The easiest way to create this type of Data Load is to create a new Data Source from Template and deploy it; this automatically creates the Data Load and pre-fills the columns. The incremental mode in Internal Copy tasks is not exactly the same as in the Refresh or Calculation type. Here, incremental means the Data Source will not be truncated before the copy, i.e., it will keep old records instead of being a true copy. |
| ||
Index Maintenance | This task can be run to repair indexes associated with the target Data Source or Datamart, typically after backend DB migration. The task should be run only in these special circumstances, not on a regular or scheduled basis. We also strongly recommend consulting Pricefx support before you run this task. |
| ||
Distributed Calculation | Allows you to split a large set of data into batches and process them independently. See Distributed Calculation in Analytics for details. |
| ||
Publishing | Publishes data after Refresh which makes new data accessible by queries. It is system generated and cannot be created manually. |
|
...