The following areas might be a source of issues when starting to use the version Rampur 13.x, so please make yourself familiar with this section.
Table of Contents | ||
---|---|---|
|
Unable to Refresh/Deploy Datamart with Calculated Foreign Key
Using a calculated field to join tables was never supported (as documented in Datamarts).
However, in specific scenarios before version Rampur 13.0, there may have been cases where such a setup worked by pure coincidence. In Rampur 13.0, these use cases no longer work.
Workaround: Create a Data Load to calculate and store this key in the Datamart. (In specific cases, another workaround is available but can only be enabled by Pricefx staff. Details are available in /wiki/spaces/SUP/pages/5197955073.)
Background information: We refresh a Datamart by joining all its Data Source tables and merging these rows into the Datamart table. The value of a calculated field in a Datamart can be determined when running a query (SELECT) on the Datamart. This can only be done when the data is already loaded into the Datamart, i.e., after a Datamart Refresh. This implies that you cannot rely on the value of the calculated field to perform the Refresh. This might compromise the integrity of the data in your Datamart.
DS Data Push Returns an Error “Maximum number of rows exceeded”
When you have a Data Load that populates a Data Source with data, there is a limit to the number of rows that can be processed (defined in the configuration property datamart.dataLoad.maxRowsPerBatch
, which is set to 100 000 by default).
In previous versions, the validation of this limit was skipped for DMDataFeeds.
Now the validation is back in place.
Workaround: Split the input data into smaller jobs that fit within the 100 000 rows limit. Alternatively, in a dedicated customer environment, you may want to consider asking Pricefx Support to double this limit.
Filters on Meta-Attributed Entities
See the topic described in /wiki/spaces/UDEV/pages/5195431938 which may also be relevant when upgrading to 13.0.