Python Engine

The Python engine allows you to execute Python code interacting with the partition from inside a Model Class logic.

Supported Versions

When using the Python engine in production, you must specify a version (e.g., not latest but v1).

Each version of the Python engine targets a specific version of the Pricefx platform, as shown in this table:

Python Engine version

Supported Pricefx Platform version

Note

Python Engine version

Supported Pricefx Platform version

Note

v1

9.4

No longer maintained

v2

9.4

No longer maintained

v3

10.0

 

v4

10.1

 

v5

10.2

 

v6

10.3

 

v7

10.3, 10.4

 

v8

11.0

 

v9

11.1 and latter

 

v10

12.0

 

For more details on the changes provided by each version, see the Changelog page.

Partition Configuration

Job Trigger Images Flavors

The following images are available at https://gitlab.pricefx.eu/engineering/pricefx-python-engine/container_registry.

Image

Description

Image

Description

pyfx

Minimal image containing only the pyfx Python module and required dependencies.

datascience

Image containing pyfx and common data science Python libraries.

neural

Image containing pyfx, common data science libraries and neural network related libraries.

If you have no specific needs, use cregistry.pricefx.eu/engineering/pricefx-python-engine/datascience.
For details on libraries available in each flavour see Images flavour.

Recommended Resources for Job Trigger Configuration

It is recommended to start with the following values:

  • cpuLimit: 2

  • memoryLimit: 4Gi

Cluster Settings

If your Python script involves downloading/uploading large amounts of data or uploading big attachments to your model, you may need to update the corresponding cluster settings to avoid issues (such as timeout).

Here are the relevant cluster settings to adjust:

  • Maximum size of model attachments
    This can be configured under the binaryDataService section. You should probably increase at least the maxBinarySizeValue.

  • Timeout of requests to data sources
    Adjust the datamart.query.externalDefaultTimeout and datamart.query.externalMaxTimeout.

You should create a support ticket to ask those cluster settings modifications.

Python Engine Parameters

The Python engine takes a json-encoded dictionary containing the following fields:

Attribute

Type

Description

Mandatory

Attribute

Type

Description

Mandatory

script

String

The script you want to run in the Python engine.

yes

parameters

Dictionary

Optional, a dictionary of values that will be made accessible from inside the script.

no

See Run Python Scripts | Script sample for examples on how to declare and use these parameters.

See Job Trigger Calculations | Parameters size limitation for limitation on script/parameters size.

Monitoring

Monitoring of a triggered job is possible in Grafana, after you filter by pricefx_heartbeat_task_id. For details see Job trigger calculation in https://pricefx.atlassian.net/wiki/spaces/EN/pages/4120346722.

To add additional logs on model-by-model basis, attach to the model file the following:

{ "rest_calls": true }

REST requests and responses will be logged to the console, meaning they can be found in Grafana.
To add the file, go to the model > click ... in the top right corner and select Attachments > Upload.


pyfx Python API

From inside the Python script, the pyfx module is available to interact with the partition.

Starting with version 9, the documentation for the latest Pricefx Python API version is available at https://developer.pricefx.eu/pricefx-api/pyfx/. Documentation for v8 is available here.

Here is the documentation for previous versions:

v1

v2

v3

v4

v7

v8

Additional Resources

Found an issue in documentation? Write to us.