Skip to main content

Dataiku Orchestrator

Enterprise

Image$DATAOPS_DATAIKU_RUNNER_IMAGE
Feature Status
Feature release status badge: PubPrev
PubPrev

The Dataiku orchestrator integrates and runs Dataiku Scenarios as part of a DataOps pipeline.

Usage

This YAML script demonstrates how to structure a typical Dataiku job run inside a DataOps pipeline.

pipelines/includes/local_includes/stitch-jobs/my_stitch_job.yml
"Dataiku job":
extends:
- .agent_tag
stage: Datascience Processing
image: $DATAOPS_DATAIKU_RUNNER_IMAGE
variables:
DATAIKU_API_KEY: DATAOPS_VAULT(DATAIKU.API_KEY)
DATAIKU_URL: DATAOPS_VAULT(DATAIKU.URL)
DATAIKU_PROJECT_KEY: DATAOPS_VAULT(DATAIKU.PROJECT_KEY)
DATAIKU_SCENARIO_ID: DATAOPS_VAULT(DATAIKU.SCENARIO_ID)
script:
- /dataops
icon: ${DATAIKU_ICON}

Supported parameters

ParameterRequired/DefaultDescription
DATAIKU_URLREQUIREDURL for the Dataiku instance
DATAIKU_API_KEYREQUIREDDataiku API access key (see the Dataiku Authentication docs)
DATAIKU_PROJECT_KEYREQUIREDDataiku Project Key for the Scenario you want to trigger using DataOps.live
DATAIKU_SCENARIO_IDREQUIREDDataiku Scenario ID which you want to trigger using DataOps.live

Example jobs

This YAML script demonstrates a typical Dataiku use case, that is, starting a pre-existing Dataiku Scenario:

pipelines/includes/local_includes/stitch_ingestion.yml
"Dataiku job":
extends:
- .agent_tag
stage: Datascience Processing
image: $DATAOPS_DATAIKU_RUNNER_IMAGE
variables:
DATAIKU_API_KEY: DATAOPS_VAULT(DATAIKU.API_KEY)
DATAIKU_URL: DATAOPS_VAULT(DATAIKU.URL)
DATAIKU_PROJECT_KEY: DATAOPS_VAULT(DATAIKU.PROJECT_KEY)
DATAIKU_SCENARIO_ID: DATAOPS_VAULT(DATAIKU.SCENARIO_ID)
script:
- /dataops
icon: ${DATAIKU_ICON}