Informatica Cloud Orchestrator
Type | Pre-Set |
---|---|
Image | $DATAOPS_INFORMATICACLOUD_RUNNER_IMAGE |
The Informatica Cloud orchestrator is a pre-set orchestrator that triggers the start of an Informatica cloud taskflow as part of a DataOps pipeline. This functionality allows integrating all existing Informatica cloud taskflows into a DataOps pipeline.
Usage
The Informatica Cloud orchestrator orchestrates Informatica cloud taskflows via Informatica cloud REST API.
The orchestrator's workflow is as follows:
- Triggers the specified taskflow using a specified Informatica cloud URL.
- Polls the status and fetches progress from the Informatica Cloud for the taskflow.
- Propagates the Informatica Cloud taskflow execution status to the DataOps pipeline.
"pipelines/includes/local_includes/informatica_jobs/my_informatica_job.yml
"My Informatica Cloud Job":
extends:
- .agent_tag
stage: "My Stage"
image: dataopslive/dataops-informaticacloud-orchestrator:pripre-infromatica
variables:
INFORMATICA_TASKFLOW_URL: https://id.informaticacloud.com/active-bpel/rt/TaskflowID
INFORMATICA_TASK_TYPE: TASKFLOW
INFORMATICA_USERNAME: XXXX
INFORMATICA_PASSWORD: XXXX
script:
- /dataops
icon: ${INFORMATICA_ICON}
We recommend that you configure the DataOps pipeline to continue running only if the Informatica Cloud job is successful, ensuring that the pipeline run does not transform any out-of-date data.
Supported parameters
Parameter | Required/Default | Description |
---|---|---|
INFORMATICA_USERNAME | REQUIRED but can be pulled from the DataOps Vault | Username to access the Informatica Cloud API |
INFORMATICA_PASSWORD | REQUIRED but can be pulled from the DataOps Vault | Password to access the Informatica Cloud API |
INFORMATICA_TASK_TYPE | REQUIRED | Currently only supports TASKFLOW |
INFORMATICA_TASKFLOW_URL | REQUIRED | The REST API URL for the taskflow |
INFORMATICA_TASKFLOW_ARGS | Optional | Arguments to be passed to Informatica Cloud taskflow |
INFORMATICA_TIMEOUT | Optional. Defaults to 3600 | Informatica Cloud taskflow timeout in seconds. If increasing the DataOps job timeout, set this to an equivalent value. |
Example jobs
This example demonstrates what a typical pipeline job looks like:
- Informatica Cloud Job
- Informatica Cloud Job with Arguments
pipelines/includes/local_includes/informatica_jobs/my_informatica_job.yml
"Start Informatica Cloud Job":
extends:
- .agent_tag
stage: Data Ingestion
image: dataopslive/dataops-informaticacloud-orchestrator:pripre-infromatica
variables:
INFORMATICA_TASKFLOW_URL: https://id.informaticacloud.com/active-bpel/rt/TaskflowID
INFORMATICA_TASK_TYPE: TASKFLOW
INFORMATICA_USERNAME: DATAOPS_VAULT(INFORMATICA.USERNAME)
INFORMATICA_PASSWORD: DATAOPS_VAULT(INFORMATICA.PASSWORD)
script:
- /dataops
icon: ${INFORMATICA_ICON}
pipelines/includes/local_includes/informatica_jobs/my_informatica_job.yml
"Start Informatica Cloud Job":
extends:
- .agent_tag
stage: Data Ingestion
image: dataopslive/dataops-informaticacloud-orchestrator:pripre-infromatica
variables:
INFORMATICA_TASKFLOW_URL: https://id.informaticacloud.com/active-bpel/rt/TaskflowID
INFORMATICA_TASK_TYPE: TASKFLOW
INFORMATICA_USERNAME: DATAOPS_VAULT(INFORMATICA.USERNAME)
INFORMATICA_PASSWORD: DATAOPS_VAULT(INFORMATICA.PASSWORD)
INFORMATICA_TASKFLOW_ARGS: |
{
"arg_1": 1
}
script:
- /dataops
icon: ${INFORMATICA_ICON}