Azure Orchestrator
Type | Flexible |
---|---|
Image | $DATAOPS_AZURE_RUNNER_IMAGE |
The Azure orchestrator is a flexible orchestrator that includes the following functionality:
- Responsive (flexible) interaction with Microsoft Azure Services using the built-in Azure CLI tools
- DataOps Vault functionality that allows scripts to retrieve variables from the vault
- DataOps native tools that allow the development of custom scripts that interact with Azure
- The following added tools:
- git
- curl
- ssh-client
- perl
- sshpass
- ansible
- unzip
- terraform
Usage
The first use case described here is typical for this orchestrator; that is, to start an Azure instance to perform a task in the pipeline:
"My Azure Job":
extends:
- .agent_tag
image: $DATAOPS_AZURE_RUNNER_IMAGE
stage: "Batch Ingestion"
variables:
# use one of the following connection methods
# identity inheritance
DATAOPS_USE_IDENTITY: 1
# or default vault expansion
SET_AZ_KEYS_TO_ENV: 1
# or custom vault expansion
AZURE_USER: DATAOPS_VAULT(PATH.TO.USERNAME.IN.VAULT)
AZURE_PASSWORD: DATAOPS_VAULT(PATH.TO.PASSWORD.IN.VAULT)
script:
- /dataops
- az .. # your azure cli command
icon: ${AZURE_ICON}
Additionally, the following use cases demonstrate how to connect to Azure from a DataOps pipeline:
Connecting to Azure
To connect to Azure, the Azure orchestrator supports two methods. Use one or the other but not both.
1. Username and password
To use the Azure orchestrator, you must provide your Azure username and password to the DataOps pipeline to connect to the Azure services. Setting the environment variables AZURE_USER
and AZURE_PASSWORD
achieves this.
We recommend that you keep your third-party credentials in the DataOps Vault. Storing them at the default paths AZURE.DEFAULT.USER
and AZURE.DEFAULT.PASSWORD
allows you to retrieve them by setting the environment variable SET_AZ_KEYS_TO_ENV
.
Use the DATAOPS_VAULT()
functionality to retrieve your credentials if you have stored your credentials at different vault paths.
variables:
AZURE_USER: DATAOPS_VAULT(PATH.TO.USERNAME.IN.VAULT)
AZURE_PASSWORD: DATAOPS_VAULT(PATH.TO.PASSWORD.IN.VAULT)
2. Inheriting permissions from the virtual machine
The Azure orchestrator also supports using the Virtual Machine's identity to connect to Azure. To utilize this feature, set the variable DATAOPS_USE_IDENTITY
.
Supported parameters
Parameter | Required/Default | Description |
---|---|---|
DATAOPS_USE_IDENTITY | Optional | If set, uses inherited Azure permission from VM |
SET_AZ_KEYS_TO_ENV | Optional | If set, exports Azure username (AZURE.DEFAULT.USER ) and password (AZURE.DEFAULT.PASSWORD ) from the DataOps Vault to the environment |
Example jobs
Here are a few example jobs that describe the most common use cases:
1. Run shell script in an Azure context
You can create shell scripts in your project repository, such as /scripts/myscript.sh
, and run them from inside a
job. For example:
"My Azure Job":
extends:
- .should_run_ingestion
- .agent_tag
stage: "Batch Ingestion"
image: $DATAOPS_AZURE_RUNNER_IMAGE
variables:
DATAOPS_USE_IDENTITY: 1
script:
- /scripts/myscript.sh
icon: ${AZURE_ICON}
2. Run Azure CLI
It is more common to call an Azure CLI command directly than to run a shell script in an Azure context:
"My Azure CLI job":
extends:
- .should_run_ingestion
- .agent_tag
stage: "Batch Ingestion"
image: $DATAOPS_AZURE_RUNNER_IMAGE
variables:
DATAOPS_USE_IDENTITY: 1
script:
- az vm list
icon: ${AZURE_ICON}
3. Run Azure CLI with DataOps vault enabled
Because in most cases, you will need to leverage context from other jobs within a DataOps pipeline, you need access to the DataOps vault. Therefore, the best practice is to include /dataops
in your script tag before executing the Azure CLI. For example:
"List Azure Virtual Machines":
extends:
- .should_run_ingestion
- .agent_tag
stage: "Additional Configuration"
image: $DATAOPS_AZURE_RUNNER_IMAGE
variables:
DATAOPS_USE_IDENTITY: 1
script:
- /dataops
- az vm list
icon: ${AZURE_ICON}
Including /dataops
allows you to read from, and write to, the vault.
4. List available storage containers
"List Storage Containers":
extends:
- .agent_tag
stage: "Additional Configuration"
image: $DATAOPS_AZURE_RUNNER_IMAGE
variables:
AZURE_USER: DATAOPS_VAULT(PATH.TO.USERNAME.IN.VAULT)
AZURE_PASSWORD: DATAOPS_VAULT(PATH.TO.PASSWORD.IN.VAULT)
script:
- /dataops
- az storage container list
icon: ${AZURE_ICON}