Snowflake Orchestrator
Type | Flexible |
---|---|
Image | $DATAOPS_SNOWFLAKE_RUNNER_IMAGE |
The Snowflake orchestrator is a flexible orchestrator that executes ad-hoc SQL statements against a Snowflake database. It handles the database connection and the SQL statement execution using credentials stored in the DataOps Vault.
Usage
A typical use case for this orchestrator is to connect to Snowflake and retrieve contextual information from the database or trigger additional actions during pipeline execution.
For instance, the following example illustrates how this orchestrator uses the dataops-snowsql
script to emit information about the current account, database, schema, connected user, role, and additionally available roles to the user.
Direct SQL:
image: $DATAOPS_SNOWFLAKE_RUNNER_IMAGE
extends:
- .agent_tag
stage: "Snowflake Actions"
variables:
SNOWSQL_USERNAME: DATAOPS_VAULT(SNOWFLAKE.SOLE.USERNAME)
SNOWSQL_PWD: DATAOPS_VAULT(SNOWFLAKE.SOLE.PASSWORD)
SNOWSQL_ACCOUNT: DATAOPS_VAULT(SNOWFLAKE.SOLE.ACCOUNT)
SNOWSQL_ROLE: DATAOPS_VAULT(SNOWFLAKE.SOLE.ROLE)
SNOWSQL_WAREHOUSE: DATAOPS_VAULT(SNOWFLAKE.TRANSFORM.WAREHOUSE)
DATAOPS_SOURCE_FILE: $CI_PROJECT_DIR/env.sh
script:
- /dataops
- source $DATAOPS_SOURCE_FILE
# emit information about the connected user, role, etc
- dataops-snowsql -u $SNOWSQL_USERNAME -p "$SNOWSQL_PWD" -a $SNOWSQL_ACCOUNT -r $SNOWSQL_ROLE -w $SNOWSQL_WAREHOUSE -f table "select current_region(), current_account(), current_database(), current_schema(), current_user(), current_role(), current_available_roles();"
icon: ${SNOWFLAKE_ICON}
dataops-snowsql
command-line arguments
In this context, it is essential to note that the dataops-snowsql
script has the following command-line arguments:
$ dataops-snowsql -h
usage: dataops-snowsql [-h] [-a ACCOUNT] [-u USER] [-p PASSWORD] [-r ROLE]
[-w WAREHOUSE] [-f {table,csv,json,yaml}] [-o OUTFILE]
[sql]
Run a SQL query against Snowflake and return the results.
positional arguments:
sql SQL statement to execute (or set DATAOPS_SQL or pipe
to stdin)
optional arguments:
-h, --help show this help message and exit
-a ACCOUNT, --account ACCOUNT
Snowflake account, e.g. acme123.eu-west-2.aws (or set
SNOWFLAKE_ACCOUNT)
-u USER, --user USER Snowflake username (or set SNOWFLAKE_USER)
-p PASSWORD, --password PASSWORD
Snowflake password (or set SNOWFLAKE_PASSWORD)
-r ROLE, --role ROLE Snowflake role (or set SNOWFLAKE_ROLE)
-w WAREHOUSE, --warehouse WAREHOUSE
Snowflake warehouse (or set SNOWFLAKE_WAREHOUSE)
-f {table,csv,json,yaml}, --format {table,csv,json,yaml}
Display/output format for the results (or set
DATAOPS_SQL_FORMAT)
-o OUTFILE, --outfile OUTFILE
Save the results to this file (or set
DATAOPS_SQL_OUTFILE, otherwise stdout)
Example jobs
Retrieving the pipeline database's DDL
This example illustrates how to retrieve the executing DataOps pipeline's Snowflake database's DDL script and store it as an artifact attached to the job.
Typically, this DDL script's name is similar to DATAOPS_PROD-123456.sql
if the executing pipeline is a production-environment pipeline.
Direct SQL:
image: $DATAOPS_SNOWFLAKE_RUNNER_IMAGE
extends:
- .agent_tag
stage: "Snowflake Actions"
variables:
SNOWSQL_USERNAME: DATAOPS_VAULT(SNOWFLAKE.SOLE.USERNAME)
SNOWSQL_PWD: DATAOPS_VAULT(SNOWFLAKE.SOLE.PASSWORD)
SNOWSQL_ACCOUNT: DATAOPS_VAULT(SNOWFLAKE.SOLE.ACCOUNT)
SNOWSQL_ROLE: DATAOPS_VAULT(SNOWFLAKE.SOLE.ROLE)
SNOWSQL_WAREHOUSE: DATAOPS_VAULT(SNOWFLAKE.TRANSFORM.WAREHOUSE)
DATAOPS_SOURCE_FILE: $CI_PROJECT_DIR/env.sh
script:
- /dataops
- source $DATAOPS_SOURCE_FILE
- dataops-snowsql -u $SNOWSQL_USERNAME -p "$SNOWSQL_PWD" -a $SNOWSQL_ACCOUNT -r $SNOWSQL_ROLE -w $SNOWSQL_WAREHOUSE -f table -o ${DATAOPS_DATABASE}-${CI_PIPELINE_ID}.sql "select get_ddl('database', '${DATAOPS_DATABASE}');"
icon: ${SNOWFLAKE_ICON}
artifacts:
name: ${DATAOPS_DATABASE} DDL
paths:
- ${DATAOPS_DATABASE}-${CI_PIPELINE_ID}.sql
Project Resources
None
Host dependencies (and Resources)
None