Skip to main content

Snowflake Orchestrator

Professional
Enterprise

Image$DATAOPS_SNOWFLAKE_RUNNER_IMAGE

The Snowflake orchestrator runs ad-hoc SQL statements against a Snowflake database. It handles the database connection and the SQL statement execution using credentials stored in the DataOps Vault.

Usage

A typical use case for this orchestrator is to connect to Snowflake and retrieve contextual information from the database or trigger additional actions during pipeline execution.

For instance, the following example illustrates how this orchestrator uses the dataops-snowsql script to emit information about the current account, database, schema, connected user, role, and additionally available roles to the user.

pipelines/includes/local_includes/snowflake/snowsql.yml
Direct SQL:
image: $DATAOPS_SNOWFLAKE_RUNNER_IMAGE
extends:
- .agent_tag
stage: "Snowflake Actions"
variables:
SNOWSQL_USERNAME: DATAOPS_VAULT(SNOWFLAKE.SOLE.USERNAME)
SNOWSQL_PWD: DATAOPS_VAULT(SNOWFLAKE.SOLE.PASSWORD)
SNOWSQL_ACCOUNT: DATAOPS_VAULT(SNOWFLAKE.SOLE.ACCOUNT)
SNOWSQL_ROLE: DATAOPS_VAULT(SNOWFLAKE.SOLE.ROLE)
SNOWSQL_WAREHOUSE: DATAOPS_VAULT(SNOWFLAKE.TRANSFORM.WAREHOUSE)
DATAOPS_SOURCE_FILE: $CI_PROJECT_DIR/env.sh
script:
- /dataops
- source $DATAOPS_SOURCE_FILE
# emit information about the connected user, role, etc
- dataops-snowsql -u $SNOWSQL_USERNAME -p "$SNOWSQL_PWD" -a $SNOWSQL_ACCOUNT -r $SNOWSQL_ROLE -w $SNOWSQL_WAREHOUSE -f table "select current_region(), current_account(), current_database(), current_schema(), current_user(), current_role(), current_available_roles();"
icon: ${SNOWFLAKE_ICON}

Another use case is to provide to dataops-snowsql a .sql file path as a parameter, for example, scripts/example_script.sql. The orchestrator accesses the file and runs all the commands listed in the file. This is particularly useful when you want to execute multiple consecutive SQL commands in a single run or have a multiline SQL command that you want to maintain in a separate .sql file.

pipelines/includes/local_includes/snowflake/snowsql.yml
Direct SQL:
extends:
- .agent_tag
stage: "Additional Configuration"
image: $DATAOPS_SNOWFLAKE_RUNNER_IMAGE
variables:
SNOWSQL_USERNAME: DATAOPS_VAULT(SNOWFLAKE.SOLE.USERNAME)
SNOWSQL_PWD: DATAOPS_VAULT(SNOWFLAKE.SOLE.PASSWORD)
SNOWSQL_ACCOUNT: DATAOPS_VAULT(SNOWFLAKE.SOLE.ACCOUNT)
SNOWSQL_ROLE: DATAOPS_VAULT(SNOWFLAKE.SOLE.ROLE)
SNOWSQL_WAREHOUSE: DATAOPS_VAULT(SNOWFLAKE.TRANSFORM.WAREHOUSE)
DATAOPS_SOURCE_FILE: $CI_PROJECT_DIR/env.sh
script:
- export DATAOPS_RUN_SQL_SCRIPT=$(cat $CI_PROJECT_DIR/scripts/example_script.sql)
- /dataops
- source $DATAOPS_SOURCE_FILE
- echo DATAOPS_RUN_SQL_SCRIPT
- dataops-snowsql -u $SNOWSQL_USERNAME -p "$SNOWSQL_PWD" -a $SNOWSQL_ACCOUNT -r $SNOWSQL_ROLE -w $SNOWSQL_WAREHOUSE "$DATAOPS_RUN_SQL_SCRIPT"
icon: ${SNOWFLAKE_ICON}

dataops-snowsql command-line arguments

In this context, it is essential to note that the dataops-snowsql script has the following command-line arguments:

$ dataops-snowsql --help
usage: dataops-snowsql [-h] [-a ACCOUNT] [-u USER] [-p PASSWORD] [-k KEY_PAIR]
[--passphrase PASSPHRASE] [--auth {PASSWORD,KEY_PAIR}]
[-r ROLE] [-w WAREHOUSE] [-d DATABASE] [-s SCHEMA]
[-f {psql,csv,json,expanded,fancy_grid,grid,html,latex,latex_booktabs,mediawiki,orgtbl,pipe,plain,rst,simple,tsv}]
[-o OUTFILE] [--transaction]
[sql]
Run a SQL query against Snowflake and return the results.
positional arguments:
sql SQL statement to execute (or set DATAOPS_SQL or pipe
to stdin)
optional arguments:
-h, --help show this help message and exit
-a ACCOUNT, --account ACCOUNT
Snowflake account, e.g. acme123.eu-west-2.aws (or set
SNOWFLAKE_ACCOUNT)
-u USER, --user USER Snowflake username (or set SNOWFLAKE_USER)
-p PASSWORD, --password PASSWORD
Snowflake password (or set SNOWFLAKE_PASSWORD)
-k KEY_PAIR, --key_pair KEY_PAIR
Snowflake Key-Pair file path (or set
SNOWFLAKE_KEY_PAIR)
--passphrase PASSPHRASE
Passphrase for Authentication using Key-Pair (or set
SNOWFLAKE_PASSPHRASE)
--auth {PASSWORD,KEY_PAIR}
Authentication method for Snowflake (or set
DATAOPS_SNOWFLAKE_AUTH)
-r ROLE, --role ROLE Snowflake role (or set SNOWFLAKE_ROLE)
-w WAREHOUSE, --warehouse WAREHOUSE
Snowflake warehouse (or set SNOWFLAKE_WAREHOUSE)
-d DATABASE, --database DATABASE
Snowflake database
-s SCHEMA, --schema SCHEMA
Snowflake schema
-f {psql,csv,json,expanded,fancy_grid,grid,html,latex,latex_booktabs,mediawiki,orgtbl,pipe,plain,rst,simple,tsv}, --format {psql,csv,json,expanded,fancy_grid,grid,html,latex,latex_booktabs,mediawiki,orgtbl,pipe,plain,rst,simple,tsv}
Display/output format for the results (or set
DATAOPS_SQL_FORMAT)
-o OUTFILE, --outfile OUTFILE
Save the results to this file (or set
DATAOPS_SQL_OUTFILE, otherwise stdout)
--transaction Execute SQL queries in a transaction (default is
autocommit)

Example jobs

Retrieving the pipeline database's DDL

This example illustrates how to retrieve the executing DataOps pipeline's Snowflake database's DDL script and store it as an artifact attached to the job.

info

Typically, this DDL script's name is similar to DATAOPS_PROD-123456.sql if the executing pipeline is a production-environment pipeline.

pipelines/includes/local_includes/snowflake/get_ddl.yml
Direct SQL:
image: $DATAOPS_SNOWFLAKE_RUNNER_IMAGE
extends:
- .agent_tag
stage: "Snowflake Actions"
variables:
SNOWSQL_USERNAME: DATAOPS_VAULT(SNOWFLAKE.SOLE.USERNAME)
SNOWSQL_PWD: DATAOPS_VAULT(SNOWFLAKE.SOLE.PASSWORD)
SNOWSQL_ACCOUNT: DATAOPS_VAULT(SNOWFLAKE.SOLE.ACCOUNT)
SNOWSQL_ROLE: DATAOPS_VAULT(SNOWFLAKE.SOLE.ROLE)
SNOWSQL_WAREHOUSE: DATAOPS_VAULT(SNOWFLAKE.TRANSFORM.WAREHOUSE)
DATAOPS_SOURCE_FILE: $CI_PROJECT_DIR/env.sh
script:
- /dataops
- source $DATAOPS_SOURCE_FILE
- dataops-snowsql -u $SNOWSQL_USERNAME -p "$SNOWSQL_PWD" -a $SNOWSQL_ACCOUNT -r $SNOWSQL_ROLE -w $SNOWSQL_WAREHOUSE -f table -o ${DATAOPS_DATABASE}-${CI_PIPELINE_ID}.sql "select get_ddl('database', '${DATAOPS_DATABASE}');"
icon: ${SNOWFLAKE_ICON}
artifacts:
name: ${DATAOPS_DATABASE} DDL
paths:
- ${DATAOPS_DATABASE}-${CI_PIPELINE_ID}.sql