Skip to main content

Snowflake Orchestrator

Professional
Enterprise

Image$DATAOPS_SNOWFLAKE_RUNNER_IMAGE

The Snowflake orchestrator runs ad-hoc SQL statements against a Snowflake database. It handles the database connection and the SQL statement execution using credentials stored in the DataOps Vault.

Using this orchestrator, you can execute SQL statements either directly through the environment variables or by using the dataops-snowsql command-line tool.

Usage

A typical use case for this orchestrator is to connect to Snowflake and retrieve contextual information from the database or trigger additional actions during pipeline execution.

For instance, the following example illustrates how you can use this orchestrator to emit information about the current account, database, schema, connected user, role, and additionally available roles to the user.

pipelines/includes/local_includes/snowflake/snowsql.yml
Direct SQL:
image: $DATAOPS_SNOWFLAKE_RUNNER_IMAGE
extends:
- .agent_tag
stage: "Snowflake Actions"
variables:
DATAOPS_SNOWSQL_USER: DATAOPS_VAULT(SNOWFLAKE.SOLE.USERNAME)
DATAOPS_SNOWSQL_PASSWORD: DATAOPS_VAULT(SNOWFLAKE.SOLE.PASSWORD)
DATAOPS_SNOWSQL_ACCOUNT: DATAOPS_VAULT(SNOWFLAKE.SOLE.ACCOUNT)
DATAOPS_SNOWSQL_ROLE: DATAOPS_VAULT(SNOWFLAKE.SOLE.ROLE)
DATAOPS_SNOWSQL_WAREHOUSE: DATAOPS_VAULT(SNOWFLAKE.TRANSFORM.WAREHOUSE)
DATAOPS_SNOWSQL_FORMAT: psql
DATAOPS_RUN_SQL: select current_region(), current_account(), current_database(), current_schema(), current_user(), current_role(), current_available_roles();
script:
- /dataops
icon: ${SNOWFLAKE_ICON}

Another use case is to provide to a .sql file path as a parameter, for example, scripts/example_script.sql. The orchestrator accesses the file and executes all the commands listed in the file. This is particularly useful when you want to execute multiple consecutive SQL commands in a single run or have a multiline SQL command that you want to maintain in a separate .sql file.

pipelines/includes/local_includes/snowflake/snowsql.yml
Direct SQL:
extends:
- .agent_tag
stage: "Additional Configuration"
image: $DATAOPS_SNOWFLAKE_RUNNER_IMAGE
variables:
DATAOPS_SNOWSQL_USER: DATAOPS_VAULT(SNOWFLAKE.SOLE.USERNAME)
DATAOPS_SNOWSQL_PASSWORD: DATAOPS_VAULT(SNOWFLAKE.SOLE.PASSWORD)
DATAOPS_SNOWSQL_ACCOUNT: DATAOPS_VAULT(SNOWFLAKE.SOLE.ACCOUNT)
DATAOPS_SNOWSQL_ROLE: DATAOPS_VAULT(SNOWFLAKE.SOLE.ROLE)
DATAOPS_SNOWSQL_WAREHOUSE: DATAOPS_VAULT(SNOWFLAKE.TRANSFORM.WAREHOUSE)
DATAOPS_RUN_SQL: $CI_PROJECT_DIR/scripts/example_script.sql
script:
- /dataops
icon: ${SNOWFLAKE_ICON}

dataops-snowsql command-line arguments

In this context, it is essential to note that the dataops-snowsql script has the following command-line arguments:

$ dataops-snowsql --help
usage: dataops-snowsql [-h] [-a ACCOUNT] [-u USER] [-p PASSWORD] [-k KEY_PAIR]
[--passphrase PASSPHRASE] [--auth {KEY_PAIR,PASSWORD}]
[-r ROLE] [-w WAREHOUSE] [-d DATABASE] [-s SCHEMA]
[-f {psql,csv,json,expanded,fancy_grid,grid,html,latex,latex_booktabs,mediawiki,orgtbl,pipe,plain,rst,simple,tsv}]
[-o OUTFILE] [--transaction]
[sql]
Run a SQL query against Snowflake and return the results.
positional arguments:
sql SQL statement to execute (or set DATAOPS_RUN_SQL or
pipe to stdin)
optional arguments:
-h, --help show this help message and exit
-a ACCOUNT, --account ACCOUNT
Snowflake account, e.g. acme123.eu-west-2.aws (or set
DATAOPS_SNOWSQL_ACCOUNT)
-u USER, --user USER Snowflake username (or set DATAOPS_SNOWSQL_USER)
-p PASSWORD, --password PASSWORD
Snowflake password (or set DATAOPS_SNOWSQL_PASSWORD)
-k KEY_PAIR, --key_pair KEY_PAIR
Snowflake Key-Pair file path (or set
DATAOPS_SNOWSQL_KEY_PAIR)
--passphrase PASSPHRASE
Passphrase for Authentication using Key-Pair (or set
DATAOPS_SNOWSQL_PASSPHRASE)
--auth {KEY_PAIR,PASSWORD}
Authentication method for Snowflake (or set
DATAOPS_SNOWSQL_AUTH)
-r ROLE, --role ROLE Snowflake role (or set DATAOPS_SNOWSQL_ROLE)
-w WAREHOUSE, --warehouse WAREHOUSE
Snowflake warehouse (or set DATAOPS_SNOWSQL_WAREHOUSE)
-d DATABASE, --database DATABASE
Snowflake database (or set DATAOPS_SNOWSQL_DATABASE)
-s SCHEMA, --schema SCHEMA
Snowflake schema (or set DATAOPS_SNOWSQL_SCHEMA)
-f {psql,csv,json,expanded,fancy_grid,grid,html,latex,latex_booktabs,mediawiki,orgtbl,pipe,plain,rst,simple,tsv}, --format {psql,csv,json,expanded,fancy_grid,grid,html,latex,latex_booktabs,mediawiki,orgtbl,pipe,plain,rst,simple,tsv}
Display/output format for the results (or set
DATAOPS_SNOWSQL_FORMAT)
-o OUTFILE, --outfile OUTFILE
Save the results to this file (or set
DATAOPS_SNOWSQL_OUTFILE, otherwise stdout)
--transaction Execute SQL queries in a transaction (or set
DATAOPS_SNOWSQL_TRANSACTION)

Example jobs

Retrieving the pipeline database's DDL

This example illustrates how to retrieve the executing DataOps pipeline's Snowflake database's DDL script and store it as an artifact attached to the job.

info

Typically, this DDL script's name is similar to DATAOPS_PROD-123456.sql if the executing pipeline is a production-environment pipeline.

pipelines/includes/local_includes/snowflake/get_ddl.yml
Direct SQL:
image: $DATAOPS_SNOWFLAKE_RUNNER_IMAGE
extends:
- .agent_tag
stage: "Snowflake Actions"
variables:
DATAOPS_SNOWSQL_USER: DATAOPS_VAULT(SNOWFLAKE.SOLE.USERNAME)
DATAOPS_SNOWSQL_PASSWORD: DATAOPS_VAULT(SNOWFLAKE.SOLE.PASSWORD)
DATAOPS_SNOWSQL_ACCOUNT: DATAOPS_VAULT(SNOWFLAKE.SOLE.ACCOUNT)
DATAOPS_SNOWSQL_ROLE: DATAOPS_VAULT(SNOWFLAKE.SOLE.ROLE)
DATAOPS_SNOWSQL_WAREHOUSE: DATAOPS_VAULT(SNOWFLAKE.TRANSFORM.WAREHOUSE)
DATAOPS_SNOWSQL_FORMAT: psql
DATAOPS_SNOWSQL_OUTFILE: ${DATAOPS_DATABASE}-${CI_PIPELINE_ID}.sql
DATAOPS_RUN_SQL: select get_ddl('database', '${DATAOPS_DATABASE}');
script:
- /dataops
icon: ${SNOWFLAKE_ICON}
artifacts:
name: ${DATAOPS_DATABASE} DDL
paths:
- ${DATAOPS_DATABASE}-${CI_PIPELINE_ID}.sql