Skip to main content

SOLE Use Cases

There are many use cases that the DataOps development environment optimizes, including testing Snowflake Object Lifecycle Engine (SOLE) configurations before running any pipeline.

info

Examples on this page are using the Devpod, where you only need your computer to develop.

SOLE compilation and validation

The DataOps development environment comes with a SOLE compile script to speed up SOLE development by catching syntax errors before ever needing to run a pipeline.

We still recommend the best way to test SOLE configurations is to run them as part of a pipeline. However, a modification to a SOLE configuration often includes a simple formatting error that isn't discovered for several minutes until the SOLE job runs in the pipeline. The goal here is to provide the ability to validate a SOLE configuration locally.

There are some environment variables needed before you can use SOLE actions:

  • Add these to your devcontainer.env file.

    devcontainer.env
    DATAOPS_SOLE_ACCOUNT=<your Snowflake account>
    DATAOPS_SOLE_USERNAME=<your Snowflake username>
    DATAOPS_SOLE_PASSWORD=<your Snowflake password>
    DATAOPS_SOLE_ROLE=<your Snowflake role>
    DATAOPS_SOLE_WAREHOUSE=<your Snowflake warehouse>
  • If you are running on the DevReady, export the same variables as below in the DataOps development environment terminal — followed by a restart of the workspace.

    terminal
    gp env DATAOPS_SOLE_ACCOUNT=<your Snowflake account>
    gp env DATAOPS_SOLE_USERNAME=<your Snowflake username>
    gp env DATAOPS_SOLE_PASSWORD=<your Snowflake password>
    gp env DATAOPS_SOLE_ROLE=<your Snowflake role>
    gp env DATAOPS_SOLE_WAREHOUSE=<your Snowflake warehouse>
note

The old variable names e.g. starting with DBT_ are still compatible and functional. However, we recommend transitioning to the above new names e.g. starting with DATAOPS_MATE_ in your setup to stay current with the latest updates.

Once you have these in place, you can now run:

terminal
sole-validate

The output of this will look very familiar — since it's the same output as the SOLE orchestrator used when SOLE runs in a pipeline:


You can see that any SOLE compilation errors would show up very quickly.

For example:


You can also utilize the watch tool included in the DataOps development environment to monitor for changes to SOLE configuration and automatically compile on any change like this:


Here is the command used in this video, but note that it is specific to the project in the video that follows the recommended layout for a DataOps project, so you may need to update the paths to match your project.

watchfiles --ignore-paths $REPO_ROOT/dataops/snowflake/databases.yml,$REPO_ROOT/dataops/snowflake/roles.yml \
"scripts/sole_validate.sh" \
$REPO_ROOT/dataops/snowflake/

Ensure you add any rendered files to the --ignore-paths argument. This will avoid triggering when the rendered files are created. Instead, you can change the templated file to automatically trigger a new render.