Skip to main content

Experimental Developer Use Cases

Feature release status badge: PriPrev
PriPrev

danger

This page contains ideas about how to use the DataOps Development Environment (DDE) to achieve more advanced use cases. Code and examples here are subject to change at any time and are likely 'not the best way' to achieve this use case in the long term. However, in line with our philosophies of transparency, we are putting these out to collect feedback and other ideas from the community. Partner with us to make these better!

SOLE compilation and validation

info

Examples on this page are using the DataOps Development Environment with remote deployment, DDE Remote.

The Dataops Development Environment (DDE) comes with a SOLE compile script to speed up SOLE development by catching syntax errors before ever needing to run a pipeline.

We still recommend the best way to test SOLE configurations is to run them as part of a pipeline. However, a modification to a SOLE configuration often includes a simple formatting error that isn't discovered for several minutes until the SOLE job runs in the pipeline. The goal here is to provide the ability to validate a SOLE configuration locally.

There are some environment variables needed before you can use SOLE actions. Add these to your devcontainer.env file:


TF_VAR_DATAOPS_SOLE_ACCOUNT=SNOWFLAKE_ACCOUNT
TF_VAR_DATAOPS_SOLE_USERNAME=SNOWFLAKE_USERNAME
TF_VAR_DATAOPS_SOLE_PASSWORD=_SNOWFLAKE_PASSWORD
TF_VAR_DATAOPS_SOLE_ROLE=SNOWFLAKE_ROLE
TF_VAR_DATAOPS_SOLE_WAREHOUSE=SNOWFLAKE_WAREHOUSE

Once you have these in place, you can now run:

sole-validate

The output of this will look very familiar — since it's the same output as the SOLE orchestrator used when SOLE runs in a pipeline:


You can see that any SOLE compilation errors would show up very quickly.

For example:


You can also utilize the watch tool included in the DDE to monitor for changes to SOLE configuration and automatically compile on any change like this:

Here is the command used in this video, but note that it is specific to the project in the video that follows the recommended layout for a DataOps project, so you may need to update the paths to match your project.
watchfiles --ignore-paths $REPO_ROOT/dataops/snowflake/databases.yml,$REPO_ROOT/dataops/snowflake/roles.yml "scripts/sole_validate.sh" $REPO_ROOT/dataops/snowflake/

Ensure you add any rendered files to the --ignore-paths argument. This will avoid triggering when the rendered files are created. Instead, you can change the templated file to automatically trigger a new render.