Skip to main content

Creating DataOps Environments

Environments provide a way to monitor and manage the deployment of your application or code. They are separate instances where your code runs, and they may include stages like development, testing, and production.

By using environments, you can effectively manage the deployment lifecycle of your data product. This involves deploying and testing code changes in a controlled manner, moving through different environments, and keeping a record of these deployments.

Prerequisites

You must have at least the Developer role.

Creating DataOps environments

You can create DataOps environments to deploy your code by using jobs from the DataOps reference project.

note

SOLE jobs defined in the snowflake_lifecycle_env.yml file in the reference project create these environments automatically. While SOLE jobs defined in the snowflake_lifecycle.yml file in the reference project don't.

If you want to introduce basic environment support in your DataOps project, you must uncomment the line about snowflake_lifecycle.yml in the sole-ci.yml:

sole-ci.yml
include:
- /pipelines/includes/bootstrap.yml

## Snowflake Object Lifecycle jobs
- project: reference-template-projects/dataops-template/dataops-reference
ref: 5-stable
file: /pipelines/includes/default/snowflake_lifecycle.yml

## Snowflake Object Lifecycle jobs with environments
## Uncomment this to add support for DataOps Environments (EXPERIMENTAL)
# - /pipelines/includes/local_overrides/snowflake_lifecycle_envs.yml

You create the environment through a SOLE job that uses one of the configurations from the reference project. Once you run the job, a new environment is created in the data product platform.

For example, to set up a Snowflake environment for a production (PROD) branch:

  1. Define snowflake_lifecycle_env.yml in your full-ci.yml file like below.

    pipelines/includes/default/snowflake_lifecycle_env.yml
    Set Up Snowflake (PROD/QA):
    extends: .agent_tag
    image: $DATAOPS_SNOWFLAKEOBJECTLIFECYCLE_RUNNER_IMAGE
    variables:
    LIFECYCLE_ACTION: AGGREGATE
    ARTIFACT_DIRECTORY: snowflake-artifacts
    CONFIGURATION_DIR: dataops/snowflake
    resource_group: $CI_JOB_NAME
    stage: Snowflake Setup
    script: /dataops
    environment:
    name: $CI_COMMIT_REF_NAME
    url: $SNOWSIGHT_URL/#/data/databases/$DATAOPS_DATABASE
    artifacts:
    when: always
    paths:
    - $ARTIFACT_DIRECTORY
    icon: ${SNOWFLAKEOBJECTLIFECYCLE_ICON}
    rules:
    - if: '$CI_COMMIT_REF_NAME == "master" || $CI_COMMIT_REF_NAME == "main" || $CI_COMMIT_REF_NAME == "production" || $CI_COMMIT_REF_NAME == "prod" || $CI_COMMIT_REF_NAME == "qa"'
    when: on_success
    - when: never
  2. Run the SOLE job on the main branch to automatically create the main environment in the platform.

This applies to jobs originating from the reference project. However, you can also create environments through your custom jobs. For this, you must add an environment section to a job. For example:

"My Job":
extends: .agent_tag
stage: "My Stage"
image: $DATAOPS_PYTHON3_11_RUNNER_IMAGE
variables:
SPECIFIED_PATH: ${CI_PROJECT_DIR}/python_prj
DATAOPS_RUN_PYTHON_SCRIPT: ${CI_PROJECT_DIR}/python_prj/main.py
script: /dataops
icon: ${PYTHON_ICON}
environment:
name: staging
url: https://staging.example.com

Viewing DataOps environments in the platform

You can view the environments for a given project from the project overview or from the environments page.

From the project overview:

  • Open your project data product platform. The project Overview page shows available environments for the project.

    Project environment count in the project overview page !!shadow!!

From the Environments page:

  1. Open your project on the data product platform.

  2. On the left sidebar, navigate to Deployments → Environments.

    Project environments listed in the project environments page !!shadow!!