The DDE is in private preview. Therefore, the service may be unavailable for periods of time. There also may be occasions where service elements are rebuilt, and things like your stored credentials are removed, and you will need to re-add. This poses no risk to the rest of DataOps and is completely isolated.
As good development practice dictates, whether developing on local machines or in the cloud, never rely on these for safe storage of configuration/code. Always create a new branch before any changes, and regularly commit your changes to this branch.
What is DDE?
The DDE is a ready-to-code environment that follows the basic principles of continuous development. It gives you highly optimized data development experience for key DataOps use cases with no or minimal setup, depending on which way you deploy: locally, remotely, or in the cloud. See the DDE deployment models for more information.
As a developer, you may have experienced how annoying it could be to:
- Figure out what tools to install and how to configure them correctly during onboarding on a new project
- Manually set up the dev environment every time you need to start fresh and get a clean slate
- Adjust parts of the dev environment when you need to switch between different versions of a project
- Manually clean up afterward in order not to pollute the local system with any checkouts, dependencies, builds, databases, and the like
- Constantly keep the correct versions of 10's or 100's libraries
The DDE changes the above development workflow to remove friction and improve your experience with automation and collaboration benefits. It solves three significant challenges:
- Eliminate all the work and hassle of creating and updating local development environments
- Allow developers to make changes and test those in seconds, not minutes or hours
- Allow developers to develop in an environment virtually identical to real pipelines where the code will finally run - eliminating the 'well it worked on my laptop' issue!
The DDE may not work for customers with Snowflake Private Link, but this is planned for the future. Don't hesitate to get in touch with our Support team for more details.
DDE deployment models
You can use the DDE with two deployment topologies: browser-based or on your desktop. With any of these deployment models, the DDE helps you speed up the development process and allows you automatically assemble all the resources you need to create a more robust and flexible ecosystem.
Depending on your needs, you can choose between the deployment models with different architectures:
- DDE DevReady: You only need your internet browser to develop. The rest is managed inside DDE DevReady itself.
- DDE DevPod: You only need your computer to develop. DDE DevPod uses Dev Containers to provide the same suite of tools available in the DDE DevReady. You can deploy the DDE Dev Containers on your local machine or a remote server.
Here are diagrams to represent each scenario:
The DDE DevReady meets the needs of a wide range of your use cases. It streamlines the development process and provides you with the flexibility and scalability of cloud computing resources. It is also possible to move from keeping sensitive information in the DDE DevReady by using different authentication methods.
However, you must add our static IP addresses to your allowlist. For more information, see the DataOps.live static IP addresses.
Sometimes, organizations prefer to work with local development environments for data sensitivity and security concerns, for example. Depending on your Snowflake structure, and if you want to avoid keeping sensitive information on the cloud or allowing the DDE DevReady to connect to Snowflake, you can use DDE locally. For more information on installation and setup, see DDE DevPod.
The DDE DevPod leverages Dev Containers and provides two deployment choices. First, a fully local deployment:
Second, a deployment with VS Code on your local machine while having the Docker-based Dev Container on a remote computer.
To see all aspects of the DDE and what problems you can solve using any of its deployment models, check out the below topics that describe how to set it up, give some key DataOps use cases, and detail the tools included in it: