With the data product platform, you can use pipelines in every stage of the DataOps process to build, test, deploy, and update data products in your DataOps project. Every part of the pipeline is simplified by breaking it down into mini-pipelines, creating opportunities for automation and orchestration of individual tasks.
Data Products undergo multiple stages before reaching production. First, you must write the code before releasing users the data product or the new feature. Next, it's crucial to prevent any critical errors in your code that might cause the product to crash. Achieving this requires running various tests to identify bugs, typos, or mistakes. Finally, once everything works as intended, you can release the data product or new feature to users.
This simplified explanation shows that build pipelines comprise the build and test stages, where these pipelines provide continuous data integration and testing. At the same time, operate pipelines include the deployment and monitoring stages of the DataOps process.
In this section, you can read about how to build and test DataOps pipelines successfully: