5. Set up a pipeline
Last updated
Last updated
After you've done creating and polishing the models, you can finally wrap the workflow in a pipeline and create a job to schedule the execution of the workflow. This ensures your transformations are reproducible and reliable every time they run.
Before creating a pipeline from the models, you need to publish them. Publishing models or any assets before using them in pipelines ensures reproducibility, makes dependencies clear, and enables parallel development across teams.
Open each model in the editor and follow these steps:
In the top-right corner, click Publish.
Provide a version number and description.
For the first publish, we'll leave it as default.
Click Publish.
The models are now ready for use in a pipeline.
Follow these steps:
Switch to the Pipelines tab.
Click on the + icon and select Creating modeling pipeline.
Provide the name for your pipeline.
Select the models that the pipeline should contain.
Here we add the three models set up in the previous step.
Click Confirm.
The new modeling pipeline is then added under the Models folder and presented in the DAG view.
Click Publish.
Now that the pipeline is created and published, you can create a job to schedule execution time. Follow these steps:
Open the pipeline that you've created.
Switch to the Jobs tab and click + Create a job.
Provide the configuration for the job.
Provide the name, select the target environment, and customize the associated model variables.
Since we've executed and verified the models using the console. We now can select the Production environment to run the transform against the production database.
Click Create.
The new job will be displayed in the Jobs section. Recurve will execute the job at the scheduled time.
You can the execution status and progress of the job in the Pipeline health dashboard.
Congratulations! You have completed building a data workflow with Recurve.
For the next step, you can explore in detail each feature and module:
Connections
Learn about the available connections
Data modeling
Learn about the core assets in data modeling
Health tracking
Explore the health tracking dashboard