How to implement conditional branches in Azure Data Factory pipelines

I am implementing a pipeline to insert data updates from csv files into SQL DB. The plan is to first insert the data into a temporary SQL table for validation and transformation, and then move the processed data into the actual SQL table. I would like to expand the pipeline execution depending on the result of the check. If the data is ok, it will be inserted into the target SQL table. If a fatal failure occurs, the insert activity should be skipped.

Tried looking for instructions / guidance but no success yet. Any ideas if the pipeline operation supports conditional execution eg. based on some properties of the input dataset?

+3


source to share


2 answers


Now possible with Azure Data Factory ver 2.

Post-like activity can now depend on four possible outcomes. - Success - On failure - On completion - Skip In addition, custom "if" conditions will be available for branch-based expressions.

See links below for details: -



https://www.purplefrogsystems.com/paul/2017/09/whats-new-in-azure-data-factory-version-2-adfv2/

https://docs.microsoft.com/en-us/azure/data-factory/tutorial-control-flow

+2


source


The short answer is no .

I think it's worth noting that ADF is just an orchestration tool for invoking other services. The current version can't do what you want because it doesn't have its own computation. It is not an SSIS data flow mechanism.

If you want this behavior, you will need to code it in SQL DB stored procedures with flags, etc. in the processed datasets.



Then, perhaps, there is a boiler plate code with parameters that are transferred from the ADF to perform an insert or update or redirect operation.

Convenient link for a called stored procedure with parameters from ADF: https://docs.microsoft.com/en-us/azure/data-factory/data-factory-stored-proc-activity

Hope it helps.

0


source







All Articles