Build Data Pipelines for Batch Processing Solutions
In this lab, you’ll build a pipeline for a batch processing solution. When you’re finished with this lab, you’ll have experience orchestrating data movement from Azure Data Lake to Azure SQL Database using data factory.
Terms and conditions apply.
Set up Batch Files in Azure Data Lake
You will upload batch files to an Azure data lake container directory.
Set up an Azure SQL Database Environment
You will learn how to prepare an Azure SQL database for processing batch files for analytics and reporting.
Create Azure Data Factory Linked Services, Datasets and Trigger
You will create a data factory linked service to connect to data sources, datasets to act as a named view of information to be accessed via linked services, and a trigger to schedule pipeline runs.
Set up Lookup and Set Variable Activities
You will build data factory pipelines to orchestrate the batch processing solution. You will begin by creating Lookup and Set variable activities.
Set up the Copy Data Activity
You will continue building the data factory pipeline to process the batch solution by creating a copy data activity to move data from its source to the destination.
Set up Stored Procedure Activities
You will now finalize the data factory pipeline by creating two Stored procedure activities to upsert data and increment batch load date, and attach a trigger to automate the pipeline run.
Provided environment for hands-on practice
We will provide the credentials and environment necessary for you to practice right within your browser.
Follow along with the author’s guided walkthrough and build something new in your provided environment!
Did you know?
On average, you retain 75% more of your learning if you get time for practice.
- Basics of Azure Data Factory