- Lab
- A Cloud Guru
Monitoring Azure Data Factory Pipeline Performance
In this hands-on lab scenario, you are a data engineer for Awesome Company. Recently the company has been implementing a number of Azure Data Factory pipelines to move and transform data for a variety of purposes. As their usage grows it's your duty to monitor for failed jobs and unexpected resource consumption. Performing the actions of this hands-on lab will help you become familiar with monitoring an Azure Data Factory.
Path Info
Table of Contents
-
Challenge
Create a Pipeline Trigger
- Set a trigger on the ProductArchivePipeline according to your timezone. Have it run at an interval, such as every three minutes.
-
Challenge
Stream Logs to a Storage Account
- Create a storage account, then configure the Azure Data Factory to store its platform logs and metrics there.
-
Challenge
Create Pipeline Alerts
- Create two alerts, one for successful pipeline runs and one for failed ones.
- After confirming successful runs, create an interruption in the pipeline that will cause it to fail. Then verify the alerts are working properly.
-
Challenge
View Real-Time Performance Data
- Use the Azure portal to view real-time CPU and Memory performance.
What's a lab?
Hands-on Labs are real environments created by industry experts to help you learn. These environments help you gain knowledge and experience, practice without compromising your system, test without risk, destroy without fear, and let you learn from your mistakes. Hands-on Labs: practice your skills before delivering in the real world.
Provided environment for hands-on practice
We will provide the credentials and environment necessary for you to practice right within your browser.
Guided walkthrough
Follow along with the author’s guided walkthrough and build something new in your provided environment!
Did you know?
On average, you retain 75% more of your learning if you get time for practice.