Microsoft Ignite 2019: Delivering the Modern Data Warehouse

Paths

Microsoft Ignite 2019: Delivering the Modern Data Warehouse

Author: Microsoft Ignite 2019

See how Azure Data Factory (ADF), Azure Databricks, and Azure SQL Data Warehouse (SQL DW) can be used together to build a modern data warehouse. You will start by using Azure Data... Read more

What You Will Learn

  • Azure Data Factory
  • Azure Databricks
  • Azure SQL Data Warehouse
  • Azure Data Lake Storage
  • Power BI

Pre-requisites

None.

Delivering the Modern Data Warehouse

See how Azure Data Factory (ADF), Azure Databricks, and Azure SQL Data Warehouse (SQL DW) can be used together to build a modern data warehouse. You will start by using Azure Data Factory (ADF) to automate the movement of data in various formats gathered from various sources, including Cosmos DB, into a centralized repository, Azure Data Lake Storage Gen2 (ADLS Gen2) in this case. You will then use Azure Data Factory DataFlow to prepare and analyze the data, and finally write the aggregations to Azure SQL Data Warehouse (SQL DW). This path covers advanced (level 300) content and includes the following technologies: Azure Data Factory, Azure Databricks, Azure SQL Data Warehouse, Azure Data Lake Storage Power BI.

These courses should be watched sequentially.

Delivering the MDW with Azure SQL Data Warehouse, Azure Databricks, Azure Data Factory and PBI

by Microsoft Ignite 2019

Feb 13, 2020 / 46m

46m

Start Course
Description

In this experience, see how Azure Data Factory (ADF), Azure Databricks, and Azure SQL Data Warehouse (SQL DW) can be used together to build a modern data warehouse. Start by using Azure Data Factory (ADF) to automate the movement of data in various formats gathered from various sources, including Azure Cosmos DB, into a centralized repository, Azure Data Lake Storage Gen2 (ADLS Gen2) in this case. Then, use Azure Databricks to prepare and analyze those data, and finally write the aggregations to Azure SQL Data Warehouse (SQL DW).

Table of contents
  1. Delivering the MDW with Azure SQL Data Warehouse, Azure Databricks, Azure Data Factory and PBI

Ingesting Data with Azure Data Factory

by Microsoft Ignite 2019

Feb 12, 2020 / 46m

46m

Start Course
Description

In this experience, walk through creating a pipeline copy activity to copy a file to an Azure blob storage container, so we can prepare the file to be processed later for transformation.

Table of contents
  1. Ingesting Data with Azure Data Factory

Transform Your Data with Azure Data Factory

by Microsoft Ignite 2019

Feb 12, 2020 / 48m

48m

Start Course
Description

Start by creating data flows in debug mode so that you can validate your transformation logic interactively. Next, add a data flow activity to your pipeline to execute and test your data flow in pipeline debug, or use "Trigger Now" in the pipeline to test your data flow from a pipeline activity.

Table of contents
  1. Transform Your Data with Azure Data Factory

Data Loading Best Practices

by Microsoft Ignite 2019

Feb 12, 2020 / 42m

42m

Start Course
Description

Connect to a Windows Azure Blob Storage (WASB) container to load dimension tables using best practices on distribution and indexes.

Table of contents
  1. Data Loading Best Practices

Azure SQL Data Warehouse: Query Performance Tuning

by Microsoft Ignite 2019

Feb 12, 2020 / 37m

37m

Start Course
Description

Over the course of this experience, look for inefficiencies that are causing sub-optimal performance.

Table of contents
  1. Azure SQL Data Warehouse: Query Performance Tuning