Developing with .NET on Microsoft Azure - Getting Started

This course gets you started with everything you need to develop and deploy .NET web applications and services in Microsoft Azure.
Course info
March 9, 2017
4h 12m
Table of contents
Cloud Storage
25m 49s
39m 7s
Course info
March 9, 2017
4h 12m

If you are a .NET developer, and have ever considered whether or not to use Microsoft Azure, this course will help you understand the breadth of Azure resource offerings and supported technologies. This course, Developing with .NET on Microsoft Azure - Getting Started, will help you get your ASP.NET application set up quickly and deployed to the cloud. You will then learn how to scale, monitor, and troubleshoot that application. You will also learn how to work with databases using the Azure SQL database and DocumentDB platforms, and much more. By the end of the course, you should feel confident with your new knowledge on being productive with Microsoft Azure.

About the author
About the author

Scott has over 15 years of experience in commercial software development and is a frequent speaker at national conferences, and local user groups. Scott is a Microsoft MVP and has authored books on several Microsoft technologies, including ASP.NET, C#, and Windows Workflow.

More from the author
ASP.NET Core Fundamentals
5h 57m
30 Sep 2016
More courses by Scott Allen
Exercise files
Exercise Files

These exercise files are intended to provide you with the assets you need to create a video-based hands-on experience. With the exercise files, you can follow along with the author and re-create the same solution on your computer. We find this to be even more effective than written lab exercises.

Download exercise files

Course Overview

Course Overview

Hi, this is Scott Allen and welcome to my course on Azure for .NET Developers. In this course I'll show you how to work with Azure to deploy ASP.NET applications into the cloud and once we have an application in Azure we'll also see how to scale, monitor and troubleshoot the application. I'll also show you how to work with databases in Azure including the Azure SQL Database Platform and DocumentDB. We will also work with Azure storage and take advantage of serverless computing using Azure functions. Finally, I'll also show you how to set up a continuous delivery pipeline in the cloud. By the end of this course you'll have the knowledge you need to get started with Azure and build your own applications. I do assume that you already know how to work with C# and ASP.NET but I'll hope you'll enjoy the course and use the course to gain the basic knowledge to be productive with Azure.

Foundational Concepts


Hi this is Scott Allen. And welcome to developing with .NET on Microsoft Azure. In this first module, we're going to talk about the Azure landscape. Including what you can do with Azure as a software developer. And also see the types of technologies that you can use with Azure. I also want to give you an introduction to the Azure web portal. Which is one place where we can go to manage the applications and other resources that we own inside of Azure. By the end of this module, we'll have the knowledge that we need to go on and start deploying the applications, the web services, and the databases, that we will be using in the rest of this course.

Cloud Computing Services

Cloud computing can be a confusing term because there are so many services available in the Cloud. With Azure for example, you can run machine-learning algorithms over big data. You can set up virtual machines. You can host APIs for your mobile applications that run on iOS, Android, and Windows. And then there are the search services, media services, and storage services. With Azure storage you can move terabytes of data into an Azure data center. And then allow Azure to replicate the data geographically to avoid a natural disaster. The possibilities sometimes feel overwhelming. In this course we are not going to learn about everything in Azure. We're going to focus on some specific common tasks for developers. And in learning about these specific tasks, you'll have the general knowledge that you need to continue and explore other possibilities with Azure in the future. As a developer watching a course for developers, you are probably wondering first what sort of programing languages and frameworks can I use to write the applications that are hosted in Azure? Let's talk about that topic next.

Supported Tooling for Azure Deployment

Azure is an open and very flexible Cloud platform. Meaning you can choose from a variety of operating systems, programming languages, tools, databases, and frameworks when you create services to run in Azure. You can of course create an ASP.NET application and host the application on Windows with internet information services as the web server. But you can also create an application in Node.js or Ruby, or PHP, or django. If you want to run a Java application in a container on Linux, you can. And for large enterprises that use an assortment of technologies you can write different pieces of a large application using different technologies and communicate between the pieces with HTTP or message cues. In this course we're going to be using Microsoft technologies like C#, ASP.NET, Visual Studio, and PowerShell. But the focus is really on Azure. I'm not going to show you how to write ASP.NET applications but I will show you how to deploy an ASP.NET application to Azure. And then show you how to monitor, manage, troubleshoot, and scale that application. The good news is once you've learned how to deploy one type of application, you can deploy anything. Because Azure provides a consistent management interface for all the various technologies. And speaking of deployment, instead of talking about the possibilities that Azure offers in an abstract way, let's make Azure concrete and tangible. Let's talk about where Azure exists in a physical sense.

Azure Data Centers

Physically speaking, Azure is an expanding network of data centers around the world. These data centers form what Azure calls regions. And it is the various Azure regions which we see a map of here. There are 30 regions around the world, including multiple regions in the United States, the UK, Europe, India, and Japan. And there are eight more regions coming online soon. They might already be available by the time you watch this video. When we deploy an application, we need to select a region for the deployment. I might select a specific region to keep the application geographically close to me or my customers. I live in the East Coast of the United States, so I might use the east U.S. region for my development resources. I can also deploy to multiple regions if I wanted some redundancy. And here is the important part to understand, Microsoft Azure runs millions of computer servers, physical machines that are spread around these regions, these data centers around the world. And Microsoft makes sure that these machines receive power. That they're up and running. That they're healthy. That they're physically secure. All we want to do is use just a small fraction of this computing power to run our own applications. How can we do that? Let's start talking about the types of Azure resources that we can use in the next clip.

Microsoft Azure Platform Resources

When you build an application in Azure, you'll be using some specific types of resources that run inside one or more of the millions of servers in the Cloud. In this course we're going to look at just a handful of the platforms and services offered by Azure. This includes Cloud databases and storage. I'll show you how to set up an Azure SQL database and how working with a SQL database in the Cloud is not much different from working with a SQL database locally. With the database in the Cloud can scale up with the click of a button. We'll also be working with a Node SQL database, a document-oriented database known as DocumentDB. We are also going to be looking at some of the tools that we can use to automate Azure. Instead of configuring everything manually which is useful for learning, we are going to learn how to write scripts to create and manage resources for us. And that's useful in software development to make sure that you have consistent environments for testing and production. We will also look at how to create app services and Azure functions. Azure app services cover a wide variety of scenarios. But in the end, if you want to host a web application or a web service in Azure, you'll want to use an app service resource to host your applications and services. And Azure functions as we'll see, they are another option for putting essentially executable code into the Cloud. So if I need a background operation to process files that I upload into Azure, we will see how Azure functions, which you really can't think of as functions in the C# sense. We'll see how we can trigger a function to take an action and process a file when the file arrives into storage. And finally what we're going to look at first is how to create a virtual machine in Azure. Virtual machines are useful for a number of scenarios. And we'll be using virtual machines primarily to learn our way around the Azure portal and learn some of the Azure vocabulary. With a virtual machine we truly will provision a server in the Cloud. It is a machine that we can access remotely. And it's a machine where we can install whatever software we need to run in the Cloud. To create a virtual machine all we need is an Azure subscription. So let's look at where to go for that first.

Setting up an Azure Subscription

If you currently are not working with Azure in any way. What you'll need to do is set up what is known as an Azure subscription. And to do that you can launch any web browser and go to This will eventually take you to the front page of Azure. And it is here where you can currently see a couple of links with the word free in them. Of course websites are updating all the time, so your experience might look a little bit different. But once you find the free link you'll be on your way. Now just a few words about this. What you'll get at the end of the signup is an Azure subscription where you can create any type of resource a virtual machine, an app service, a database, anything at all, and you can use these resources with free credit. Which is more than enough to get through this course. You do have to sign up with a credit card. But your card will never be charged. Once you've spent your credit, you'll need to convert your free subscription to what is known as the pay as you go subscription. And that subscription will only bill your credit card for the resources that you use. And you can't apply spending limits to your subscription. You'll always know how many credits you have left and if and what you'll be charged when the credits are exhausted. Also, just as a note, if you're an MSDN subscriber, or you work for a large enterprise, or a Microsoft partner, or a company in the BizSpark program, you might already have subscriptions and credits that you can use on Azure. So check with your employer first.

Getting Started with the Azure Portal

Once you have a subscription go to This is the Azure management portal. The first time you're here you'll need to log in with your Microsoft account. I've already logged in previously and the browser has sent my authentication cookie along. And there are new portal features all the time in Azure. But what you should see is something like the following. Along the top here is a gear icon. This is something that you can click to customize the portal settings. Including changing the color scheme. And also along the top, is a place where I can view my profile, change my password, view my bill. I can also sign out and go to another Azure account that I may be associated with. And right here in the center is the dashboard. What I see here are tiles that represent various resources and services that I might want to access. The service health file here in the center, this has many green circles with check boxes telling me that all the regions and data centers are up and running smoothly as normal. And the stats board is a useful place to customize, to keep track of what is important to you. I can edit this dashboard. And when I'm editing, I can click to remove a tile from the dashboard. I can also make the dashboard tiles larger or smaller. And I can also rearrange tiles. I can add additional tiles from this tile gallery on the left. Just drag them over. And as we go through the course, and create new resources, like a new web application, we'll be able to add tiles for those resources so we can view and manage them quickly from the dashboard. And finally over here on the left is what we call the hub. This is the place to go when you want to create or manage the resources that you own in Azure. Everything that we can create from virtual machines, to app services, is generically known as a resource. And the entries in here will take me to a particular resource type to manage. For example I can click on app services, which we will be using in the next module. And I can see I currently have no app services. This large panel that is opened up here in the browser, this is what we call a blade. Now Azure has so many services available, that not everything can be listed here in the hub directly. But down at the bottom there is a link I can click for more services which will list everything available here. And if there is specific resource that I want to make available for easy access from the hub I can click to add a star here. And one more tip I'll give you. As a software developer we like keyboard shortcuts. If I press the question mark, the portal will display a help page here with some keyboard short cuts that I can use to navigate around the portal. Now in this portal, when I want to create a new resource, I can either click on the new link up here at the top, or I can go to the hub. I know I want to create a virtual machine, So I click on that icon. The blade opens up for virtual machines. I can see I have no virtual machines here. But in the next section we will create our first machine.

Creating Your First Virtual Machine

Here on the VM blade, let's click add. Now whenever you create a new Azure service or resource, there's some amount of configuration that you need to provide for Azure to create your resource. With virtual machines this configuration includes the type of operating system that you want to install. What I'm seeing now are all the various starting machine images that I can select from. There's a Windows server image, there's a Red Hat Enterprise Linux image. And in this long list there's going to be various flavors of both Windows and Linux. I can even use this search box here to search for various operating systems or configurations that I might want to start with for a virtual machine. And there are many images that include not just the operating system, but also some additional software. Here this featured item is MongoDB with Replication. I can also create a machine that already has something like SharePoint installed. Or Java installed. For C# developers, I can create a machine that has Visual Studio pre-installed. So the type of machine that you will create depends on what you want to achieve. If you want to move one of your business applications running on the internet into Azure, you might choose a machine with just Windows server installed. Let's say as developers we want to try a new version of Visual Studio without installing Visual Studio on our own laptop. Instead we can pay a couple of dollars to try Visual Studio and Azure. It's very quick and easy. I can see once I've selected Visual Studio from this list, I now have the option of installing different versions of Visual Studio. Visual Studio 2013, as well as Visual Studio Enterprise 2017 released candidate. Let's look for the version of Visual Studio 2017 that is the community edition, the free edition. We can create a VM that will run this on either a Windows server or Windows 10. Let's pick the Windows 10 version. On this page there's a description and some legal terms at the top. But I want to call your attention to the deployment model down here at the bottom of the page. I can chose between classic and resource manager. This is a choice that you have for many Azure resources that you'll create. And you always, always, always want to use resource manager, unless you have a good reason not to. We'll see some of the great automation options that we have with the resource manager model later in the course. For now I'm going to leave that selected. And click create, this will open two additional panes here on the right. And this is a reoccurring theme in the portal, when you have a blade that opens new blades the blades will open to the right. And if I scroll to the left, I can see all the steps that I took to arrive at this current scenario. But now that we have the type of machine selected that we want to create, now we need to start getting in and enter some specific configuration details. We'll look at these settings next.

Specific Configuration Details

In Azure just about every resource that you create will need a name. And the name in this case is something that I will use to identify the virtual machine when looking at a list of resources. Let me give this the name VS 2017. I know what that will be for. For VM we can also select the type of hard drive the machine will use for storage, and this is where you'll need to start thinking about the type of workload that you'll run on the machine. To run an IDE like Visual Studio 2017, we probably want to use solid state drives, which are a bit faster. But they do cost a bit more. For other workloads like batch processing that might be more network and CPU intensive, you might go for regular spinning disk drives. But I'm going to stick with SSD. I'm going to enter in a username that I can use to log in as administrator to this machine. And I'll need to select a password, that is sufficiently complex. Now with every resource that you create in Azure, you can assign the resource to a specific subscription. Because you can set up multiple subscriptions on the same Azure account. Here you can see I have two. Using multiple subscriptions is useful, if you want to separate the resources and the billing for different customers or clients or projects. Setting up multiple subscriptions can be useful if you want to separate the resources and the billing for those resources into different subscriptions that map the different customers, or different clients, or different projects. I have a dedicated subscription for this course, so I can keep the resources and the spending for this course separate from what I use for my own company. Even though both these subscriptions are on the same account. And they bill to the same credit card. Another important property that you will set for every resource that you create is the resource group assignment. For now, think of a resource group as a logical container for your resources. So if you have multiple projects running in Azure, you might create a resource group for each project. And a resource group allows you to find and identify all the services and resources that are needed for each project. And when we are done with a project, for instance when I'm finished evaluating Visual Studio 2017, I can delete this resource group that I'm about to create and that will take care of cleaning up all the resources inside. I'm going to give my resource group also the name VS 2017. And then finally nearly every resource in Azure will require you to select a location. This is where I select the data center or the region where my resource will live. Typically you want to create all the related resources in the same region, or the same location. And since I'm on the East Coast of the United States it makes sense for me to select the east U.S. data center. That will keep my virtual machine close to me geographically so the network communications are just a little bit shorter. And once I select okay for these basic settings. I will be asked to make another very important decision about my virtual machine. How powerful do I want to make this machine? And this is not a choice that I'm stuck with. One of the central themes in Azure is that I can always scale up if I need more power. And scale down if I need less power. So the selection I'm about to make is just the initial size of my virtual machine, I can always change it later. Here I can see some recommended sizes for me. In Azure, virtual machines are classified into series. There is the A series, the D series, the F series, and others. Each series is generally designed for a specific type of workload. The F series generally has faster processors, but less RAM. While the G series is the series that you want to look for, if you want the most memory. What we're being offered are two different sizes, from the DS series. The DS series is one of the series that I need to use if I want fast, premium storage on SSD drives. And of these two, DS version two, this is actually the faster of the two machines. It comes with Xeon processors, two cores, seven gigabytes of RAM. I can attach up to four data disks. And I can get up to 6,400 input output operations per second. If I wanted to look for something more, I can click the view all link here. This will show me all of the machines in the DS series. I could go up to a machine that is something like eight cores and 28 gigabytes of RAM. But obviously the other important statistic to look at here is how much the estimated spending on this particular resource will cost you per month. And obviously, a machine that has been carved out for you. That has eight cores and 28 gigabytes, it's going to be a little bit more expensive than the two core, eight gigabyte machine. But right now I'm just going to click to select the DS V2 to be to standard. And move on to the next set of settings, by choosing the select button. Now there are some optional settings that are already pre-configured for me. I'm going to leave all the defaults here, and just press okay. We'll have one more step where the portal will validate and review all of our settings. It looks like everything is good. So I'm going to press okay. This is going to take me eventually back to my dashboard. And I have a new tile on my dashboard that is showing the status of my machine deployment. All I need to do now is wait a few minutes, and I will have access to a machine in the Cloud.

Managing Your New Azure Resource

Now my virtual machine is up and running in Azure. Any time that you create a resource in Azure, you'll want to come back to the portal eventually to manage that resource. You might just want to check on the health of the resource, or you might want to delete the resource, because you're no longer using it. But the first job will be trying to find that resource. Now when I created a virtual machine, Azure was kind enough to drop a new tile on my dashboard. And all I need to do in this particular case is just click that tile, and I'll be taken to the overview of my new virtual machine. But let's pretend that's not there. How else can I find the resources that I have? If I'm looking for a virtual machine, I can always come to virtual machines here in the blade that will provide me a list of all my virtual machines. But I also want to show you that you can come to all resources. And what we're about to see is that I now have more resources than just the virtual machine. We'll talk about that in just a second. I can search for a resource by name. So I can search for VS 2017. I still see a list of seven items. And I think you can see in a busy subscription where there are lots of things going on the list of all resources can be quite long. Quite often when I come into the portal, I go to resource groups instead. Remember resource groups are logical containers for resources. And all the resources inside of a group are typically related. So I have one group, VS 2017, that's the group where I want to store everything that I need to support a Visual Studio 2017 virtual machine. If I come into here, Now I'll see that same list of resources before. Where did all these resources come from? When I create a virtual machine in Azure, a virtual machine is dependent on a number of other resources. A network interface, a storage account, a public IP address, a network security group. I'm not going to go into specific details on any of these. I will show you that a network security group, it's essentially a firewall. I can have inbound and outbound security rules. The inbound security rules are allowing remote desktop connections to this virtual machine. So all of the dependent resources here are related to that virtual machine somehow. They provide some sort of feature, like storage, or network access. Let's go and look at the virtual machine itself. Every resource in Azure will have an overview blade that will give you some basic statistics about the health of that resource or how it is performing. This will allow you to determine if you've sized the resource appropriately. So for a virtual machine I have the ability to see the CPU percentage on the machine, some disk statistics, some network statistics, all of these look good. So I'm not prepared to make any changes to this virtual machine right now. But I might want to connect. This bar at the top, again, just about every resource in Azure, when you open the blade and look at the overview, there will be some buttons up at the top for common tasks that you might want to perform with that resource. Let's try connect. When I click connect, that's going to download an RDP file, something that will open up with the remote desktop connection application. Yes I want to connect to this virtual machine. I'm going to have to enter the username and password that I provided in the Azure portal. And that was sallen with a password that I hoped I typed in correctly. Let's try it. It looks like I did. I need to allow connections to this computer, and now what I should have in just a minute is the desktop of my new virtual machine. You can see here's visual studio, I can double-click to launch it. It's just like I'm using a desktop at my house. Here's Windows Explorer, I can see the disks that are attached to this virtual machine. Now the D drive is a temporary drive. If there's some sort of hardware failure in Azure, Microsoft has to move my virtual machine to a different server. I will lose anything that's in temporary storage. But the C drive, that is stored in Azure storage. There can be some catastrophic failure in Azure, but that C drive in Azure storage, will be backed up and available to me. That's one of the nice things about hosting in the Cloud. What if I wanted to add more storage? What if I wanted to add more disks? What if I needed a more powerful virtual machine? Well that's where I could come in to the Azure portal. And look at settings like the disks settings. It's inside of here where I can attach a new disk. And that disk will just be a VHD file in Azure storage. It's something I could download if I wanted to. It's something I could disconnect and connect to another virtual machine. There's also the size settings. So if I've determined that this virtual machine is not performing as well as I'd like, all I'd need to do to take that environment and put it on a more powerful server and more powerful virtual machine, is to select another setting here. So I could move up to four cores with 14 gigabytes of RAM. Or perhaps the machine is performing very well and I want to try to save some money. If that's the case I could scale down. Try one core with four gigabytes of RAM. But another way to save money with virtual machines, is to stop the virtual machine when I'm not using it. And by the way if I selected one of these boxes and then clicked the select button, my existing virtual machine would be shut down, the disk drive, which is really a VHD file in Azure storage, would be migrated to a new virtual machine. Somewhere out there in the same data center. I could then come back to the overview. Click connect, get an RDP file for the new machine. And everything would be the same until I come just running on new hardware at that point. I'm going to leave the size alone. We were talking about saving money with virtual machines. Something like a machine with Visual Studio 2017, that's not a machine that's going to be in use 24 hours a day typically. And if I wanted to save some money I could stop this virtual machine. When I stop a virtual machine the configuration, the data, everything stays in Azure. But I don't have to pay the compute costs for that virtual machine. When I'm ready to start using it next week or tomorrow morning I can come back in after I've stopped it and click start. Now you may be wondering do you have to login to the Azure portal to do everything? And the answer is no. Every single thing that you can set and do here in the Azure portal, you can also automate by writing a program, or by writing a script. There are SDKs available for many languages, from C#, to JavaScript, to Ruby, and Python. I can also use PowerShell to manage my Azure environment. So let's use PowerShell in the next clip to shut down this virtual machine.

Automating Azure Using PowerShell

In this course we're going to have several opportunities to see how to automate Azure using PowerShell. So I want to give you the basic flow of things. Now as a .NET developer, you probably have Visual Studio installed. And with Visual Studio installed you probably have some Azure SDKs and PowerShell modules already installed on your machine that you can use. But if not, or if you just want to make sure that you have the latest edition, or the latest version of the Azure PowerShell commands, then go back to the website. And here on this website at the top there is a resources link. What I'm ultimately looking for is the downloads page. This is where I can get SDKs and command line tools. On this page you can see there are SDKs for .NET developers, Java developers, Node.js, PHP, Python, and Ruby. We can also download SDKs for iOS and android. And a little bit further down the page currently there are the command line tools. The tools we're going to look at in this course, are the PowerShell tools. Click the Windows install link, run your download, follow the installation instructions, and you should now be able to open a PowerShell and manage absolutely everything in Azure. I already have a linked PowerShell here on my task bar. Let's see how this works. The first you always need to do is associate your PowerShell session with your Azure account. You can do that with a command. Login-AzureRmAccount. There is tab completion here. So I'll use tab to complete that command. When I press enter, I'll be given an interface where I need to login with my Microsoft account. The same one that I used to login to the Azure Portal. So start at This is going to take me to a different location to provide my password. And I have to factor authentication turned on, so I might need to enter a code. But it looks like I passed that test. Now if you have a single subscription you're good to go. I have multiple subscriptions in Azure. So I have a little more work to do to make sure my context is set up to work easily with the virtual machine that I want to get to. Since I have multiple subscriptions when I type the command Get-AzureRmSubscription, this will show me a list of all the subscriptions I'm associated with. Two of them are enabled. I have another two that are inactive. I want to make sure that I'm using the subscription Pluralsight OTC. Currently Ode To Code Azure is the one that is selected. I want to change that. So this time I'm going to run Get-AzurezRmSubscription again, but I'm going to pass a parameter subscription ID. And now I can pass in the wid that is associated with Pluralsight OTC, but is easier in this case to go with the subscription name. Which is Pluralsight OTC. Now this isn't a PowerShell course, but Get-AzureRmSubscrition is going to select a single subscription object. And now I'm going to pipe that object into another command, which is Select-AzureRmSubscription. And what this will do is ensure that my current context is set up to work with that Pluralsight OTC subscription. If at any time you want to see specifically what your context is, you can always type the command Git-AzureRmContext and that will show you, what is the account and what is the subscription that you're currently acting against. Now you'll notice a reoccurring theme here. All these PowerShell cmdlets. They all have AzureRm in the name. Rm being resource manager. If you're using PowerShell cmdlets that do not have those two letters Rm in the command, then you're using an older version of the Azure PowerShell module. You'll want to make sure that you update, so that you're using the commands that have Rm. So AzureRmSubscription, AzureRmContext, now that we have the correct context set up, let me get a list of all the virtual machines that are in the subscription. I can get that with Get-AzureRmVM and I can see I have one machine, its name is VS 2017. It's in the resource group VS 2017. And I have some summary information here, like the VM size and the OS type. I can drill into very specific detail, and see everything about this virtual machine using PowerShell if I wanted to. But all I want to do was stop the machine. So I will use the command Stop-AzureRmVM and I will need to pass along a couple of parameters here. First of all, the resource group name which is VS 2017. And then the name of the resource that is in that group. Which is VS 2017. And if I was writing a script, I would also include the forced parameter, but I won't in this interactive demo. Without the forced parameter, PowerShell's going to prompt me, make sure that I really want to stop this virtual machine. I'm going to say yes. This is going to take a couple of seconds. But we will come back as soon as this command completes. Now we're back and in this case it took a couple of minutes for the command to succeed. Sometimes that happens much quicker. But if I come back to my portal now, I should be able to refresh on this virtual machine. And I'm just going to refresh the entire portal. But I should be able to see that this virtual machine has stopped. So as the dashboard comes up, I can see under the status indication here, that this is stopped and de-allocated. De-allocated meaning I will not be paying for this virtual machine. I will still be paying for the storage of all the information that is on the C drive. But I won't be paying for the two cores and seven gigabytes of memory that were allocated by this virtual machine. If I wanted to use it again, there is a Start-AzureVM command that I can run. Or I can click the start button here in the portal. If I was completely done with this resource, this virtual machine, what I would do is come in to the resource groups, go to my VS 2017 resource group, and now one of the action buttons up here is to delete this resource group. If I delete the resource group, I will delete all of the resources inside of it 'til everything is cleaned up. Then I won't be paying for anything. The virtual machine or the storage. And of course I could also delete this resource group using PowerShell. And that's an important aspect of Azure, everything that you do can be automated. And that means your software development and your application releases, they can all be repeated and reliable.


In this first module of the course I gave you an overview of what Azure is and what Azure can do. We created a new virtual machine, and a data center on the East Coast of the United States. And in doing so, we saw how to work with the Azure web portal. We also used PowerShell from the command line to work with Azure. We're going to build on all this knowledge moving forward. In the next module we're going to turn our attention specifically to deploying a web application into Azure.

Building Web Applications and APIs


Hi this is Scott Allen, and this second module we will look at building web applications and APIs with Microsoft Azure. I want to show you how to use Azure as a platform for your applications, and look at different options for creating and deploying an application. Since this course if focused on .net, we'll be looking mostly at using Visual Studio as a tool for developing and managing our application. Although we will see that Visual Studio is just one of many options for this scenario. For our first topic, I want to make sure I explain what I mean when I say that we will use Azure as a platform for our application.

Using Platform as a Service Features

Let's talk about three different scenarios for building and then hosting your application. What we're going to focus on is what you are responsible for versus what Azure is responsible for. The first scenario we'll talk about is what the industry calls the on-premises solution. You'll hear this referred to as on-prem for short. An on-prem solution is where the physical servers that you use for your application, they are purchased by you or your company. They're also hosted by your company. In a server closet perhaps, for a small company, or in a data center where your company places the hardware. In this scenario, your business is not only responsible for building your application, but also acquiring, installing, and configuring in the operating system, and the physical hardware like the server machines, the network cables, the redundant power supplies, and all the disk drives that you need. Let's contrast the on-prem solution with the Infrastructure as a Service solution or IaaS for short. This is one of the scenarios you can achieve using Azure, and actually what we did in the first module of this course was a first step into an IaaS setup, because we created a virtual machine in Azure. With IaaS, we are still responsible for building our application, of course, and we still need to configure the operating system. So if we created a Windows server or Linux server machine in Azure, we'd still have to go in and make sure the right web server components are installed. But the hardware is taken care of by Azure. And it's not just hardware in the sense that an Azure data center has hundreds of thousands of servers, and solid state disk drives, and network cables, and routers, and redundant power supplies. Azure data centers also provide real infrastructure like physical security, backup generators, cooling systems, and redundant fiber network connections. So virtual machines in Azure, they have a lot to offer with infrastructure as a service, but when building applications, we can push even more responsibility to Azure using platform as a service features. We call this PaaS for short. This is what we're going to focus on in this module. As developers, we know all about using abstractions to encapsulate complex operations and software, and with PaaS, we are raising the level of abstraction so that we don't have to think about hardware or the operating system. We don't need to configure the operating system, we don't need to apply patches or clean up temporary files, or set up software backups. All of this is taken care of for us by Azure. We just focus on building the application and adding new features. If we need more power because we have a growing user base, we don't need to go in and create more virtual machines and configure each machine with a PaaS solution. We can simply ask Azure to scale up or scale out for us. Now let's talk about the specific platform that we're going to use as a service. The app services platform.

Introducing Azure App Services

Azure app services is where you want to go if you want to build an application or an API that runs on a platform in the cloud. app services supports a wide variety of technologies and application types. So what if you want to write a traditional web application and serve HTML to a browser like a blog or an eCommerce site or a content management system? App Services will support this. You can write the application in ASP.NET or Node.js or PHP, and many other frameworks and languages. Or maybe you want to build an API application and serve data over the internet. Perhaps you are supporting a single page application with lots of JavaScript and you want support on the server for cross origin resource sharing. app services works here too. Or perhaps you have a logic app. Something that integrates with other backend systems and implements a business workflow, or it needs to collect data from other sites like Twitter to aggregate and analyze. app services supports this scenario too, or perhaps you need to write the back end service for a mobile application, and not only serve data over a rest API to the application, but also send push notifications out to mobile devices. Again, app services has this covered. In this module, I'm not going to be able to show you all the possible scenarios that you can implement and all the possible features that you can use with the app services platform. But I will show you how to build and get a web application up and running in app services and take advantage of some of the core features in this platform. You'll have enough knowledge then to explore the rest of app services on your own. Let's take a look at what creating an app service looks like in Azure.

Creating an App Service

Back in the Azure portal I can still see my VS2017 virtual machine is here, but in this module we're going to be looking at App Services, not virtual machines. And if I click on that link in the hub, I currently have no app services. We're going to create a couple in this module. So the first app service I crate, I'm just going to create a website from a template here that Azure provides. Later in this module, we will use Visual Studio to build a brand new web application and then put it into an app service, but this process that we're going through is very similar to the virtual machine process that we went through in the last module. Which is I want to create a resource and now Azure's going to walk me through a series of steps to set up and configure that resource. With virtual machines I got to select what type of operating system that I want. With app services, if I want to, I can go with a pre-configured app service. For example, if I wanted to set up a WordPress blog, I could set up the WordPress entry here and click Create, and with a little more configuration I'd have a WordPress site up and running. I can also select from other templates that include different types of media services, other blogs and content management systems. Further down here there are websites based on different web application frameworks, eCommerce sites. I think since we're going to be building an ASP.NET website later, I'm just going to start with the ASP.NET starter web app. It's very easy to configure. Once I click Create, we'll come to that basic configuration section and they'll be some familiar things here. Just like with virtual machines, I have to select what subscription I will create this app service on. I'm going to use plural site OTC again, and I have to specify an app name. Now the name of an app service is a little bit different than specifying the name for a virtual machine. When I'm specifying the name for an app service, I need to have a unique name on on that domain. So many of the common names are already taken. If I try foo, or if I try plural site, well someone else already owns so I cannot use that. Actually a good example of this is if I try to go to, Azure is telling me that, that name is not available and that's because I own that name in a different subscription, and in fact if I try to go to what we'll see if that we end up back on my site, That's because my blog runs as an app service in Azure. It's been hosted in there for at least four years now, and I didn't want to expose my blog as so I went out and purchased my own domain, and I pointed that domain to my app service. So now when a request arrives in Azure at my app service for, I have an IIS rewrite rule that will just redirect that request to Which is the same app service, but I just want to use that domain name. So let's come up with a unique name here. What I usually do is prefix the name with some sort of company or project initials. So maybe PS for plural site. A plural site starter, that name seems to be available, and once again I have to select a resource group. Ether an existing resource group or create a new resource group. I want to create a new resource group. I'll call it PS starter. This is the same resource group concept that we talked about in the last module when we created the virtual machine. I want this to be a new resource group, because it's not related to that development machine, and if I add other dependent resources to support this app services, resources like maybe a sequel server database, I want them all in the same resource group. It's a logical container for resources, and once I filled out the name and the subscription and the resource group, it's time for me to choose an app service plan. Which is very similar to when I created a virtual machine I had to select a size for that virtual machine, but we're not concerned with virtual machines anymore, and the concept of the app service plan is very important. So let's take a step back and talk about this for a moment.

App Service Plans

We are about to create our first web application in Azure. Every web application I create will be placed into a single app service dedicated to that application. And then every app service maps to a single app service plan. A plan describes the performance characteristics of the machine that will host my app service. Which means my application. If I select the P2 plan, for example, currently that plan is a two core machine with 3.5 gigabytes of memory, and yes, behind the scenes this is a virtual machine, but the virtual machine is abstracted away by Azure. I don't need to log in and manage software updates or security patches. I simply use my app service plan to describe how many CPUs I want and how much memory I need, and Azure take care of the rest. Now the important part to understand about the app service plan is that I can deploy multiple app services into Azure and have them all map to the same app service plan. In this case you see I have three applications. Each in their own app service, but all the app services are pointing to the same app service plan. Which means these applications will live together on the same machine. I could also create an app service, one for each application. So all the applications are separate, and it really depends on the type of application whether or not I want all these applications on the same app service plan as all the others. One reason to keep applications together is so that I can scale them together. So what happens when I get more and more users, which means I'm seeing more and more requests coming into my applications. With an app service plan I really have two options. I can scale up or I can scale out. When I scale up, I change my app service plan to a plan with more compute resources. So currently in Azure if I go from a P2 plan to a P3 plan, I would double the number of cores from two to four and double the amount of memory from three and half to seven gigabytes. Azure takes care of placing my applications on a new virtual machine behind the scenes, and then another option is to scale out. By default, I run a single instance of my app service plan, which means I'm using a single machine, but I could double the number of instances to two or maybe even five. With the P2 plan, notice that little line there, it says I can have up to 20 instances. Also with Azure I can configure the app service plan to scale up and down dynamically in response to the load or even by time. So let's say these three applications work together, and they're primarily used during business hours. I could configure Azure to run five instances during business hours, and a single instance overnight. We'll look at some of those details in the next module. Right now we just want to get up and running with a new web application, and just remember you can have multiple app services mapped to a single app service plan, and a single plan can run my application on multiple instances if I need to scale out to satisfy the demand.

Creating an App Service Plan

Back in the portal we reach the point where we were going to create a simple website from an Azure template, and we reach the app service plan configuration. Azure has already added an app service plan to our app service with some default settings to make configuration easy, but I want to take complete control over the app service plan. So instead of using service plan double A809 number number number, I'm going to click Create New. This will open a new blade where I can explicitly set the configuration for my app service plan. I'm going to name the app service plan PS starter. Because to me this app service plan will only exist for the PS starter app service, and now where will these machines live that host my app service? I'm going to select east U.S, and here in the pricing tier settings, this is where I can specify the size of the machine that I want. The default is to use the S1 app service plan. Which is a one core machine with 1.7 gigabytes of RAM. This is actually enough power depending on the application to host a website that can process 10s of thousands of requests per day. It's also a plan where I can support a custom domain like I can buy an SSL certificate and add that to my app service so that my traffic is encrypted, and I can scale out to 10 instances, but let's take a look at some of the other pricing plans here. If I scroll down you'll notice there is a free plan. With the free plan I cannot have a custom domain, I cannot use SSL, but then again, it is free. Since I just want to experiment with this template I'm going to select the free plan. Click Select, say okay, this is my app service plan that I want to use, and now I'm going to create my new app service with its new app service plan. And in a few moments I should be able to check the website. We'll be right back. It took just a few seconds but now my alerts say that this deployment was successful. So if I come back to the list of app services that I have, I can now see PS starter. Let me click this and it will bring us to the overview. Just like with virtual machines there is a consistent theme throughout the UI of the Azure portal. The overview screen tells me that this app service is running. It gives me the URL. I should be able to click on this URL and see the ASP.NET application that I created, and here it is, a website running ASP.NET webpages in an Azure app service. I could download these webpages and open the project in Visual Studio and start to edit this website, but instead let's take a different approach now. Let's use Visual Studio to create a web application and deploy it into a new app service.

Publishing to an App Service

For this demonstration I'm going to use Visual Studio 2015, the free community edition, and what I want to do is just create a new ASP.NET NVC project. So under the web templates I want to create an ASP.NET web application. Let's call this ps-aspnetnvc, and when I select OK, I want to make sure that I have the NVC template selected. I want the authentication type to be individual user accounts, because this will require a sequel server to store information about users. We're not going to set up sequel server in this module, but I will show you how to do this later. Notice there's a checkbox here for host in cloud. If I wanted to configure an app service to run this application in Azure right from the start. I'm going to leave this unchecked for now, and I'll show you how to come back and add the Azure support later. The steps will be the same. When I select OK, Visual Studios going to take a few minutes to go out and create my new project, and once all of this is complete, which it looks like it has finished, let me press Control F5. This will build the application and run the application locally. I just want to make sure everything is working before we try a deployment. And it does look like we have a working ASP.NET NVC application. Now back in Visual Studio what I want to do is deploy this application into Azure, into a new app service. So if I right-click the project and say that I want to publish, I can now walk through a series of dialogues to configure all the information. So what I'm going to show you in this module is how to do an easy deployment directly from Visual Studio. This is a really nice feature to have if you are an individual developer. Obviously if you work on a large team, you don't want individual developers deploying into Azure, but in the last module of this course, I will show you one approach to setting up a build server and having an automated deployment into Azure so we will look at that scenario. For right now I just want to publish directly from Visual Studio, and I want to publish into an Azure app service. This first dialogue that comes up once I select my account and my subscription, I will be able to view all the existing app services that I have out there if I wanted to publish this application into an existing app service, but that's not what I want to do. I want to create a new app service, and what we're going to see is the configuration information that I had to fill out in the Azure portal for the previous app services. I have to provide that same information here inside of Visual Studio. So everything that we learned previously still applies. I have to give my app service a name. I'm going to try to get away with just ps-aspnetnvc. That does have to be unique on, and if it was not unique I'd have a validation error. It looks like that worked. I do want it on this plural site subscription. I do not want to use the same resource group as ps-starter that app service users. So let me create a new resource group, and I'll call this one ps-aspnetnvc, and I don't want to host this application on the same app service plan in the same machines as the ps starter. So let's create a new app service plan. I'll call this just ps-aspnetnvc. The previous hosting location was east U.S, but let's go ahead and use south central U.S because I can certainly deploy different applications into different regions, and for the size, this time let's go with the small premium size. So P1 where P is premium. It's a one core machine with 1.75 gigabytes of RAM. Let me select OK, and when I click Create, what Visual Studio will do is go out and deploy this environment. So I'm not publishing my app into Azure just yet. What I'm doing is setting up the environment. Visual Studio is creating the app service, it's creating the app service plan, it's creating the resource group to host both of those things, and now that, that step is complete, I now have everything that I need to publish this application into Azure. This is going to be the server, and the end point that is used to publish. We're going to be using the web deploy publish method. Notice there's a number of other published methods available. I could use FTP if I wanted to, to deploy into Azure, but I'm going to stick with web deploy. There are specific deployment credentials that Visual Studio sets up in Azure for me. So this user name and this password, I could use these inside of a build service to give that service the credential it needs to publish my web application, but I don't need to give the service or someone else credentials to get into my account or into the subscription just to be able to publish an app service. Visual Studio is going to save a publish settings file into the project, and you want to be very careful about checking that into source control if it contains the password, but I could just validate this connection to make sure everything is going to work. That validation succeeds. I could go ahead and click the Publish button right now, but I'm going to click Next at the moment just to show you that you can change the build configuration here. I do want a release build. Visual Studio does realize that this project uses an entity framework DD context and that there is a connection string in web.config. We will come back in a later module of this course and set up a sequel server in Azure and use this dialogue to change the connection string that's going to be used when the application is deployed, but for right now I just want to go to Next. This is where I could see a preview of what's going to happen when I publish, but I'm just going to go ahead and click the publish operation. This can take a few moments the first time you publish because Visual Studio has to push up all of the assemblies and all of the artifacts that are needed for this app service. But once I've published into Azure, if I just make a change to something like a razor view, a simple change, the next time I publish all Visual Studio needs to do is upload that one changed view, but it looks like the publish operation has succeeded. Visual Studio has launched a browser. It's going to, and now I can see that application is running in Azure. Just to test that, let's go to the about page and see if we can make a change to this about page. This will be in a razor view for the home controller. Specifically the about view. And let's just change the text in here to say this is an update to deploy into the app service. Let me save that razor view and immediately come back and click publish again. Visual Studio remembers this profile that I have set up. It is now the ps-aspnetnvc profile. All I need to do is click Publish. It's a very quick operation this time, and now that I'm viewing the application inside of Azure again, let's come to the About page, and I can see that my changes are now in Azure. So a very simple publish operation.

Working with Publishing Profiles

Now that we've built and deployed an application into Azure, let me answer some common questions that I get regarding deployment in Azure and Visual Studio. So first of all, inside of Visual Studio, when I right-click a web project and say that I want to publish this project, once I fill out all the information about the connection and the app service and so forth, Visual Studio will create what is known as a publishing profile. You can have multiple profiles and these profiles will be stored in the publish profiles folder. The extension is pubxml because these are XML files, and they contain information like the type of build that I want to deploy. I want to deploy, in this case, the release build. It contain the URLs, it contains the user name that is part of the deployment credentials. What this file does not contain is a password. So this file is safe for sharing and perhaps checking in the source control. When I tell Visual Studio to save my deployment password, by default that information is placed into a different file that is not part of the project but it is in the same folder. I can see it if I go to solution explore and say show all files. It is the pubxlm.user file. Now this file does not contain the password in plain text. It contains an encrypted password, but this file is impossible to share so it doesn't make any sense to check it into source control. The password is encrypted on a per user and per machine basis. So if another user logs into this machine or if I copy this file to a different machine, in both of those scenarios, Visual Studio will not be able to decrypt the proper password for deployment, and what I would have to do then is when I right-click and say publish, I would have to answer the deployment password. So that brings up the obvious question of where can I get the password, and how do I manage these credentials? There's a number of ways to do this, including using PowerShell, but let me show you two easy approaches. One approach is to use a tool inside of Visual Studio. Notice the cloud explorer. This is an extremely useful tool if you're doing Azure development with Visual Studio. If you do not have cloud explorer in your View menu, then I suggest you go to the Tools menu, go to Extensions and Updates. Inside of there you'll be able to search for cloud explorer and tell Visual Studio to install that extension. I already have the tool installed so I'm going to open up the cloud explorer. Inside of here I can add as many Azure accounts as I have access to. For this we're just going to be looking at the subscription that we're using for this course, and now I can see all the resources and resource groups that are associated with that subscription. Here's the resource group that we just created, ps-aspnetnvc. Inside of here is my app service plan. I can see some information about it in the properties down here. And here's is my app service itself. I can further expand this if I wanted to take a look at things like log files associated with the website. We'll be looking at that in more detail in the next module. I can even see files that are deployed to the server. What I want to do is right-click this and go to Download Publish Profile. This will allow me to download a file that contains all the Azure settings that Visual Studio would need. So if I wanted to create a new publishing profile from these publish settings, I can go into Visual Studio, I can right-click, I can say that I want to publish, and now what I would do is come back to the profile tab here and say that I want to import those publish settings that I just downloaded. Now what I wanted to point out about these publish settings is that this file, when I open it up in Visual Studio, I will see it contains both the user name and the password, unencrypted. So this is a file that you wouldn't want to check into source control. It will also set things up so I can deploy using either web deploy or FTP, and if I scroll over a little bit, I just want to show you there's the deployment user name. Here's the deployment password over here unencrypted. Now if I wanted to change those credentials, or I wanted to use the portal to get to those credentials, what I could do is open this app service in the portal. An easy way to do that from cloud explorer is just to right-click the app service. Say Open in Portal, and that's like a shortcut link that will take me directly to the blade for that specific app service. So here we are on the overview blade for that app service, and in the action buttons across the top of the blade, if I go to the More link, then I can select Get Publish Profile to download that same file that we just downloaded with the cloud explorer. And if I wanted to invalidate any existing profiles and change the password, I could click Reset Publish Profile. This will force Azure to generate a new password for deployment. I'm not going to do that right now, but I do want to show you one more thing. Like I say, later in this course, in the last module, we will look at creating a build and an automatic deployment for our application, and we'll do that using Visual Studio team services. If I come here to Deployment Options for my app service, there are other sources available to deploy from. So we'll look at VSTS later, but you can also have Azure synchronize the contents of your app service with a OneDrive folder, or a local git repository, or a repository on GitHub, BitBucket, a Dropbox folder, or an external repository. With an external repository, if you have an open source project or any project that has a public git or mercurial repository, then you can tell Azure to monitor that repository, a specific branch, and everything there is a check in, Azure can synchronize with that repository and you can deploy just basically by pushing a commit up to that repository. Now there's many other options here available in my app service. Many other settings. Here's how I can install a custom domain, here's where I can buy and install my own SSL certificate. We won't have time to cover all of these settings, but in the next module we are going to look at things like monitoring, troubleshooting, scaling out, and scaling up.


In this module we learned about app services and how they can host different types of applications and APIs in Azure using an app service, and an app service plan. Then we took this knowledge and used Visual Studio to create a simple web application and used the publish tools in Visual Studio to push our application into the cloud where it's now available to everyone. In the next module we're going to build on what we've learned here to explore app services in more detail. Specifically we're going to look at monitoring and troubleshooting our app service for those rare cases where we might create a bug in our software.

Monitoring & Scaling Web Applications and APIs


Hi, this is Scott, and in this module we will continue looking at app services in Azure, and concentrate on monitoring, configuring, and scaling app services. We will look at how to use deployment slots, set up performance alerts, view the diagnostic logs, and for those extreme cases when we can only reproduce a bug when running in Azure, we'll also look at remote debugging using Visual Studio. We're going to start by looking at the deployment slots feature of App Services.

Deployment Slots

This module is all about making sure that you have a successful, healthy web application in production. And as part of that discussion, I want to show you how to use deployment slots. A deployment slot will allow me to validate that my application is working properly in Azure, before I put that application into production for my customers. I can also eliminate downtime, and give my new deployment a chance to warm up before I allow customers into a new version. What I'm going to show you is how to add a deployment slot to an App Service. You can have more than one slot with the right App Service plan, so for example, you might want one slot for testing, one for staging, and one for production. And once validated that that deployment into staging is working, Azure makes it easy to swap slots. That is, I can take what is in my staging slot, and with a click of a button or a script, make my staging deployment the new production. If something goes wrong, it's very easy to swap these back and take my last good production deployment and put it into production again. Let's take a look at how this works.

Setting up and Using a Deployment Slot

What I want to do when demonstrating deployment slots is also show you how to manage application configuration, and how application configuration and deployment slots can work together. In order to do that, we're going to return to the web application that we built and deployed in the last module. And what I want to do is come into my Web config file and create a special app setting. The app setting will just be a greeting that I display in a Razor view, so let's make the key for this setting Greeting, and the value will just be Hello, from development. Let me save that, and now let's come into the homepage for the application, the Index.cshtml view, and up here in the jumbotron, I'm going to cut out some of this text, and just replace it with the value that we read out of the Web config file, so let me go to ConfigurationManager, and that will require me to bring in a name space, System.Configuration, and we'll ask the configuration manager for the app setting with the key of greeting, and just try to display the text here on the homepage. Let me run the appliation locally, and what we should see once the browser starts is that message, Hello, from development. That's very good, so that might represent a connection string, or mail server, or some other configuration item. And what I want to do is I moved this application from here to a staging slot, and then into production, is modify the value of that application setting at each stage, for each environment. So let's come back to the Azure portal for a second. I'm going to go to this App Service where we deploy this application. And down here on the App Service blade, I will be able to get to Application settings. And under Application settings, I can override settings that are in the Web.config file. So in addition to setting properties like what version of the .NET framework do I want to run on, I will find an App settings section. Also, a Connection strings setting, so if I want to override the database connection string that's being used. But here under App settings, I'm going to add a new setting with the key of Greeting, and the value, This is production. Now I'm just going to save those settings, and let's come back to Visual Studio. What I want to do is re-deploy the application, so I will go to Publish. We're going to use the publishing profile that we set up in the last module. And this will just take a second to push the files up to Azure, and launch a browser where I should be able to see the greeting that I configured in Azure for that App Service. And now the browser has launched. And I can see, when I'm running in Azure, I see the message, this is production, but luckily, I see "Hello, from development." So now let's say I want to do some more work on my application and I want to setup a staging slot so that when I do a deployment, I have a chance to look at the application and make sure they're up and running in Azure properly. And once I feel confident that everything is working correctly, I will swap that staging slot with a production slot, which will push the application live to all my customers. First we'll need to setup the deployment slot, so back in my app service blade, up here there is an option for deployment slot. What I want to do is add a slot. I will call this slot staging and the configuration source will be the ps-aspnetmvc app service. So one way to think about a deployment slot is it's almost like having a brand new app service. And all the configuration, like the application setting that I just entered that is in my parent app service, I want to take those settings and copy them into this deployment slot. And when I say okay, Azure will create that deployment slot, which will take a little bit of time but it looks like we have completed it already. So let me click to go to that staging slot. And as I said, this is just like having yet another App Service, but it's tied to the original App Service that I had, ps-aspnetmvc. This App Service has its own URL. So what Azure did was just take the slot name, which was staging, and append that into the URL so now the URL to this application is ps-aspnetmvc-staging. What I want to do is come down into the Application settings and change the value for the greeting. So I don't want the greeting to say This is production. When an application is deployed in this slot, I want it to say, This is staging. And I'm going to select the checkbox here that this is a slot setting, so it is not a setting that should get swapped when I swap staging and production. This value is sticky and it stays here with the staging deployment. Now let me click save and now the question is how do I deploy to this staging slot? Right now, Visual Studio is configured to deploy into the production app service. Well, at the end of the last module, we talked a little bit about deployment to an app service. And I showed you that you can always come into an app service and get a publishing profile for that app service, and that's one of the easiest steps to do here. I want to get a publishing profile that I can import into Visual Studio and publish into the staging environment. So let me click that link to download the file. We will save it and now, back in Visual Studio, what I want to do is deploy into staging so let me right-click the project and say publish. But this time, instead of going right to publish, let me back up and go to the different profiles that I have. I want to import. To create a new profile, I will browse to the file that I just downloaded, which is PS-ASPnetMVC We're going to import that. This will bring in all the settings that I need to get to the staging slot. And back in the profile settings, I can clean this up to make it very clear if I'm going to staging or production. Let's go to Manage Profiles. I'm going to delete the FTP deployment. I'm never going to use that. Let's rename this to production and rename this to staging. And now it's very clear in the style log box if i'm pushing to staging or production. I want to go to staging. And I'm just going to say publish to make sure this works. Once again, it will take a second for Visual Studio to push all the files into Azure. This is a brand new deployment, so all of the content will have to be pushed in, all of the assemblies. Now the deployment is finished and the browser is launched and I can see the application up and running in the staging slot, with the app citing this is staging. Now, let's say I have an update that I want to push into production. So far, I haven't made any code changes. I just put this into the staging slot. Let's just add some text. This is the new version. I'm going to save that. We're going to publish one more time to push that into Azure and I'm going to publish into staging. And I just want to show you that in the staging environment, we have a new version of the software. And currently, over here, in the production environment, we still have the old version of software. And now that I've looked at the staging software, I feel comfortable that this is ready for production, so let me come back to the portal. What I'm just going to do is press the pipe key to close all of the blades that are open right now and come back to the dashboard. That's one of those keyboard shortcuts I told you about in the first module of this course. And now let me go to my App Services. I'm going to go to the web application that we're working with now. And at the top of the action bar here, you will see the swap button. I'm going to click swap because I'm now ready to swap my staging deployment slot with the production deployment slot. And this truly is a swap, because once I press okay, what is currently in production will be replaced with what is in the staging slot and the application that is currently in production will be placed in staging. Once the swap is finished, which I should be able to watch from the notification here, we'll go back to the browser and make sure this works. Azure tells me that operation has completed successfully and let's go back to the browser and look at the staging environment. The staging environment still says it is staging, which is correct because that's the app setting that I want to stay in that staging slot. But I have the old version of software that used to be in production. And if I come over to production and refresh the app setting says yes, this is the production version. This is also the new version of my software, running on the production URL. Now, one of the great things that is happening behind the scenes here with Azure is it never has to draw up a connection. It can do the swap and users will never experience any downtime because Azure doesn't have to shut down an app service to do this swap. It just continues to let connections run as normally. But when I do the swap, it'll start directing traffic to the correct app service location. So now that we've seen how to use deployment slots to make sure our application is good at deployment time, let's talk about monitoring to make sure the application continues to work in the future.


Once you have your app service out there and running, you'll want to know if it is healthy, how it is performing. Are your customers seeing any errors? We're going to focus on performance first. There are monitoring tools available in Azure to show you how many requests your app service is seeing and what is the average response time. You can even drop down to see low-level machine details like the amount of memory and CPU being used. Let me show you how to look at these statistics so you'll know what is happening inside of Azure with your application. Let me show you a few places in the Azure portal where you can get an understanding of what is happening inside of your app service. Now because the app service that we deployed doesn't see much traffic, let me bring in the different app service. So we'll look at that from a different subscription. The overview that first appears for an app service is actually one good source of information. This chart that will appear down here will show me server errors and requests over the last hour. But I can click this chart to customize this chart. If I go to the edit chart, action button up here, I can set the chart to show statistics for the past week. I can also get a graph of some additional HTTP status codes. But let's see what the chart looks like for a week. And this will give me a pretty good idea of the type of traffic that the application is seeing. I can see the number of server errors over the last week and that the application has seen almost half a million requests. If I want to see what is happening right now on this application, which might not be too interesting because it is a holiday weekend, and here I am recording a course. But that's okay. I enjoy recording. Let me come down to the monitoring section of the App Service blade, and we'll take a quick look at live HTTP traffic. So between the overview chart and the live HTTP traffic chart, this will give you a good idea of the kind of load your application has seen in the past and what it's seeing right at this moment. And knowing what's happened in the past is good because you'll have a baseline of the number of requests your application typically sees. And that's important because if there is a day when your application is misbehaving, you'll want to know if this is because of a spike in requests or if there's some other reason. Now, I might need to zoom out temporarily in order to be able to close this blade but I could've also used the hotkey to do that. And let's come back in and look at some metrics because understanding the number of requests that are arriving, that's one number to look at. But I also need to understand how the hardware is performing under this load. So again, coming back down into the monitoring section, I can look at metrics for my application, as well as metrics for my App Service plan. That is the hardware. So first let's look at the app service here. I can actually get some performance monitoring counters. Things like processing time and I/O operations. If I go to the site metrics tab though, this can be a little more interesting, primarily because I can watch the average response time here. It's a green line that is very small. And typically, the response time for this particular application is under one second. If I scroll down a bit, I can also see the amount of CPU that is being consumed and the amount of memory that is being used. I can see for example that the average memory working set is around 160 MBs, seems to have peaked up to almost 190. But all of that seems more than reasonable. Scroll down a little bit further. I can also take a look at how much data is coming in and going out from this application. So data out peaked around 70 MBs in an hour at one point today. Over here on the metrics per instance for the app service plan. This will contain some information about all the app services that are on this machine, and if you have multiple instances of your app service plan, you'll see different machine names across the top here. But again, this will give you an idea of how hard is this machine working, how much memory is being consumed, how much CPU is being used, if my application is spending time waiting for disk operations to complete. And these type of metrics can be important when you're trying to understand if the app service plan is providing adequate resources for your application. Is the CPU usage too high? Are you running out of memory? Another great source of information will be application insights. We'll take a look at that a little bit later in this module. But what I want to look at next is how can I proactively monitor my application and make changes to the hardware if I see a sudden traffic spike. So we will talk about alerts and then scaling up and scaling out in the next few clips.


One way to know if something unusual is happening with your application is to create alerts in the Azure portal. Alerts will watch for a metric and can send an email if something is out of the normal range like the average response time or CPU usage. Like I said before, you will need to know what the normal range of your application looks like but by monitoring the numbers using the tools I showed you earlier, you can have a pretty good idea of what normal looks like for your application. Let's see how to create an alert. Inside the portal, let's set up an alert for the application that we deployed in the last module. So I'm going to go back to ps-aspnetmvc and when this app service blade loads, I can go ahead and search for alert. It's going to be down under the monitoring section. And I'll click this to create an alert. If I had any alerts already created, I would see the list here. I would see when that alert was last active. That is when was the threshold last exceeded. But there's no alerts yet, so let me add an alert. I'm going to have a choice here at the top. Do I want to monitor something on this specific app service, ps-aspnetmvc, with sights in parentheses, or do I want to monitor something about the app service plan itself that has server farms in parentheses? And here's the difference. Supposed I wanted to set up an alert called high CPU. And I want to know when the CPU usage exceeds a certain threshold. That would be a metric. And under metrics, I could select, for the app service, the amount of CPU time consumed by that app service. Now, I'll have a little graph that will give me an indication of what the CPU time was in the past. And if I know my application well, I might be able to come with a useful metric there. However, what I really might be interested in is how much headroom do I have left on my app service plan. So if I go to the app service plan instead, now the metrics are slightly different because this is about the machine itself and I can look at the total CPU percentage being used on that machine. I can see the CPU percentage runs very low and for many web applications, that's true. There's not a lot of CPU intensive work going on. A lot of the work is just waiting for network I/O, whether it's SQL Server or HTTP. So let's go ahead and say that if the CPU percentage... And I can select greater than or greater than or equal to, or less than or less than or equal to. Let's say if it's over 50 percent on this machine, then something abnormal must be happening and this threshold has to be exceeded for a particular duration. So let's say if this has been going on for over 10 minutes, then it's not just high because I'm doing a deployment. It has to be high because something odd is going on and I want to receive an email when this happens. I could also add additional emails here. Or if I have some sort of email to MSM gateway, I can enter that address here and have an alert delivered to my phone. Let's select okay to create that alert and then I want to come back and create one more that might be useful later, which will be an alert about HTTP errors. And that would apply to the app service itself, not the machine obviously. So this time, for the resource, I'm going to pick the app service. For a name, I will say that this is monitoring server errors. I also just want to show you the events alerts. What I could do is receive an alert if the app service fails to start or fails to stop, or if someone tried to delete the app service. But what we want to monitor is any server errors. Average memory working set, also something good to monitor. But let's pick HTTP server errors. And let's say that, for this application, which has no users, if I receive more than five HTTP errors over a five minute period, then I want to be alerted about this scenario too. Select OK, and what we'll be able to do later in this course is generate some errors from our application, and I can show you the email I receive as well as what the alert panel will look like. But for right now, let's stay focused on performance troubleshooting. What happens when I receive this alert about high CPU, and what happens when I'm consistently getting that alert? Maybe the number of applications I have and the number of users I have, maybe they're just overwhelming the app service plan that I have selected for my application. So let's talk about changing the app service plan and also scaling up and scaling out.


When our application comes under heavy load, we don't want to turn customers away. So we can use Azure to scale our application. Azure offers what we call horizontal and vertical scaling. Vertical scaling is when I move my application up to a bigger server to handle more load or down to a smaller server to save money. Horizontal scaling is using the same type of server but I can scale out to add more servers and handle more load, or scale in to reduce the number of servers and save money. Let me show you how to scale using the portal. Here inside the portal, let's come back to our app service and what I want to search for is plan, because all of the scale operation revolve around the app service plan. So one thing you might want to do, if you have multiple app services on a single app service plan, sometimes it makes sense to go to one of the app services and change its app service plan, and put it on a different app service plan. That's what I can do with this entry, change app service plan. I can take this app service, move it off the server where other app services are living and put it on a different server. And then there's the vertical scaling and the horizontal scaling. Vertical scaling is changing the type of app plan so if I'm currently using the S1 standard plan, with one core and 1.7 GB of RAM, I could move to the S2 plan, double the number of cores, double the amount of RAM. Once I confirm and select the new app service plan, Azure will migrate my application to the new virtual hardware. But generally, what's more interesting is going to be horizontal scale. And do you want to do vertical or horizontal scaling? Some of that depends on the application that you're running and the type of work load that it has. In general, I would say favor scaling out but try both with your application to see how it behaves. There are load testing tools that you can use with Microsoft Azure. We won't be able to cover them in this course but those load testing tools will give you good benchmarks and allow you to evaluate the different strategies that you can use. So here, under scaling out, by default, when you create an app service, you get a single instance of that app service, which means you have a single virtual machine, a single server that is hosting your applicaton. And depending on the plan that you pick, you'll be able to add additional instances. Here on the standard plan, I could run my app services on up to 10 instances. And if I were to enter the setting and click save, I would be running 10 instances all the time and paying 10 times the amount of money for that app service plan. Now if you're application always has a consistent load, just setting a specific number of instances makes sense, but quite commonly, what you want to be able to do is scale up and scale down dynamically. That will ensure your application is always giving your customers the best performance, but you're not spending too much money keeping too many instances running. So in this scale by drop down, instead of saying I always want 10 instances, what I can do is say I want to scale by CPU percentage. And I've determined, based on my analysis of cost and load, that I want to run between, let's say, two and five instances. I always want two to have some redundancy out there. If one particular server fails, because of a hardware problem let's say, which is rare, but it does happen, then there at least will already be another server up and running that can handle the requests. And now I can set a target range for the CPU. So at the high end, I can say if the CPU usage exceeds, let's say 70 percent, that's when Azure will automatically scale up for me. So if I'm sitting at two instances, and the CPU percentage is running at 80 percent, Azure will add another instance to give me three, and that will continue up until I get to five instances. And in the lower bar here, I can say if the CPU usage percentage is below, let's say 40%, what Azure can do for me now is scale back. And now, if i were to go in and click save, those older scale settings will be in place, Azure will go ahead and spin up another instance because I've said always run between two and five instances, and then because this application has no users, it should always stay at two. Notice, I can also send out an email when these scale actions, either a scale out or scale in happen, I can be notified by email. Now the third setting in this dropdown list is the most advanced setting. Now, I want to schedule my scale using some performance rules. With the performance rules, I can have multiple profiles. A profile basically determines what is the target range or the number of instances that I want to run. And I can set up this by schedules so I can say something like on... Let's start with weekends. On weekends, I always want the target range between one and three. And to set this just for weekends, I would come to the reoccurrence tab here and go in and basically unselect the weekdays. So I'm telling Azure on Saturdays and Sundays, my profile is to have one and three instances. And I could also come back and provide the setting for weekdays and maybe the instance range is higher there because more people are using my application. I could say between five and 10 for example. I can also set a fixed date, so if I know a lot of people visit the site on Christmas, or on a popular shopping day, I can go ahead and say, on that day, just keep 10 instances running all the time. Let's do something simple. I'm just going to call this scale profile. And what we're going to say is I always want between one and three instances. And click okay. Once I have a profile... And again, I can have multiple profiles that operate, one on weekends, one on weekdays, one on a special day. But once I have a profile, I can start adding rules for Azure to evaluate to determine the correct number of instances. So I could say over here, for this rule, for example, that when the CPU percentage is greater than, let's say 70, over a period of 10 minutes, so the average CPU percentage over the last 10 minutes has been over 70 percent, then what I want to do is increase the instance count by one and then wait at least five minutes to determine what happens next. Let me add that rule. Now once I have a scale up rule, I'm also going to need a rule that allows Azure to scale down. Otherwise, I'd always be running at three instances. So you'll typically have a pair of rules for every scale action. This one is CPU percentage over 70 percent. Now I want to pick CPU percentage, which is less than let's say 50 percent and then what I would want to do is decrease by one. So if the alerts that we looked at were great, they can tell me if a server has come under sudden load. But these scale settings are even nicer because when my server does come under load, I don't have to come and log in to that Azure portal and figure out how to add more instances. Azure can just take care of that for me. And I can do that based on CPU percentage on the app service plan. I can also do that on memory percentage, data in, data out. Those are the metrics that are available. You can also look at other resources. For example, a service bus queue. If you have an application running in App Services that does a lot of background processing and it pulls all of its work off of a service bus queue, you can set up a scaling rule to look at the number of messages being delivered, and scale out or scale in on the number of instances as appropriate. Let me go ahead and save this scale out plan. And now, let's start turning our attention from performance to diagnosing and debugging problems in the web application.

Debugging and Application Insights

Azure has a number of tools available that make debugging problems in your application just as easy to do in the cloud as it is on a local server or on your development machine. In this section, I want to show you how to grab diagnostic logs from Azure as well as C telemetry and exception information from application insights. We're also going to see how to use some of the advanced tools in Windows Azure that put us right there on the server, as well as how to launch a remote debugging session. When I create a new app service in Azure, the diagnostic logs are not enabled by default. But I can turn them on pretty easily. I'm going to come into the app service blade. I'm going to search for diagnostic logs and it's inside of here, where I can turn on application logging. I can have the logs placed on the file system or into blob storage The caveat with the file system is that this setting is turned off after 12 hours to avoid filling up the file system. Where as blob storage can continue indefinitely. I can also turn on web server logging with a Windows App Service. This is your traditional IIS. I can turn on detailed inner messages, which will override the setting in webdoc config to show friendly error pages. And I can also turn on failed request tracing. If you've ever used that feature in IIS, then you'll know it dumps out a tremendous amount of information about a failed request and it can be very helpful if you're trying to track down a problem with modules and handlers in the IIS configuration. Now once these logs are enabled, I can download these logs using FTP, but I'll show you a couple of other tricks that you can use to view these logs. Now while I'm in here, turning on diagnostic information, I'm also going to come over and turn on application insights. This is of tremendous value when you're trying to understand the performing characteristics of your application and see where errors are occurring. In application insights, it can not only work with .NET, but also know JS and Javascript so you can know what's happening in your customers' browsers. Are they experiencing Javascript errors. And it also works on mobile applications and desktop applications. There's an FTK that you can use to send telemetry data to your application's insight resource. I do need to create this as a new resource. It is not free. I'll show you the pricing information in just a minute. But I'm going to call this ps-aspnetmvc-AI. It's my application insights resource. And this application insights resource, it works best when you instrument your application and add the FTK for application insights into your application. I can show you how to do that. It's very simple. When I have an ASPnet application in Visual Studio, I can just right-click the project and say that I want to add application insight's telemetry. When I click that button, I'll be given some information about what application insights can do. I want to start using this for free. You can see that the base monthly price is free. That would give you one gigabyte of telemetry data per month. After that, you'll be paying a little more than a couple dollars per gigabyte and the data retention is 90 days. The service is well worth it when you're trying to understand how your application is behaving. For the resource, I want to hopefully select the resource that I've just created in the portal. So ps-aspnetmvc-ai. And now I will click register. What this is going to do is go out and add a few new get package references to my application, so you could see the assembly references being added there. It will also add an application insights configuration file to my project. And if I scroll down in this configuration file, the most important part here is the instrumentation key. With application insights, I could have multiple applications delivering telemetry data to my single resource, and it is this key that helps application insights identify this particular application and distinguish it from others. So the FTK has been added. My application has been registered with the application insights resource. It's going to collect performance data and exception data. And I can also enable a trace collection and listen to everything that is being sent through system.diagnostics, but that can be a tremendous amount of information. I don't need that unless I'm really trying to find something that's wrong. I'm going to leave that off for now. And now, what we can do is give our application something to diagnose. So let me create another controller action. I will just call it test. And inside of test, we will throw a new... Let's throw an invalid operation exception. This feature is not implemented. I'm going to start by trying to run this application locally, just to make sure it is behaving as expected. And if it is, we can send it into Azure. There's the homepage that seems to work well. And we also go to slash-home-slash-test. That generates an exception as expected. And on a development machine, this shows a stacked trace. But that shouldn't happen in Azure because we have friendly error pages turned on for our release build. Let's try it. Back in the project, let me go to publishing. This application. And I'm just going to send it straight into production. That will save me time in this demo, because I won't have to swap the staging and production slots on this app service. The publishing is finished. Let's see if we can look at the homepage in Azure, and the homepage seems to load just fine. Let's also go to slash-home-slash-test. And there I can see the friendly error page. No stack trace available. How do I know what's going wrong? Well, that's where I can start using application insights and some of the diagnostic logs. So, inside of the portal, I can already see here in the live stream that application insights have seen those two requests and it's seen the one failure. But I just want to show you a couple of different ways to get to the diagnostic logs that Azure itself produces. So under Monitoring, I can connect to a log stream. This will allow me to connect to the application logs or the web server logs, and as requests are coming in to my web server, I can actually see the web server logs scroll by. It can take a few moments to connect, but here we can see that it's picked up the last two requests that actually happened a little bit ago. And there is some delay. It's not completely real-time. And this is one way just to get a feel for what is happening on the web server right at this moment. But now what if I went to download some of these logs files that I'm producing. One way to do that would be in a tool that I'll show you on the next clip. And another way to do this is to use that Cloud Explorer tool that we talked about earlier. With Cloud Explorer, I can come to my app service and one of the options here will be to view the different log files that are associated with my application. So the log files in this particular folder, this will be the failed request tracing the IIS provides. I can double-click one of these files to download it. Inside of here, I will see all the information that IIS provides for a failed request. It's quite a good amount. Under log files, I can find the event log for my application. Under HTTP, I can spine the raw IIS logs if I want to look at those. So this is IIS 8. And this is your usual log file recording. The date, time, site name, the method that's being used. Get, put, post, delete. All of that information will be in here for every single request. But if I really want to understand what is happening at an application level, and not just at the web server level, that's where application insights can be useful. So let's come into application insights and let me actually come back to the browser for a minute on the error page here, and send off a couple errors. I'm even going to open the developer tools here in Chrome, and press shift-refresh to make sure there's no caching going on. And back here in application insights, I can see that there are failures occurring. So let me scroll down to the bottom of this blade and say, I want to view more in application insights. And it will be inside of this blade, where I can give you just a small overview of everything that's available inside of applicaton insights. I will be able to drill into things like the server response time for different requests that have arrived. And let's just go straight into the heart of the matter here. I'm trying to diagnose some failures. So I want to click on the failures link here. I will see some graphical depictions of how many failures I've had. And scrolling down a bit, I will be able to see a total of failed requests by operation names. So here, home-slash-test, I can see that it's generated a couple of failed requests. In fact, that is generated a system.invalid operation exception exactly. If I click on one of these entries, I can start to go drill down even a little more and see not only when these exceptions happen, but I can click on the one that happened at 7:52 PM, and see exactly what device this happened on if I have an app service plan with multiple instances. I can show telemetry data five minutes before and after this event if I think this is correlated with something else that happened in the application. And most importantly, I can see a call stack here. I know this happened inside of home controller test method. So now I have something to go on and to go back to my project with, and try to find why I'm getting an exception inside of that method.

Kudu and Remote Debugging

If you have a problem with your app service, and the diagnostic logs don't provide enough information, in application insights, even though it provides a great amount of information, it's not helping, then there is still some tools that you can use to try to figure out what the problem is. What I want to do is come into this app service, and I'll show you one of the entries that is under Development Tools, and that is the advanced tools. What I'm going to do is select Advanced Tools and then follow the Go link that appears over here. This is going to take me to the Project Kudu website for my app service. Notice the URL here is the same as my app service, just with a .scm in the middle. So Your customers won't be able to come to this website. You have to have permissions in the portal to be able to view this page. But on this page I can see various links to information about the environment that my app service is operating in. You also have the ability here, under the Tools menu. If I asked for a diagnostic dump, this website will collect all of the log files together for me and offer them to me as a single zip file, so I could download that and analyze the log files later. There's also a Process Explorer link, if I want to see what is running on the web server. Then it's here where I will see W3WP.exe, the IIS worker process. This one is for the Project Kudu website, where the second one, this is the one hosting my application. I can see the amount of CPU time it is consuming and the amount of memory it is using. I can go to Properties to see even more information about this particular process, including information like what the environment variables are for this process. And finally, on this website, I want to show you, under Debug Console, you can open up a simple command channel or power shell prompt on the server. So there's two pieces to this UI. At the top, there's an explorer-like view of the file system. I can go into log files. There's all the log files that we saw earlier. The IIS log files. The failed request log files. I can click the download button. That will download the entire directory of files for me, as a zip file. I can also edit files. So if I wanted to, for some reason, edit the web.config that is deployed into Azure, then I would need to go to the site folder. Inside of the site folder, my app service is deployed into the www root folder. If I go in there and scroll down, there's the web.config file that I deployed into Azure. I can click the pencil, make changes here, then click save. Even if I'm running multiple instances of my app service across multiple virtual machines, that web.config file would be replicated everywhere. And down here at the bottom, if I just want to poke around the file system, I have a command prompt where I can get a directory listing. I can change directories. I can make new directories. And this can be a very useful tool when you just need to get your hands on the server and figure out what is happening there. Now the ultimate thing to do, when nothing else will work, is remote debugging. This is something I would only do if I had a problem that I just could not reproduce locally. Under the application settings, for my app service, there is a button, which is the remote debugging flag here. I could set that to on and say that I want to remote debug with Visual Studio 2015. I can also have this feature automatically turned on for me using the Cloud Explorer window that we've looked at a few times. If I had the proper permissions, I could right-click on my app service, and in addition to opening the portal or opening in Kudu, there's also the option to attach a debugger. So let's go ahead and do that. The first thing the Cloud Explorer will do is go in and make API calls to make sure that that button we just looked at, set it to on to enable remote debugging. The setup process here can sometimes take a couple seconds, but it looks like we're attached. Its going to launch a browser to make sure there is a request coming into the website. And now I'm going to get a warning that my application was essentially deployed in release mode. If I wanted to make debugging easier, I would deploy a debug version of my application. Now it is possible to debug with the release version, but what I have to do is disable the Just My Code feature in Visual Studio. So let me select that option. What I want to do is come into the home controller and set a breakpoint there and see if we can hit the breakpoint. So in Solution Explorer, let me open up the home controller. Let's set a breakpoint here on the test action, which took a minute. And now let's come out to the application that is running on Azure and try to go to slash-home-slash-test. Flip back to Visual Studio, and I can see I am on that breakpoint. So again, I would only do this in extreme cases where I cannot reproduce a bug locally. But in those cases, it is nice to have this feature where I can just attach a debugger to an app service that is running in Azure. And again, to make that process a little bit easier, what I could do is publish my web application again, but this time, go into settings and set the configuration to debug. That can make your debugging experience easier many times.


In this module, we looked at deployment slots, scaling vertically, scaling horizontally, and how to set up auto-scaling. We also set up alerts in Azure and took a look at how to enable diagnostic logs and even attach a visual studio debugger to a web application running in Azure. This will give you a good start in understanding the diagnostic and monitoring capabilities in Azure, and I hope it will help you solve any problems you might encounter when you deploy an application. In the next module, we'll continue creating new resources in Azure as we give our application the ability to store data in a cloud database or two.

Using Cloud Databases


Hi this is Scott and this module of the course is about databases, databases in the cloud. If our application needs to store data in a reliable fashion and if our application needs to query and aggregate data to show to our customers, then we'll need to set up a database. You can use nearly any type of database in Azure. We've already talked about infrastructure as a service with Azure, IaaS. And if we set up our own infrastructure, our own virtual machines, we can run any database that works on Windows or Linux. In this module though, we're going to stick to using platform feature of Azure, or PaaS solutions. And I'll show you how to create and configure two different types of cloud databases for your applications. Let's get started by talking about a couple of the database platform offerings in Azure.

Databases in the Cloud

We are going to look at two different and complimentary database offerings. The first database I'll show you is the Azure SQL Database offering. As a .net developer, you might already be familiar with Microsoft SQL Server and Azure SQL is just SQL Server in the cloud. From a developer's perspective, Azure SQL is not much different from the Microsoft SQL Server that you can install and run on your own machine. We use the same tools and frameworks to query and insert data. There's still tables and columns and indexes and stored procedures. Azure SQL can feel a little bit different to DBAs, because Azure SQL is a platform. It takes care of all the infrastructure management that SQL DBAs usually have to worry about. Infrastructure like what disks will hold the data files and log files. Because Azure SQL is just like SQL Server, you'll see that this is easy to use, especially if you already know SQL and because the infrastructure is managed for us, we'll see that Azure SQL Databases are easy to scale. Another type of database that we will look at is a NoSQL database, known as DocumentDB. As a NoSQL database, DocumentDB doesn't have tables with columns inside. Instead, DocumentDB allows you to store documents, documents in JSON format. There is no database schema in DocumentDB and this makes the databases incredibly flexible. There are, however, still indexes for fast queries and transactions for consistency. So DocumentDB can be fast and reliable and scale up to serve customers all around the planet. Let's start by digging into Azure SQL.

Setting up an Azure SQL Database

The ASP.NET MVC application that we deployed earlier in this course needs a SQL Server Database. I know this because if I come into register and try to register myself as a user, I know this application, this program, to try to save this information into a SQL Server Database. So if I put in a sufficiently complex password and click Register, it's going to take a little bit of time, but eventually the application's going to time out trying to reach a SQL Server and throw an error. And there is the error. Now if I were to go into application insights and look at the exception that was recorded for this request, I would see that the application failed because we could not reach a SQL Server. So let's go into the Azure Portal and create a SQL Server and then configure our application to use this server and make that registration page work. So what I'll need to do is come over into the hub and I want to create a new Azure SQL database, so I will select that option. Currently I have no databases, so let's go to Add a new SQL database and as usual in the Azure Portal I'll need to fill out some configuration items. Some of these are common across all Azure resources. I need to give my database a name. Let me call it ps-asp.netmvcdb. I do want it in my Pluralsight subscription. Again we have the concept of a resource group. Since this is a SQL database specific to this ps-asp.netmvc application, then I want to place this database into the same resource group as my application. I can also select a source for this database. I'm going to start with a blank database, but you could also create a sample database based on Microsoft AdventureWorks or create the database from a backup. The next thing I need to do is configure the server that will host this database. But when I say host, I don't mean host in the traditional sense that you might be accustomed to thinking of databases and how they relate to a specific SQL Server. Let's pause here and talk about SQL database servers in Azure.

Setting up an Azure SQL Server

Every Azure SQL Database that we create, will need to be associated with an Azure SQL Server. But don't think of the server a physical server or even a virtual server. In Azure, as SQL Server is really just a name that you give to a logical container for one or more databases. This is not like installing SQL Server 2014 locally and having an instance of SQL Server running to host multiple databases on a single machine. And Azure SQL Server really is just a logical container and the databases that we associate with that SQL Server they may or may not live on the same physical or virtual server. We don't know, that's all infrastructure that Microsoft worries about. But we do need this logical server, because all the tools and frameworks and applications they all expect to connect to a SQL Server and then to a database that inside of that server. So this logical server, it gives us a name that we can use to group our databases and it allows us to create an admin login that we can use to manage the database's collected inside that server. And now that we understand that Azure SQL Servers are just a little bit different than our local SQL Servers, let's go back to the portal and continue our set up. Back in the Azure Portal, let's give our server a name. Now notice this name is going to be checked to make sure that it is unique in the domain because this server will be addressable from over the internet, but only if you allow it. By default, no connections will reach the SQL Server unless you explicitly allow those connections to reach it. So let's see if I can get away with ps-server-1. That seems like a good name. Now I need to enter my server admin login. I'll make that sallen and I will enter a sufficiently complex password that will pass the validation rules. And almost like a virtual server, I get to pick a region or a location. But again, a SQL Server in Azure SQL is just a logical container and this just means that all my databases will be in South Central US. That's the same location as my Webat, which is good. So I will say yes, Select this. And now the next question is, do I want to use a SQL elastic pool? If I'm only creating a single database, I don't need an elastic pool. But if you need to create and manage multiple databases, you might want to research and consider using an elastic pool. Typically using an elastic pool that will allow me to save money in purchasing the resources that I need to make my databases perform well. And speaking of purchases, it's now time to select the pricing tier. The pricing tier options are a little bit different than what we've seen in the past. So let's take a step back and talk about this.

Selecting DTUs and Azure SQL Plans

Previously in the course, when we selected the size of our virtual machine, or the size of our App Service plan, we were given different options based on the number of CPUs and the amount of memory. Azure SQL works a bit differently. Yes there are still different plans with different pricing options, but now the important numbers are not CPUs and memory, but database size and the number of DTUs. A DTU is a data throughput unit. That's a relative measure of just how much power is available behind the database. The basic plan we are looking at here, offers five DTUs, which is generally more than enough for a testing or development machine. In other words, not many concurrent users. Notice the maximum database size for a database with this basic plan is 2 gigabytes. If you want to run a production website with users, you probably want to start with a pricing plan from the standard tier. Here the DTUs go from 10 to 100. So the S3 plan on the right, that's roughly 10 times more powerful than S0 plan on the left. So the obvious question here is, how many DTUs do I need? Well it depends on the application and how heavily the application uses the database. In my experience, websites with 25,000 page views a day, where each page view generates two or three queries, the S0 plan is actually more than adequate for that type of site. And Azure provides you all the tools that you need to monitor your DTU usage, so you will know if the plan you selected is out of headroom. And just like we learned with App Services, we can setup alerts to let us know if we're in trouble. Now notice the database size in this standard plan goes up to 250 gigabytes. If you need more power and larger databases there is also a set of premium pricing plans. The premium plans that are offered through the portal currently go all the way to 4,000 DTUs and 1 terabyte databases. 4,000 DTUs is roughly 400 times the number of DTUs available with the lowest standard plan. Now knowing that the application I created is still unknown in the world, I'm going to be able to select the basic plan and save money, but I'll also be able to watch the portal and set up alerts and know when to scale my database up. Now I do want to point out that these pricing plans are not per server, they are per database. So for each database that I attach to my server, and each database that I create in Azure, I select one of these pricing plans. And again, if you do have multiple databases you might want to look at the elastic pool, which allows you to purchase one of these plans and share it across multiple databases. For right now, let's go back and finish the setup for our single database. Back in the portal, let's go ahead and choose our pricing strategy, which for this database I'm just going to use a basic plan. And once I click Select and then Create, Azure will go out and provision my database. While that is happening, let me go ahead and select an existing database and just give you a brief tour of the features that are available in this blade. The SQL database blade is very similar to the App Service blade and the Virtual Machine blade that we've seen earlier in the course. It contains an Overview section. In the overview blade there will be some action buttons along the top for common tasks. So if I want to restore this database from a backup or export the database to take a snapshot and save it at some point in time, these are easy buttons that allow that access. There's also a firewall in front of my database. By default when you create a SQL database, Azure will allow other Azure services to access this database, but no connections are allowed from outside the datacenter. You can change that by adding client IP addresses that you want to allow access from. And you can do that by using this firewall setting's blade. I will also show you another technique that you can use a little bit later in this module. Here in the overview, just like with App Services, I can see a graph that shows my resource utilization. In this case, it's expressed as DTU percentage. I can see by this graph I have a lot of headroom, so if anything, I might be paying more for the service plan for this particular SQL database than I need to. If I wanted to change my plan, I could click on the pricing tier, select a new pricing tier, click OK and I'll instantly have more power or less power and be saving money. I can also replicate my SQL database into other Azure datacenters. All I need to do is select those datacenters here. There's some built in threat protection. There's also transparent data encryption. So if your application needs to be in compliance with some regulations that requires your data to be encrypted when the data is at rest or in storage, then you just need to come in here and turn on the data encryption. It is transparent. There's nothing your application needs to do to encrypt that data or decrypt the data. Now for figuring out your pricing tier, or to know when your database is being overloaded, just like with App Services, there are alerts that you can set up. And I can go in and add an alert on several different metrics. Those metrics include, Failed Connections, DTU percentage, Total database size. Just like with App Services, I can select a metric, provide a condition and a threshold and a period, and then allow Azure to email me if something goes wrong. I'm going to cancel that particular alert rule. I can also see the database size here and then finally a performance overview. This performance overview will give me a good idea about how effectively I'm using the resources that I have assigned to this database and will provide me a chart. And I just want to show you quickly, if I click on this chart, what I'll do is go to the Query Performance insight. I'm just going directly to that blade. On this blade I can see things like the most expensive queries that I'm running or the longest running queries that are happening inside of my database. If I scroll down in this blade, I will see there is a query here that is consumed nearly 50 seconds of time over 7,500 executions. I can actually click that line and Azure can show me the exact SQL query that is being issued against my database. And a little further down I can see roughly how long that query took to execute over the last few times it has executed. So these are averages, but I can see that query is sometimes averaging almost three seconds. Perhaps I need to put an index in place. I can also, in the portal here, turn on automatic tuning, which will Azure to monitor my database and automatically create or drop indexes for me. And with that brief tour, our new database should have been created. Let me click Refresh and I can see it is here. So let me turn off the other subscription and let's find out to wire up our application so it can use this new database.

Connecting and Querying Azure SQL

We want our application to talk to the new database that we created and I'll show you in a bit how you can click on a link in the portal to show the connection strings for your database with all of the settings, including encryption included, except the connections strings will not show a username or password for the database. We need to fill those in. And that leads to the question, what username and password do we use? Now the username and password we entered when creating the database server, that is an administrative login. And the admin login has control over the entire server and all the database inside. So that's probably not the login we want to use from our application. We want a more restricted custom login for the application. The Azure Portal doesn't provide any tools to manage our database at this level, so we will have to turn to other tools and any tool that connects to SQL Server will be a tool that we can use, so tools like, Visual Studio, as well as SQL Server Management Studio, which is a free download. Let's see how we can login to the database and create a new user for the application. Inside the Azure Portal I'm going to open up the SQL database that we created and in this blade I can see things like the server name, and also those database connection strings. I can take this connection string and copy it out of the browser. Then I just need to fill in the username and password. We'll come back and do that later once we have a username and password for the application. The question is how can I connect to the server to create a login and a user for the application? I'm going to hover over this server name and there's a little icon that will appear to the right. I can click that icon to copy the server name and now I'm going to go into Visual Studio. Inside of Visual Studio I can connect to that database from the SQL Server Object Explorer. All I need to do is right click and say that I want to add a SQL Server. In Visual Studio, this dialog is nice because if I have signed into Azure with my credentials, then I will see a list of the databases that I can connect to given this account and this subscription. All I would need to do is select that database, fill in the password that we entered in the portal when we created the server, and now I'd be able to browse the schema and execute queries against that database. I'm going to use a different tool just for this demonstration, which is SQL Server Management Studio. This tool still has some advantages over the SQL Object Explorer in Visual Studio and provides a few more nice tools. What I want to do is connect to a Database Engine. I'm going to paste in the name of the server that I copied from the portal. Authentication is going to be SQL Server Authentication. There is no Windows Authentication in Azure SQL by the way. And now I need to fill out the username and the password that I created for the admin login in the portal. I'll tell SQL Server to remember this password and now I want to connect. Now as I told you before by default, your SQL database will not be reachable by anything outside of Azure. And this is SQL Server Management Studio telling me that the firewall is blocking this connection. But seeing how this is my account and my subscription and my database, I can sign into Azure and have my IP address automatically white listed in the firewall. All I need to do is enter my Azure credentials properly and that will include my password and my two factor authentication code, 820512. I sign in frequently. Click Submit. I will go ahead and allow my current IP address in the firewall. I would be able to see that setting in the portal and delete later or change it later if I wanted to. Inside the portal you can add a range of IP addresses, but I am now connected to my Azure SQL Server. I should be able to see the application database in the list of databases and I can. Now what I want to do is create a new login. And the template here will allow me to fill in some parameters to create the login. Let's create a login called webappuser with a password that I have generated and will copy and paste in here. Let's execute that query and now I need to add a user to my database that will allow that login. So a New User where I just need to fill in some parameters. So create a user, webappuser. I want that to be for the login webappuser and we will leave off the default schema. I also need to place my user into some roles. So we could select db_owner or I could select db_data reader for my user, webappuser, and I could also select a couple other roles. So let's do db_datareader, let's do db_datawriter, and let's do db_ddladmin, which will allow my application to make schema changes in the database, which is something that will probably need to happen in the future as I add new features to the application. And let me make sure that I use ddladmin here. Let's execute this command against the database and that command completed successfully. So now I have a user and a password that I can use for my application to login to the database. Now there's many ways to get the connection string for this Azure SQL Database into the App Service that is running in production. Since this course is targeted at .net developers who have already been using ASP.NET, then you probably know you have the ability to create a release version of your web.config and this release version that would be used in a release build, this is where I could add a database connection string that could point to my Azure SQL Database. There's another way I could do this. I'm going to come back to the portal, I'm going to share the database connection strings and I'm going to copy this database connection string out of the portal and now come over to my App Service and I'm going to add the connection string here in the Azure Portal. That's so that I don't have to add the connection string to a configuration file that goes into source control. And that makes the connection string just a little bit safer. Let's go into the App Service for the application that's trying to connect. Under the settings I'm going to look for connection strings. They will appear under Application settings. And here when I scroll down I will see where I can override connection string settings that might be in the web.config file of the deployed application. First I need to know the name of the connection string. So coming back into my web.config file, I can see my application is programed to use a connection string name of DefaultConnection. So let me come back to the portal. I'm going to enter a name of DefaultConnection and I'm going to paste in the value of the connection string that I copied out of the portal. Now I just need to come through and find the placeholder for username and password. So we will find that here. Your username, this is going to be webappuser and the password will be that strong password that I autogenerated. Let me copy that off the screen, come in here, and paste it into your password. I probably want to set this as a Slot setting, so this connection string is only effect for the production slot. If I want my staging slot to use the same database as production, I would uncheck this box. As it stands now with that selected, I could set up a different database or a different login for the staging slot. Well let's go ahead and save our application settings and come back to the application, which was currently erroring because there was no database to save the user information into. And now let's try to register again. I will enter an email address and a complex password and I will click Register. And it looks like the user was successfully registered and I've been logged into the application. How can I verify that that data is actually in the database? Well let's come back to SQL Server Management Studio. If I go into the Tables for this database I can see the application has automatically created a schema using the entity framework for the ASP.NET identity framework. And that schema includes an ASP.NET users table. Let's select the first 1,000 rows. We should only see one. And I will see indeed, there's a single user in here for the user and my asp.netmvc application which is using the entity framework to talk to SQL Server. I'm now successfully using an Azure SQL Database. And everything from this point with Azure SQL is pretty easy. If I'm using the entity framework and doing code first development I can have the entity framework manage my migrations and make changes to this database to add additional tables and columns to tables. If I'm working with a DBA, they can use a Visual Studio database project or somehow manage the schema and deploy changes to this production database using the tools that they're familiar with. And inside of my ASP.NET application I can use any data access technology that I like. I could use low-level ADO.NET. I could use the entity framework. I could also use a lighter weight data access technology like Dapper. So now that we have SQL Server working and configured and how easy it is from this point forward, let's turn our attention to that other database the NoSQL database, DocumentDB.

Creating a DocumentDB

The first step in using DocumentDB is to create a new DocumentDB resource. We'll go through many of the same steps as we have before, filling out a name, a location, a resource group, and of course, anything that you do in the portal, anything at all, can also be done in an automated fashion using command line tools like PowerShell. Even adding the firewall rule that we added in the last module to connect to SQL Server from Management Studio, that step can be automated through PowerShell or C# code or Ruby. I also want to point out that Microsoft offers a free DocumentDB emulator that you can use locally for testing without incurring any Azure DocumentDB prices. But let's go ahead and create a DocumentDB in Azure for our existing application to use. In the portal, let's go into the hub and select DocumentDB. Remember if this doesn't appear for you here you can always go to the new button and search for DocumentDB. But once we have this blade open, we'll see we have to document databases so let's go and add one. At this point we'll fill out many of the same items that we fill out for other Azure resources. First we have to come up with a name. This has to be a unique name in the domain. Let's see if we get away with ps-docdb. That looks good. Notice that if we're porting an application from MongoDB, which is a popular NoSQL database in some circles, there is the option to give our DocumentDB, Mongo compatibility. So we can move a Mongo based app to Azure and easily use DocumentDB. For this example I'll stick with the DocumentDB selection. As always I have to select a subscription. I have to place this resource into a resource group. I'll select the existing resource group for the application and I have to select a location. I'll leave this as South Central US so it is close to my application, but DocumentDB, much like SQL databases, you can replicate the data all around the world. Now one thing you'll notice is that unlike virtual machines and App Services and SQL databases, I didn't have to select a pricing plan at the start. That's because all we've created so far is what we call a DocumentDB account. And I don't pay anything for an account, but once I have an account I can create one or more databases on my DocumentDB account. And in each database I can have one or more collections. It is the collection where I will select a pricing plan. So again the hierarchy is that a DocumentDB account contains databases and a database contains collections. It is the collections that will hold my documents. And because collections don't use a schema I can store any type of document in the collection. Don't fall into the trap that many relational database users fall into, which is to think of a collection as a table. A collection hold different types of documents with different shapes and different sizes. DocumentDB will take care of storing and indexing them all. I can have documents with accounting information alongside documents with payroll information all in the same collection and that's fine. When and where you create a new collection really depends on the application that you're writing, how you're going to access the collection, how much data you're going to store, and how much performance you need. Once the DocumentDB account is ready, I can come into the DocumentDB blade and see the typical Azure overview blade. This shows me where my account is active and also some monitoring statistics which are, not surprisingly, currently empty. Now over here in the settings, let's go down here to the collections and browse the collections I have. Currently no collections, no databases. I want to add a collection so I have a place to store documents. First I need to enter a Collection Id. Let's call this courses. What I plan to do is show you a demo where I store Pluralsight course related information in a DocumentDB. And it is here where we get to the pricing information. Now DocumentDB is a high level platform. It offers platform as a service capabilities, PaaS. So once again we don't think about how many cores or how much memory we are using. The default standard pricing tier here is estimated to cost 24 U.S. dollars a month. In DocumentDB you pay for how much data you store and how much throughput that you reserve. Let's talk about storage first. I can see the price on the standard tier is a quarter of a U.S. dollar per gigabyte, which is quite cheap. With these standard tiers I can store up to 10 gigabytes of data and that's in a single collection. If I need more storage capability, I need to switch to a partitioned collection and that will allow me to store up to 250 gigabytes of data. If I need more, I can always contact Microsoft as this note says. But remember this is just per collection and I can have multiple collections. I'm going to use just a Single Partition for this demo. Partitioning will influence the design of your collection and the queries that you write. So again, I pay for the storage that I use and I also pay for the throughput that I reserve. Throughput is measured in RUs, where RU stands for request unit. They cost to fetch a one kilobyte document over HTTP is one RU. The standard plan comes with a minimum of 400 RUs. Since I pay $6 a month for every 100 RUs that I reserve, that's how we come out to a starting cost of 24 US dollars. 400 divided by 100 times six is 24. Obviously you need to figure out how many RUs your application needs. But like we've seen in the past, there are plenty of monitoring tools and alerts that can help you set the right level and you can always come into Azure and scale the number of RUs that you've reserved up or down. I also want to point out that Azure provides a pricing calculator. This is on On this webpage you can upload a sample JSON document of the data you're going to store. In fact, you can upload multiple documents. And yes, DocumentDB uses JSON natively. I can then tell the calculator how many documents I plan on storing, how many create, read, update and delete operations that I expect per second and this calculator will give me an estimated total on how many request units I'll be using a second and how much total data storage I'll be using. That will give me a good idea of how many request units to reserve and how much I'll be paying a month. For this example I'm going to stick with the standard plan and I'll show you how to scale this up and down in just a bit. Now as I said, every collection belongs in a database. So I'm going to create a courses collection. But I don't have a database yet, so I will create a new database and let's call this coursedb. And once I select OK, in just a few moments, I will have a place to store my documents. And that happened very quickly. So now let's come under browse and go to the scale options. And what I want to do is for my courses collection, here's where I can specify how many RUs I want. And I'm going to scale down. I'm going to go down to 400. For this sample application with no users, this will be more than enough. Now be aware that I pay for what I reserve, not what I use. So if reserve 1,000 RUs I pay $60 a month for this collection even if I only use 500 RUs. So I always want to monitor and make sure I'm not overallocating resources. I also don't want to underallocate, because if I do exceed my RU setting, my DocumentDB will send back HTTP Status Code 429 errors which is the code for too many requests. And yes, behind the scenes, DocumentDB is done all over HTTP. Now I have a database with a collection and I wish I had the time to show you all the wonderful features of DocumentDB. It really is fantastic and fast and incredibly scalable. What I do have time to do is a quick demo that will show you how to get some data into this new collection and then retrieve data from the collection.

Using DocumentDB

In the application that we've been deploying to Azure I've gone ahead and added a controller and a view to accelerate this demo. What I want the controller and view to do is show a list of courses that are in DocumentDB. Currently there are no courses, so I have a button here that will allow me to insert some sample courses. Now over in Visual Studio, we can see the exact type of document that we will be storing. We will be storing course documents. Every course contains an Id, Title, and a collection of Modules. Notice the id property is lowercase. I made this property lowercase so that I can easily retrieve the id that DocumentDB assigns to a course when I insert the document. Having a lowercase property goes against some of the conventions of C#. If I wanted an uppercase property, or an uppercase I on this property, I could easily do that, but if I wanted to retrieve the id from DocumentDB, I would have to add an attribute that says for this property use this name when you're serializing and deserializing a document. I'm just going to leave the lowercase property right now, but just know that you can change that. So every course has a collection of Modules and every Module has a collection of Clips and every Clip has some properties that are interesting and that we want to look at. And this roughly how we would model courses for Pluralsight. You're currently watch a clip which is part of a cloud databases module, which is part of an Azure course. If you are a SQL developer, you're already thinking about the tables that you need to create and the foreign key relationships and the joins that you need to add in a query to retrieve all of the information about a course. With DocumentDB I can insert a course and all of its related data with a single insert. It's all one document and I have a really simple query that I can use to retrieve a course and all of its related information that will be fetching and inserting data will ultimately be controlled by this document's controller. Currently there's two actions here. The Index action will try to get a collection of all the courses in DocumentDB, because we won't have many. And yes you can do paging with DocumentDB. We'll try to get a list of all the courses and display them in a view, the index view. That's the model that we'll pass along. So we expect in this view to receive a sequence of courses and if there are any courses in the model, I'm just going to loop through every course and display information about it. So for every course also go to every module in each course. Go into each clip that is in each module and basically dump out information into some of the ugliest UI tables you've ever seen. But that's okay, I think this course really isn't about design. Also in this view, at the bottom, if there are no courses, I'll display this text, which is what we're currently seeing and the form with a button that I can click to insert some sample data in an HTTP post operation. And back in the document's controller, clicking that button would come to the insert action, where I will get some sample data for my courses that was back here in this course file. If I scroll down to the bottom I just have all the typing already done to insert some sample data. So a new course, with a couple modules, and each module has three clips. And then back in the document's controller once that insert is complete, I'll just redirect back to the index action, which will hopefully show the list of sample data. Now the primary abstraction here and where all the real work is being done is in this CourseStore abstraction. This is my data store or my repository. This is the class that I want to instantiate to interact with DocumentDB and it's an abstraction that I created. Currently this class is pretty much empty. I just added the minimal amount of code so that I don't have any compiler errors and no runtime errors. What we need to do now is get this working with DocumentDB. The first step in a .net project is to add a reference to the DocumentDB sdk for .net. I can do that by installing NuGet Package. So let's browse for DocumentDB. The first hit here Microsoft.Azure.DocumentDB, that is the package that I want, so I will install this package by accepting the license terms. That will add the assembly references that I need. And now that that is complete, let's come into the CourseStore class and do some work. So every operation that you perform with DocumentDB is going to happen through a class called the DocumentClient. I need to bring in the name space for this, which I can do using control period. The name space is Microsoft.Azure.Documents.Client. That will add the using statement at the top of my file. And I'm going to declare this as a private instance field, so that I have access to a client throughout the rest of the code in this class. And in the constructor of the class I will need to create a new instance of the document client. So new DocumentClient. And now I have several constructors to pick from. The constructor that is showing here is the one I want. At a minimum I have to pass in a service endpoint and an off key or resource token. Where do I get this information? Well I can get this information directly from the Azure Portal. The surface endpoint will be the URI that is here in the Overview tab. I'm just going to hover over that and click the copy button that appears. Then I can bring that URI back into my code. So I want to construct a new URI and I'm going to paste in what I copied from the Azure Portal. Now I need some sort of token or access key. So back in the Azure Portal let me look through the settings until I find keys. That would be interesting. And here I can see that I have Read-write Keys, and Read-only Keys. I'm going to use a Read-write Key because my application needs to write into DocumentDB. There's two keys to select from. The idea is that you can rotate these through your applications. Periodically you'll want to regenerate the key and reset that key. I can do that by clicking the little refresh button here. That will invalidate the current key, so I'd have to take the new key that gets generated and plug it into all my applications. But let's just use this button to copy this key. And just be aware that this is a master admin key. If my application connects with this key, it can effectively do anything in DocumentDB. It can create new databases, new collections, it can also delete collections. But I do need some sort of authorization token that I can pass along and for this demo I'm just going to use the primary Read-write Key. I can pass that in as a string to the DocumentClient. So let me surround that with double quotes and we should now have a client that can connect to my DocumentDB. Let me get rid of the System bit here, because I don't really need that namespace and we will just be sure to bring in a System namespace. Now DocumentDB is entirely based on HTTP. Some would call it a restful system. Every database, every collection, every document has a URI associated with it. So if I pass the proper type of HTTP message to a URI that is inside of this domain, I will be able to query documents and insert documents and create new collections and new databases. But in order to do that I need to formulate the correct URI. So what I want to do next is save off a URI that will give me access to the courses collection in this DocumentDB. And I will call this the coursesLink. And in the constructor I will initialize, or build, the link that I need by using a class provided by the DocumentDB sdk called URI Factory. What I want to do is create a document collection URI. And I do this by passing in the name of my db, which if you might remember in the Azure Portal, the db that we created was called coursedb and the collection inside of that was courses. So my database is coursedb. The second parameter that I pass is the collection id and that will be courses. And again, I'm storing a single type in this collection, but there is no schema in DocumentDB so I could also create a student class and store students in this same collection. So now that we have a client instantiated and now that we have a link, let's write a query that will return all courses. Every operation that happens with DocumentDB, again it goes through this client. So you will notice that there are plenty of methods on this client that will allow me to do things like create a database or create a database if it doesn't exist. What I want to do is create a document query. So create something that will send a HTTP message to bring back account documents. And the generic type parameter that I can pass to this method will describe how I want these documents deserialized. So I want to write a query that will pull back documents that will instantiate course objects. When I create a document query, I have to provide the link of what collection we're going to. So I will provide the coursesLink. And at this point I have an IQueryable. So if you used the entity framework or some other technology that is IQueryable, you'll know at this point that I can use link operators and say that I want to order these documents. So given a document D, I want to order by the title of that document. Or I could make this a C. Given a course, I want to order by the title of the course. And I can apply filtering operators and all of this is strongly tight and uses link to build a query that will be issued to my DocumentDB. What I will do is just return the result of creating a document query and ordering the courses by title. Now I need to insert courses, also a very simple operation. What I'll do is write a foreach statement that will go through each course in the courses that are passed in. And now what I want to do is again, go up to the client and I want to create a document. That is the first method in the IntelliSense here. This is an async method so I will await the results of this and change my method here to be an async method. So await the result of client.CreateDocumentAsync. Once again, I need to pass in the link. That is, what collection do I want to use to create this document? And then I can pass in any type of object that will be my document. So I want to pass in a course. The sdk will take care of serializing that course. JSON is the native language of DocumentDB, so that data will be sent out to the server in JSON and DocumentDB should store that document. So a very simple API. Let's do a build on this project. And yes, typically you would store things like this URI and this authorization key in a configuration file to make it easy to change, but again, I'm trying to get through the demo in a short amount of time. I'm just going to run this application locally and we will come to /document/index and see if we can get this to work. So here's the list of courses. Currently there are no courses in DocumentDB. Let's try to insert the sample courses. And that was a quick operation, that redirected back to show the list of courses again. And now I can see that I have all that information in DocumentDB. Here are the ids that DocumentDB generated. And I can tell you that DocumentDB, it's amazingly fast, amazingly scalable and hopefully now you'll see it's also easy to use.

DocumentDB Metrics

Now that we have a working DocumentDB sample, just a couple things I want to show you in Azure Portal. If I scroll down to the collections area where we originally created a collection in a database, I can now go to the scale settings again, adjust my RUs. We looked at that before, but again the question is, how do you know how many RUs that you are using? Well that's when you can come into, again, Metrics and Alert rules. So just like with App Services and SQL Server, I can set up alert rules to tell me when I might be in trouble and running out of headroom on the number of RUs that I have reserved. And if scroll over the browser here, I can see lots of charts showing me how my db is performing, how many successful calls have there been. Notice the HTTP 429 code here. That is the code DocumentDB will generate when you are exceeding the RUs that you have reserved. I can study the latency, the average consumed versus provisioned RUs. These are all here. And just like with App Services I can create alert rules around the metrics that I care about. I also just want to show you that in the portal, there is a document explorer. There's also tools, by the way, that you can download to migrate data into DocumentDB and look at the collections. But if you want something quick and easy, this document explorer will allow you to come into a collection that you have defined and I can create and upload documents. Here I can even click on documents that already exist in DocumentDB and I can see here, some of the data that I just inserted from my C# application. I can also come to a query explorer and DocumentDB does support a variant of structured query language that will allow you to query the data that is in DocumentDB. So I could say something like select star from courses, aliased as C and run that query. That will also show me information that I've put into DocumentDB. So the SQL language is very expressive and you could look online to see more examples. There's also a script explorer, because DocumentDB supports stored procedures, which are implemented in JavaScript, triggers and user defined functions. So all terms that you would be familiar with, if you've relational databases. This would be an example of a stored procedure that's doing a select star from some collection. So these are all pieces to explore if you're interested in DocumentDB. Unfortunately we don't have the time, and we're going to have to move on to the next module, which we'll talk about storage in Azure.


In this module we looked at two cloud database platforms, Azure SQL and Azure DocumentDB. One is relational, one is a NoSQL database. Both these technologies allow me to persist and query data is Azure without worrying about infrastructure. I simply pay for the amount of power that I need and Azure takes care of all the hardware behind the scenes. Both these platforms can replicate data around the world and scale up with the click of a button. As we've just seen in the application that we are deploying I can even use both platforms in my application if it makes sense and I can have the best of both worlds.

Cloud Storage


Hi, this is Scott and in this module we'll be talking about Cloud storage. Now we're already storing data in Azure using database platforms, but now we're going to talk about storage accounts. Storage accounts can hold hundreds of terabytes of data. In this module I'll show you how to use Azure storage as a place to keep uploaded files for our web application And how we can allow users to download files from storage by creating and handing out what's known as a shared access signature.

Types of Storage

In Azure creating a storage account allows us to work with four different storage services. First there is the Blob storage service. Blob is short for Binary Large Object. You can store any type of file in Blob storage. Large files and small files. You can store binary files like movies and images or text files like a c sharp code file or PowerShell script. There is also Table storage. Table storage is another no-SQL database service similar to document DB but instead of storing documents, Table storage is optimized for storing key value pairs. Table storage is really cheap and it's a good place to store large volumes of data. But since it is a key value pair, Table storage is nowhere near as flexible as a SQL database or a document DB collection. There is also Queue storage. Queues allow you to store and retrieve messages. And they are an important technology to have if you want to build reliable high scale websites because Queues allow you to put a buffer between your application and the back end services that it uses. And this buffer not only allows Azure kind of communication which can be more efficient. It can also keep the amount of pressure on your services constant so that parts of the system are not overwhelmed. There is also File storage which allows you to set up SMB file shares in the cloud. This service is typically the most helpful if you have an existing application that you'd like to port to Azure and the application already uses file shares. Otherwise there are generally better solutions. And one more type of service that is coming soon and probably available by the time you watch this video will be Azure disc storage which is storage optimized for virtual machine discs. In this module we're going to focus on Blob Storage. Blob Storage and Queue Storage are probably the two biggest storage services of interest to our web developer. What we'll focus on Blob storage here and circle back to Queue Storage in a second course later this year.

Accounts, Containers, and Blobs

To use Blob Storage we will first create a new storage account in Azure. As part of that setup we will provide a name and because Blob storage is a addressable over HTTP the name will become the first part of the domain that we use to get to the storage. So for example. Each account that we create gives us access to 500 TB of storage currently and we only pay for the storage that we use. But before we can create our first Blob, we will need to create a container. A container is like a folder for our Blobs. Each container can have its own name and settings and permissions and then the Blobs we create will be in one of the containers that we've created. In Azure there's only one level of containers so I cannot have nested containers to simulate nested file folders although I can use a slash and a Blob name to make it appear as if the Blob is in a sub folder. Let's try to get things set up in the portal.

Creating a Storage Account

Inside the portal let's go to the storage account's entry in the hub and what might be surprising is that I already have two storage accounts. These accounts were created back in the first module of this course when we set up our virtual machine. A virtual machine requires storage. Some place to put the files that represent the discs that are attached to the virtual machine. And if I open up one of these storage accounts and keep in mind it's a special kind of storage account it is really just going to work with Blob storage. I can see there is a container in my storage account. It's called VHD and if I click to open that container, I can see the Blobs that are inside in this case just one Blob, it's a VHD file. It is the file that represents the C drive on that virtual machine. There's a few different Blob types that you can create in Azure. This particular Blob is what is called a page Blob. A page Blob is optimized for random access so because this file is basically a C drive a page Blob is optimized for reads and writes into the middle of the file. We'll use a different Blob type later, one that is more optimized for streaming. That is you start at the beginning and you read it till the end. If I click on this particular Blob I can see properties like the URL that used to access the Blob. But just because this Blob has a URL doesn't mean just anyone can access this. In fact if I go into my browser and try to get to that Blob I will get an authentication field message. I will need a specific access token in order to get to that Blob. Well back in the portal, let's close everything out and come back into storage accounts and create a brand new one for this demonstration. I want to create a storage account and I will first need to give that storage account name. Let's see if we can use pshorsestore which seems good. You always want to use the resource manager, the point up model unless you have a really good reason not to. The account kind here is interesting. Do I want this to be a general purpose storage account which gives me access to Table storage and Queue storage and all those other types of storage we talked about? Or do I want this to be dedicated to Blob storage? The one advantage to the Blob storage account kind is I can pick an access tier of cool or hot. If I have Blobs that I want to store but they're not accessed very frequently, I can set this up as a cool access tier. I pay little bit less to store the data but I'll pay a bit more when the data is accessed. If I have data that is frequently accessed, I might set this up as a hot access tier. And if you search for the Azure pricing calculator you can plug in these different settings and the amount of data that you're going to store and the calculator will give you the estimated price of what those will cost you on a on a monthly basis. I'm going to go with an account kind of general purpose just so they can see the other options floating around. My performance setting can be either standard or premium that's really the difference between using SSDs premium storage or spinning hard drives standard storage. I'll stick with standard. It's really applications like virtual machines that require the premium storage. Replication I'm going to leave this just as locally redundant storage. Azure will store three copies of my data inside the same data center in case there's a hardware failure. If I went over the natural catastrophes I could make this GR redundant. Here again is a place where I can automatically turn on encryption so that my data is encrypted at rest when it's in storage and my application doesn't have to do any additional work to encrypt the data when creating a Blob or to decrypt the data when reading a Blob. As usual I have to select a subscription, I have to select a resource group which let's go with es_espnetnbc and I have to select a location. With everything filled out, I will click create and in a few moments I will have a new storage account available.

Creating a Container

Now that my storage account is set up I'm going to click to open that storage account. As usual I will come to an overview blade. This will give me some of the basic settings of this and allow me to access Blob storage File storage Table storage and Queue storage. There's also of course a chart that will show me how this resource is being utilized. Of course we have no data because we have no items in storages yet. So let me click on Blob storage and it is here where I can see a list of my containers. There's no containers yet so let's create a container. I want to store some images in Blob storage so I'll call my container images. I'm going to leave the access typed as private. If I wanted to allow anonymous access to individual Blobs I could set up the access typed as Blob or if I wanted to give anonymous users the ability to see what is in the container I could set the access typed to container but I'm going to set it as private. Now only myself the account owner and those applications that have the access keys will be able to view what is in this container. Let me create that container that will bring me back to the Blob service blaine where I can click on the container. I can see the URL for the container already and it's here that I can upload Blobs if I wanted to. When I upload a blob which I could do just by selecting a file from my file system. There's two options here. The Blob size and the Blob type. There are three Blob types. You have to select one of these when you create a Blob. A block Blob is just like the files on your file system so image files, XML files, movie files you typically want to put them into blob storage as block blobs. They're optimized for uploading, they're optimized for streaming, they're optimized for starting to read at the beginning and going all the way to the end. Page Blobs that's the type of Blob that we'll use to store my VHD file for the virtual machine. Those are optimized for reading and writing right in the middle of the Blob. And then there's the Append Blob. A special type of Block Blob which is optimized for me updating that Blob by always writing to the end of the Blob. So that makes it good for things like logged files where my changes are always just a pending new information. For most Blobs typically you just want to select the block Blob type. And at that point I could go in and select a file from my file system to upload but that would be a little bit too easy. So instead let's go over to our application and create a form where a user can upload a file into our application and our application will save that file into Blob storage.

Uploading a Blob from C#

Inside the application I've added a controller and a couple views and my goal is allow the use to come in, select an image from the local file system and upload that image into my Blob storage. Once I have the image in my Blob storage then I want to show that image back to the user and I want to serve that image directly from Blob storage. Not from my website but from Blob storage. Let's take a look at the code. The first bit will be the images controller and since we are allowing users to upload arbitrary files we might consider adding the authorized attributes so that only registered users can do this but there's a number of things that I'm going to do to simplify this demo and this index action simply returns in index view which is the view we were just looking at. It contains a form and that form has an input type people's file that will allow the user to select a file. The name of this input is image and when the user clicks on the submit button, this form will post to the upload action. So back in the controller the upload action receives a perimeter of type each of the posted file base. This is the class that I want to use if I have uploads to my MVC application. It will automatically post those uploads. All I need to do is name this perimeter the same as my input so the input name is image my perimeter name is image therefore this perimeter should receive the file and activity posted file base includes a property input stream which is really nice because I be able to stream that file directly into Blob storage. That's what I hope to do. I don't have to save it on the local file system in this application. I just want to view it directly into Blob storage and I'm going to do that by the way with a class called image store. Just like I document DB demo this image store class is not implemented yet. I will need to obtain an SDK to make calls into Azure to make sure the file is saved and then later retrieved. So if I do save the image successfully, we will redirect to the show action and pass along an ID that was generated for the image and inside of show I just want to build a full URI for that image and serve it directly from Cloud storage and so the show view that is using this show model that just contains the URI is very simple. Just an image tag whose source is equal to model.URI So let's see if we can get this to work. We will need to implement these methods. Save image in URI form in the image store class. Just like with document DB and in fact many programmable bits of Azure there will be new hit packages that I can install to obtain an SDK so this time I want to search for Azure storage in the top hit in the search results that is the new hit package that I want so let me install the latest stable version. I'll need to accept a license key any moment here. There it is. Accept this and we should now have that new hit package available. Now what I can do is come into my image store and start working with the storage SDK. That SDK contains an API for all the storage services. Blob storage, File storage, Queue storage, Table storage but when I want to work with Blobs so what I want to do is instantiate a Cloud Blob client and I'll need that client for several operations. So I'll just create a private instance field here of type Cloud Blob client. I do need to bring in name space Microsoft.WindowsAzure.Storage.Blob and I will instantiate that client in the constructor. So client equals a new instance of CloudBlobClient and there's a few different constructors here. I'm going to need to call the constructor that takes a URI and some storage credentials. Credentials because I'm about to write into Blob storage. I can set up Blob storage to allow anonymous reads but if I'm going to write in the storage or do anything else I'm going to need these credentials. So both of these pieces of information are actually available from the Azure portal. So flipping back into the portal if I am looking at my storage account here pscorestore let's go into the Blobs section again and on this blaine that lists all of my containers there's an essentials drop down. I can click to open that up and right here is my Blob service end point. This is the route URL that I would use to get to everything that is in my Blob storage. I will click to copy that. You'll notice that the URL for this container is simply that service end point with slash images at the end? so container name and then a Blob that is in the images container would simply be /images/ and then the Blob name. So keep that in mind for later. Back in my code, this is typically the type of thing that I would want to include in the configuration file but I'm just going to hard code it into the application to simplify things and in fact, I'm going to save that base URL here as a private field also. Just because I will need this for a couple different operations. So a new instance of the URI class, paste in my service end point and that becomes the first perimeter to the Cloud Blob client. Now I need to create new storage credentials and that is a class that is in the main space Microsoft.WindowsAzure.Storage.of so let me add a using statement for that using control period and now there will be two perimeters required here. If I go to the overload that I want, the overload I want is this one. I pass in an account name and a key value. Again both of these pieces of information are available in the Azure portal. For the account name, I literally want the name of my storage account and that is pscorestore so I'm going to copy that value that becomes the first perimeter to this constructor and now I need a key value. So back in the Azure portal let me close the Blob service blaine, come back to my storage account blade and under settings there will be my access keys. So this is very similar to document DB. I have two master keys available you need to be careful with the keys and not hand them out to just anybody because whoever has one of these active keys will be able to do whatever they want inside of my storage account. So even the help text here recommends you store these in Azure key volt which we won't have time to look at in this course but that it something to investigate for a real application. I'm simply going to copy one of these keys and come back to the application and paste that in as the second perimeter to storage credentials and I should now have a Cloud Blob client that I can use. The first thing we'll try to do is save an image. So instead of returning basically an empty result here, what I need to do is take the stream and save the content into Blob storage. In order to be able to do that I need to get a reference to the container that will contain the Blob that I create from the stream so first of all I will need a container reference. I can do that just by walking up to the client and asking to get a container reference for the container in this Blob storage account named images. And once I have a container reference, I can use that container reference to do all sorts of things including getting a Blob reference so you can see there's a number of methods and properties that are available here, but what I want to do is get a block Blob reference. So we want to store this as a Block Blob and I need to pass in a Blob name. Now if that Blob already exists, this API will allow me to overwrite or delete that Blob. And if this Blob doesn't exist, I can create the Blob but I do need a Blob name. Now I could from the http file base look at the local file name that the user is using but what I want to do is actually just generate a globally unique identifier for every Blob that I store. So the ID for my Blob will be generated by using Guid.NewGuid and I'll just go ahead and convert that into a string so that I can say here's the ID for my Blob. And now once I have that Blob reference it's a very easy operation to say dear Blob please UploadfromStreamAsync and you'll notice all sorts of methods again here available. But what I want to do is upload and I want to upload from a posted file stream. That's a Async operation so let's await the result of that I will make this our Async method. Once that completes without an exception, I can be reasonably sure that I actually saved something in the Blob storage so I will return the ID to the image controller which will then pass that along to the show action and then the show action wants me to construct a full URL to that Blob in Blob storage so I can display it using a simple image tag. So over in the image store, let's do the simplest possible thing that might work even though it won't and I'll explain why. But let's just go ahead and return a new instance of URI class and the URI class allows me to pass in several perimeters and well it can cut many things together to build a URI. I'll pass in the base URI for the service and then use some C sharp string interpolation to say then go to the container name which is images and then go to the image ID. Let me use shift control B to build this application and we'll come out and try this locally. I can still access Blob storage by running this application locally. I'll skip the deployment for right now so let's refresh this page and let's try it out-- I'm going to browse for an image, I'm going to select go and we come over to the show action pretty quickly. I can see an ID is in the expected place in the URL so I'm pretty confident we actually saved something in the Blob storage but the image isn't showing. Let's inspect that element and we can see that it seems to have generated a reasonable URL but I am getting a broken image. Let's go into our Blob storage once and just take a quick look at it. If I come into my storage account and then come into my Blob storage. I should see a list of my containers. I'm going to click on the image container when it appears and from here I should get a listing of Blobs. There's my bock Blob, I could look at the properties for it I should be able to compare this URL to the URL that I'm seeing in the browser and they would be equivalent. So I know that image is out there but remember just having the URL to a Blob in Blob storage doesn't give you access to that Blob. What I'm going to need to do is pass along some additional information so that this request for the image will actually succeed. Let's look at how to do that in the next clip.

Generating a Shared Access Signature

When you want to grant access to what's in your storage to another person or application, you can create what is known as a shared access signature, a SAS. A SAS is like a token or access key. It gives the other entity limited permissions to a Blob or an entire container. One way to create a SAS is to use the Azure portal. The UI here shows that you can grant access to your entire Blob service or just a container or specific object or Blob inside the container. Then you can select permissions like read write delete and so on. In the past I've built applications where I found it useful to create a SAS with write only permissions. Because then I can give some customer the ability to upload into my storage but they won't be able to read or see anything. You can specify start and stop dates for the token, you can specify allowed IP addresses, allowed protocols and the good news is the SAS contains a signature that is created using one of your master keys so if anyone tampers with the token and tries to do something like extend the ending which is visible Azure will reject that request because the values in the token don't match the signature. We can also create a SAS I Csharp code right in our application and let's do this so that we can have a unique SAS each time user uploads a file. In the application, in order for this user to be able to see this image we're going to have to create a shared access signature and include that in the image request for Azure to serve up this image. It turns out this is relatively easy so let's come back into visual studio and go to the image store and here where we compute the URI we will include a shared access signature on that URI. First I want to generate this signature for this Blob that this user has uploaded. Which means I will to get a reference to that Blob which means I first need a reference to the container so @client.getcontainerreference for images I can already see that I have some duplicated code here and here I'll leave that as an exercise for the viewer to refactor this. This line of code also will be duplicated. I want to get a block Blob reference to the image specified by the image ID and once I have the Blob I can apply the signature so let's build a shared access signature policy which I can say I want a new instance of the class called the shared access Blob policy and I'm going to set some properties on this so the permissions that I want to allow, I want to allow sharedaccessBlob and this is a flag them so in can order this with write permissions, look permissions and so forth But I only want to provide read permissions. I'll also specify a shared access start time and I will do that by using and you have to be careful with this because of clocks queue I will back this up by 15 minutes so that it is good a little bit into the past and then I can also set an expiration time which let's give the user 30 minutes into the future so add minutes 30 and that's really all I care about. I only want to grant read access for about a 30 minute window and what I do is use this policy and combine it with a specific Blob to generate a token so my shared access signature token is what happens when I say dear Blob please get a shared access signature using this policy that I defined earlier and it turns out that this signature is really nothing more than a query string so all I need to do is take that token and poke it into the string here. At this point I will build the application and we will try this upload one more time. So back in the browser let's go back to images/index or just /images from here I'll try to upload that same image I look very stun in this image I'm not sure why but I definitely look younger than I really am even though I took the picture a few months ago and there I can see the image. It's quite a large image but let's inspect this element and see as I hide myself that this time this worked because a pendant to the end of the URL is a query string that includes a signature as well as a start time and a stop time. And again if I tried to modify one of those values Azure would reject the request because it would verify the values that are here against the signature that's provided. So we've barely only scratched the surface of the different things that you can achieve using Blob storage, but hopefully this will give you an idea of some of the types of applications and features that you can build using Azure storage.


In this module we looked at how to create a storage account and then work with Blob storage. We created a container and added a feature in a web application that would upload images into that container. We also generated shared access signatures to give the user some limited access to see the Blobs that we created. In the next module we will again do some work with Blob storage as we look at another approach to hosting code in Azure with Azure functions.



Hi, this is Scott, and in this module, we're going to look at Azure Functions. This platform as a service offering allows us to upload small bits of code to define a service, a service that can react to HTTP calls like a traditional web API. But we can also define triggers for a function so the code will run when a new blob appears in storage or a timer expires. Functions are an inexpensive way to host scalable logic in the cloud, and by the end of this module, you'll be able to implement and upload your own functions.

Serverless Computing

Microsoft designed Azure Functions to give us a serverless computing option. Perhaps you've heard of that term. It's a trendy term in software circles at the time of this recording. Serverless computing is about using platform as a service technologies that provide a high level of abstraction over the underlying hardware and infrastructure. In an app service, I still have to worry about the number of cores and the number of memory I have on a VM as well as scaling the number of instances of that VM. With Azure Functions, I don't have any of those worries. Azure Functions also allow us to implement nanoservices which are even smaller than microservices, also a trendy topic. Nanoservices are small, autonomous web services that can implement pieces of our application logic. And by keeping those pieces autonomous, we can update just the piece of application logic that needs to change. Serverless computing and nanoservices, this is what Azure Functions is all about. They provide this ability by allowing us to upload a single function with some configuration so Azure knows when to call that function. The function could be implemented in C#, in F#, Node, Ruby, PHP. Even PowerShell and Bash scripts are now supported. The neat thing about Azure Functions is that we don't have to manage the scale or pay for having these functions in Azure. If no one is using our functions, we have zero cost. If the next day, we have a million requests, Azure will make sure the hardware is ready behind the scenes to support those requests. And Azure Functions, they can respond to HTTP requests. They can also execute in intervals when the timer expires or when a new item appears in blob or q storage. They can respond to webhooks, which are HTTP calls some services like GitHub offers. GitHub can fire off an HTTP call when there is a new check in or a new pull request, for example. Let's go into the portal and start to explore Azure Functions.

Creating a Function App

Inside the Azure portal, I want to create what Azure calls a function app. And function app currently does not appear as a top level item here in my hub, so I'm going to go to the new button and then I will search for function or function app. And the first result that should appear will be function app from Microsoft. That is what I want. I'll click that selection. This will bring up a quick overview of function app but I know this is what I want, so I will click create. And that will bring up the blade that we have seen many times in this course, the blade where I enter the configuration of my new Azure resource. So first, an application name. This has to be a unique name in So let's try ps-functions, and that appears to be okay. If I went to invoke one of my functions over the internet, I'll be going to ps-functions/ I do want this to be in the (mumbles) site subscription. I'm going to create a new resource group for this function app. I will also call that resource ps-functions, and then I come to the hosting plan. The hosting plan is interesting because I can choose from a consumption plan or an app service plan. Let me talk about the app service plan first. So one way to think about a function app is to think of it as a special type of app service. We've already created an app service. We've deployed a web application into that app service, and we know that every app service requires an app service plan. The plan describes the type of hardware that we're going to operate on. So because function app is a special type of app service, I can pick my own app service plan. I can tell Azure about the number of CPUs and the amount of memory that I want on the virtual machine for my function app. In fact, I can pick an existing app service plan, and then when I deploy my function app, it will be right there alongside my other web applications and app services. If I go to the file system using a tool like Kadu, which we looked at earlier in the course, I would be able to look at the file system and see my web application and also, my function app. And inside of the function app, there will be directories, one for each function. And each function contains a special configuration file which I will show you later, and that configuration file is what tells Azure, "Hey, this is a function app, "and here's how this app should behave. "It should respond to an HTTP request, "or it should trigger when a timer expires." Now the downside to selecting an app service plan is that I will always be paying for that app service plan even if my functions are not being used. And if my functions are being used, it's my responsibility to monitor that plan and either scale the plan up to move to a bigger virtual machine, or scale it down, or scale it out by adding more instances, or scaling it in. We saw how to do all that earlier in the course. But one of the appealing aspects of a function app is that I can choose a consumption plan. A consumption plan is still really like an app service plan in the background but now, I don't worry about the type of virtual machine. I don't worry about scale up and scale down, or scale out and scale in. I simply want Azure to manage my functions. And if they're under load, add more hardware, add more resources. If they're not being used, scale everything back. And with a consumption plan, I only pay for when my functions are actively executing. And the obvious question there is how much do I pay. Well, I've already done a search for Azure Function app pricing that brings me to this page on functions pricing. And if I scroll down a little bit, I can see the two metrics that Azure uses to compute my bill at the end of the month for a function app. One metric is the total number of executions. I can see that I pay one fifth of a US dollar per million executions, and I have one million free executions every month. The other metric that is used is the execution time of my functions. This is measured in gigabyte seconds. So if I have a function that executes for five seconds and the average memory consumption over that five seconds is half a gigabyte of memory, that would be five times one half or 2.5 gigabyte seconds. So I'm basically paying for how long does my function execute and how much memory does it use. Currently, that price in US dollars is 1.6, 100,000th of a US dollar, and I have 400,000 gigabyte seconds free per month, so function apps on a consumption plan can be relatively inexpensive. I pick a location where this virtual hardware will live, South Central US. That's where my application will be hosted. And every function app needs an associated storage account. If I don't explicitly create the storage account, one will be created for me with a name like function4928 number, number, number. I'd like to have a more reasonable name so I'm going to click on that entry and say that I want to create a new one. And this is a special type of storage account. So I don't have to fill out a lot of the configuration that I filled out for the storage account we created previously in the course. I just need to give the storage account a name. So let me call it psfunctionsstorage. And when I select okay, that configuration will be added here. Now I'll try to create the function app. We'll go through some validation. Looks like the validation was successful. So in a few moments, I will have a new function app and we'll start to write some code.

Creating a Function

Inside the portal, my function app should now be provisioned. I want to create a function and start executing something, so let me go to my list of app services because a function app really just is a special type of app service. And here is the ps-functions app that we created. Let me click on that. And now this experience is going to be much different from the experience when you click on other app services that host traditional web applications. And that's because the Azure portal makes it easy for me to create and edit functions right here in the portal. Even if my functions are implemented in C# code, I don't have to do a build and then deploy those functions. I can just edit them right here in the portal and they will be compiled dynamically. Now if I want to, I can download some tooling for Visual Studio. Right now, with Visual Studio 2015, I can download a preview for function app templates that will allow me to create function apps and add functions to an application in Visual Studio, run that function locally and debug it, and then publish in the Azure. But for this module, I'm going to stick with a portal. But just so you know, you can use Visual Studio code. You can use Visual Studio. You can use other editors. You can deploy using any publishing tool. You can deploy using source control. Or as we saw with app services previously, I could deploy by synchronizing with a file sharing service like Dropbox or OneDrive. All those options are still out there. What I really want to focus on is just the function app environment. And to do that, I'll just be using the portal. The first function that we create will be one of these pre-made functions that are available from the portal. I could go to this new function button, but we'll do that later in the module when we create a more non-trivial function. For right now, I want either a timer, or data processing, or a webhook plus API function. Where this type of function would be addressable over HTTP, a data processing function, you'd have your input from something like blob storage. We're going to go with timer, so a function that executes at some interval. Maybe that interval is once a day. Maybe you just need a little bit of code that executes once a day to clean our records from your database. Or maybe it needs to go to blob storage and remove all the blobs that are older than 30 days or one day. Those are the types of tasks that you can perform with Functions. So I want a timer function, I want it based on the C# language, and let me go ahead and create this function. What we'll open up here in the list of my functions will be a new entry. The generated name was TimerTriggerCSharp1, not the best name in the world but we'll live with that for now. I'm going to dismiss this tour dialog, and we can see that a C# function inside of a function app literally just is a function, public static void and the function name is Run. When I'm on this develop tab, I have the editor open and I can just write C# code inside of this function. And if my function gets too big, I can add some additional files or I can break that one function into multiple functions, but run is my entry point. That's invoked by Azure. And because this is a timer-based function, the first parameter to this function will be the timer that just expired. And any function that you write in C# can take a trace writer. Anything that I log using that trace writer will appear in the log files for my function. In fact, I should be able to open up those log files right now and see that the C# timer trigger function executed at such and such a time. So that was my function actually running just a little bit ago. We'll see just how that timer is scheduled in just a bit, but I just want to show you that if I let's say add an additional log statement, maybe we just print out the text hello is something simple like that, there is no need for me to do a build. I can just click the save button, this function will be dynamically recompiled, and I'll see the compilation results in my log file too. So let me put in a syntax error like just put in some extra z's and click save. I can see there's a message that the script change, this is being reloaded, and now I have your typical C# compiler errors. The type of name zzzz could not be found. So let me fix that error and click save, and I should have a compilation succeeded message almost instantly. So those are the logs that are showing me my function is working and actually has executed just once. Let's go to view files for a moment. I want to show you that there are two important files for this one function, TimerTriggerCSharp1. One follows run.csx. That's my run file that contains the code that I want to execute. No, it's not a .cs extension. It's a .csx extension. It's like C# scripting, and there's some special syntax available to me inside of a .csx file that I'll show you later in this module that I would not be able to use in a typical .cs file, special syntax to do things like reference other assemblies. So the .csx file contains my source code, and then there is function.json. This is the metadata that describes to Azure information about my function and how it should behave, what are its inputs, what are its outputs. Currently, this function is not disabled so it will be able to execute. And every function has some bindings. The bindings for this function will be a timer trigger, and the name here is saying that when this trigger expires, we will actually be able to pass in this timer as a parameter to your run function called myTimer. And if I go back to run.csx, this is called myTimer, and the binding in function.js is called myTimer. So that's how these things can match up. For other types of functions, functions that rely on blob storages, input, or message in a queue as input, those will have different types but you can always name these things so that the blob or the q message is actually passed into your function. And the whole idea here is to make the functions easier to write, to remove as much configuration and as much work from that function as possible. So if I need to read something from blob storage, I want to describe to Azure how to go out to blob storage and fetch an item and pass it into my function so I don't have to write that code in my function. And we'll see an example of that later in this module when we work with blob storage. But these are the two important files, run.csx and function.json. Now we want to come back actually to run.csx and let's explore some of the options that are available over here on the left-hand side.

Function Triggers and Configuration

Inside the portal, when I'm on the develop tab, I see the source code to my function. When I go to the integrate tab, here's where I can define the triggers, the inputs, and the outputs, essentially, the bindings. Everything that I do here will be reflected in that function.json file that we just looked at. Remember that function.json file defined a binding for a timer called myTimer. Let's pretend we didn't have that. I would now either edit function.json by hand or do this graphically. Graphically, if I want this function to execute let's say once a day, then I would want a trigger that is based on a timer. So I would click on new trigger, and you can see I can have triggers based on HTTP calls, so anytime a call arrives, or based on Azure Service Bus, so anytime there's a message in a particular queue, or blob storage which would be anytime there's a new blob that appears in my container that I specify. But I want a timer-based function so I will select timer. And now in this editor, I can specify the parameter name that will be passed to my run function. It is called myTimer in the function, and I can define a schedule. So the schedule syntax looks a little bit odd perhaps at first but if you've ever done any work with the Cron utility on a Unix system, then you probably either recognize or are familiar with the syntax. For everyone else, I'm going to come to the Wikipedia entry for Cron which again is just a job scheduler/task scheduler in Unix and Linux systems. And if I scroll down a bit, you will see the description of the syntax which is I want to be able to control when my executable or my scripts executes. I want it to run every night or every five seconds. I do that by specifying minutes, hours, the day of month, the month, and the day of week in the syntax. Now the easiest way I found to work with this syntax is actually to come to a website This has things set up so that I can plug in values, and it will actually describe to me in English what my schedule is saying. Now the one thing to keep in mind here is that on this site and everything that you find out about Cron, there are five entries in the typical Cron line. But in Azure, there's going to be six entries. So you'll notice there's one, two, three, four, five, six asterisks here, and that's because in Azure, there is also an entry for seconds which would come right before the minutes entry. So other than that, everything is exactly the same. Let's just do some experimentation here. Let's say that I want something to run at 3 a.m. every morning. I would put in zero for minute and three for hour, and that will basically say I want to execute at 3 a.m. every morning on every day of every month. What if I just wanted that to happen let's say on Sundays. Well, then I can come in here for the day of the week and enter zero. The first day of the week will be, obviously, zero. So zero is Sunday. I can also use an abbreviation, SUN, to make that more apparent. So if I were to enter this type of schedule for my Azure Function, it would execute once a week at 3 a.m. every Sunday. I can also describe ranges. So what if I only wanted this to happen from June through September? I can enter six to nine. And let's say I really don't care at what hour this executed, but I do want it to execute every five minutes. Well, then the slash allows you to step value. So this is saying every five minutes, but that's only going to happen on Sunday in June through September. So let's get rid of Sunday. Now it will happen everyday from June through September. Let's turn this back to star and take all values. So now I would have something that runs every five minutes. So back in Azure, let's say I want this function to execute just temporarily every one second. Remember, the first entry here is for seconds. I'll put in */1. Click save. Let's come back to develop and look at our log file. What I should see happening in the streaming logs is that the function will just keep executing. So function completed, function completed. I can see it's operating now every second. That's a little too frequently so let me come back to integrate and let's go to seconds, minutes, hours. Let me just say every 12 hours. And now, I would have a function that executes every 12 hours. I could use logic and code inside of here to go out and clean up databases, clean up blob storage, send an email, whatever I wanted to do. Well, let's also talk about the manage option here. The biggest thing about managing is I can disable my function if I want to. I can delete this function. I can also see the function keys that someone would need in order to be able to invoke my function from the outside. So if I'm building an HTTP API using a function app, this would be like my API key. And finally, under monitor, I'll be able to see things like the recent success count, the recent error count. I'd be able to view my logs. I'd be able to look at every invocation of this function and click on that function and see what happens. So there's still plenty of monitoring and diagnostic tools for a function app, and now let's talk about, finally, function app settings. We'll do that next.

Inside Functions

After working with app services earlier in the course, we're going to notice a few familiar configuration options under function app settings. One of the first buttons over here on the right is an app service editor. We'll use that in just a little bit when we write another function for our function app, but I want to show you configure app settings. This will open the blade that we would see to configure the application settings for any app service. So that includes things like the .net framework version. And the most important part would be down here where we specify app settings. So there's been a few that have already been set up for us. But when our function needs something like a connection string, we can put that connection string here in app settings to make a database, or a document DB, or a blob storage account available to our function. There is also, over here, a dev console that would allow me to look through the file system where my function app is deployed. There's the option to configure continuous integration. This is where I could hook up my function app with a file sharing service like OneDrive or to set up continuous integration from Visual Studio services or from a local git repository or from GitHub. There is also a Kadu site, just like the Kadu site available for our regular app service. And just like with our regular app service, the address for this will have .scm in the middle. So ps/ And inside of here, it's where I can see that this really does function like an app service. You might remember browsing the file system for the application that we deployed earlier in the course. I can still come into site. I can still come into wwwroot. And here is where I can see that each function I create as part of this function app will have its own folder. And inside of that folder are the files associated with that particular function. So there's the run file. There's the function.json file. I can also, from function app settings, go to the blade that I would typically see for my app service. So I can manage that app service, but there's going to be fewer options here because I don't need the ability to scale. And there's other infrastructure related concerns that just aren't a concern when I build a function app, but I can still set my deployment credentials, my deployment options. I can go under settings. There's another place where I can get to application settings. I can install my own SSL certificate, have my own custom domain. Also, from here, I can get to the app service plan but I can already see that it's consumption based, so there's not much for me to do there. And finally, over here, there are some easy settings that will allow me to configure cross origin resource sharing headers for my functions, configure authentication, and configure API metadata if I want people to more easily consume my APIs. There's also a place where I can set a daily usage quota in gigabytes per second. I'm going to leave that undefined so I have no quota. But now what I'd like to do is build another function that can do some more interesting work for me. What I want to be able to do with my web application that is still set up and still allows users to upload images that I save in the blob storage. What I'd like to do is write a function that will execute every time there is an upload, and we'll take that image and analyze the image to see if the person in the image is happy or fearful or sad. You might be wondering how I can do that. Well, I'll show you how to do that in the following clips.

Adding Cognitive Services

What I went to do is a demo where I show you how I can write a function in a function app that interacts with Microsoft Cognitive Services. I want to do this for a few reasons. First of all, this will be a non-trivial function meaning it actually does something significant or some useful and interesting work although it's going to be easy to implement. But the reason it's interesting is because cognitive services is interesting. Cognitive services provides many different APIs to do things like recognize emotions, recognize faces, do speech recognition, text analytics, translate text. There's even a recommendations engine, so all sorts of interesting APIs. And the emotion API is one that is featured here on the home page for Cognitive Services. I can tell Cognitive Services about an image that is available over the internet, and Cognitive Services can come back and tell me what faces it finds in that image. And not just the faces, but it can analyze those faces to determine the emotions that are present. So is this person angry or disgusted or happy or sad or surprised? And what I'm thinking is that as a user uploads images into my web application and I store them in blob storage, that creation of a blob can kick off or trigger a function in my function app that will make a call to the Cognitive Services API to analyze that new blob. And then I can take this information about how many faces and are people happy or fearful and store this information in the database. In order to do this, I'm going to need to set up a Cognitive Services resource in Azure. So let me come in and do a search for Cognitive Services. And the first result that should come back is Cognitive Services APIs currently in preview, but I know this is what I want to create. And just like everywhere else in the Azure portal, this will bring up a blade where I can fill out things like the account name. I want to call this ps-cognitiveservices. I do want it associated with the plural site subscription. Which API do I want to invoke? I want to invoke the emotion API. The only location available is the West US, and that's okay. I can call this API from anywhere. And I'm going to select the free tier, which allows me up to 20 calls per minute. I'm going to place this into my existing resource group, the one I have created for my function app. And once I agree to the legal terms and click create, it'll just be a few moments until I have Cognitive Services at my disposal. While that is being set up, let me come over to app services. And what we want to do is create a new function here in my function app. This function will be triggered whenever a new blob appears in blob storage. And to do that, once the menu appears here, I'll be able to select new function and choose from a template. There are a variety of templates that are available here, and I can browse through different scenarios to find a template that is a good starting point. But the template that I want is actually the first template that appears here, the template for a C# function that will be invoked whenever a new blob appears in a specified container. So let me select that template. And if I scroll down, I just need to provide some configuration information like the name of this function. Let's call it image processor. And then I need to configure a path and a storage account. So this is the type of information that will appear in that function.json file that removes all this infrastructure from my function itself. And if I already have a storage account connected, it would appear in this list. But none of these storage accounts are the ones that I want to use so I'm going to click new. And all I need to do to set up a storage account to work with this function is to select the storage account. At least if the storage account is in my account and in my subscription, what Azure will do is take the connection string for that particular storage account and store it in the app settings for my function app. I could go in and see that if I went back to the function app settings. And the key for that particular value will be pscoursestore_STORAGE. And now that I am connected to that account, I just need to provide a path. You'll remember we defined a container named images, and anytime a new blob appears, I want Azure to pick up the name and pass that name also into my function. So I'm going to click create, and in just a few moments, we will have a very simple C# function. But currently, all it will do is just log information. So yes, we received a new blob. Here is its name and here is its length. Notice this function currently takes a stream. That would be very useful if I wanted to read that blob into memory here inside of my function. But if I look at the Cognitive Services API, all I need to do is invoke the API and pass along a URL for Cognitive Services to go out and read my image directly. And that's not something easy to do with just a stream so let's see what else we can do here on the integrate tab. I can see I have a trigger set up for Azure blob storage. Here's the configuration information on the storage account and the container to use. I'm going to open up the documentation, and what I'll be able to find in here in addition to various techniques that you can use for finding the right type of blob that's in your container, if I scroll down a little further, I will find that in C# functions, that blob that Azure finds, it can pass it in as a parameter using various different types. So yes, it can pass it in as a stream if I really want to read that blob, but it can also pass in a cloud block blob reference. And you might remember that is the class that we used in the previous module. Along with a shared access signature policy, we use this too in combination to generate a shared access signature that we could hand out to allow someone to view the image. And that's exactly what I want to do with Cognitive Services, generate a URL with a shared access signature so Cognitive Services can read the image. So I'm going to copy this cloud block blob and come back to my function. And instead of using a stream, I want something passed in as a cloud block blob. Now a stream has a length property, a cloud block blob does not. So for right now, I'm just going to take size off of here. I'm not really concerned about the size, but it might be nice to have some logged information about the name of the blob that was received. Now I will have some problems though when I click save. Let me open up the logs. What I'll see is that I saved my function. It changed. Azure tried to recompile the function, and the type remains, "cloud block blob could not be found." The reason is that there are some assembly references and implicit using statements available for your C# functions but the Azure Storage APIs are not one of those defaults or implicits. What I need to do is add an explicit reference. In a csx file, this is that special syntax I was talking about, I can use a #r directive to say I need to reference types that are available through Microsoft.WindowsAzure.Storage. You might remember that was the exact name of the Nuget package that we've had to install in the last module to have access to cloud block blob. So once I have that reference, now I can also have a using statement that says I want to use types in the main space Microsoft.WindowsAzure.Storage.Blob. So once I have my reference and my using statement, let me try to compile the function again. And I can see compilation succeeded. So one observation you might have had there when I was typing that code in is that there was no IntelliSense available, and that's true. Typing out this demo is going to be a little bit of a challenge because I rely on IntelliSense to help me along, but I can imagine in the future, all these tools will be improved. For right now, I have something that is compiling and working, and I just want to run a quick test to see what happens when a new blob does appear.

Testing a Function

When I went to test a function that I'm actively developing here in the portal, one option I have is to come in and click on this test link that appears in the upper right. This will allow me to enter a request body which will be used as input for my function. So this is not the literal input for my function. In other words, the text samples-workitems/something, Azure doesn't take that request body and try to jam it into my blob or name parameter, something like that. Instead, Azure is a little bit smarter. Azure knows this particular function is triggered when a new blob appears in my storage account. It knows that because that is what is in function.json. It's the metadata provider to Azure. So Azure will be smart enough to know, oh, if you're providing a request body, that must be the name of a new blob that appeared. So I will take that and do all the work to establish connection to the storage account and find the container and find the blob in the container and then pass you in a cloud block blob that is that particular blob entry. So all I need to do here to test this function is come out to my storage account in the images container and pick one of my existing images. I will select any of these and just copy out the URL, come back to my function app, and paste this in as the request body. And now all I need to do is remove the part. All I need to pass in is images/ and the blob name which is guid. With that, I will run my function and I will see that the function completed successfully. I can see the output of my log statement, C# blob trigger function process blob. Here is the name, there's that guid. So I know even though I have very little code at this point that my function can receive a cloud block blob. Now all I need to do is add the code to generate a URL that Cognitive Services can use to get to that blob.

Analyzing an Image

Now that I have some test data to exercise my function. I'm ready to try to finish an implementation of this function. I'm going to do that by slowly pasting in some snippets of code and explaining what I'm doing as we go along. So first of all, for the code I want to use, I do need some additional references. I'm going to include a reference to Newtonsoft.json, and that reference will allow me to serialize a C# object into json data. I've also added using statements for Newtonsoft.json and system.text. At this point, I'm ready to use a new run method that will decompose the operations I need to perform into a couple of different steps. So first of all, I'll still log some information when the function starts to execute about the blob I have received. That could be very useful for debugging. Then I need to take that blob and generate a URI that includes a shared access signature so I can give Cognitive Services access to that blob. And once I have that URI, I can send that off to Cognitive Services for analysis. I'm going to wait for the analysis to come back and then simply log the results. I could also take that analysis, which will be json data, and easily store it into document DB. Or I could put it into a queue to deliver somewhere else. And that's how function apps really can work well because you can decompose a large operation into smaller pieces and compose these functions together so that the output of one function goes to the input of another function. But they are all decoupled and can be deployed independently. Now the next thing we need to do then is implement this method, get URI. So let me paste in some code that should look pretty familiar because what we're going to use is code that is almost duplicated from the web application. Given a cloud block blob, I need to generate a shared access signature. So I will use a policy that says you can read from 30 minutes in the past to 30 minutes into the future and use that policy to get a shared access signature for this specific blob and then build a URI to that blob by appending the shared access signature to the blob URI. So every cloud block blob has a URI property which will point to that blob. And now that we have a URI, it is time to call Cognitive Services. So first, let me go to the documentation for Microsoft Cognitive Services. What we're going to be using is the Emotion API. And if I scroll down on this page, which is the documentation for that particular API, I can see the request URL. I'm going to be using this in my code. Ideally, it would be an application setting that is configurable. I'm just going to plow along in this demo and hard code it. I'm also going to hard code the subscription key. So this is a required request header that must be sent along in order for the call to succeed. And I'll show you where to find that subscription key in just a bit. When I send off this request, which is a post to operation, the body of the request has to include some json data which is very simple. It just gives the URL. So the URL of the image I want you to analyze is this URL. So knowing that about Cognitive Services, let's come back to our function app and let me paste in an implementation of GetAnalysis. So given our URI, to my blob, I'm going to use the .net HTTP client to send off an HTTP request. The data that will go into the body of that request will be what I get when I serialize a new anonymous object that just says URL equals this incoming URI to string. So I'm building that body of the request that I have to send to cognitive services. Then using the HTTP client API, I can build an HTTP request message. This is the service endpoint that I want to go to. So it ends with /recognize. And again, ideally, that would be a configuration key. This is going to be a post to operation, and I do have to include the subscription key. So where do I find the subscription key. Remember earlier, we created this ps-cognitiveservices resource. And if I go to this resource, it will look like most of the other ones. So it will give me some basic monitoring information about how much I've called the API, and it will include my access key. So under resource management, here's key one and key two. I'm just going to copy key one. And instead of putting it into an application setting, just paste it into the source code here so we can move along. I now need to set the content of my request, that's just going to be the json data that I created as a string, and then send off the request, wait for a response, and read the content as a string. The content will be, again, adjacent string. I could deserialize that string into a C# object, but all I really want to do here is just log the analysis so they can see it here on the page. Let's go ahead and save and run this code. I'm going to open up the logs and we will see image processor is running. And I got some results back for that image that I sent up. I can see where the face rectangle is. And over here, I can see the score. So there's very little anger. Notice this is scientific notation. So that's a very small number. Not much contempt, not much disgust, not much fear. There does seem to be quite a bit of happiness in this image, and it's not very neutral, not very sad, not very surprised. Let's also see what happens when a new blob appears. That was my testing blob. Let me run the web application that we have. And I want to upload a new image and see what the analysis is for that just by peeking back into the log file there. So in the web application, which I'm just running locally, I could also publish this version to Azure, I want to go to /images and let's browse for another image of myself and click go. In this image, I look a little bit different than the other one. Let's come back to our Azure Function which is over in this tab. And I can see already over here that the surprise score here was nearly perfect, and there's very little sadness or happiness or fear in that particular photo. So now I'm pretty confident that my function is working well and hopefully, you have a better idea of what you can do with Azure Function apps.


In this module, we used serverless computing and an Azure Function app to perform image analysis. This is just one way to use Azure Function apps. They really are a great way to build small pieces of functionality that you can compose together into a larger application and distributed systems in general. Anything from batch processing to clean up tasks and message handling, Azure Functions is such a lightweight approach to building those small pieces and only needing to pay for the execution time as a bonus. In the next module, we'll tie app services and deployments together and look at how to build a continuous integration pipeline.

Continuous Integration


Hi, this is Scott and in this module we set up Continuous Integration for our web application. Continuous Integration or CI for short is automating everything that happens after I checked some code into my source code repository. After I check in, I want my CI process to automatically build the application and ensure my changes haven't conflicted with other changes in a way that would break the build. Then I want to run the unit tests available for my application and then, if everything passes, automatically deploy my application to Azure, perhaps in a staging slot or a testing slot, so testers and integration tasks can run on a live application. Let's get started with an overview of the platform that we will use for CI.

Visual Studio Team Services

In this module we will be using Visual Studio Team Services or VSTS to set up source control and a full CI pipeline. There are enough free credits with VSTS that if you set up a new account you'll be able to follow along. VSTS is a cloud platform and although the name starts with Visual Studio, just like Azure, you can use a wide range of technologies on VSTS. You can build native applications for iOS and Android and even deploy from VSTS directly into app stores like Apple's App Store. Of course, you can build for Windows and you can build web applications and you can build using different languages like java. But providing a build server is just a small part of what VSTS provides. There are Agile tools that cover both Scrum and Agile approaches to building software. So, therse tools give you the ability to set up com bon boards, manage work items, epics, feature stories, assign bugs to developers and see all the work through from development to production. There's source control, of course. In this module we will set up a Git repository in VSTS. Once we have a Git repository, we can set up build definitions and release definitions to automatically take our code, build it, test it and deploy it to our web server. There are testing tools included with VSTS, including a load testing tool that I've used in the past to determine how to allocate my resources in Azure. With a load testing tool, I can specify a number of virtual users and give those users a script to follow, then measure and monitor my web application to see how it performs with that many users. And there's plenty of reporting tools in VSTS. If you want to see how much work you have left, how much work you are completing in each sprint, there's a wide variety of customizable reports to give you metrics on how your applications are progressing. There's so many features in VSTS we won't have time to cover them all. I'm going to concentrate on using Git and VSTS as a place to share the code for a project and setting up continuous integration with a build and release.

Creating a Team Project

To get started with VSTS, if you don't already have an account, do a search for Visual Studio Team Services. This used to be known as Visual Studio Online but that search should bring you to the home page for VSTS. Currently that URL is and it's on this page where you will see various links that you can click to get started for free. So, click on one of those, log in with the same credentials that you used to log in to the Azure portal and then just go through the registration process to finish setting up your VSTS account. I created an account many years ago, so if I flip over to a browser that knows who I am because I've already logged in with my Microsoft credentials, when I click Get Started For Free, this will take me directly into Visual Studio Team Services. And since I currently don't have any projects set up inside of VSTS, just like you would not have any set up if you'd just created an account, the first thing VSTS will prompt me to do is to create my first team project. If you already have an account and you want to create a new project to experiment with, go to the navigation bar up here in the upper left and click on New Project. This will bring up a dialog with much of the same information that we have here. First, I need a project name. Let's call this ps-aspnetmvc. That's the application we've been working on and deploying. Then I can select a process template. I can pick from Agile, CMMI or Scrum. For this course I'm just going to use the Agile template but you can experiment and read some of the documentation or go to some Pluralsight courses to find out more about how the other process templates work. For version control I do want to use Git for version control and now I will just click Create Project. While I'm creating the project, let me come over and talk for just a moment about pricing. You might be concerned about pricing if you just signed up for an account. But I can tell that for small teams, up to five users, VSTS is free. So, I can invite four of my friends or colleagues to work on this project with me and they'll have access to source control and to the work items if I granted that access and I don't need to enter a credit card number. I also have access to up to four hours per month to a hosted pipeline. That is a server in the cloud that will run my builds for me. So, if my build took one minute to complete, that would give me 240 free builds in the cloud for continuous integration. Once you go over five users or once you start using more than four hours per month to render your builds, then you can look at associating a credit card with this account and paying some extra money each month to add more users or to add more build time. Back in the overview, my project has been created. Just like the Azure portal, I have an opening dashboard for my project. Since nothing is going on yet, there are not very many interesting metrics here but eventually I can customize this dashboard to show the metrics I'm interested in, to show the work items that I have opened. But before we can really do any of that, I need to get my code into this project.

Setting up a Source Code Repository

The first order of business is going to be to take the source code that I currently have for my application on this local machine and get it into the cloud in my VSTS project in this new source code repository that was set up. If I go to the code menu item here along the top, I can see various entries for managing this new source code repository. I'm just going to click on the code menu item itself. Now, because VSTS sees that my source code repository is completely empty, it's going to present me with a help page describing how I can interact with this repository. Since I already have code for my application available, it's just on my local drive, what I do not want to do is clone this empty repository to my computer but I do want to spend some time in this section to just highlight some links that might be useful. For example, this button, Generate Git Credentials. If you are working with a Git tool, perhaps one that your company mandates, and that tool requires you to enter a Git username and password, you can click on this button to create the username and the password that you want to use to log in and use this repository. You can also create a personal access token. Now, if you're using a tool like Visual Studio, there's really good integration between Visual Studio and VSTS, so you don't need to explicitly generate Git credentials, you can just log in with your Azure and VSTS account. But this tool is here if you need to use it. There's also links here to download Git for Windows as well as plugins for other popular IDEs, like IntelliJ or Eclipse. So, if you're working on other platforms with other languages, you can still use VSTS and use a popular IDE like Eclipse. I do suggest that you have Git for Windows installed and there's a good possibility that if you have Visual Studio installed, you already have Git for Windows, it's just a question of whether or not that tool is in the correct path. The easiest way to verify is just to open up a developer command prompt and type the command Git, G I T, and if this help screen appears, then you have the command line tools installed. And that's good. You should learn how to use them. I'm not going to cover them in this course but obviously Pluralsight has many Git courses available. If you don't have that tool available, you can download Git for Windows and run through the installation process. So, I don't want to clone this repository. What I want to do is create a new Git repository on my machine, a local Git repository and then push that existing repository into this VSTS repository and that will make it available to other users to work with me on this project. One way to do this operation completely from the command line is to go to the folder that is the root of my application, typically where the solution file exists, and use the command git init to initialize a new Git repository. After I've initialized the repository, I could follow that up with a Git remote add origin and a Git push and voila, all of my source code will be in VSTS. I'm going to show you how to do those exact three steps but through Visual Studio instead. Visual Studio adds a nice couple of additional features when I go through this process. So, inside of Visual Studio, if I want to create a local Git repository and get this pushed into VSTS, there's a couple different ways to do it. I'm just going to check one setting inside of here first. If I go to Tools, Options, I want to show you that under the Source Control category, you can select your current source control plugin that you want to use. I have mine set to Git and just making sure that is there might avoid some potential issues when I go to create a local repository. There's a couple ways to do this. So, I could come into the Solution Explorer and I can right click on my solution and say that I want to add this solution to source control. What that will do is create a local Git repository for me, as well as take a few additional nice steps which I'll show you what those are in just a second. I can also go to this publish operation, so Visual Studio knows that this is not under source control and perhaps, my end goal is just to get this pushed onto a remote server. That's what I want to do, so I'm going to say yes, I want to publish this into a Git repository somewhere. The first thing Visual Studio will do as soon as I click that button is create a local repository for me. I know that for a couple reasons. First of all, the status bar changed. I can now see that I am on the master branch of a repository called ps-aspnetmvc. There's no pending changes but I do have two commits that need to be published because no one else can see them if they're only on my local file system. And if I come back to the file system and do a listing for hidden directories, there I can see the .git directory. That is where Git keeps all of its bookkeeping information, that's where all the history of my repository exist and I can also see, if I do a directory, there's two new files here. So, these are the two things that I get when I create a repository through Visual Studio that I do not get for free if I just do a git init. So, these two files are Git attributes and Git ignore. This sets up the environment for my source control and determines how Git is going to behave. Let me just give you an example of what is inside of Git ignore. So, when I have a Git repository, not all of the files in the file system are files that I want to be checked in and pushed to a remote server. For example, I don't want to put my build results on the server. When I do a local build, I want to exclude my binD bargain, bin release or my OBJ folder from anything that involves source control. Those files are just generated from my C Sharp code files, so it's the C Sharp code files that are important. Likewise, I don't want check in or commit or push any star dot user files. You might remember earlier in the course when we did a publish operation, that operation created a published settings dot user file that contain an encrypted password and that password is only good on this machine for myself. So, that doesn't make any sense to put that on source control. Visual Studio created this file for me. If I was doing things by hand with git init, I would have to find and download and place a Git ignore file into this folder to make sure the proper files and folders were ignored. So, back in Visual Studio, I am now in the Team Explorer window, which if you ever need to get to Team Explorer, you can always open up that window by going to the view menu and selecting Team Explorer. This window can make it really easy to interact with VSTS. You can manage your work items inside of here, you can commit changes and synchronize your code with the server. You can do quite a bit of work without ever actually going to the VSTS website. But the first thing we need to do is publish our local Git repository into that new team project that we created. And I can publish to any Git repository. I could publish to GitHub or Bitbucket but I want to go to Team Services. So, Publish Git Repo. I need to select the account that I'm going to use and then I just want to show you, if you have an existing project where you want the code to go, then you need to open up this advanced menu in VS 2015 because I need to select the exact team project that I want to go to, ps-aspnetmvc and then I have a repository name which is fine. So, I'm going to publish this repository. I can see it was published. And if I come back to the website, let me refresh this code view and I can now see that my code is inside of a Git repository that lives in the cloud and Visual Studio Team Services. And now that we have our code in the cloud, what we want to do is execute a build in the cloud.

Creating a Build Definition

Now that we have the source code in our repository, it's time to create a build, so that every time I check in, a build will be kicked off. So, if I go to the build and release menu, I will select builds and because I currently don't have any build definitions, I'll be prompted to create my first new build definition. Now, I should say that VSTS builds can be complicated and varied. I can build everything from Xcode projects to Xamarin.iOS, I can be using Gradle, Jenkins, Maven, all of these build options. I can even start with an empty build definition if I really want to build things from scratch. The template I'm going to pick because this is an application, is I will pick the build. Notice there's a separate build template for core projects. But when I click next, I can configure the settings for this build. So, for example, if I wanted this particular build to pull source code from GitHub, I would click on the GitHub icon but I want it to come from this project that we're in, ps-aspnetmvc, that is the source code repository that I want to use. This is the branch that I want to use. Of course, there are many different branching strategies in Git. Your developers might not be working on the master branch, they might be working on a development branch or a feature branch and of course, you can have QA branches and staging branches and release branches. We're going to keep things simple. I'm only going to check code into the master branch and I am going to select this box, continuous integration. That will be the magic so that whenever I check in, this build will automatically execute and compile all of my code. I can also work for pull request by the way, if you set up that sort of development workflow. The agent queue that I want to use will be the hosted agent queue, so I want a Windows machine in the cloud somewhere to build my project and the folder that I want to start at is the root folder of my repository, that's where the solution file exists. So, with that in place, let me click Create and VSTS will show me my new build definition. Now, it's important to note that this is not saved yet. In order for this to function, I will need to save it. But let me give you an overview of what we're seeing here on the screen. On the left-hand side, these are the build steps that define my build process. What do I need to do to build my solution? Well, first I need to execute NuGet restore against my solution. So, this build step, which I could add manually by clicking on Add Build Step, requires some parameters and you can always see parameters for each of these steps over here on the right-hand side. This says go through all directories and all sub-directories looking for files that have a .slm extension and when you find one, execute NuGet restore against that particular solution. That will make sure all the packages are available. There's usually an advanced section inside of the parameters that would allow you to specify more advanced parameters, things like the exact NuGet version that you want to use. You could even check in a custom version of NuGet to source control and use that. In which case you'd have to specify the path to NuGet and then once the NuGet packages are restored, we will go through and build all of the solutions that we find. This will be done on the build server using MSBuild, so you can just walk up to any solution file and say MSBuild and the name of the solution file and MSBuild will use the instructions inside of your solution file to perform the build. We are going to pass some additional parameters and that's because typically, when you build a solution, all that really produces is your bin release folder or your bin debug folder, it puts all of the assemblies and all of the packages that you need and all the code that was compiled from your application into that bin folder. With web applications, we need a little bit more because we also have content files, things like Razor views and JavaScript files and CSS files. We need to make sure that those are properly placed and available for the build, so that we can get those files into our deployment package. And if you're familiar with Visual Studio, you might know that these command line arguments, like deploy on build and web publish method, that's a lot like right clicking on a web project inside of Visual Studio and saying I want to publish. But this is not going to publish into Azure. This is just going to publish to the local file system but the publishing operation is smart enough to collect all of our Razor views and JavaScript files and CSS files and everything that I have marked as content, as well as all of my assemblies and package that up into a single zip file that I can then use with MSDeploy to create a new web application and that works inside of Azure, that also works locally inside of IIS. Now, the variables here are build platform and build configuration. Anytime you want to see the value of those, you can come over to the Variables tab here. This will tell me we will be doing a release build and the build platform will be any CPU and MSBuild will understand those parameters. I could also select the Visual Studio version that I want to build with. Test assemblies, currently, unfortunately, our application doesn't have any test assemblies but I'm going to leave this build step here because in the future, if I add a unit testing project or two, and it doesn't matter if it's using VS tests or xUnit or end unit, I'll be able to execute those tests during my build and if a test fails, my build will fail which is what I want. So, this project just goes and looks for any DLL file that has the word test in the name. It will execute a test runner against it. We're also going to publish PDBs and then the important part here is Publish Artifact. What this will do is go to your build artifacts staging directory. Anything that is in that artifacts staging directory will be placed into what we call a build artifact and then that build artifact is available for me to download, if I want to install this build locally, or to hand off to a release definition, which we'll do later, that can automatically deploy this build somewhere, including Azure. The artifact type is going to be server, it's going to be stored on the build server but it's still going to be available to me. And artifact staging directory, by the way, that is where the web application was published to. If we look at the last parameter here, the package location, where will it place the zip file with everything that is needed for the web application? It will place it in the artifacts staging directory. Now, real quickly, I just want to point out that you can swap around these tasks just by dragging and dropping them. You can also add new tasks and there's a lot of tasks inside of this task catalog. So, if I want to build something with CMake or Gradle, if I have Grunt tasks defined in my application and I need to run Grunt or Gulp to process or transpile my JavaScript, all of those tasks are available. If there is something you need to do that is not here, there's also a marketplace that contains hundreds and hundreds of new tasks, many of them written by Microsoft but most of them by third parties and most of them are free but there are a few that require you to pay something. And one of the most important tasks to point out might be that you can run a script from the command line or you can run a PowerShell script as part of your build. And for very complicated builds, I would actually recommend that you take advantage of these utility tasks to run scripts to make your build because then you can check your build script into source control and developers can use that build script locally to do a build. The build server can use that script here in the cloud to do the build and everyone will be following a consistent build plan that just makes things easier for more complicated projects and also makes problems easier to diagnose. But we're going to use this simple arrangement that was created for us. I do need to save this build definition. Let me call this master-ci. This is my CI build for the master branch of source control. I will select OK and now I will come back to this page refreshed. Build definitions for master-ci and I can still edit things. But let's come out to build definitions proper. I can see now I do have a single build definition in place. I'll see some statistics about this build once a few builds have happened and since we set this up for continuous integration, I should be able to make a change at source control, push my change to this repository and the build will be kicked off. But what I want to do is just kick one off manually to see if this works. So, if I click on the ellipsis here that appears, I can click Queue New Build and once I click on this entry, I just need to fill out a few parameters. So, I want this to be in the hosted queue, so a cloud server. I want this to be based on the master branch not on a specific commit, just take whatever is there and build it and the variables that I have here are fine, I'm going to release build. Click OK and I'll come over to a page that will show me the current progress of that build. Right now we are waiting for an available agent but once we have an agent and we have a build service set up, I will be able to see exactly what is happening on the build server that will stream to me here inside of a window. So, we'll be right back as soon as this build is hopefully successfully completed. Here we are roughly 71 seconds later and the build has completed successfully. If I did run into any problems, I'd be able to see any error output here in the console and usually an easy thing to do if there is a problem, is to click Download All Logs As Zip and then you can inspect those logs and figure out if there was a compiler error or maybe you had some path configuration set up wrong inside of the build but all of the information is available. And now I'm going to come back out to my build area just to show you that master-ci did complete. My seven-day pass rate is 100% which is pretty good and if I click on this for more details for that specific build, there's plenty of information about where this occurred, the date and the time, yes, there was an issue, the issue was that there were no test assemblies found, that's okay for right now. And I just want to call your attention to this little link here Artifacts. If I follow this link, remember, that one of our build steps wants to take everything in our artifacts staging directory and publish all those things into an artifact called drop. So, here indeed, I do see a folder called drop. And of course, you can publish other artifacts too. Maybe you have a Windows service or Azure functions that also have to be deployed. They could go into a separate artifact folder. And I could download everything that was produced by my build or just explore that build here on the server. It's available for as long as I want. You can specify retention times inside of VSTS. And then the Artifacts Explorer, the most important file here would be That was the file produced when I told Visual Studio that I wanted to build and also publish my web application. It took all of my assemblies and Razor views and JavaScript and CSS, it's all available inside of that zip file. So, I could download this build and install it locally on IIS but of course, we want to push this into Azure. So, let's look at that next.

Creating a Release Definition

Inside of VSTS, let's swing over to Releases. Once again, I'll be prompted to create a new release because I have no releases currently and let's create a new release definition. So, I just want to point out that a build definition and a release definition, they are like C# class definitions. What I do is instantiate objects from those definitions. And that's the same concept of build definition and release definition. I instantiate a build and instantiate a release using these definitions. Now, just like with builds, release definitions can cover a wide variety of scenarios, for example, if I want to deploy to an Azure app service, there's a real easy way to do that here by selecting this first template. In fact, that's the one that we're going to use. But once we get into the meat of things, you'll see there are also tasks that would allow you to deploy to IIS on a virtual machine, deploy databases, run PowerShell scripts, everything's available once again. What I want to do is create a release definition that is based on a build, specifically in this project I want to use that master-ci build that I just created and I want to check this box to go one step further than continuous integration. I want continuous deployment. So, I want the ability to check in a piece of code, that code will trigger a build and when the build completes, that build will trigger a release. Once again, I'll just use an agent queue of hosted and let's click Create. So, the release definitions are very similar to the build definitions. In this center column here, these are the instructions that the release manager will follow to deploy my application and anything else that is needed. I can drag and drop tasks, I can add additional tasks and this where you can do things, many things with Azure, including things like Azure file copy, but you can also do things like a SQL database deployment. So, if you manage your database schema using a Visual Studio database project, those projects produce DACPACs when they build. I can include the DACPAC as build a build artifact and then add this task to apply the DACPAC to a database that is in Azure. And again, if there's something you need that you don't see in this list, you can always go to the marketplace and there's many other tasks available. We're going to stick with this one simple task, which is I want to deploy an Azure app service. Over here on the left, this can control multiple environments where my app service could be deployed. I'm only going to have one environment and I'm going to call that staging. I want to be able to deploy to the staging slot of my Azure app service. Now, this deployment hasn't been named yet. I'm going to go ahead and give it a name by clicking the little pencil icon and I will name this master to staging, this is my continuous integration release. Right from the master branch into the staging slot for my app service. Now, currently, this is in red because there are some required parameters here. First, I need to pick the Azure subscription that I will use for the deployment and I will pick the Pluralsight OTC subscription. Now, this Authorize button appears. I want to talk about this for just a second. If you don't own the subscription, you might need some help here. I'm the one that created this VSTS project, I'm also the one that owns and created the subscription, therefore when I click authorize, I should not have any problems here. If this authorization step fails for you, or if that subscription is not available where you want to deploy your app service, you might have to take some additional steps. And I'll show you basically what has to happen. Here on the gear icon, I'm going to go to the Services entry and right click this to open it in a new tab. If I come over here, this is what just happened when I clicked that Authorize button. There was a new service endpoint that was added. This service endpoint allows VSTS and the release manager to communicate with Azure and use resources that are in the Azure subscription. If you don't have those permissions, you'll need to get someone who has those admin privileges on the subscription to come in here and say that they will set up a new service endpoint for you and for Azure, you probably want the Azure resource manager, that is what we are using, but notice you can also hookup other services. So, if you need to retrieve source code from a private GitGub repository that someone else can give you permissions to, then you would come in here and click Authorize and fill out your GitHub credentials or enter a personal access token, click OK and now VSTS will be able to make a connection to that GitHub repository during your build and release cycles. But now this is set up, I have an endpoint defined for Pluralsight OTC. So, coming back to the release definition, that was the subscription that I wanted, I want to go to ps-aspnetmvc, that is the app service name that I want to deploy to. I want to deploy to a specific slot in that app service. I need to select the resource group, that's ps-aspnetmvc, I want to go to the staging slot. I don't need to specify virtual application. And now, what package do I want to deploy there? Well, I can click this to browse and say in my artifacts for master-ci, I want you to take this zip file. That is my web deployment package and that is how I would like my application deployed. Now, I could also add a task here that would swap staging for production and that would give me a zero downtime deployment. But for right now, I'm just going to leave this in staging. Under Additional Deployment Options, you might want to take the app offline while you're doing the deployment. That will add the app_offline file. And I think we have everything that we need for a release definition. So, let me save this release definition and now come back to a list of all the releases I have. So, I have one master-staging-ci. Currently there are no releases. Let me right click this and say I want to instantiate a release from this release definition. In the future this will happen automatically when I check in code but I can always do this manually too. So, first I need to select the build. So, every release is attached to a specific build, that is the build that it will deploy. And now for my various environments, I can say, yes, I want an automated deploy to my staging environment and click Create. This will take a few moments. I can click to highlight it. Once again, I'll see a streaming log of what is going on but we'll be back in a few minutes when hopefully, the release has completed. Now, just a minute later and that release has been successful. And what I want to do is just verify that I have something running in staging. Of course, this won't look much different from what we had before because I haven't made any real changes to the website. But now we can do the true test of continuous integration and continuous deployment. Let's make a change to the web application, commit that change into the repository and see what happens.

Continuous Deployment in Action

Let's go back into Visual Studio and I will go to the Solution Explorer and open up the view that is displaying that page. That was the About page and let's make a simple change here. This is an update that I will check into the Git repo. Let me save this file. Now, there's a couple ways I could get this into VSTS. One is from the command line. My usual flow is to use git status, just to first survey what changes have occurred. And I can see here what I expected to see which is about.cshtml was edited. At that point I could Git add to stage that file to commit and then do a Git commit. I would commit everything and add in a message and once the commit is finished, I could do a Git push to push my changes from this local repository into VSTS. I can also do all of those operations here in Visual Studio. So once again, inside of Team Explorer, I can come to the Home page in the Team Explorer window, which, by the way, I can now see things like my build definitions are here. But I want to go to Changes. This window will also tell me that about.cshtml changed. If I had a work item to associate with that change, I could add that here. But let me just add a commit message. Something simple like testing continuous delivery and say that I want to commit all. Once I've committed my changes, I can click on this hyperlink to sync or again, back from the Home page, there's a sync icon here and what I'm going to do is push my commit out to the server. I see that was successfully pushed to origin master. Now, let's come back into VSTS and what I want to look at under builds is to see if there is a build kicked off. And indeed, I can see there is a build in progress. I can click to watch what is happening with that build but I'm going to assume that this build completes. Let's go over to Releases and wait just a bit for our next release to appear. Just a couple minutes later and I can see that I now have a new release under way. That release was triggered by the build that just completed and I can see that in the details, this release was triggered by master-ci build 20170112.2. By the way, you can adjust that name if you want to in the settings for your build definition and on my behalf, VSTS is now kicking off the continuous integration release. So, in a few moments we should have the new bits here in the staging environment. Another minute elapsed and now my release, release number two, has succeeded based on that new build and if I come in and look at all the releases for master staging, I can see there's been two releases now, one based on the build.1, this one based on the build.2. So, if I come out to my application that is in the staging slot, I do a refresh and I can see the new text has appeared automatically. And this is the promise of continuous delivery. Every time I check in changes to a Git repository, my changes can be integrated with changes from other developers. There will be a build. When the build is successful, there will be an automatic deployment and that deployment doesn't have to be into production. I could deploy to a staging server or a QA server. Right now I'm deploying to a staging slot that would give people the opportunity to come in, just do a quick look at the application, make sure everything is correct and then, by using the Azure portal, or even a PowerShell script, or even another release definition inside of VSTS, I could issue the command to swap staging and production and the new features in my application would be live.


In this module we saw how to create a project in Visual Studio Team Services, how to push our code into a Git repository hosted by VSTS and then set up a continuous integration and continuous deployment pipeline to take our new code changes and deploy them directly into a staging slot of an Azure app service. If you combine this knowledge with what we learned earlier for app service monitoring, you can create a powerful DevOps environment where feedback on your application comes quickly. And that concludes my course. I hope this course gives you the knowledge that you need to get started with Azure and the confidence and the mental model to explore new features. And if you enjoyed this course, look for the advanced course coming later in 2017.