Learn how to fast-track applications from development to production by automating various application tasks with Docker and Ansible. You'll learn how to create a continuous delivery workflow that delivers a sample Python Web application to AWS.
Continuous delivery is fast becoming an indispensable practice for organizations that want to develop and deploy applications to production at speed with improved reliability. This course, Continuous Delivery Using Docker and Ansible, will teach you how to create a robust, production-class continuous delivery workflow that will test, build, release, and continuously deploy your applications in Docker containers. You'll learn how to create a portable workflow locally on your machine that you can invoke with a handful of simple commands, and then learn how to run your workflow in the popular Jenkins continuous delivery system using the new Jenkins pipeline plugin. Along the way, you will learn how to compose multi-container environments using Docker Compose, publish test reports, set up integration with GitHub and Docker Hub, and finally, deploy your application to Amazon Web Services (AWS), using the AWS CloudFormation service to define all of the infrastructure requirements for your application and AWS EC2 Container Service to run your Docker applications in production. By the end of this course, you'll have a better understanding of continuous delivery and how you can use Docker and Ansible to develop and produce better applications more efficiently than ever.
Justin is a full stack technologist working with organizations to build large scale applications and platforms, with a focus on end-to-end application architecture, cloud, continuous delivery, and infrastructure automation.
Course Overview Hi everyone, my name is Justin Menga, and welcome to my course, Continuous Delivery Using Docker and Ansible. I am an independent Full Stack Technologist working with organizations to accelerate application deliver and build scalable application architectures and platforms. Continuous delivery is the gold standard in application delivery, allowing development and operations teams to successfully deliver applications faster and more reliably. And, Docker and Ansible are some of the hottest technologies around that can really supercharge and empower continuous delivery. In this course, we are going to learn how to build a continuous delivery workflow from scratch, using Docker and Ansible as core enabling technologies, allowing us to test, build, release, and continuously deploy a sample application to Amazon Web Services. Some of the major topics that we will cover include, running unit and integration tests using Docker, building and testing Docker release images, enabling push-button style automation for continuous delivery and deployment, establishing a continuous delivery pipeline using Jenkins. And finally, setting up a deployment pipeline that deploys your application to Amazon Web Services. By the end of this course, you'll know how to build a production-class continuous delivery workflow that you can run both locally on your computer or on any continuous delivery system that supports Docker. Before beginning this course, you should have a basic working knowledge of Docker, and have worked with Ansible or other configuration management tools previously, although, I have designed this course such that you can follow along, even if you haven't worked with any of these technologies before. I hope you'll join me on this journey to learn how to create a production-class continuous delivery workflow with the Continuous Deliver Using Docker and Ansible course at Pluralsight.
Unit/Integration Testing using Docker Hi, my name is Justin Menga and welcome to Unit and Integration Testing Using Docker. In this module, we are going to start creating the continuous delivery workflow for the sample application we created in the previous module. We will first introduce the continuous delivery workflow so you have a clear picture of the end to end workflow and what parts of that workflow we will cover in this module. We will then create a base Docker image for the application which will establish the runtime environment for the sample application. This image will form the basis of our development image which we will create next. We will add the necessary dependencies for testing and building the sample application to the development image. After which we will be able to run our first tests. To facilitate the integration testing against our MySQL backend we configured in the previous module, we will then create a more complex text environment using Docker Compose. This will include multiple containers and allow us to run integration tests from an application container against a MySQL database running in a separate container. We'll also see how we can use Docker Compose to create other useful services such as volume container services, which will provide a consistent cache for the package manager, reducing installation times between test runs, and agent services that help orchestrate the necessary workflow required to get our tests working reliably.
Building Artifacts using Docker Hi, my name is Justin Menga, and welcome to Building Artifacts Using Docker. In this module, we are going to continue from where we left off in the previous module, where we completed the test stage of our Continuous Delivery Workflow. We will learn about the build stage, the goal of which is to take our tested application, and create deployable, versioned application artifacts. After a brief overview of the build stage workflow, we will discuss the types of application artifacts, and then add metadata to the sample application to allow us to build a Python application artifact known as a wheel. We will then add a builder service to our Docker-composed test environment, which will leverage the development image we created the previous module to build application artifacts. We will then publish the built artifacts so they can be use for subsequent stages of our continuous delivery workflow.
Continuous Delivery Automation Hi, my name is Justin Menga, and welcome to Continuous Delivery Workflow Automation. In the previous modules, we have created the building blocks of the test, build, and release stages of our continuous delivery workflow. In this module, we are going to bring all of these building blocks together and create an automated workflow that can be invoked in a handful of commands using the GNU Make build system. By the end of this module, you will be able to run everything we have created in the last three modules and more with the commands make test, make build, and make release. Before we can continue on to more advanced topics in later modules, we need to set up supporting infrastructure for our workflow. Which includes setting up Github repositories for the sample application and other supporting repositories we have created so far. As well as setting up Docker Hub repositories to host our main application release image along with the various supporting images required for our workflow.
Continuous Delivery Using Jenkins Hi my name is Justin Menga and welcome to Continuous Delivery using Jenkins. The test build and release stages of our continuous delivery workflow are complete and we are now able to run our workflow in a robust and consistent manner after which we can take and publish release images to Docker Hub. In this module we will configure the popular continuous delivery system Jenkins to run our workflow using the Pipeline plug-in formally known as the Workflow plug-in. In the spirit of this course we will set up and test Jenkins locally inside a Docker container demonstrating the portability and flexibility of our workflow. We will encode the Jenkins workflow inside our To Do Beckon repository, meaning any Jenkins system can execute our workflow by simply checking out the To Do Beckon repository. With our workflow set up and tested, we will deploy Jenkins to Amazon Web services using the EC2 Container Service. Adopting an infrastructure as code approach and encoding all the resources and services required to run Jenkins on Amazon Web services in a CloudFormation template. After deploying Jenkins using CloudFormation. We will verify Jenkins is operational and then configure integration with both Github and Docker Hub to trigger our workflow automatically.
Continuous Deployment using Ansible Hi, my name is Justin Menga, and welcome to Continuous Deployment using Ansible. To date, we have focused extensively on the test, build, and release stages of our continuous delivery workflow. In the final piece of the puzzle, is the deploy stage. In the last module, we got a taste of the target AWS platform in CloudFormation building blocks, which allow us to create a complete application environment. In this module, we will bring all of this together using Ansible and continuously deploy our Todobackend application to AWS. We will first create a CloudFormation template to create all of the necessary AWS resources and services required to support our Todobackend application, and then, create an Ansible Playbook that will create our stack and deploy application releases. Finally, we will add a new pipeline job to Jenkins, that will trigger our Playbook after the release stage has completed successfully, continuously deploying our application on each new release. And with that, our continuous delivery pipeline will be complete.