Skip to content

Contact sales

By filling out this form and clicking submit, you acknowledge our privacy policy.
  • Labs icon Lab
  • A Cloud Guru
Google Cloud Platform icon

Scaling a MEAN App in Lightsail Using App Tiers

In this learning activity, we will implement the MEAN stack using a multi-instance architecture. We will then scale the architecture, first by separating the app and database tiers, and then by scaling the app tier using a load balancer. The goal of this learning activity is to gain experience with: * Creating a Lightsail instance complete with a pre-installed stack * Using a launch script to perform advanced configuration * Interacting with an on-instance MongoDB * Connecting to an instance to configure a MEAN-based application * Application testing and verification * Using multi-instance applications * Using snapshots to clone instances * Using load balancers to enable mass scaling #### Helpful Links:

Google Cloud Platform icon

Path Info

Clock icon Intermediate
Clock icon 1h 30m
Clock icon Jan 28, 2019

Contact sales

By filling out this form and clicking submit, you acknowledge our privacy policy.

Table of Contents

  1. Challenge

    Deploy an instance-based MongoDB

    1. Deploy an instance in the Virginia us-east-1 region with 2 GB of memory using the Ubuntu 22.04 LTS Operating System (OS) only blueprint.

    2. Click + Add launch script and use the following code to configure the database:

    3. Name the instance Mongo.

    4. When the instance is built, log in using SSH and check that the database is running.

      mongosh --host $(hostname -i)
      show dbs
    5. Note the private IP of the instance; we will need it later.

    Note: hostname -I can show the private IP of the Mongo instance. The public IP is shown for the instances on the box that represents each instance.

  2. Challenge

    Deploy the application front end on an instance

    1. Deploy a new App + OS Node.js instance in the Virginia us-east-1 region using the 2 GB memory size.

    2. Use the Node.js blueprint (we don't need an on-instance database, and the app will install everything it needs).

    3. Use the following launch script:

    4. Name the instance node-fe-1.

    5. When the instance finishes building, we need to point it at the database server. Log in to the node-fe-1 instance using SSH and run:

      cd ~/todo-mean/
      sudo sh -c "cat > ./.env"  << EOF
    6. Next, we need to make the application auto-start and do a final configuration.

      sudo pm2 startup ubuntu
      sudo pm2 start /home/bitnami/todo-mean/bin/www 
      sudo pm2 save
    7. Then we need to show the logs to ensure everything is working correctly.

      sudo pm2 logs www
    8. Once the logs are streaming, navigate to http://<frontendinstanceip> to test the application.

  3. Challenge

    Clone and scale the front end

    Snapshot the Working Front-End Instance

    1. Click the context (...) menu for the node-fe-1 instance.
    2. Choose Manage, then Snapshots.
    3. Create a snapshot and wait for it to complete.

    Create Additional Front-End Instances

    1. Locate the snapshot. Click the context (...) menu and select Create New Instance.
    2. Name the instance node-fe-2.
    3. Repeat this process to create a new front-end instance called node-fe-3.
    4. Navigate to each new instance's public IP and test that the application works.
  4. Challenge

    Distribute traffic across the front end with a load balancer

    1. Under Networking, create a load balancer and name it todo-lb.
      • Once available, attach all front-end instances to the load balancer (node-fe-1, node-fe-2, and node-fe-3).
      • Ensure that health checks are enabled.
      • Make sure each instance passes health checks.
      • When all instances are healthy, locate the load balancer DNS name and open it in a browser.
      • If no task exists, create one.
      • Refresh the page and ensure the task still exists. Note the hostname changes at the bottom showing which front-end instance is being used.

The Cloud Content team comprises subject matter experts hyper focused on services offered by the leading cloud vendors (AWS, GCP, and Azure), as well as cloud-related technologies such as Linux and DevOps. The team is thrilled to share their knowledge to help you build modern tech solutions from the ground up, secure and optimize your environments, and so much more!

What's a lab?

Hands-on Labs are real environments created by industry experts to help you learn. These environments help you gain knowledge and experience, practice without compromising your system, test without risk, destroy without fear, and let you learn from your mistakes. Hands-on Labs: practice your skills before delivering in the real world.

Provided environment for hands-on practice

We will provide the credentials and environment necessary for you to practice right within your browser.

Guided walkthrough

Follow along with the author’s guided walkthrough and build something new in your provided environment!

Did you know?

On average, you retain 75% more of your learning if you get time for practice.

Start learning by doing today

View Plans