Simple play icon Course
Skills

Leveraging Fully Managed Redis Datastores Using Google Cloud Memorystore

by Vitthal Srinivasan

Cloud Memorystore is a new addition to the Google Cloud Platform. It provides a fully managed, cloud-hosted Redis service, allowing you to cache common responses on GCP-hosted web applications, providing your users low latency and high performance.

What you'll learn

Due to its in-memory nature, Memorystore features some of the lowest latencies on the platform, down to sub-millisecond levels. This managed-Redis service is hosted on Google’s highly scalable infrastructure, which means that it can support instances up to 300 GB and network throughput of 12 Gbps. Memorystore offers an easy migration path for users of Redis, a technology that is fast gaining popularity, especially for use from within Docker containers running on Kubernetes. In this course, Leveraging Fully Managed Redis Datastores Using Google Cloud Memorystore, you'll examine all of these aspects of working with Memorystore, and learn how to get the best out of this powerful managed database service. First, you will explore the suite of storage products that are available on the GCP and where exactly Memorystore fits in. You will be introduced to the capabilities of using Redis to cache data for transactions, and as a publisher-subscriber message delivery system, and you will learn about the LRU eviction policies that Memorystore follows. Next, you will implement Memorystore integrations with applications that you host on Compute Engine VMs, App Engine, and on Google Kubernetes Engine clusters. These are the current options that the GCP supports for working with managed Redis. Finally, you will dive into how you can configure Memorystore for high-availability configurations. Memorystore offers two Redis tiers: basic tier and standard tier instances. Basic tier instances do not support cross-zone replication and failover, while standard tier applications are equipped with both features. In addition, the standard tier offers far lower downtime during scaling. You’ll also see how you can monitor Redis instances using Stackdriver. When you’re done with this course, you will have a good understanding of how you can use Memorystore to cache your data on the cloud and know how you can integrate managed Redis with your applications running on various compute options on the GCP.

About the author

Vitthal has spent a lot of his life studying - he holds Masters Degrees in Math and Electrical Engineering from Stanford, an MBA from INSEAD, and a Bachelors Degree in Computer Engineering from Mumbai. He has also spent a lot of his life working - as a derivatives quant at Credit Suisse in New York, then as a quant trader, first with a hedge fund in Greenwich and then on his own, and finally at Google in Singapore and Flipkart in Bangalore. In all these roles, he has written a lot of code, and b... more

Ready to upskill? Get started