Apache Pulsar is a highly scalable, high throughput system
that handles both queuing as well as streaming data with
incredible ease. This course will teach you all the necessary
concepts and tools to adopt Apache Pulsar to your projects.
Real-time applications are hard to scale! They can get high
volumes of data in an instant and need to route messages
correctly. Apache Pulsar is a highly scalable, low latency, high
throughput pub-sub system to attack this problem. In this
course, Handling Streaming Data with Apache Pulsar, you’ll
learn how to tame them adopting Apache Pulsar. First,
you’ll explore Pulsar Functions for serverless ETL. Next,
you’ll discover how to connect your Pulsar deployment to
Kafka and databases with Pulsar IO. Finally, you’ll learn
how to migrate from Kafka to Pulsar with the client
wrapper. When you’re finished with this course, you’ll
have the skills and knowledge of Apache Pulsar needed
to handle high volume streaming data in your applications
Axel Sirota has a Masters degree in Mathematics with a deep interest in Deep Learning and Machine Learning Operations. After researching in Probability, Statistics and Machine Learning optimization, he is currently working at JAMPP as a Machine Learning Research Engineer leveraging customer data for making accurate predictions at Real Time Bidding.
Course Overview Hi, everyone. My name is Axel Sirota. Welcome to my course, Handling Streaming Data with Apache Pulsar. I am a machine learning research engineer at JAMPP, ML fanatic, distributed systems enthusiast, and I am very excited to present this to you. Real‑time applications are hard to scale. They can get high volumes of data in an instant and need to route messages correctly. Furthermore, the problem complicates even more when needing to consider multi‑tenancy, cold data offloading, and geo‑replication. Apache Pulsar is a highly scalable, low‑latency, high‑throughput pub/sub system to attack this problem. Our journey begins discovering the architecture of Apache Pulsar and Pulsar Functions for serverless ETL. Next, we will learn about Pulsar IO to connect our deployment through the outside world, only to go deep dive into debugging messages in real time with Pulsar SQL. Finally, we will migrate a working application from Apache Kafka to Apache Pulsar with a client wrapper and zero code changes. When you have finished this course, you will have the skills and knowledge of Apache Pulsar needed to handle high volume streaming data in your applications with ease. I hope you will join me on this journey to learn Apache Pulsar with the Handling Streaming Data with Apache Pulsar course, at Pluralsight.