- Learning Path
Build and Monitor Data Pipelines with Apache Kafka
This learning path is actively in production. More content will be added to this page as it gets published and becomes available in the library. Planned content includes: 1. Up and Running with Apache Kafka (video course) 2. Produce and Consume Data in Kafka (video course) 3. Design Topics, Schemas, and Retention Policies in Kafka (video course) 4. Integrate Systems with Kafka Connect (video course) 5. Stream Processing with Kafka Streams and ksqlDB (video course) 6. Operate and Monitor Kafka Clusters (video course) 7. Secure and Scale Kafka Deployments (video course) 8. Patterns, Anti-patterns, and Best Practices in Kafka Pipelines (video course)
Apacha Kafka is a distributed event-streaming platform used to build real-time data pipelines and streaming applications. This path teaches data engineers how to design, operate, and monitor scalable Kafka environments that move data reliably across systems. In this path, learners will gain the skills to produce, consume, transform, and manage streaming data, preparing them to support modern, event-driven architectures in production environments.
Content in this path
Build and Monitor Data Pipelines with Apache Kafka
Eager to get started on your Apache Kafka learning journey? Start watching the courses below to get started building and monitoring your own data pipelines with this powerful open-source tool.
Try this learning path for free
What You'll Learn
- How to get up and running with Apache Kafka
- How to produce and consume data in Apache Kafka
- How to design topics, schemas, and retention policies in Apache Kafka
- How to integrate systems with Kafka Connect
- How to perform stream processing operations with Kafka Streams and ksqlDB
- How to operate and monitor Kafka clusters
- How to secure and scale Kafka deployments
- How to apply best practices in Kafka pipelines
- Learners interested in this skill path should have a foundational knowledge of data engineering concepts such as batch vs. streaming data and ETL workflows. They should also be familiar with basic stream processing terminology and understand how data flows through modern data pipelines. Prior experience writing SQL queries or working with data integration tools is also helpful.
- Stream Processing
- Data Engineering
- Data Pipeline Design
- Real-time Analytics
- Event-driven Architecture
