Description
Course info
Rating
(16)
Level
Advanced
Updated
Aug 24, 2020
Duration
2h 29m
Description

In a world of data, governance can become chaotic very quickly. In this course, Enforcing Data Contracts with Kafka Schema Registry, you’ll learn to enforce and manage data contracts in your Apache Kafka-powered system. First, you’ll explore how the serialization process takes place and why AVRO makes such a great option. Next, you’ll discover how to manage data contracts using Schema Registry. Finally, you’ll learn how to use other serialization formats while using Apache Kafka. When you’re finished with this course, you’ll have the skills and knowledge of data governance with Schema Registry needed to enforce and manage data contracts in your Apache Kafka setup.

About the author
About the author

Bogdan Sucaciu is a Software Engineer at Axual in The Netherlands, where he is taking part in building a Streaming Platform designed to share information in real-time, enabling instant processing of incoming events. He has several years of experience "cooking" software with JVM-based languages, some flavors of web technologies and garnishing with automated testing.

More from the author
Micronaut Fundamentals
Beginner
3h 10m
Jul 28, 2021
Getting Started with Knative
Intermediate
1h 24m
Apr 8, 2021
Securing a Kafka Cluster
Advanced
2h 20m
Dec 14, 2020
More courses by Bogdan Sucaciu
Section Introduction Transcripts
Section Introduction Transcripts

Course Overview
[Autogenerated] Hi, My name is Baldwin Scott, You and welcome to my course. Enforcing data contracts with CAFTA Schema Registry when working, which only a few Kafka topics, producers and consumers thinks might be easy. But when you have hundreds or even thousands of them that use their own pretty fine data structure, things start to become rather complex. In these course, we're going to overcome that complexity by using a piece of technology called Scheme Our Registry. Some of the major topics will cover include enforcing data contracts for Kafka topics, handling scheme, evolution, choosing compatibility levels between different versions off the data. And finally, diving is the most commonly used serialization formats with Apache Kafka. By the end of this course, you know how to properly choose a civilization format that matches your use case and, most importantly, how to make sure that everything will go smoothly while using it at scale before beginning discourse. It should be familiar with the most common Apache Kafka concepts, like topics, producers and consumers. Also, I will be using the Java programming language to work through some examples, So some previous job experience will definitely help you to stay focused on what's important. I hope you'll join me on this journey to learn about scheme or registry in the enforcing data Contracts with Kafka scheme are juicy course a your side.