Expanded

Conceptualizing the Processing Model for the AWS Kinesis Data Analytics Service

In this course, you will learn how you can use the Amazon Kinesis Data Analytics service to process streaming data using both the Apache Flink runtime and the SQL runtime. You will integrate your streaming applications with Kinesis Data Streams, Kinesis Data Firehose Delivery streams, and Amazon’s S3.
Course info
Level
Advanced
Updated
Apr 28, 2021
Duration
2h 33m
Table of contents
Course Overview
Getting Started with the Kinesis Data Analytics Service
Processing Data Using the Apache Flink Runtime
Processing Data Using the SQL Runtime
Description
Course info
Level
Advanced
Updated
Apr 28, 2021
Duration
2h 33m
Your 10-day individual free trial includes:

Expanded library

This course and over 7,000+ additional courses from our full course library.

Hands-on library

Practice and apply knowledge faster in real-world scenarios with projects and interactive courses.
*Available on Premium only
Description

Kinesis Data Analytics is a service to transform and analyze streaming data in real-time with Apache Flink and SQL using serverless technologies. In this course, Conceptualizing the Processing Model for the AWS Kinesis Data Analytics Service, you will learn that Kinesis Data Analytics is part of the Kinesis streaming platform along with Kinesis Data Streams, Kinesis Data Firehose, and Kinesis Video streams.

First, you will get introduced to the Kinesis Data Analytics service for processing and analyzing streams. You will explore the runtimes available that you can use to process your data which includes the Apache Flink runtime, the SQL runtime, and the Apache Beam runtime. You will then deploy a streaming application using the AWS command-line interface. This will involve setting up the correct roles and policies for your application to access the resources that it needs.

Next, you will learn how you can deploy a Kinesis Analytics application using the web console. You will configure your streaming application to read from an enhanced fan-out consumer and write to Kinesis Firehose delivery streams. You will also explore using the Table API in Apache Flink to process streaming data.

Finally, you will deploy and run Kinesis Data Analytics applications using the SQL runtime. The SQL runtime allows you to run interactive SQL queries to processing input streams, you will learn how to create and use in-application streams and understand the purpose of the stream pump.

When you are finished with this course, you will have the skills and knowledge to create and deploy streaming applications on Kinesis Data Analytics and use connects to work with other AWS services as data sources and data sinks.

About the author
About the author

A problem solver at heart, Janani has a Masters degree from Stanford and worked for 7+ years at Google. She was one of the original engineers on Google Docs and holds 4 patents for its real-time collaborative editing framework.

More from the author
Summarizing Data and Deducing Probabilities
Intermediate
2h 50m
Jul 8, 2021
More courses by Janani Ravi
Section Introduction Transcripts
Section Introduction Transcripts

Course Overview
Hi. My name is Janani Ravi, and welcome to this course on Conceptualizing the Processing Model for the AWS Kinesis Data Analytics Service. A little about myself. I have a master's degree in electrical engineering from Stanford and have worked at companies such as Microsoft, Google, and Flipkart. I currently work on my own startup, Loonycorn, a studio for high‑quality video content. Kinesis Data Analytics is a service to transform and analyze streaming data in real time with Apache Flink and SQL using serverless technologies. In this course, you will get introduced to the Kinesis Data Analytics service for processing and analyzing streams. You will explore the runtimes available that you can use to process your data, which includes the Apache Flink runtime, the SQL runtime, and the Apache Beam runtime. You will then deploy a streaming application using the AWS command line interface. Next, you will learn how you can deploy a Kinesis Analytics application using the web console. You will configure your streaming application to read from an enhanced fan‑out consumer for Kinesis Data Streams and write to Kinesis Firehose delivery streams. You will also explore using the Table API in Apache Flink to process streaming data. Finally, you will deploy and run Kinesis Data Analytics applications using the SQL runtime. The SQL runtime allows you to run interactive SQL queries to process input streams. You will learn how to create and use in‑application streams and understand the purpose of the stream pump. When you're finished with this course, you will have the skills and knowledge to create and deploy streaming applications on Kinesis Data Analytics on Amazon Web Services, and you will know how to use connectors in your streaming code to work with other AWS services as data sources and data sinks.