Expanded

Conceptualizing the Processing Model for the AWS Kinesis Data Analytics Service

In this course, you will learn how you can use the Amazon Kinesis Data Analytics service to process streaming data using both the Apache Flink runtime and the SQL runtime. You will integrate your streaming applications with Kinesis Data Streams, Kinesis Data Firehose Delivery streams, and Amazon’s S3.
Course info
Level
Advanced
Updated
Apr 28, 2021
Duration
2h 33m
Table of contents
Course Overview
Getting Started with the Kinesis Data Analytics Service
Processing Data Using the Apache Flink Runtime
Processing Data Using the SQL Runtime
Description
Course info
Level
Advanced
Updated
Apr 28, 2021
Duration
2h 33m
Description

Kinesis Data Analytics is a service to transform and analyze streaming data in real-time with Apache Flink and SQL using serverless technologies. In this course, Conceptualizing the Processing Model for the AWS Kinesis Data Analytics Service, you will learn that Kinesis Data Analytics is part of the Kinesis streaming platform along with Kinesis Data Streams, Kinesis Data Firehose, and Kinesis Video streams.

First, you will get introduced to the Kinesis Data Analytics service for processing and analyzing streams. You will explore the runtimes available that you can use to process your data which includes the Apache Flink runtime, the SQL runtime, and the Apache Beam runtime. You will then deploy a streaming application using the AWS command-line interface. This will involve setting up the correct roles and policies for your application to access the resources that it needs.

Next, you will learn how you can deploy a Kinesis Analytics application using the web console. You will configure your streaming application to read from an enhanced fan-out consumer and write to Kinesis Firehose delivery streams. You will also explore using the Table API in Apache Flink to process streaming data.

Finally, you will deploy and run Kinesis Data Analytics applications using the SQL runtime. The SQL runtime allows you to run interactive SQL queries to processing input streams, you will learn how to create and use in-application streams and understand the purpose of the stream pump.

When you are finished with this course, you will have the skills and knowledge to create and deploy streaming applications on Kinesis Data Analytics and use connects to work with other AWS services as data sources and data sinks.

About the author
About the author

A problem solver at heart, Janani has a Masters degree from Stanford and worked for 7+ years at Google. She was one of the original engineers on Google Docs and holds 4 patents for its real-time collaborative editing framework.

More from the author
More courses by Janani Ravi
Section Introduction Transcripts
Section Introduction Transcripts

Course Overview
[Autogenerated] Hi my name is johnny ravi and welcome to the scores on conceptualizing the processing model for the Aws kindnesses data analytics service a little about myself. I have a master's degree in electrical engineering from stanford and have worked at companies such as Microsoft, google and flip card. I currently work on my own startup. Loonycorn a studio for high quality video content, kindnesses data analytics is a service to transform and analyze streaming data in real time with Apache flink and SQL using survivalist technologies. In this course you will get introduced to the Kindnesses Data analytics service for processing and analyzing streams. You will explore the run times available that you can use to process your data which includes the apartheid frink, runtime SQL runtime and the apartment being runtime you will then deploy a streaming application using the AWS command line interface. Next you will learn how you can deploy a kindness analytics application using the web console, you will configure your streaming application to read from an enhanced fan out Consumer for kindness is data streams and right to kindness Firehose delivery streams, you will also explore using the table API in Apache flink to process streaming data. Finally you will deploy and run kindnesses data analytics applications using the SQL runtime The SQL runtime allows you to run interactive SQL queries to process input streams. You will learn how to create and use in application streams and understand the purpose of the stream pump. When you're finished with this course, you will have the skills and knowledge to create and deploy streaming applications on kindnesses data analytics on Amazon Web Services and you will know how to use connectors in your streaming code to work with other AWS services as data sources and data sinks.