Description
Course info
Rating
(11)
Level
Beginner
Updated
May 22, 2018
Duration
2h 1m
Description

By 2022 the Global Market for Hadoop is predicted to be over 87 Billion. That’s a huge market and one of the reasons Data Engineers are in such high demand. In this course, Getting Started with Hortonworks Data Platform, you will learn how to build a big data cluster using the hadoop data platform. First, you will explore how to navigate your HDP cluster from the Command line. Next, you will discover how to use Ambari to automate your Hadoop cluster. Finally, you will learn how to set up rack awareness for multi server clusters. By the end of this course, you will be ready to build your own data cluster using Hortonworks Hadoop data platform.

About the author
About the author

Thomas is a Senior Software Engineer and Certified ScrumMaster. He spends most of his time working with the Hortonwork Data Platform and Agile Coaching.

More from the author
Splunk: The Big Picture
Beginner
1h 40m
Dec 23, 2019
Performing Basic Splunk Searches
Intermediate
2h 17m
Aug 9, 2019
More courses by Thomas Henson
Section Introduction Transcripts
Section Introduction Transcripts

Course Overview
Hi everyone. My name is Thomas Henson, and welcome to my course, Hortonworks Getting Started. I'm a course author at Pluralsight and a data engineering advocate in the big data community. By 2020, the global market for Hadoop is predicted to be over 87 billion. Think about that. $87 billion. That's a huge market and one of the reasons data engineers are in such high demand. This course is devoted to training the future data engineers on how to build out their first big data cluster using the Hortonworks Data Platform. Hortonworks is one of the largest contributors to the big data analytics open source community and one of the core contributors to the start of Hadoop. Some of the topics that we will cover include learning how to use Ambari to automate our Hadoop cluster from adding nodes to installing other data analytic components like Spark, Pig, and Hive; navigating our HTTP cluster from the command line; setting up rack awareness for multi-server clusters; and understanding the changes that are coming in Hadoop 3. 0. Those are going to be some huge changes. By the end of this course you'll know how to build and manage a Hortonworks Data Platform cluster. But before beginning this course, you should be familiar with installing Linux in a physical or virtual environment, and also some basic Linux command line skills. I hope you'll join me on this journey to learn about building Hadoop clusters, with the Hortonworks Getting Started course, at Pluralsight.