Hortonworks - HDP Operations: Install and Manage with Apache Ambari (1HW-AHOMHDP)
This four-day Apache Hadoop 2.0 training course is designed for administrators who deploy and manage Apache Hadoop 2.0 clusters. Through a combination of lecture and hands-on exercises you will learn how to install, configure, maintain and scale your Hadoop 2.0 environment. At the end of this course you will have a solid understanding of how Hadoop works with Big Data and through the hands-on exercises will have completed the Hadoop deployment lifecycle for a multi-node cluster. At course completion all students are provided complimentary access to the Hortonworks Apache Hadoop Administrator Certification program.
In this course you will learn the best practices for Apache Hadoop 2.0 administration as experienced by the developers and architects of core Apache Hadoop. How to size and deploy a cluster How to deploy a cluster for the first time How to configure Hadoop and the supporting frameworks How to perform ongoing maintenance to nodes in the cluster How to balance and performance tune a cluster How to move and manage data within a cluster How to integrate status and health checks into your existing monitoring tools (single pane of glass) How to add and remove DataNodes How to Implement a high available solution Best practices for deploying Hadoop cluster
Who Can Benefit
This course is designed for IT administrators and operators responsible for installing, configuring and supporting an Apache Hadoop 2.0 deployment in a Linux environment.
This course utilizes a Linux environment. Attendees should know how to navigate and modify files within a Linux environment. Existing knowledge of Hadoop is not required.
Day 1: Foundation, Planning and Installation Introduction to Hortonworks Data Platform & Hadoop 2.0 Hadoop Storage: HDFS Architecture Installation Prerequisites HDP Management: Ambari Ambari and the Command Line Hadoop Operating System (YARN) & MapReduce Day 2: Configuration / Data Management Configuring Services Configuring HDFS Configuring Hadoop Operating System (YARN) & MapReduce Configuring HBase Configuring ZooKeeper Configuring Schedulers Data Integrity Extract-Load-Transform (ELT) Data Movement Copying Data Between Clusters Day 3: Data Management / Hortonworks Data Platform (HDP) 2.0 Operations HDFS Web Services Apache Hive Data Warehouse Transferring data with Sqoop Moving Log Data with Flume Setting up the HDFS NFS Gateway Workflow Management: Oozie Data Lifecycle Management with Falcon Monitoring HDP 2.0 Services Commissioning and Decommissioning a Nodes and Services Day 4: Hortonworks Data Platform (HDP) 2.0 Operations Rack Awareness and Topology NameNode Federation Architecture NameNode High-Availability (HA) Architecture Backup & Recovery Security