Hadoop training in pune

Hadoop training syllabus

Hadoop Introduction
We will start with evolution of Hadoop. We will be discussing it’s importance over the RDMS or traditional databases. Then we will go inside the ecosystem. In the last we will cover the HDFS and map reduce working.
Topics:
1. Where is Big Data?
2. Netflix challenge
3. RDMS and it’s challenges
4. Ecosystem
5. HDFS (Hadoop Distributed File System)
6. Map Reduce

Hadoop Architecture and HDFS
First You will learn how to install the Hadoop on the machine in a single node. Then we will be using the hdfs file system. After that we will install the multi node to built the rack or cluster scenario. During this module we will learn various configuration.
Topics:
1. Single Node setup
2. Hadoop Shell Commands
3. Cluster Architecture
4. Multi node setup
5. Hadoop Administration

Hadoop MapReduce Framework
We will learn the concepts of map reduce framework. Then we will use map reduce with Hadoop on an example. The data will be stored in HDFS and yarn will be used.
Topics:
1. MapReduce Approach
2. YARN components
3. YARM architecture
4. Practical Map Reduce
5. Execution flow
6. Distributed Approach
7. XML Parser

Apache Pig
We will understand the importance of PIG over the map reduce. How will they mingle to give the high importance? Then we will be using the PIG in some of the use cases. PIG givens you various modes, UDF and streaming protocol.
Topics:
1. Introduction
2. Importance of pig over mapreduce
3. Pig Architecture
4. Commands
5. Pig Demo

Apache Hive
We will look on the rdms and nosql. Why the hive is becoming more popular due to openness? Querying data and scripting in Hadoop.
Topics:
1. Hive vs traditional database
2. Limitation of Hive
3. Tables
4. Data import and export
5. Querying data
6. Partition (hashing)
7. Use case study

Apache Spark
Spark plays an important role in real time, context, graphical interface, etc. We will look on the contact and ecosystem of spark. We will learn how to use it in real time application. The major importance comes through it structure in processing so we will go in deep on it.
Topics:
1. Introduction to Spark
2. Architecture
3. Intro to Scala
4. Various component of Spark and it use

Atashee Group Hadoop & Big Data

Atashee Group Big Data course trains you in advance database technologies like R programming, MongoDB & Hadoop, which are used by more than half of Fortune 50 companies. The curriculum trains you from the basic to the advanced level, right from installing & configuring the technology, and maintaining the system, to using additional products that integrate with other systems.

Course details :

Smart Pro Big Data trains you in:
  • Introduction to cloud computing
  • Fundamentals of Linux operating system
  • Fundamentals of Java
  • Introduction of big data
  • Work with NoSQL database (MongoDB)
  • Hadoop programming
  • Reporting & analytics with big data
  • Project

Job opportunities:

On successfully completing the Smart Pro Big Data course, you will be ready to join top national & international IT companies as:
  • Big Data Professional
  • Hadoop Data Engineer
  • Big Data Software Engineer