Apache Hadoop, the open source data management software
that helps organizations analyze massive volumes of structured and
unstructured data, is a very hot topic across the tech industry.
Employed by such big named websites as eBay, Facebook, and Yahoo,
Hadoop is being tagged by many as one of the most desired
tech skills for 2012 and coming years along with Cloud Computing.
The attendees will learn
below topics through lectures and hands-on exercises
– Understand Big Data & Hadoop Ecosystem
– Hadoop Distributed
File System – HDFS
– Planning, designing
and deploying a fully distributed cluster
– Managing and
monitoring HDFS, Map Reduce Components
– Periodic and regular
maintenance activities for the cluster
– Best practices for
– Diagnosing, tuning
and solving Apache Hadoop issues
– Populating HDFS
Architects, developers, administrators who wish to design, deploy
and manage Hadoop clusters.
Course Prerequisites: The participants should have basic
understanding of Linux.
has about 15+ years of industry experience working on
enterprise java, SOA and Cloud computing platforms. He has
worked with TCS, HP, Patni and worked on large scale projects
for customers like Motorola, Home Depot, CKWB Bank, P&G in
the roles of solution and technical architect. He provides
consulting and training on Cloud Computing, Big data & Hadoop,
Google App Engine, and Amazon Web Services.