VIDEOS TO LEARN ABOUT OUR UNIQUE TRAINING PROCESS:
- Live Training Program Details
- Video Training Program Details
- “After Training” Process
- 6 steps to placement
- Online Training Benefits
- What Happens After Signup?
Course Duration: 30-35 hours Training + Assignments + Actual Project Based Case Studies
Training Materials: All attendees will receive,
- Assignment after each module, Video recording of every session
- Notes and study material for examples covered.
- Access to the Training Blog & Repository of Materials
This course is designed for anyone who is:
- Wanting to architect a project using Hadoop and its Eco System components.
- Wanting to develop Map Reduce programs
- A Business Analyst or Data Warehousing person looking at alternative approach to data analysis and storage.
- The participants should have at least basic knowledge of Java.
- Any experience of Linux environment will be very helpful.
- We will provide you with Pre-recorded Videos of Core Java if you need.
Advantages of Hadoop online:
An online Hadoop tutorial designed by Hadoop experts provide you the knowledge and skills in the field of Big Data and Hadoop and train you to become a successful Hadoop Developer. ZaranTech’s Big Data Hadoop online training is the first of its kind providing the best-in-class online Hadoop tutorial and is ideal for all aspiring professionals who wants to make their career in Big Data Analytics using Hadoop Framework and master all the concepts. Towards the end of the Hadoop course, you will be working on assignments, live case studies, Hadoop interview questions and Big Data Hadoop Certification sample test.
This course is delivered as a highly interactive session, with extensive live examples. This course is Live Instructor led Online training delivered using Cisco Webex Meeting center Web and Audio Conferencing tool.
Timing: Weekdays and Weekends after work hours.
- Focus on Hands on training
- 30 hours of Assignments, Live Case Studies
- Video Recordings of sessions provided
- Demonstration of Concepts using different tools.
- One Problem Statement discussed across the Whole training program.
- HADOOP Certification Guidance.
- Resume prep, Interview Questions provided.
- Introduction to HADOOP and BIG DATA
- Covers All Important HADOOP Ecosystem Products.
How are we Different from other Training Institutes?
Role-specific training instead of Product-based training – We are the leaders in providing **Role-Specific training and e-learning solutions for individuals and corporations. Our curriculum are based on real-time job functions as opposed to being product-based. Real-time scenarios and troubleshooting techniques shown in class.
(**Role based training – Here our trainers share their real-time implementation experience in the class. The trainer will work with participant on several Case Studies based on a actual projects. This gives the participant an understanding of how things are accomplished in real-time environment. The idea is to get the participant familiar of the process, real-time.)
Longer Course Durations – We provide students with more detailed training with Assignments based on the real-time scenarios as well as case studies so that the students take away relevant experience in their respective platform.
We offer Training Blogs using Google Site – The Training Blogs are a common platform for both the trainer as well as the trainees to interact with, discuss queries with the trainers, upload assignments and referring assignments. Training Blogs helps the student to attend the sessions anywhere, anytime, using laptop, desktop or tabs/palmtops.
We provide study materials using Google Drive –We provide access to a Repository of materials for training using Google Drive Cloud. The students are given access to their respective modules using Google Drive for which they have access for lifetime and can be accessed anywhere any time.
For our SAP Trainings –We offer the longest duration of Courses in SAP as compared to any other training institute out there. Our SAP training programs are very detailed. Integration with other SAP modules is covered as a part of our training programs.
Never miss a session – We video record every online training sessions and post the Video recording on the training Blog after the session. So if a students misses a Live Online session, the Video is always available on the Blog. Other students can always go back to these video recordings for review purpose or just to go over.
Highly Qualified and Well Experienced Trainers – Our Trainers are highly qualified and are well experienced in their respective domains. We have trainers from USA, Canada, Australia, Singapore and many other countries.
Case Studies and Assignments Based on Real Scenarios – The Case Studies and Assignments assigned to the students are based on real-time scenarios out the Trainers Past Projects they were involved in.
Certification Assistance – During and at the end of training, the Sr. trainer will provide Certification questions and answers to help you clear the Certification (if required). They will guide each student the required Certification program as well as they themselves are Certified. Every student also receives a ZaranTech Training Completion Certificate as well.
Career Counseling – If you are New to IT and want career counseling to help you decide which stream to go into, please click the link, http://www.zarantech.com/free-career-counseling/ and fill out the Career Counseling form and one of our counselors will get in touch.
Placement Assistance – Our “After the training” team can also help you with Resume prep guidance, Interviews questions and Mock interviews after your training is complete.
Modules Covered in this Training
- Introduction and Overview of Hadoop
- Hadoop Distributed FileSystem (HDFS)
- HBase – The Hadoop Database
- Map/Reduce 2.0/YARN
- MapReduce Workflows
- Putting it all together
- Integrating Hadoop Into The Workflow
- Delving Deeper Into The Hadoop API
- Common Map Reduce Algorithms
- Using Hive and Pig
- Practical Development Tips and Techniques
- More Advanced Map Reduce Programming
- Joining Data Sets in Map Reduce
- Graph Manipulation in Hadoop
- Creating Workflows With Oozie
- HANDS ON EXCERCISE
Attendees also learn:
- Resume Preparation Guidelines and Tips
- Mock Interviews and Interview Preparation Tips
Introduction and Overview of Hadoop
- What is Hadoop?
- History of Hadoop.
- Building Blocks – Hadoop Eco-System.
- Who is behind Hadoop?
- What Hadoop is good for and what it is not?
Hadoop Distributed FileSystem (HDFS)
- HDFS Overview and Architecture
- HDFS Installation
- HDFS Use Cases
- Hadoop FileSystem Shell
- FileSystem Java API
- Hadoop Configuration
HBase – The Hadoop Database
- HBase Overview and Architecture
- HBase Installation
- HBase Shell
- Java Client API
- Java Administrative API
- Scan Caching and Batching
- Key Design
- Table Design
- MapReduce 2.0 and YARN Overview
- MapReduce 2.0 and YARN Architecture
- YARN and MapReduce Command Line Tools
- Developing MapReduce Jobs
- Input and Output Formats
- HDFS and HBase as Source and Sink
- Job Configuration
- Job Submission and Monitoring
- Anatomy of Mappers, Reducers, Combiners and Partitioners/li>
- Anatomy of Job Execution on YARN
- Distributed Cache
- Hadoop Streaming
- Decomposing Problems into MapReduce Workflow
- Using Job Control
- Oozie Introduction and Architecture
- Oozie Installation
- Developing, deploying, and Executing Oozie Workflows
- Pig Overview
- Pig Latin
- Developing Pig Scripts
- Processing Big Data with Pig
- Joining data-sets with Pig
- Hive Overview
- Hive QL
- Sqoop Tools
- Sqoop Import
- Sqoop Import all tables
- Sqoop Export
- Sqoop Job
- Sqoop metastore
- Sqoop Eval
- Sqoop Codegen
- Sqoop List Databases and List Tables
- Sqoop Create Hive Table
Putting it all together
- Distributed installations
- Best Practices
Our Advanced Hadoop is an extension of Essential Hadoop Module designed with objective of in-depth coverage
with case study illustration.
Integrating Hadoop into the Workflow
- Relational Database Management Systems
- Storage Systems
- Importing Data from RDBMSs With Sqoop
- Hands-on exercise
- Importing Real-Time Data with Flume
- Accessing HDFS Using FuseDFS and Hoop
Delving Deeper Into the Hadoop API
- More about ToolRunner
- Testing with MRUnit
- Reducing Intermediate Data With Combiners
- The configure and close methods for Map/Reduce Setup and Teardown
- Writing Partitioners for Better Load Balancing
- Hands-On Exercise
- Directly Accessing HDFS
- Using the Distributed Cache
- Hands-On Exercise
Common MapReduce Algorithms
- Sorting and Searching
- Machine Learning With Mahout
- Term Frequency – Inverse Document Frequency
- Word Co-Occurrence
- Hands-On Exercise
Using Hive and Pig
- Hive Basics
- Pig Basics
- Hands-on exercise
Practical Development Tips and Techniques
- Debugging MapReduce Code
- Using LocalJobRunner Mode For Easier Debugging
- Retrieving Job Information with Counters
- Splittable File Formats
- Determining the Optimal Number of Reducers
- Map-Only MapReduce Jobs
More Advanced MapReduce Programming
- Custom Writables and WritableComparables
- Saving Binary Data using SequenceFiles and Avro Files
- Creating InputFormats and OutputFormats
- Hands-On Exercise
Joining Data Sets in MapReduce
- Map-Side Joins
- The Secondary Sort
- Reduce-Side Joins
Graph Manipulation in Hadoop
- Introduction to graph techniques
- Representing graphs in Hadoop
- Implementing a sample algorithm: Single Source Shortest Path
Creating Workflows with Oozie
- The Motivation for Oozie
- Oozie’s Workflow Definition Format
- HANDS ON EXERCISE
About Trainer Venkat:
- 18 years in to IT Training and Consulting in areas of Java, Enterprise J2EE with last 6 years in to Hadoop Consulting, Training and assisting organizations like JPMC and Microsoft India Development Center in the inception, incubation and growth of Big Data innovations.
- Have architected more than 10 Big Data life cycle implementation for Customer Churn Analysis, 360 Degree view of the customer and other big data solutions for integrating existing enterprise assets with the Hadoop cluster for organizations.
- Have mentored participants in Amdocs, Fidelity, Wells Fargo, TCS, HCL, Erricson, Verizon, CA, Accenture and other organizations in Hadoop with its eco-system components like Hive, Hbase, Pig and Sqoop.
- Have delivered more than 200+ Big Data Trainings with a rating of 4.5+ in more than 90% of the training.
About Trainer Ratikant:
- 20 Years in IT with focus on BI and Big Data.
- Principal Big Data Architect for a Global Service provider.
- 4 Years Big Data project Implementation experience with major US clients.
- Skilled in Hadoop Technology Stack.
- Have trained students and professionals in Big Data Tech stack for last 1 year.
About Trainer Navin:
- Experienced Software Professional with nearly 18 years of strong technology experience.
- Core competency skills involves Software Project Management, Development, Support and Maintenance using Apache Hadoop, Hive, Data Science, R, Python, Pig, Sqoop, Flume, Storm, Spark, Tableau, Talend, MS Project/Server, Asp.Net/C#, VB.Net, Smalltalk, Java, Oracle and MS SQL Server, WPF, Silverlight.
- Senior technologist with strong business acumen and technical experience in IT and Big Data space.
- Expertise in Design and Development of Apache Hadoop, Hive, Data Science, R, Python, Pig, Sqoop, Flume, Storm, Spark.
- Leader in IT & Big Data space that combines an entrepreneurial spirit with corporate-refined execution in Big Data strategy, Big Data consulting, implementations, CoE setup, architecture, pre-sales and revenue optimization.
- Expertise in Design and Development of User Interfaces using Winforms, WPF, Silverlight and Asp.Net, Business object, Web Services and Data Access layers in C#.
- Expertise in Smalltalk, and Java2/1.3, JSP, Servlets.
About Trainer Srinivas:
- Experienced Software Professional with nearly 17 years of strong technology experience.
- Core competency skills involves Development, Support and Maintenance using Apache Hadoop, Hive, Pig, Sqoop, Flume, Storm, Spark, Tableau, TalenD, Java, & J2EE technologies including JSE, JDBC, Servlets , Java Beans, Java Server Pages, EJB, XML, Web Services, JSF, Struts, Spring ,Hibernate, JPA and Oracle.
- Senior technologist with technical experience in IT and Big Data space.
- Expertise in Design and Development of Apache Hadoop, Hive, Pig, Sqoop, Flume, Kafka, Oozie, Storm, Spark.
- Expertise in Cloudera(CDH), Hortonworks (HDP) and IBM BigInsights Distributions.
- Expertise in Oracle Fusion Middleware, Oracle Webservice Development, Oracle ADF and Oracle Weblogic Administration.
- Have trained students and professionals in BigData stack for the last 3 years.
CASE STUDY # 1 – “Healthcare System”
Healthcare System Application:
As the Product Manager for Inner Expressions you are asked to provide one of your largest clients with additional features in the EMR ( Electronic Medical Records Management) System. The client has requested an integrated Referral Management System that tracks patients from Primary care into the Specialist departments. Appointments are created by either the Primary Care Physicians themselves or other clinical staff like Nurse Practitioners or Clinical Assistants. Each appointment must go through the appropriate checks including checking if the patient has an active insurance with the client, whether the insurance program covers the condition of the patient, patient’s preference for location and timings and availability of the Specialist doctor.
Some appointments may have to be reviewed by the Specialists themselves before they can be approved, the administrator of the facility (hospital) must have the ability to choose by appointment type to either make it directly bookable by the Primary Care Staff or as a type that requires review by the specialist. The system should also allow the Primary Care Staff and specialists departments to exchange notes and comments about a particular appointment. If the specialist department requests tests or reports as mandatory for the appointment, the system must ensure that the patient has these available on the date of the appointment.
The system shall also allow users to track the status of patients’ appts & must store the entire clinical history of each patient. This will be used by the hospital for two main purposes; the specialist and the primary care providers will have access to the patients complete medical history before the patient walks in for the appt and hence allowing for better patient care, the Hospital also stores this data in a general data warehouse ( without Protected Health Information) to do analytics on it and come up with local disease management programs for the area. This is aligned with the Hospitals mission of providing top quality preventive medical care.
The Hospital sets about 300 appointments per day and must support about 50 users at the same time. The existing EMR system is based on Java and an Oracle database system.
- Identify Actors, Use Cases, Relationships,
- Draw Use Case Diagrams
- Identify Ideal, Alternate and Exception Flows
- Write a Business Requirements Document
CASE STUDY # 2 – “Asset Management System”
Asset Management Application:
An e Examination system is also known as (e-Pariksha/ Online Examination Scheduler), an Intelligent Web Application which automates the process of pre examination scheduling of Any Academic Institutions, Universities, Colleges and School. This automations primary scope is to save nature by saving tons of paper involved in conducting the examination. All examination communications are done via email management between student and Academia. Usually any examination would start with Exam Registrations, which is connected to Subject Creation, Exam Room Management , Room Allotment, Examination Hall Dairy, and Absentees Information (Variety of Reports) – Required by UniversityThis WebApp edges two sides of Client side and Server side Application. Client side enables student community to fill up their examination registration form online via internet and also they have privileges to check out their examination details like (Day of Start, Complete Time Table, Day-wise Exam Details and Day seating details of the candidate- like room name, seating number subject, date and time. The Server side involves the processing of each candidate exam registration form into workflow like, Subject Loader, Room Management, Seating Manager, Room Allotment, Room Dairies, Absentee Marking, and Rich Crystal Reports to meet various needs of Data set.The WebApp Admin records new chattel into database, deletes archaic ones, and revises any information related to examination. “User”. All users are known to the system by their USN, ID and their The asset management system keeps track of a number of assets that can be borrowed, their ownership, their availability, their current location, the current borrower and the asset history. Assets include books, software, computers, and peripherals. Assets are entered in the database when acquired, deleted from the database when disposed. The availability is updated whenever it is borrowed or returned. When a borrower fails to return an asset on time, the asset management system sends a reminder to the borrower and informs the asset owner.
The administrator enters new assets in the database, deletes obsolete ones, and updates any information related to assets. The borrower search for assets in the database to determine their availability and borrows and returns assets. The asset owner loans assets to borrowers. Each system has exactly one administrator, one or more asset owners, and one or more borrowers. When referring to any of the above actor, we use the term “user”. All users are known to the system by their name and their email address. The system may keep track of other attributes such as the owner’s telephone number, title, address, and position in the organization.
The persistent storage is realized using an SQL database. The business logic is realized using the WebObjects runtime system. The system includes:
- Identify Actors, Use Cases, Relationships,
- Draw Use Case Diagrams
- Identify Ideal, Alternate and Exception Flows
- Write a Business Requirements Document
OTHER CASE STUDIES:
Social Networking, Cruise Management System, Collegiate Sporting system
How to be a certified Hadoop Developer?
Certification for Hadoop Developer can be attained by the aspirant in the following steps:
Step 1: Once training is over, Registration must be done for Hadoop CCDH-410 exam.
Step 2: Complete the exam and hence you shall be certified.
What are the requirements for the certification?
Basic knowledge of Java, Unix and SQL is required to pass this examination.
What is the cost of the examination for Hadoop Developer certification?
Once the candidate registers on the website, he must pay for the certification exam. The cost is $295.
What are the pattern for the exam and the duration for the test?
Note: The examination would consist of 50-55 live questions, the duration for the examination is 90 minutes and the passing score is 70%.
Technical Requirements to take an Online training with ZaranTech
Technical Requirements for ZaranTech Online Classes:
- Operating System: Windows XP or newer
- Browser: Internet Explorer 6.x or newer
- CPU: P350 MHz, recommended P500+ MHz
- Memory: 128 MB, recommended 256+ MB RAM
- Free Disk Space: 40 MB, recommended 200+ MB for content and recordings
- Internet Connection: 28.8 Kbps, recommended 128+ Kbps
- Monitor: 16 bit colors (high color)
- Other: Sound card, microphone, and speakers OR headset with microphone
What is the Difference between Live training and Video training?
These Videos here will help you understand the difference,
VIDEO – What is Instructor led LIVE Training –http://www.youtube.com/watch?v=G908QvF-gVA
VIDEO – What is Instructor led VIDEO Training – http://www.youtube.com/watch?v=naPdAyKvAI0
How soon after I Enroll would I get access to the Training Program and Content?
Right after you have Enrolled, we will send you an Email to your Gmail id with a Video on How To login to the training blog and get access to the training program and content.
What are the pre-requisites of taking this training?
– The participants should have at least basic knowledge of Java.
– Any experience of Linux environment will be very helpful.
Who are the instructors and what are their qualifications?
All our instructors are Senior Consultants themselves with a minimum of 10 years of real-time experience in their respective fields. Each trainer has also trained more than 100 students in the individual and/or corporate training programs.
How will be the practicals/assignments done?
Practicals/assignments will be done using the training blog. Instructions will be sent after you enroll.
When are the classes held and How many hours effort would I need to put in every day/week?
Online Live sessions are held weekdays evening CST (Central Standard Time GMT-6) or on Weekends. The schedule is posted for each batch on the website. You have to put in a effort of 8-10 hrs per week going thru the videos once again and completing your assignments.
What if I miss a class?
We Video record every Live session and after the session is complete, we will post the Video recording of that session in the blog. You will have access to these Video recordings for 6 months from the date you start your training. Material access will be provided using Google Drive Cloud for lifetime.
How can I request for a support session?
You can do that by posting a question on the training blog.
What If I have queries after I complete this course?
You can post those questions on the training blog.
Will I get 24*7 Support ?
You will get 24*7 accesss to the blog to post your questions. Trainers will answer your questions within 24 hrs of time. Normally they answer very frequently, like about 1-2 hrs. You can also approach your training coordinator for the same.
Can I get the recorded sessions of a class from some other batches before attending a live class?
Yes, you can. Or you can see our Youtube page for previous batch session recordings.
How will I get the recorded sessions?
It will be provided to you through the trainng blog.
How can I download the class videos?
You wont be able to download any videos. They are available for you to View using the training blog 24*7.
Is the course material accessible to the students even after the course training finishes?
Do you provide Placements as well
We are infact, a Consulting company which provides training so we are mainly looking for trainees who are looking for Placement after training.
After the Training Process explained (Video): http://www.youtube.com/watch?v=BrBJjoH46VI
Our 6-step training to placement process (Video): http://www.youtube.com/watch?v=BrBJjoH46VI
How can I complete the course in a shorter Duration?
Enroll to our Self paced video training.
Video Explanation – What is Instructor led VIDEO Training – https://www.youtube.com/watch?v=v1P9_fkg9mE
Do you provide any Certification? If yes, what is the Certification process?
We provide Certification guidance at the end of each course. You will also receive a “Certificate of Completion” from ZaranTech at the end of the course.
Are these classes conducted via LIVE video streaming?
We have both the options available
What internet speed is required to attend the LIVE classes?
1Mbps of internet speed is recommended to attend the LIVE classes. However, we have seen people attending the classes from a much slower internet.
What are the payment options?
We accept Credit Cards, Paypal, Bank payments from anywhere in USA, Money orders, International Wire transfer, ACH transfers, Chase Quickpay, Bank of America transfers, Wellsfargo Surepay. All the payments details are mentioned on the Enrollment page.
What if I have more queries?
Call the number listed on the Course Details page of our website.