Hadoop Tutorial for beginners Youtube Videos
Blog: Big Data Science Training
Hadoop Tutorial for beginners Youtube Videos : Hadoop is an Apache open source framework written in java and handles the challenges faced by BigData. The Bigdata in a distributed environment is divided into small clusters using simple programming models. Designed in such a way that it scale up from a single server to thousand of machines, each machine offers the data storage and computation. So basically this open source framework for distributed storage, processes large sets of data on commodity Hardware.
Hadoop is a framework of tools and the objective is to support running of applications on big data. It helps businesses to have quick insight about their structured and Unstructured data. Before beginning with the Hadoop you must have a clear understanding of What is Big Data?
Learning how to code to a program and developing it for the Hadoop Platform can lead to lucrative career The brief video tutorials will provide a quick introduction to Hadoop.
What is Hadoop?
This video explains what is apache Hadoop, BigData, Apache, Big Data Challenging points,Traditional Architecture, Hadoop Architecture ,Components of Hadoop, Pillars of Hadoop: Map Reduce & HDFS , Projects, Usage area of Hadoop, Example of Applications and Hadoop’s future Outlook in brief. Hadoop is a framework of tools and their objective is to support running of Applications on Bigdata.
Hadoop for Beginners Youtube Videos Tutorials
Tutorial – Challenges produced by Big Data
So in this video, we will discuss the challenges produced by Big Data and how Hadoop is addressing them. In this, you will learn why traditional enterprise data fails to address challenges created by Bigdata and why the creation of Hadoop became a necessity.
Tutorial – History behind creation of Hadoop
Here in this video, we will discuss very briefly the history of Hadoop. How google invented its technology, and how Hadoop come in need. Let see how Doug Cutting and Michael Cafarella created Hadoop and how it went to Apache.
Tutorial – Overview of Hadoop Projects
So previously we discussed about two main components of Hadoop : MapReduce and HDFS. These are also called Pillars of Hadoop. However there are few projects managed by Apache that also fall under the umbrella of Hadoop. These projects add a certain value to the core functionality at multiple levels.
Here is a brief introduction to Hadoop Projects :
- Apache Hive : A data warehouse infrastructure build on top of Hadoop for providing data summarization, query and analysis.
- Apache Scoop: This a tool to transfer bulk data between Apache Hadoop and Relational Databases.
- Apache Pig: In order to make the programming easy a higher level of language was created called Pig Latin which falls under the umbrella of Apache Pig and this do same job as SQL is doing.
- Apache Flume: A distributed service for collecting, agreegating and moving large amount of log data. This is robust and fault Tolerant.
- Apache Mahout: Its a distributed and scalable machine learning algothims on the Hadoop Platform.
- Apache HBase: An Open source, non relational, distributed database. This is written in java and runs on the top of HDFS.
- Apache Oozie: This is a java based application that is responsible for scheduling the jobs and Hadoop systems.
How to Install Hadoop on Computer / Laptop ( Windows/OS X)
This video will explain you how to install Hadoop on your personal computer running operating system Windows or OS X.
Your education can take you to different routes. Hadoop is a very powerful technology, Just make sure you get the essential and basics from the above videos. The best way to learn Hadoop is by enrolling yourself in a real world task. As you will grow in Hadoop you will appreciate the power of Hadoop. Whatever tasks to decide to tackle in Hadoop, you will find the walkthough codes online.I hope these videos will help you clear the basics of Hadoop , its origin and its relation with Big Data.