Demo on Hadoop Flume | Edureka

preview_player
Показать описание

Apache Flume is a distributed and reliable service for efficiently collecting, aggregating, and moving large amounts of streaming data into the Hadoop Distributed File System (HDFS). This video shows you clearly how to load data using flume.

Related Posts:

Edureka is a New Age e-learning platform that provides Instructor-Led Live Online classes for learners who would prefer a hassle free and self paced learning environment, accessible from any part of the world.
The topics related to Flume are extensively covered in our 'Big Data & Hadoop' course.
Call us at US : 1800 275 9730 (toll free) or India : +91-8880862004
Рекомендации по теме
Комментарии
Автор

I was pretty much pulling my hair last night, trying to implement this project.
This video helped a lot. Great Explaination...

kamzcool
Автор

thankyou for such nice explanation! Instructor hats off!

blessonbiji
Автор

Thanks for the explanation. Can I import twitter data into HDFS in any other format other than JSON files?

srinathkalakonda
Автор

Sometimes it is downloading more the one file in this hdfs location for a single keyword. On what basis, the downloading data getting separated more than one file? Plz help me. I want the answer for this. 

eddyedu
Автор

But this thing we could have done through kafka as well. Why do I need flume then ?

FirstNameLastName-fveu