filmov
tv
Configuring MySQL and Importing the Required Data Files | MySQL with Hadoop | Lecture 20

Показать описание
This lecture is all about Configuring MySQL on Hadoop Cluster and Importing the Required Data Files for our Ratings table where we have ran the below commands to setup the MySQL on our HDP Sandbox to get the required privileges and permission so we can proceed with importing/exporting the data between Relational Databases (MySQL, MS SQL Server, PostgreSQL, MariaDB etc.) to the Hadoop Cluster(HDFS, Hive, HBase etc.)
Required Commands for this lecture:
su root
systemctl stop mysqld
systemctl set-environment MYSQLD_OPTS="--skip-grant-tables --skip-networking"
systemctl start mysqld
mysql -uroot
------- Mysql cmd
FLUSH PRIVILEGES;
alter user 'root'@'localhost' IDENTIFIED BY 'hadoop';
FLUSH PRIVILEGES;
QUIT;
------ CMD
systemctl unset-environment MYSQLD_OPTS
systemctl restart mysqld
CREATE DATABASE movieratings;
USE movieratings;
DROP TABLE IF EXISTS ratings;
CREATE TABLE ratings(
user_id INT,
movie_id INT,
rating INT,
ts INT
);
-----------------------------------
In the previous lecture we have seen Integrating MySQL with Hadoop using Sqoop tool where we have seen how to work with Relational Databases on Hadoop HDP Sandbox Cluster where we have discussed what is MySQL?, why we use MySQL with Hadoop?, Sqoop command line utility for importing and exporting data between Relational Databases and Hadoop Ecosystem(HDFS, Hive, Pig, HBase etc.) and basic Sqoop import/export commands.
----------------------------------------------------------------------------------------------------------------
---------------------------------------------------------------------------------------------------------
HDP Sandbox Installation links:
-------------------------------------------------------------------------------------------------------------
Also check out similar informative videos in the field of cloud computing:
Audience
This tutorial is made for professionals who are willing to learn the basics of Big Data Analytics using Hadoop Ecosystem and become a Hadoop Developer. Software Professionals, Analytics Professionals, and ETL developers are the key beneficiaries of this course.
Prerequisites
Before you start proceeding with this course, I am assuming that you have some basic knowledge to Core Java, database concepts, and any of the Linux operating system flavors.
---------------------------------------------------------------------------------------------------------------------------
Check out our full course topic wise playlist on some of the most popular technologies:
SQL Full Course Playlist-
PYTHON Full Course Playlist-
Data Warehouse Playlist-
Unix Shell Scripting Full Course Playlist-
--------------------------------------------------------------------------------------------------------------------------
Don't forget to like and follow us on our social media accounts which are linked below.
Facebook-
Instagram-
Twitter-
Tumblr-
-------------------------------------------------------------------------------------------------------------------------
Channel Description-
AmpCode provides you e-learning platform with a mission of making education accessible to every student. AmpCode will provide you tutorials, full courses of some of the best technologies in the world today.By subscribing to this channel, you will never miss out on high quality videos on trending topics in the areas of Big Data & Hadoop, DevOps, Machine Learning, Artificial Intelligence, Angular, Data Science, Apache Spark, Python, Selenium, Tableau, AWS , Digital Marketing and many more.
#bigdata #datascience #technology #dataanalytics #datascientist #hadoop #hdfs #mrjob #hdp #hdfs #hive #mysql #sqoop #apachepig
Required Commands for this lecture:
su root
systemctl stop mysqld
systemctl set-environment MYSQLD_OPTS="--skip-grant-tables --skip-networking"
systemctl start mysqld
mysql -uroot
------- Mysql cmd
FLUSH PRIVILEGES;
alter user 'root'@'localhost' IDENTIFIED BY 'hadoop';
FLUSH PRIVILEGES;
QUIT;
------ CMD
systemctl unset-environment MYSQLD_OPTS
systemctl restart mysqld
CREATE DATABASE movieratings;
USE movieratings;
DROP TABLE IF EXISTS ratings;
CREATE TABLE ratings(
user_id INT,
movie_id INT,
rating INT,
ts INT
);
-----------------------------------
In the previous lecture we have seen Integrating MySQL with Hadoop using Sqoop tool where we have seen how to work with Relational Databases on Hadoop HDP Sandbox Cluster where we have discussed what is MySQL?, why we use MySQL with Hadoop?, Sqoop command line utility for importing and exporting data between Relational Databases and Hadoop Ecosystem(HDFS, Hive, Pig, HBase etc.) and basic Sqoop import/export commands.
----------------------------------------------------------------------------------------------------------------
---------------------------------------------------------------------------------------------------------
HDP Sandbox Installation links:
-------------------------------------------------------------------------------------------------------------
Also check out similar informative videos in the field of cloud computing:
Audience
This tutorial is made for professionals who are willing to learn the basics of Big Data Analytics using Hadoop Ecosystem and become a Hadoop Developer. Software Professionals, Analytics Professionals, and ETL developers are the key beneficiaries of this course.
Prerequisites
Before you start proceeding with this course, I am assuming that you have some basic knowledge to Core Java, database concepts, and any of the Linux operating system flavors.
---------------------------------------------------------------------------------------------------------------------------
Check out our full course topic wise playlist on some of the most popular technologies:
SQL Full Course Playlist-
PYTHON Full Course Playlist-
Data Warehouse Playlist-
Unix Shell Scripting Full Course Playlist-
--------------------------------------------------------------------------------------------------------------------------
Don't forget to like and follow us on our social media accounts which are linked below.
Facebook-
Instagram-
Twitter-
Tumblr-
-------------------------------------------------------------------------------------------------------------------------
Channel Description-
AmpCode provides you e-learning platform with a mission of making education accessible to every student. AmpCode will provide you tutorials, full courses of some of the best technologies in the world today.By subscribing to this channel, you will never miss out on high quality videos on trending topics in the areas of Big Data & Hadoop, DevOps, Machine Learning, Artificial Intelligence, Angular, Data Science, Apache Spark, Python, Selenium, Tableau, AWS , Digital Marketing and many more.
#bigdata #datascience #technology #dataanalytics #datascientist #hadoop #hdfs #mrjob #hdp #hdfs #hive #mysql #sqoop #apachepig
Комментарии