Import Huge file (5M Records) into MySql in 2 minutes

preview_player
Показать описание
This videos shows you how you can easly import huge csv file into mysql table using below simple 4 steps
2 Get file names
3 Use Google sheet to generate small import scripts for speed purpose
4 Run import scripts
Рекомендации по теме
Комментарии
Автор

Thanks for the feedback. I will consider it in the future.

delamberty
Автор

Wow, this helped me a ton!
This is awesome.
The problem I encountered is that you have to add the word LOCAL after DATA and add this flag to your advanced configuration of the connection:
OPT_LOCAL_INFILE=1

NaturalistChannel
Автор

for security reasons the load data infile doesn't work to make it work we should change the configuration settings

jaideepnalluri
Автор

Thanks for your video, really helpful.

ThiagoTAV
Автор

i have 50 lakhs plus record and i need to export to excel i have implement stream but it is not working how can i solve this problem.

jaylimbasiya
Автор

Thanks very much. That was really helpful😃

matz.sjodin
Автор

U jumped too fast. .how do we get the beginning on cmnder

samuelnyarkotey
Автор

Great Video, but i got a problem when it comes to split part as an error ocurred.
It shows : split' is not recognized as an internal or external command, operable program or batch file.
Please help !

OddlySatisfactory
Автор

i cant split, do u no how to fix it?

'split' is not recognized as an internal or external command,
operable program or batch file.

naufallaksana
Автор

=CONCATENATE wasn't working for me and worked as "=("LOAD DATA INFILE 'D:/VM-SharedFiles/'"&A1&" INTO TABLE artigos FIELDS TERMINATED BY ', ' OPTIONALLY ENCLOSED BY '""' LINES TERMINATED BY '\r\n';")"

pedrobarros
Автор

says two minutes and makes the video of 13 minutes

ramoscoder