How to handle 'Memory Error' while loading a huge file in Python-Pandas

preview_player
Показать описание
Learn how to solve the memory error while working with a huge file in Pandas-Python.
#MemoryError #Python #Pandas
# How to read a sample data from csv file without opening it in Excel or Python?
# The data to load
# Count the lines
num_lines = sum(1 for l in open(f))
# Sample size - in this case ~50%
size = int(num_lines // 2)
# The row indices to skip - make sure 0 is not included to keep the header!
# Read the data

Connect with me here:
Рекомендации по теме
Комментарии
Автор

If 900kb file is huge then what about 2.9GB file?

umarfarooq
Автор

hey sumit,
I'm trying to understand that if I follow this procedure, only the data will be skipped while reading
but what if we need a particular element from the data set which has been skipped, as it will show the sub-data only

munnashaik
Автор

Thanks for your help. I wonder how to face this memorry issue: I get this message: "Unable to allocate...for an array with shape Do you know how to fix it?

brunozaratecasallo
Автор

The "open " step shows error

Musafir.
Автор

How can we seperate positive, negetive and neutral tweets from a random csv tweets file

miankhalid
Автор

Loading 600 mb is already pain in the a...

abaricias
Автор

Cheers Sumit! Like, comment and subscribe guys✌️

prateeknarayan
Автор

Waste of time.
Their is huge difference between random data sampling and reducing data bro.
How can you reduce data ? That's stupidity

brownmunde