Working with larger-than-memory datasets with Polars

preview_player
Показать описание
Polars makes it easy to work with very large datasets even on a regular laptop. In this video I show you just how easy on a typical data science query.

This video comes from my Data Analysis with Polars course on Udemy - check out the course here with a half price discount:

Want to know more about Polars for high performance data science and ML? Then you can:
- subscribe to my channel
Рекомендации по теме
Комментарии
Автор

Great video. I still want that workstation tho...

efilson
Автор

Question for you all: what kind of problems could this kind of streaming solve for you?

rho-signaldataanalytics
Автор

Hi @Rho-Signal Data Analytics, what is the difference between polars.read_csv vs polars.read_csv_batched vs polars.scan_csv

vg_
Автор

hi, how to use sqlcontext on polars? i wonder how to use sql syntax to lazydataframe

SugengWahyudi
Автор

hi, cool proyect, this has suport for geodataframes ?

Автор

When streaming, what is the size of the batch?
When streaming, it means that it will handle any data size, but cost is just time?
---
I was playing with a 18 million dataset, try to inner join with some other dataset in a aws lambda at full memory.. lambda crashed. I will try again with this streaming..

davido
Автор

I've never - such SHT'(intel)

paulhetherington