Processing 1 Billion Rows Per Second

preview_player
Показать описание
Everybody is talking about Big Data and about processing large amounts of data in real time or close to real time. However, to process a lot of data there is no need for commercial software or for some NoSQL stuff. PostgreSQL can do exactly what you need and process A LOT of data in real time. During our tests we have seen that crunching 1 billion rows of data in realtime is perfectly feasible, practical and definitely useful. This talk shows, which things has to be changed inside the PostgreSQL and what we learned when processing so much data for analytical purposes.
Рекомендации по теме
Комментарии
Автор

Need help in processing Billions of rows, row by row using PL/PGSQL. If there is any alternative?

IMfLEX
Автор

Do you mean the system can extract csv file for 1 Billion Row per second?

hkpeaks