Inserting 1 BILLION rows to a Timescale Database 💀 #ad #programming #software #technology #code

preview_player
Показать описание
Рекомендации по теме
Комментарии
Автор

Thanks to TimescaleDB for letting me be reckless and sponsoring :)

CodingWithLewis
Автор

Great! Let's see how fast can PostgreSQL read 1 Billion rows

TFA
Автор

Now let's see Paul Allen's DB.

alexgirbescu
Автор

1 billion row use case? 10 years of time clock punches for 100/employees/day for 1, 500 locations. Basically any large chain QSR.

anthonydagosta
Автор

This is not apples to apples comparison.. timescale DB is tailored for IOT kind of data.. but Postgres is used for all purposes like transactional data .. timescale db is not an alternative for Postgres

manojBadam
Автор

to optimise this you need to load continuously from the disk and make a buffer, then you copy the buffer back to the disk. to optimize ram usage we need to split tha data in batches. if we use a perfect c compiler or assembly this should take about 3 minutes with a fast ssd or 35 minutes on a fast hdd

amazing-
Автор

I’m 100% sure Postgres can do way more than 100 rps on individual inserts,

NotPineda
Автор

Depends on the width of a table, indices, keys, and other factors.

TheMisterSpok
Автор

Thanks to Timescale DB. It let us query and go through billions of rows in milliseconds. Wish we discovered this earlier in our development.

Shurq
Автор

@CodingWithLewis Use case for 1 billion rows? Easy, subsecond sampling of manufacturing process variables (pressure, temperature, voltage etc)... 3 years sampling 10x per second is almost a billion. This happens all the time in factory/infrastructure data loggers.

MichaelSmith-fgxh
Автор

Single insert method would take ~116 days to insert 1 billion records for you 💀 [not a whole day 🙃]

nagrajnaik
Автор

Hey coding with Lewis one question for you, how do you make your videos can you show some behind the scenes what kind of apps or software u use etc.

programmershourya
Автор

Can you please share the spec of the machine that postgres was on, along with the schema of the table and the size of each row, and where were the 1 billions rows from?

srivathsaharishvenk
Автор

Funny, I have a use case, just no idea how to implement. Awesome video and I’m glad I found this

killerdiek
Автор

India or China populations - vaccination data or social security numbers - good use cases!

dhawalmoghe
Автор

The use is all those shows and movies that you say "I'll watch em some day" but never do

Avery_Silva
Автор

are you serious? i migrated 7 billions of rows from mysql to pgsql on aws rds in 1.5 hours. Like ten times for a single project

vyacheslavhruschev
Автор

Do you plan any further Videos / tutorials for timescaledb? I have used it in IOT projects but always failed in the right way of configuration

JanniTune
Автор

I never learn this postgres in university 😢

primary
Автор

Which DB is best for reading as much rows as possible at once for only one request?

Redragon
join shbcf.ru