Update a Table with Millions of Rows in SQL (Fast)

preview_player
Показать описание

Running an SQL UPDATE statement on a table with millions of rows can often take a long time.

There are several different ways you can update this data so that it runs faster.

In this video, I explain and demonstrate five different approaches to updating data in a table with millions of records.

The demonstration is done in Postgres, but the concepts are valid for Oracle, SQL Server, MySQL, and other vendors.

⏱ TIMESTAMPS:
00:00 - Our sample data
00:47 - Method 1
01:22 - Method 2
02:09 - Method 3
04:38 - Method 4
05:25 - Method 5

🔗 VIDEO LINKS:
Рекомендации по теме
Комментарии
Автор

Your channel is very helpful sir thank you providing knowledge for free.

prashlovessamosa
Автор

It would seem that the first method would always be the fastest. The REPLACE function basically runs an INSTR on every row, changing the data when it finds a match. Every other example will do more work. For example, in Method 2, the INSTR runs on every row, then REPLACE has to search the subset of rows again to locate the text and replace it. The other methods all add additional overhead while still searching some fields twice. The fastest way would be to run it in parallel if your DB supports it.

RAMII
Автор

Can you show me that I want to update mysql database on Column Address to be masking with custom way like replacing * and & sign round. eg. 'address' to '*d&ras&' to all 30million rows. How to solve plz help me.

slaybryn
Автор

Would update in batches be beneficial for a live app, thereby making the app available for use in between the batches?

coolcha
Автор

What is the point of this video?

The most obvious and naive method is the fastest, while the rest are convoluted and do extra work for no good reason.

99% of the videos on this channel are good to great, but I would just delete this one if I were in your shoes.

kane_lives