Laravel: Mass Update Batch Rows in One Query? 3 Ways

preview_player
Показать описание
Answering a question from a YouTube comment, showing 3 different approaches to bulk-update the set of data.

Links mentioned in the video:

- - - - -
Support the channel by checking out my products:

- - - - -
Other places to follow:
Рекомендации по теме
Комментарии
Автор

I would be happy to see some detailed comparisons, with large amount of data, on videos like this :)

wolvgvng
Автор

with the upsert solution and a lot of data you will get an exception "too many parameters" if you dont chunk the data. When using foreach I would use proper transaction sizes to speed it up.

globetrotterxxl
Автор

Thank you for sharing your knowledge with us, your videos are short, direct to the point and your teaching skills are remarcable!

gssj-op
Автор

I always use the 1st approach for mass update... I recently tried it on a 14k records and it was pretty fast. I often add unique index on database level

samsonojugo
Автор

The main reason to use the "Foreach method" is to use the Observers. Only then will they work as they should. The "Upsert method" is only useful for sending some statistical data that doesn't need to be prepared for insertion into the database

artem
Автор

Today at work I had a case of inserting large amount of data into two db tables.

Tricky part was that the second table had a foreign key of table one.

The easiest solution would be to perform for loop inside which I would crate record in table one and then create record in table two. But that would mean hundreds of queries.

I solved it by inserting data into table using insert statement. Then grabbing the id of the last inserted record, getting the range of ids that had to be inserted into table two and then finally inserting data into table two using another insert statement. So in the end I only performed 3 queries to insert hundreds of records. All that had to be done inside transaction in case records were inserted by someone else at the same time.

If anyone faced similar issue I would love to hear your opinion.

dApoTB
Автор

I usually use the MySQL 'case when...then' statement with chunks for handling large amounts of rows, but this time I prefer using the upsert method with chunks.

The chunking technique is helpful in avoiding the query size issue.

alamin_firdows
Автор

It's definitely worth mentioning the difference between foreach and upsert (and the batch package if I had to guess).

The foreach implementation will respect soft deletes and will trigger model events while upsert WILL NOT. If either of those are important, you SHOULD NOT use upsert.

In most cases, this is probably ok, and the desired effect you want with your batch update.

DanRichardsRI
Автор

Databases are built to handle large data efficiently on their own.

I'd always defer more work to the DB whenever possible, since that's what they're good at.

If I don't need eloquent's events and other related goodies for the update, then upsert is just fine and also performant.

newtonjob
Автор

Thank you sir for making a video on my comment, i was using the 3rd solution but now i got two more for that purpose. Thank you very much 😁

sr.rachit
Автор

For big data sets second way is the best way. Some time ago I made DB seeder run about 10x faster (from about 10 - 15 min to 1+ min) by using that approach (didn't use mentioned package).
This is flexible as there is no "unique" requirement and you can always chunk data to insert in batches to avoid any count limits.
For smaller sets it's smarter to use 3rd way to allow Laravel to handle events etc.

NoOorZ
Автор

For accuracy, Upsert was added in Laravel 8.

jderdmann
Автор

Agree with the looping of data. Using laravel's model cursor() or chunkById()

ronssijei
Автор

I think first and second approaches are good. we can apply chunks on bigger data. but in third appraoch for bigger data we will have to many queries and slow response.

rizwan
Автор

I bumped into this situation and came here to see your opinion 😁... I'm going with the foreach as always....seems like no real savings using upsert method

ameename
Автор

I solved it in one of my projects with a foreach inside a db transaction. The db transaction is a bit faster compared to foreach without db transaction and of course we know db transaction comes with other benefits.

svenst
Автор

I use SQLite with millions of records and experienced that using transactions can make batch updates up to 10 times faster. One of the reasons I suspect is that the indexes must be updated only once.

paulfontaine
Автор

I think I would go with foreach, but put the whole thing into a transaction.

Denakino
Автор

Have tried the upsert method on Oracle DB. Must tell that DB there was heavy AF, but upserting ~11K records did not take quite a long time

cardboarddignity
Автор

Upsert + chunking gives you the least amount of queries and an ability to fine-tune it

SauliusKazokas