Backing up your Docker Configurations and Data.

preview_player
Показать описание
=== Links ===
Show Notes

Get the AwesomeOpenSource Merchandise

Support my Channel and ongoing efforts through Patreon:

Buy Me a Coffee or Beer

=== Timestamps ===

=== Contact ===
Twitter: @mickintx
Telegram: @MickInTx

Try out SSDNodes VPS Services! Amazing Specs for incredibly low costs. I'm running a 32 GB RAM / $ CPU Server for only $9 a month! Seriously. FOr long term server usage, this is the way to go!

Get a $50.00 credit for Digital Ocean by signing up with this link:

Use Hover as your Domain Name Registrar to get some great control over you domains / sub-domains:

Support my Channel and ongoing efforts through Patreon:

What does the money go to?
To Pay for Digital Ocean droplets, donations to open source projects I feature, any hardware I may need to purchase for future episodes (which I will then give to a subscriber in a drawing or contest).

=== Attributions ===
Рекомендации по теме
Комментарии
Автор

Been wanting to do something like this for a while. Thanks for laying the groundwork. I can integrate this into my rsync/restic backups that I already do.

tack-support
Автор

I have syncthing pulling up my docker folders in my server to a raspberry pi with an external HDD on my folks house as a external location backup and since sync thing doing a one-way sync it keeps the back up synced in real time.

A lightning struck a couple months ago and fried a whole bunch of equipment, My ISP's ONT, my router, and my servers Mobo and CPU. The backup served me well because I was able to quickly get it all back and running.
I had images of my HDD backed up, restored the image, restored the docker files to latest and boom as if nothing happened :)

uilh
Автор

Small adjustment, you can make a list with a loop in bash so you can add / remove containers from backups easily since it will be the same list used to stop and then start the compose files

mehdiyahiacherif
Автор

Thanks a lot Brian, for making this video and handout for the docker backup. I always like your clear explanation of doing things and more importantly, why you do things. Greetings and keep up the good work👍

i-am-you-tube
Автор

My backup strategy looks almost exactly like the one you showed us here. The only big difference is that I use restic instead of tar and I can recommend restic to everyone here to have a look at it!

DocBrown
Автор

Thank you for that tutorial! It s always a pleasure to learn things with you! I Ve been following your channel for quite some time now!

Robertjaymercer
Автор

Appreciate the straight forward reviews and avocation for Open Source.. Thank you! I offer a slight correction on the tar command options you gave. The -czvf options actually are c for create, not compress. z for compress, not zip. v for verbose and f for target file. You are creating a compressed tar file ... <filename>.tar.gz. If you omitted the z option you'd leave off .gz. You don't technically have to add or omit the .gz to the filename. Either way, it has no bearing on whether the file is compressed. That's all in use or not of the z option.. And tar -xzvf equals x for extract, z for a compressed file (uncompress a compressed file), v for verbose and f for source file. Keep up the great work!

tonydematteis
Автор

I will give it a try. I had backups problem earlier. I used aapanel, stopping webserver and other shitty things and used backup cron to upload to my GDrive account. This method is also cool. Thank You

PapaSharmaJi
Автор

My backup script is quite similar to yours. Except that I use 'restic' instead of 'tar'. As 'restic' backup is in incremental style. The backup is a lot faster than using 'tar'. The 'restic' is also doing a lot better in term of version control and storage usage.

gotothemoon
Автор

Add this line at the end of the script to delete archives older than 3 days:

find /backup/folder -type f -mtime +3 -delete

florealucianm
Автор

Thanks, I've been wanting to do something like this myself. I would love to know about deleting certain old backups as part of the process.

nathanmcfarlane
Автор

Hey that is an awesome video! one thing thought, I have a backup script very similar to yours. I did find it helped with time and space to exclude Temp and cache folder in some containers. plex/jellyfin/emby have a lot of it and they tend to be the most time consuming. same with the Art and some metadata. I have the bandwidth and cpu power to rebuild pretty fast so I also exude that from backups. but that is just me being greedy with my storage space.

samuelchampigny
Автор

To stop and restart all containers quickly use
docker stop $(docker ps -a -q)
docker restart $(docker ps -a -q)

HelloHelloXD
Автор

Instead of doing the start and stopping of all container, you could also do a sync and then pause the containers. It's not perfect but faster and easier. If you want to be cool, you could even just use a fancy FS like BTRFS or ZFS or Thin LVM to snapshot the whole directory/filesystem. Then you can do a backup of that snapshot. Since snaps are atomic you have all the time to do your complimentary backup off of that.

LampJustin
Автор

Very good video - Thanks for the information - Regards

Автор

This was very helpful. Thank you. I will modify this script and try it. You explanation is helping me understand why my duplicati backups of my docker volume haven’t been working. This is a big problem though and we haven’t solved it. What if I had people truly relying on these services? I can’t tell them we’re shutting down for a break every 24 hours so we can back up.

FrontLineNerd
Автор

That is a perfectly valid backup plan. You can go fancy with incremental backups of Borg, or Bacula, but never use backup tools, which are smarter than You. (-: The easier, the better.

lpseem
Автор

I did a _very very_ dirty version of this :)
docker stop $(docker ps -a -q)
tar -cvzf -X exclude.txt /volume1/docker
docker start $(docker ps -a -q)

At least it works :)

kharmastreams
Автор

Thanks for this great video.. would it be possible to go through a restore in a future video.?

pctechdr
Автор

I’m developing a python script that verifies the backup by extracting the gzip file to a temp directory then compares the source files to the ones in the temp directory. (Point being, that _having_ a backup doesn’t really do you any good if the backup is no good.) Core functionality works; I’m just working on the fancy stuff like parsing the log files and sending myself a push notification to let me know that the backup completed and verified successfully. Also need to run some tests to see if there’s a significant (time) difference between using diff, comp, and sha256 checksums. (I may do it the fastest way on a daily basis and the most thorough/effective way weekly.)

donny_bahama