Linux Crash Course - The wget Command

preview_player
Показать описание
The Linux Crash Course is a tutorial series that goes over all of the core concepts regarding Linux that you'll need to know, one video at a time. In this episode, the wget command is covered.

## Thank you to Linode for sponsoring this episode of the Linux Crash Course.

*Support Learn Linux TV*
Note: Commission may be earned for any and all links presented here.

*⏰ TIME CODES*
00:00 - intro
01:25 - Get your very own cloud Linux server with Linode
02:38 - Checking if wget is installed
03:22 - Downloading a single file with wget
06:02 - Choosing the name of the file to be downloaded
07:21 - Choosing a location for the file to be downloaded
08:38 - Resuming an interrupted download with wget
11:18 - Using an input file with wget to download a list of files*🎓 FULL LINUX COURSES FROM LEARN LINUX TV*

*🌐 LEARN LINUX TV ON THE WEB*

*⚠️ DISCLAIMER*
Learn Linux TV provides technical content that will hopefully be helpful to you and teach you something new. However, this content is provided without any warranty (expressed or implied). Learn Linux TV is not responsible for any damages that may arise from any use of this content. Always make sure you have written permission before working with any infrastructure and that you are compliant with all company rules, change control procedures, and local laws.

#Linux #LearnLinux #DevOps
Рекомендации по теме
Комментарии
Автор

Thank you, it would have been useful to do tell about the difference between wget and curl.

Gabriel-of-YouTube
Автор

wget is good for scraping sites, but I use cURL. its preinstalled on almost every distro, works with more protocols, and just as easy to use.

berkpwn
Автор

Thanks didn't know all those wget options. Perhaps this is a nice string of commands to explain one time also: "time timeout 60 watch -n 10 ls -litra" :-)

dirkpostma
Автор

You are my online encyclopedia for Linux stuff. Perfectly understandably explained, as always 🙏

ArniesTech
Автор

thanks for "-i" option. that's a life saver :) also very useful option is "-b" (background download)

zauliuz
Автор

The process you used to get the url of the the downlaod page of the word press defeated the purpose of wget as you mentioned it at the beginning. Using wget when you don't have a gui installed on your server, yet you had to open your browser to get the URL of word press.

medes
Автор

I know you can't cover every available option, but I just wanted to mention that I first discovered wget when I needed to download the Slackware distribution, before the days that ISOs (and fast enough internet to download ISOs) were commonplace. The recursive option to wget was great for mirroring the directory structure in preparation to create an install CD.

aaronperl
Автор

I always use wget to download Linux distro and other operating systems so the potential for downloads to be interrupted is always there, so I always use the -c switch. It doesn't seem to hurt leaving it on through something like an alias.

I also like to retain the original timestamps of the files as well whenever possible, which allows me to sort my distros roughly by release date, which can be a useful way to sort my ISOs in the file manager. The switch for this is capital N, so the full wget command I use is simply: wget -cN

UltraZelda
Автор

Hey, I've been watching your videos lately. I saw a video you posted a year ago about migrating to Alma Linux. I also saw your opinion on CentOS and RH and not being tied to a certain distro. I was wondering if you had a video about being able to migrate a server from one distro to another, like Ubuntu/ Debian to rocky or visa versa.

rudysal
Автор

Great video as always, can you cover tar next?

logyross
Автор

I have been enjoying these beginner focused videos. One suggestion would be to explain what an argument means. A perfect example is you explained that -i was for input file but the other arguments you don't explain what they mean, only what they did. Explaining the meaning such as -i means input helps people learn the association between the command and function making it easier to remember.

bw_merlin
Автор

How is wget & curl command is similar in certain scenario?

nirabhmani
Автор

How about using Wget for fetching html values from webpages?
For example if you wanted to fetch the html value of a sports score or stock price from a website?

__
Автор

I only download programs from the package manager and so should you.

drumpfall
Автор

Dang! I just bought the 3rd edition of your book on kindle. Is it worth buying the whole book again, to get the updated info? I am studying to become a systems administrator, and I don’t want to miss vital info, but funds are also tight as I am doing school work to make this career transition

maheadros
Автор

The way I download a single archive from the internet shows me the download progress and when download completes without those progress bars or dots on my terminal. For this example I'm going to use archive for what to download from the internet.
wget -bc archive && wc -l wget-log && grep -i saved wget-log
Once entered every so often hit up arrow or ! then enter to check on archive growth on the system. Some lines will only have numbers but a longer output line will tell you when the file has finished downloading. Once finished just rm wget-log so the command can be used again. This is probably a good function candidate too.

judedashiell
Автор

Thanks very much your Tutorials are so simple and I could understand each piece of information in your videos

moamer
Автор

Top! Tnx for your Videos. can you cover the topic of nmap?

jwspock
Автор

So funny I was googling how to wget with poor connection without restarting from beginning and you made a video😀

Broly_
Автор

Have you done a gentoo install guide similar to your arch one?

ciaopete