Web Scraping in Google Sheets! (IMPORTXML FUNCTION)

preview_player
Показать описание

In this video, I show you step by step how to use the =IMPORTXML function to web scrape in google sheets. Using this function has allowed me to build extremely powerful and useful spreadsheets, which you can see in many of the videos on my channel.

Get 50% off of Seeking Alpha Premium!

I am not a Financial advisor or licensed professional. Nothing I say or produce on YouTube, or anywhere else, should be considered as advice. All content is for educational purposes only. I am not responsible for any financial losses or gains. Invest and trade at your own risk

#WebScraping #IMPORTXML #GoogleSheets
Рекомендации по теме
Комментарии
Автор

doesn't seem to work on views on youtube videos or instagram etc etc. But does work on the data you show in the video. How come?

KnowArt
Автор

Great video!
I'm blown away by this feature. Didn't realize it was that easy to scrape without using code

ianbraganza
Автор

I have been looking for YEARS for a formula that would help me pull NBA win totals automatically -- this finally did it. Thank you so much!

msvec
Автор

nice video, thanks for posting! I've got two questions. 1) I get a N/A cell and the error is "could not fetch the url". What is the cause of this and is there any workaround? 2- Is there any way with this method to see the values changing in real time? thank you!

thswan
Автор

Hi! I’ve tried the formula to import the full xpat section but now the cell return #N/D so can’t find the object at the url. Any suggestions?

trader_gian
Автор

How can you pull variable data from a webpage? example: Only business addresses from a listing page that gives you multiple addresses and full descriptions of each address in between the addresses. How do you scrap specific variable data like that into a spreadsheet keeping it organized and separate from the other data?

dxrpemn
Автор

Mindblowing stuff! I just managed to web scrap off another website based on the principles here! Thanks for teaching me something new!

antonlindberg
Автор

Thanks for the video but this doesn't work for me and I've seen other people get the same #N/A Error Resource at URL not found - please help.

JustinWaite
Автор

Thanks, your video is very good. But when I use this formula, error come is - import data is empty...Pls help

sandeepkumar-jsgi
Автор

Has anyone tried scraping from YH subsection pages like "Statistics" or "Financials" sections? I have tried scraping from there using importhtml and importxml and both seem to fail (only working in summary page)

dbarrantescr
Автор

This is great but I tried this and for some reason it says “Imported Content is Empty”. Then I tried the same thing you did for Kobe but got the same error. Is yours still working?

tp
Автор

Smooth as it can be. Excellent. Liked, subscribed. 🫡

Sanctimonious
Автор

yasss this is what I was looking for (I think... let's see if it actually works)

KnowArt
Автор

This is a great video and I appreciate you putting it together! Thank You 🙂🙂

jivepatrol
Автор

What would you do in the case that a website you're pulling prices from requires a log in? I have the login for the specific site i'm referencing, but what if you don't as well? More curious on how to do it if you have the login. For example, if I get specialty pricing from a hardware store that's only visible via my login, how would I automate that extraction?

pcjhrjq
Автор

Thank you. But I cannot web scrap some ETF like GLD (gold). Can you help ?

yxuwbtj
Автор

Nice! What about the 'Statistics' tab? It doesn't seem to allow me access to the Xpath on, say EV/EBITDA

skenderaxe
Автор

When I use the same function, but the cell shows"N/A". Do you have any idea what is the problem with it? Thank you!

jennifertao
Автор

Thank you nice, on the Home Screen in Yahoo its work, but not for the Split statistic. Can you make this for maybe divident yield...😊

bonchanz
Автор

Hello i dont know if you ever see my message but is seems google has changed something in their settings, the inspec option is not working as one unit but it shows all text so i can not scrape anything lately... am i wrong??

erincvarol