Scrape data from multiple pages on a website with Power BI Desktop - Power BI Tips & Tricks

preview_player
Показать описание
Have you come across a website that has data you need but it is presented in multiple pages? Downloading page by page can be frustrating and time consuming.
In this short video, I will show you how to iterate multiple pages on a website to import the data using Power BI.

Chapters:
00:00 Intro
00:30 Explain the user case
01:30 Break down the URL
02:30 Get web data in Power BI
03:00 Add parameter to the web url
04:30 Test our function
05:30 Create a list of years in power query
06:00 Call the function
07:00 Load the all the data in Power BI
Done!

SUBSCRIBE to learn more about Power and Excel BI!

Our PLAYLISTS:

ABOUT CURBAL:

QUESTIONS? COMMENTS? SUGGESTIONS? You’ll find me here:
► Twitter: @curbalen, @ruthpozuelo

#SUBSCRIBE #CURBAL

#SUBSCRIBE #CURBAL
Рекомендации по теме
Комментарии
Автор

The connector has just got better, check this video also:

/Ruth

CurbalEN
Автор

1:24 - URL decoding ( BULK download website data)
2:10 - Import data in Power BI
3:04 - Substitute static year with dynamic function
5:10 - Create query for ALL data

thanks for the video Ruth! very useful tricks. Please take a look at "threelly" the chrome extension!

amantin
Автор

That worked really well. Ended up using it to replace a data field from a URL instead of a year and worked flawlessly.

kurtisbrischke
Автор

Mind blowing.... just when you thought downloading data from one website was the coolest thing ever .... now I'm downloading it from 50 different pages at once lol.... simply...magnificent! thank you Ruth!!! you da beesttt!

chescov
Автор

That was very useful. To think that my problem was sorted already 5 years ago is very humbling.

ngrna
Автор

Thank you very much! After a fews hours searching and searching on google, I reached this video! With a simple solution to get and merge a few web data on the same page.

omeufii
Автор

Great video and very helpful. I've been struggling with this for some time but I've just managed to scrape the data that I need from the website. Thank you.

powerb_i
Автор

Thank you very much for putting together a clear and straighforward tutorial...wish everybody was as good as this as you!

luiszuniga
Автор

Haha i cant believe this, i was trying to scrape similar data for learning from formula1 site and i used somewhat same logic and then came to check if theres a better way to do it here as youre my goto learning place. To my surprise i have implemented it perfectly, thanks for the good ideas always anyways. :)) greatful

aditpandey
Автор

Very useful and broadly applicable information. This works great for querying every account I needed email metadata from using the Exchange connector.

davidmcewen
Автор

Top video - nicely explained with no unnecessary waffling.

CyberPin
Автор

This was very useful. Thank you very much. I was having problems in extracting multiple pages. But you just brought me all the clarity i need. Kudos and keep it up

johannanderson
Автор

I can't believe that it is possible . Why did not i came here in the first place ..This is really great Ruth ! Thank a ton for this ...

abhishekstatus_
Автор

Thank you much for such an informative video

synix
Автор

Hi, i must say your video has been so helpful and i appreciate it however, creating a table for range of dates/years have been very difficult to handle eg. 01/01/2015..31/12/2018? i have been trying it using your video but i got errors when i got to create table with dates i want to iterate. can you help?

royrolls
Автор

Thank you so much, this was explained very simply.

teraross
Автор

That's very clever, Ruth, well done, and thank you!

merkstamde
Автор

I just used a combo of this video and one other of yours in a solution! Even your videos from quite a while ago are super helpful!

christopherhastings
Автор

Excellent video. Very clear. One of the best YouTube videos I've watched. Many thanks.

bok
Автор

Lovely. i was looking for something to scrape multiple pages. I was using an R Script but it wasn't the most reliable so was looking for an easier alternative. Only thing different is that i have a whole list of URLs that i want to scrape so i need to put that in the query and have the function use those.

atifmir