Automate Python script execution on GCP

preview_player
Показать описание
This tutorial shows how to automate Python script execution on GCP with Cloud Functions, Pub/Sub and Cloud Scheduler.

00:00 Introduction
00:21 Architecture overview
01:09 GUI - Pub/Sub
01:28 GUI - Cloud Functions
02:39 Python code walkthrough
05:18 GUI - Cloud Scheduler
08:27 gcloud CLI - Pub/Sub
09:33 gcloud CLI - Cloud Functions
11:31 gcloud CLI - Cloud Scheduler
Рекомендации по теме
Комментарии
Автор

Great job! I followed your instructions, and everything started working smoothly for me. Your tutorial is fantastic — keep up the excellent work! You have the potential to reach 1 million subscribers. Keep pushing forward !!!

umamaheshmeka
Автор

Thanks a lot for sharing your knowledge!! Greetings from Mexico

patriciodiaz
Автор

Please answer to my question, I need to do the same thing as you did in this video. My python script works just fine under Google cloud shell. However, I am still having trouble making it work as cloud function. The purpose is to schedule the execution of the function. It consist of extracting a data from a web site and save in google sheet. I was able to make run it under google cloud shell. Any clue from you ?

MohamedFerrouga
Автор

Hi I love your videos so far! Can you make a video on vertex pipelines with cloud scheduling for model training and deployment to an application? Looking forward for your reply!

algorithmo
Автор

Hi! First of all, great video! Really simple and intuitive. It worked for me when I called only one function inside the hello_pubsub, but when I tried to call several others, from others .py, looked like the function run perfectly but with no results. Is there a way to make the cloud functions wait before every function finishes before moving to the next one? Thanks

victorricardo
Автор

I have a question, what about if I had a script with selenium (web scraping) library? There would be a problem right, because as far I can understand it needs a driver installed in your machine to work 😪

patriciodiaz
Автор

Great tutorial. Wondering - why save the data for versioning as csv and not parquet? Wouldn't that allow you to create easy partitions, save on storage, and expedite processing/ querying?

fafenley
Автор

We’re does the script get its internet from to load URLs/api when it’s in the cloud? Is it generated or from a mast in the providers grounds?

GTFS-lgtw
Автор

Thanks for the detailed explanation. I try to follow the first method but am facing some issues. Basically, I could load the data into Cloud Storage but when I try to load it to BigQuery, I received 404 error saying reqeust could not be served. Any ideas why it happens. FYI in case it helps. I have modified the python script a bit using local environment os which I could load data into cloud storage and then upload into BigQuery successfully through my laptop. However, when I try to automate it in Google Cloud Function, I faced the 404 issues. Any comment would be appreciated.

LamBecky
Автор

Hello, are you still active and availble for questions? I would like to ask you if it's possible to have a function that run and close a VM within google cloud itself?
Thanks

EternalAI-vb
Автор

Very helpful tutorial. Can you do the GUI code for Environment version 2 for the function?

freddiemarrero
Автор

Lol it’s not working. Can’t deploy. What can be the problem? I did exactly what you did, except can’t create the same bucket so named the bucket as c4ds1. Also changed the code for that.

Rajdeep
Автор

Can we do this using google compute like GPU? how can we do that?

JishnuMittapalli
Автор

what is the zip file ? i didnt understant what is it about..

flosrv
Автор

Hi This tutorial is great but I encountered one problem while runing the function: 404 Not found: Table was not found in location southamerica-east1" any clue how to solve this?

rogerrendon