How to use local resources in google colab for AI ML Model training

preview_player
Показать описание

Setup instructions
In order to allow Colaboratory to connect to your locally running Jupyter server, you'll need to perform the following steps.

Step 1: Install Jupyter
Install Jupyter on your local machine.

Step 2: Install and enable the jupyter_http_over_ws jupyter extension (one-time)
The jupyter_http_over_ws extension is authored by the Colaboratory team and available on GitHub.

pip install jupyter_http_over_ws
jupyter serverextension enable --py jupyter_http_over_ws
Step 3: Start server and authenticate
New notebook servers are started normally, though you will need to set a flag to explicitly trust WebSocket connections from the Colaboratory frontend.

jupyter notebook \
--port=8888 \

Once the server has started, it will print a message with the initial backend URL used for authentication. Make a copy of this URL as you'll need to provide this in the next step.

Step 4: Connect to the local runtime
In Colaboratory, click the "Connect" button and select "Connect to local runtime...". Enter the URL from the previous step in the dialog that appears and click the "Connect" button. After this, you should now be connected to your local runtime.

use local branch

#ai #aiml #learnai #learning #monai #google #googlecolab
Рекомендации по теме
Комментарии
Автор

I did this but when you call

import tensorflow as tf

print("yes" if else "no")

it just print's no as no GPU was found, is there a way to fix this?

santiago.valencia
Автор

It just gives me "jupyter notebook" is not a valid syntax despite me installing both jupyter lab and notebook

iEatCarKeys
Автор

when i run the code in terminal it shows that jupyter is not found

_anjali
Автор

it seems notebooks with google drive connection have issues using this

Gamess