Training model in Python and Loading into TensorFlow.js - TensorFlow.js p.4

preview_player
Показать описание


Рекомендации по теме
Комментарии
Автор

I feel intimidated by lack of other comments.

Автор

Thank you sentdex for the kind block of review at workspace. I am very happy to continue an original task. The instruction was excellent.
McGary

mcgama
Автор

As of the problem with hosting the model separately: you are serving the script over *'file://'* (clicking the index.html file and 'running it with a browser') protocol instead of *'http://'* .

It won't work as *'file://'* is not a, hyperlinked being important part here :D. A handy trick using *python* to quickly serve static content is, using your terminal, going to the directory of a *index.html* and running:

# this is in your linux shell or windows CMD
$ python -m http.server 8080

where *8080* is the port.
Then you can specify a relative path to your *.json* file, no need to use any external hosting.

Автор

Hi there.
Thanks for sharing your insights on TF and TFJS.
Couple of questions I have below. Hope you can help to clarify:

1. When adding layers to a model, what is the factor that results in the number of layers used, as well as number of input shapes to use between each layer? For instance, for tut #4, a total of 6 inputs are applied to the input layer.

model.add(Dense(64, activation='relu', input_dim=6)) #for subsequent layers, the inputs will be taken care of

2. The 4 tutorials switches between letting TF decide on the type of neural network used, vs the explicit use of 'relu' / 'sigmoid'. How does one decide when it is best t o let TF make the call on the appropriate type of NN to use.

Look forward to your feedback.

martinleong
Автор

i have some trouble. When i run 'pip install tensorflowjs' Error "Could not find a version that satisfies the requirement tensorflow==1.9.0 (from tensorflowjs) (from versions: )"

Can you help me?? pls

linho
Автор

your mugs got more and more crazy through this tutorial

NEDMInsane
Автор

I just want to load a model and use it later in multiple model.predicts. How can I wait till the http request is done so I can use the model? And why is the model.load function async at all?

wyosfr
Автор

Hi, I am training the model with keras python and loading the Json file in my js code. Prediction is working correctly from python code but prediction result is not coming correctly from js.
Can you tell me the reasons behind this.
Thank you

visheshmahale
Автор

Can you make a tutorial about convert TensorFlow model to the TensorFlow.js web format? I really confused about that

longvan
Автор

Hey man! great tutorial. Is there a way to get in contact with you directly?

ghaithalmasri
Автор

Hey sentdex I was wondering can how about using flask for deploying models locally?

anandpawara
Автор

actually i need some explanation about what is dropout and epochs and Dns . where could i get the help?

farnazfarhand
Автор

Can you please tell me how to load the model from a local folder... it's been driving me crazy throwing errors continuously

srikanthguptha
Автор

Firefox and Edge both allow you to make cross origin requests for local files. Chrome doesn't

robertaradi
Автор

I created a small flask app in 10 lines of code to host the model, I think that is the best way to go.

tomhense
Автор

cant you just use xammp for hosting the model or is that not what you are going for

nykachuu
Автор

I think Keras disables dropout for testing automatically. But does tfjs do the same? Better set K.set_learning_phase(0) before exporting, right? I think you can also add a hash check for external resources: buzzword "Subresource Integrity".

Alexnr
Автор

Are you going to continue this series?

chases
Автор

When I try to make predictions with my model, I get the error Uncaught TypeError: Cannot read property 'length' of undefined. Does anyone have a solution to this?

bbjjooddhh
Автор

You can host a simple http server using python and that will allow you to make those kind of requests

scootscootk
welcome to shbcf.ru