Training deep quantum neural networks

preview_player
Показать описание

Here a natural quantum neural network architecture for fully quantum machine learning is proposed and an efficient training algorithm described.
Рекомендации по теме
Комментарии
Автор

just the fact that you edited the video to have high quality slides is good, thumb up

ONDANOTA
Автор

About gradient descendant algorithm, you have to realize that as you are modifying the weights at synaptic junctions you are in fact modifying the surface as well. That is awesome for me how eventually the algorithm, many times, reach the solution correctly.

hrivera
Автор

Such a good explanation I love it. Thanks for all your efforts. 😊

viddeshk
Автор

Very interesting talk! How are activation functions modelled in the case of a QNN? Without them the network will not learn any non-linear decision boundaries. Unless there is some other quantum non-linearity I'm not considering.

DevashishGuptaOfficial
Автор

An explicit comparison of QNN advantages over standard NN (if any) would have been VERY welcome.
Still Kudos for the presentation! 👍

robertovoce
Автор

Could you enable the automatic closed captions please?

FilippoTramonto
Автор

First of all thank you very much, this is very interesting and i would definitely read the paper.
I have a question about non linearity. The main strength of neural networks is the non linear term (the activations) which gives it enhanced abilities to model non linear functions. How is it inserted in QNN? and if it doesnt doesnt it have a big imapct? are there aditional effort of implementing it soon?
Thanks

בןויזל-כה
Автор

Really awesome presentation! Can you please share the slides?

ashharr
Автор

you could have enabled subtitle section.

asifsaad
Автор

Firstly, thank your for this explanation. I see this as an extremely powerful algorithm. However, I'm stuck with question. Could you explain me how the "k" matrix was derived at? You did mention about it, but I wasn't able comprehend it.

vvnkk
Автор

Thank you for the video. I read the paper and the supplementary material and I have a question for the future research. In the last part of the supplementary material the authors suggest a way to calculate the K matrix fast with the method of QPCA. Is there a paper about it? Or are the authors still working on this idea. Or sadly it has failed?? I will really appreciate for the reply. Thank you.

godthinkun
Автор

How are the slides this cleanly displayed? Are they overlaid atop the recorded video? Wonderfully clear, as I've come to expect from the videos on this channel. Thank you Tobias and I look forward to doing our podcast together. - Curt EDIT: Just noticed it is indeed placed on the video since she disappears behind it when drinking water unless that's some quantum magic.

TheoriesofEverything
Автор

What is the dimension of the unitary U11 in this example? Could you please tell?

vvnkk
Автор

First, thanx so much for this nice presentation and paper.
I have a question. In the classical training procedure the weights are updated.
In the quantum version, I do not see any weight or coupling between the qubits of different layers. Are the couplings hidden in Unitaries?
Because as far as I understand, the tool to be updated is the unitary operator. Then the unitaries should always be two-qubit operators?
Am I right?

yankiodamdasohbetler
Автор

❤️❤️❤️It's done. Awareness model based on pennylane and QNN strawberry pennylane. I did it/yariv hellden Sincirly wearables after Apple lens to lens data replace smartphones. It's all in the files

arasuper
Автор

Jesus guys are already training quantum neural networks. We are so screwed.

dejiamoo