Few-Shot Learning (2/3): Siamese Networks

preview_player
Показать описание

This lecture introduces the Siamese network. It can find similarities or distances in the feature space and thereby solve few-shot learning.

Lectures on few-shot learning:
Рекомендации по теме
Комментарии
Автор

This is hands down the best explanation of Siamese networks on YouTube

haroon
Автор

Please upload more of these English lectures sir! Best content ever! I'm not bored listening to your careful explanations!

hp
Автор

Mind-blowing and very-well explained. This video succeeds in giving us the intuitive aha moment when you finally understand what few-shot is and how Siamese networks are used for that! Thank you.

karanacharya
Автор

After reading dozens of papers (including the original ones) this is the place where I got my understanding of Siamese clear. Thanks.

prasadjayanti
Автор

Holy shit, dont know why other articles are little bit harder to understand, but explained very good. Thanks a lot!

sanketgadge
Автор

Best description of Siamese Network, can you also make video on MAML?

antulii
Автор

Best tutorial that I have ever seen, much better than those technical articles or Academic thesis which are full of mathematical symbols and formulas

loveplay
Автор

Hands down the best tutorial on Siamese Networks!

subhrajitbhowmik
Автор

Presentation is very well prepared graphically. Simple and with pauses. It looks easy, but it's not. Thank you, Shusen Wang, 🙏

Tiago_R_Ribeiro
Автор

thank you sir for all the effort you made in this clear explanation it helped me a lot in understanding Siamese network

jerbijmaziz
Автор

I'm a non-English speaker, but I understand everything.

wooheonhong
Автор

Thank you, very explicit explanation. 讲的太好了老师!感谢!

jaylenzhang
Автор

Your explanations are very easy to understand. Thank you!

gingerbrown
Автор

Clear and good explanation, good lecture, thanks

Fers-gy
Автор

Best lecture. Please keep posting.Best video ever.

larissabasso
Автор

Great explanation, thank you. I'm confused about last example of classification and support set. I was thinking that after training, model should have distance metric and present predictions for all classes provided in training before that.

eck
Автор

In practice, what mechanism would you use to generate the support set? I ask because let's say your support set contained a bunch of rodents so it might be hard to distinguish a squirrel, whereas you have another support set with a variety of objects including your support squirrel. Obviously, you now have a choice of two support sets where using one will be harder to correctly classify your squirrel. Do we include a metric in the loss that accounts for the distances between the support images? For example, we want to help out when our support images are more similar to one another, but we don't care when our support images are already pretty dissimilar.

davidlanday
Автор

I feel like autoencoder can be used for the classification task and might work better. Because autoencoder can map the input into a latent space which captures the patterns.

gemini_
Автор

so the training set is much bigger than the support set ? and i only use the support set to help with the classification of query images ?

geoskyr
Автор

if you can provide the code for implementation then it will be great.

bk
join shbcf.ru