Differential Privacy + Federated Learning Explained (+ Tutorial) | #AI101

preview_player
Показать описание


Thank you to Kasia, Jeff, Gerald, Milan, Ian, Becky, Jino, Daniel, Narskogr, Jason, and Mariano for being $5+/month Patrons!

Sources:


Рекомендации по теме
Комментарии
Автор

A note on the Federated Learning example in the Colab tutorial (which is also pointed out on the OpenMinded tutorial) - this is meant to show how the learning process works when you're pulling from separate datasets, but this process actually doesn't ensure privacy! You can call model.get() to learn about how to predict well on Alice's data without having seen it, which can help you learn more about the dataset itself, even potentially replicating it perfectly. A way to avoid this is to average Bob and Alice's model updates before sending them to the global model as we talked about in the video! Thanks to u//raj111sam for pointing this out on the r/artificial subreddit.

JordanHarrod
Автор

Hey Jordan :) I am a postdoc researcher based in Japan, but originally from Brazil. I have started to work with federated learning this month and your video was super nice to make me understand better this topic. I am going now to check your tutorial :) Thanks a lot!

Автор

I was just wondering about the difference between differential privacy and federated learning and this video showed up just in time to explain exactly that. What a gem! Thank you so much.

pinklemonpurplerose
Автор

Thanks for taking the time to put together this well explained video. It’s great that you added a notebook for exploring more deeply what you explained in the video. Nice work!

SamuelGuebo
Автор

Hey Jordan, I'm new at federated learning, your video was really nice!

zhipingliu
Автор

@JordanHarrod Thank you for this lovely video. Is there a video that explain the code you shared in Colab ?

taofikolajobi
Автор

Seeing 0:50, I know this video definitely worths a thumbs up.

thmingus
Автор

Great job. Please do a video on Continual Learning + Differential Privacy.

anand
Автор

Great video. So happy I subscribed. You always make something interesting and informative 🤘🏾🤘🏾

markeliupson
Автор

Would definately like to see more coding examples.

rajan_
Автор

This was very informative, Jordan. Thanks for the great explanation!

_psyguy
Автор

Nice explanation. Just wanted to point out that differential privacy can also be achieved by adding noise to a model or its gradients, rather than to the data directly

jg
Автор

Dear Jordan, Thanks so much for this video and link. I was looking for this kind of information. Thanks a lot again.

patricioa.galdamess.
Автор

Thanks! I’m doing a short presentation about these topics for my data science boot camp

clairenoms
Автор

3:07 - It's not just medical data either, with enough data points, you can de-anonymize anything. That's how Panopticlick works.

vnceigz
Автор

awesome, here for the coding walkthrough

israel_abebe
Автор

Short video but informative. Thanks for sharing.

xenyamike
Автор

Great video
Please do more or an update on privacy protection

N-HTTi
Автор

4:21 - Would be very interested in an Analysis of Whoop and your thoughts on it!

ciferkey
Автор

Good one to start with. Can you add a tutorial or reference for private network setup or remote worker setup using torch and pysyft.

aadarshchoudhary