Lecture 14B: Modern Private ML - Differentially Private Stochastic Gradient Descent

preview_player
Показать описание
Рекомендации по теме
Комментарии
Автор

Amazing! I have a simple question:
Why we can use composition lemma to handle the total privacy? I think each iteration is not independent.

ufsbdubf
Автор

Hi, thank you for publishing these lectures (non privately :) ), super helpful! I had a query regarding gradient perturbation techniques. While I see adding noise to the gradient at every iteration privatizes the model, why would I be required to do so unless I want to publish each of my gradients? Suppose I just want to publish my model, and say I'm using cross entropy loss with neural networks, so convexity doesn't hold and therefore output perturbation is non-trivial - still, can't I just add noise to the final (or final few) gradients in the training procedure and go onto publish the model? I'm guessing I'm overlooking something here, since intuitively my final iteration is likely only to be a function of a small fraction of the data, and DP is post processing invariant, but surely can't undo sensitive information captured apriori; but I can't quite get my head around formalising why this wouldn't be sufficient? Apologies in advance if this is really silly

dronakhurana
Автор

This is amazing. I have a simple question on the sensitivity of the DPSGD method.
Why is the L2 sensitivity for adding noise C. It should be sqrt(2C^2) for neighboring datasets. Why was 2 removed? This is with reference to the Deep Learning with Differential Privacy paper referenced in your CS860 course
Thanks

iyiolaolatunji
Автор

I was wondering if there were an underlying assumption for gradient perturbation, that having the gradient equals seeing the data. Yet I don't see anyone talk about this. But my guess is that if gradient cannot fully represent the data, there is already some inherent privacy, i.e. releasing it is not blatantly non-private.

junyizhu
Автор

Is it equivalent to adding N(0, sigma/L) to each individual of i in the lot?

GundamCipher