filmov
tv
Ryan Marten - Vertical Federated Learning

Показать описание
Vector interns are curious students and researchers affiliated with the Vector Institute who collaborate with the best and brightest AI developers, scaling research breakthroughs in Machine Learning and Deep Learning.
Below is one of many Internship Talks, where interns presented their projects after completing their work term. Projects span across sectors, some of which include: Project Management, Automation, Applied Machine Learning, and Data Visualization.
PETs: Privacy Enhancing Techniques, such as federated learning, differential privacy, and homomorphic encryption. Ryan’s talk focuses on Vertical Federated Learning as a Privacy Enhancing Technique.
There is a large amount of data stored by individual organizations which, if used collectively to train machine learning models, could benefit consumers or the general public. For example, if a model can be developed using patient data from all hospitals across Canada, we could understand the health situation of Canadians and allocate resources and funding both in research and in hospitals accordingly. It is evident that such models could help hospitals across Canada operate more efficiently and advance research where it is most needed. However, organizations such as hospitals are often unable to share data due to privacy constraints. As a result, how can we train a machine learning model leveraging data stored across organizations without creating privacy risks? In this work, we showcase how privacy preserving machine learning technologies (e.g. differential privacy, federated learning) enable the training of such models.
Below is one of many Internship Talks, where interns presented their projects after completing their work term. Projects span across sectors, some of which include: Project Management, Automation, Applied Machine Learning, and Data Visualization.
PETs: Privacy Enhancing Techniques, such as federated learning, differential privacy, and homomorphic encryption. Ryan’s talk focuses on Vertical Federated Learning as a Privacy Enhancing Technique.
There is a large amount of data stored by individual organizations which, if used collectively to train machine learning models, could benefit consumers or the general public. For example, if a model can be developed using patient data from all hospitals across Canada, we could understand the health situation of Canadians and allocate resources and funding both in research and in hospitals accordingly. It is evident that such models could help hospitals across Canada operate more efficiently and advance research where it is most needed. However, organizations such as hospitals are often unable to share data due to privacy constraints. As a result, how can we train a machine learning model leveraging data stored across organizations without creating privacy risks? In this work, we showcase how privacy preserving machine learning technologies (e.g. differential privacy, federated learning) enable the training of such models.
Комментарии