Federated Learning Demo Using the Flower Framework | Python, Tensorflow 2.0

preview_player
Показать описание
In this video, I take you through a brief explanation of how Federated Learning works and introduce you to one of the python frameworks used to implement the same. I use Tensorflow 2.0 to create the models and MNIST as the dataset.

Flower is a python framework used to implement Federated Learning. With just a few lines of code, convert your ordinary ML/DL code to a Federate architecture. Best part, no need to learn any new deep learning framework (if you already know one, that is) for support with flower as it supports all other deep learning frameworks (as long as the weights of the models can be extracted as a Numpy Array).

Рекомендации по теме
Комментарии
Автор

you explained so well and clearly please make more videos with practical implementation on edge devices

AnamKhan-frif
Автор

Excellent video regarding federated learning, most accurate than all the videos i ve watched so far

amirtemimi
Автор

Thanks, finally a clear explanation with flwr code.

muntasir_fahim
Автор

Dear Garai;
This is exactly what i was looking for.
Thanks for your effort.
Best regards...

omeremhan
Автор

Hey prateik, just want to thank u for this video. I have to submit a report on federated learning and it's practical implemtation and this video is going to help me alot

MeraAashiyan
Автор

Thank you for grate description. It was helpful and clear, please keep recording video about federated learning.

mozhganrahmatinia
Автор

I just came across your video with clear and vivid explanation. I will to contact you. Thanks

loveahakonye
Автор

Well articulated.. custom strategy implementation can be helpful. May be you can come up with next video.

KiranKawalli
Автор

subscribed! great video, keep making contents like this man.

sailfromsurigao
Автор

sir could you please tell me how do i used the aggregated weight i mean i want to use it so that my model can predict new input please help me

vivianpaul
Автор

Thanks a lot, can you tell me how to load that saved weights as .npz file to use my model?

shivamsrivastava
Автор

Hello we are currentl working on federated learning but we want to classify mobile packets so we are using the CSE-CIC-IDS dataset. how to load that dataset to this code. could you please guide us

fypgrp
Автор

While running both client i am getting graph for only 0 class

gayathrikamath
Автор

I want to use Federated Learning with my one of existing classification machine learning models in which I had used Linear regression model for classification. Now, I have simply added that model in your code and run it and I'm getting error that LinearRegression has no method .setWeight().

As in flower there is .setWeight() and .getWeight() used for each models so how can I use Linear regression or other machine learning models with Federated Learning using Flower? Thanks a mill in advance

sageraza
Автор

Hello I am trying to load the dataset but the count of classes is zero why? Can you help me?

rahmahermes
Автор

Hi, I am working on implementing Flower on raspberry pis as clients for object detection with CIFAR-10 dataset and I am facing some issues. Have you tried this approach?

poojithist
Автор

Hi my fyp is mobile packet classification using federated learning...can I get ur contact for the help

adithya
Автор

How to save the model weights once the training got finished

LearnAiWithVikas
Автор

I have a doubt regarding Federated learning. Suppose if we have 2 remote workers like x and y, is it like the server sends the model as a whole to both x and y at a time and both train models locally at a time and send it back to the server and the server aggregates. Or is it like first the server send the model to x and x trains and sends the updates to server and then sender send those updates to y and y trains and send the updates to server and repeating? Pls do clear my doubt for it.

abinayamani
Автор

instead of loading mnist dataset, is it possible to load any other datasets? like pneumonia dataset? if show what should we change here?

abinayamani