Neural Networks on FPGA: Part 3: Activation Functions

preview_player
Показать описание
#NeuralNetwork #FPGA #Zynq #Verilog #ActivationFunction #Sigmoid #ReLU
Source Code
In this tutorial we discuss the hardware implementation of Sigmoid and ReLU activation functions. We discuss the factors effecting the accuracy of the activation functions and methods to minimize resource utilization.
Previous tutorials
Рекомендации по теме
Комментарии
Автор

Thank you I want to implement the ASIC design for the whole code using OpenLane, but it has only 32 pins in total along with 64 logic analyzer input pins .How to you suggest to approach?

mdomarfaruque
Автор

Great series. Your teaching style is very good. Thanks for sharing.

Abdulbuzdar
Автор

In the verilog code where is the inputIntwidth specfied?
I wish to have the inputIntWidth=4 same as the weightIntWidth.
Also I generated the inputs accordingly. But something is not working right.

matta
Автор

Sorry I cannot get what's the input for ReLU function. Like for sigmoid we take the input from sig file but for ReLU. We won't be using same sig file for inputs of MAC unit. I am still confused about the input of MAC unit when we implement ReLU.

wajahatabbasi
Автор

In previous lecture when there is overflow you assigned 1 remaining all zeroes. But here in relu implementation you assigned 0 remaining all one's. What's the difference?

veeraganesh
Автор

Sir thanks for your Video. I have a doubt about the way you converted the signed output of the MAC to address the LUT @19:27. How does adding a large positive number ( 2**(inWidth-1)) to a positive value (x) and vice-versa with its negative counterpart fit the address in the address range? Also sir, please do you have any video or reference where the control logic is explained. I have a hard time following how the data path and control logic fit together. Lastly, I have a board, how can I do a resource estimate to figure out if this design is likely to fit on my board or in other words what is the min resource requirement to accurately identify the handwritten digits with this design?
Thanks in advance

vanessaingrid
Автор

Hello thanks for your videos plise I need some help for GPU applications running in the zynq

medhm
Автор

how many neurons have you designed here sir??

varunl
Автор

Does the weightIntSize inculude the sign bit(I think no)

jianingli
Автор

How many more videos left in this series??

sudharsankannan