Deep Learning on a Xilinx FPGA with MATLAB Code

preview_player
Показать описание
FPGA-based hardware is a good fit for deep learning inferencing on embedded devices because they deliver low latency and power consumption. Early prototyping is essential to developing a deep learning network that can be efficiently deployed to an FPGA.

See how Deep Learning HDL Toolbox™ automates FPGA prototyping of deep learning networks directly from MATLAB®. With a few lines of MATLAB code, you can deploy to and run inferencing on a Xilinx® ZCU102 FPGA board. This direct connection allows you to run deep learning inferencing on the FPGA as part of your application in MATLAB, so you can converge more quickly on a network that meets your system requirements.
--------------------------------------------------------------------------------------------------------

© 2020 The MathWorks, Inc. MATLAB and Simulink are registered trademarks of The MathWorks, Inc.
Рекомендации по теме
Комментарии
Автор

Can we also deploy to ZCU104 and any Zynq-based board like PYNQ Z2 or some Artix-based board like Nexys Video/A7 ?

thanatosor
Автор

Does this toolbox support other FPGAs as well, such as Virtex-6 SX475? Thank you!

George-dojc
Автор

Is not 8 fps too slow for xilinx? I am confused to whether to use a ti dsp or an fpga for my deeplearning project

samaTV