F23 Lecture 17: Recurrent Networks, Modeling Language Sequence-to-Sequence Models

preview_player
Показать описание

Рекомендации по теме
Комментарии
Автор

This is a great explanation on seq-to-seq and attention models -- thank you for sharing!

oChimychanga
Автор

Thank you for putting this lecture online! This lecture should be number 18 😊

widipersadha
Автор

this lecture should come after lecture 23 - i.e. the videos labeled "18" should come before "17"

ColtonLapp