Building models with tf.text (TF World '19)

preview_player
Показать описание

Presented by: Robby Neale

Рекомендации по теме
Комментарии
Автор

Throughout the BERT GitHub repo, almost a lot of people wanted to have this within TF as one of the API. Finally is here. I was one of the first people to write a preprocessor to do fine-tuning from BERT and raised this question at first. And, also posted a pr on the Github issue request.
Glad to see this happening. It's been a year that Google Research gave birth to BERT.

jugsma
Автор

This is something we waited really long.

aiwithr
Автор

Listen to around 30 minute 45 second mark (near the end of the main presentation). Hol up ... Wait a min. So for sequence data (in addition to text data) that is non-text, e.g., numeric time series, tensorflow_text is the right thing to use because it contains the ragged tensors? Then why name it tensorflow_text? Did you think this through? This is a lot like the keras attention layer which although useful for sequences generally in concept, Google apparently wrote and documented as if it's only for text sequences and encoder-decoder text models. Google product managers need to consider to generalize their thinking -- and software tools -- a little more.

geoffreyanderson
Автор

Are the preprocessing functions such as bert_preprocess() built-in and done automatically, or do we still write those? Wasn't sure if you were showing the code for what happens behind the scenes or if they were examples of what an engineer might write.

jeffreydeason