What is Neural Search? Nils Reimers - Sentence Transformers and Embedding Evaluation

preview_player
Показать описание
Sentence Transformers and Embedding Evaluation - Talking Language AI Ep#3 Full episode:

About The Speaker:
Nils is the creator of Sentence-BERT and has authored several well-known research papers, including Sentence-BERT and the popular Sentence Transformers library. He’s also worked as a Research Scientist at HuggingFace, (co-)founded several web companies, and worked as an AI consultant in the area of investment banking, media, and IoT.

===

In our conversation, Nils gives us an introduction to the Sentence-BERT package and the large language models provided in it. He also shares some lessons from his experience in open-source development of such a popular package. Finally, Nils touches on his research collaborations on how to evaluate embeddings through works like MTEB: Massive Text Embedding Benchmark and BEIR.

To go deeper into these tools, and other concepts around embeddings, watch the video and join the conversation on Discord. Stay tuned for more episodes in our Talking Language AI series!

===

Discussion thread for this episode (feel free to ask questions):

===

Resources:

Рекомендации по теме
Комментарии
Автор

Thanks for a great overview of how everybody can train a search model for their own data! One question: Is there any difference whether I take the mean of embeddings or CLS token of BERT or the mean of hidden state representations?

natalyfoucault