Understanding Top_p and Temperature parameters of LLMs

preview_player
Показать описание
In this video, we are exploring the usage of top_p and temperature parameters in large language models. By adjusting these parameters, we can customize the language models to better suit our specific use cases.

#chatgpt #chatgptapi #parameter #languagemodels #llm

Timestamps
0:00 Introduction to Parameters
1:20 Effect of Top_p and Temperature for LLM
3:08 Concept behind temp and top_p
6:07 Example usecases

Рекомендации по теме
Комментарии
Автор

Difficult topic to understand overall, but your vid has brought me within a higher Probability of one day understanding fully ;) Thanks 🙏

BabylonBaller
Автор

does the high top_p value increase the response time?

mpaforoufakis
Автор

thanks a lot.
what would be the result of temp=1 and top_p=0

amortalbeing
Автор

This is an incorrect way of explaining temperature and top-p, temperature will be applied to the softmax computation itself upon which top-p and top-k is then implemented!!

OmkarRahane-shvu