Run Llama 3.1 in Python by Using Ollama in Linux Ubuntu on Local Computer

preview_player
Показать описание
#llama31 #ollama #llama #ubuntu #linux #python #llm #machinelearning #ai #aleksandarhaber #meta #intel
It takes a significant amount of time and energy to create these free video tutorials. You can support my efforts in this way:
- You Can also press the Thanks YouTube Dollar button

In this tutorial, we explain how to run Ollama and Llama 3.1 Large Language Model (LLM) on a Local Computer From Python in Ubuntu Linux. Ollama is an interface and a platform for running different LLMs on local computers. On the other hand, Llama 3.1 is Meta's (previously Facebook) most powerful LLM up to date. We will call Llama 3.1 by using Ollama's Python library. After the response is generated in Python, we will save the response in a text file such that you can use the generated text for other purposes.

The procedure is:

1.) Install Ollama and download Llama 3.1 model from the Ollama website
2.) Create a workspace folder, create Python virtual environment, and install Ollama Python Library
3.) Write Python code that calls Llama 3.1 by using Ollama library and that saves the response in a text file.
Рекомендации по теме
Комментарии
Автор

It takes a significant amount of time and energy to create these free video tutorials. You can support my efforts in this way:
- You Can also press the Thanks YouTube Dollar button

aleksandarhaber
Автор

already using Ollama and I'm very happy with 70B model

sandeepnaik
Автор

Thank you for sharing this information!
Have a question regarding another video that you used pytorch, transformers and accelerate. There is any reason why it was not used on this example? Any difference besides the prompt used?
Thank you once again!

fabiopguerreiro