Running vLLM on Akash

preview_player
Показать описание
In this video, I discuss how vLLM can be used on the Akash platform to serve open source LLMs for multi user applications with concurrent requests. I highlight the benefits of vLLM for those looking for a high throughput server implementation. I also show the how to use vLLM with OpenWebUI, an open-source LLM user interface, and provide a step-by-step guide on deploying vLLM on Akash. I also demonstrate how you can use vLLM with agent systems such as LangChain and Crew. Our Crew AI agent example shows how you can generate an Akash Deployment yaml in a jupyter notebook
Рекомендации по теме
Комментарии
Автор

Thanks, great tutorial.
Could you help me with a similar scenario? I want to try to set up a test environment on my personal computer, it would be Windows 11 on the host and Ubuntu on the WSL, the vllm and OpenWebUI containers would be run on the WSL docker. Can you help me?

RaulDiasFotografo