filmov
tv
Running vLLM on Akash
Показать описание
In this video, I discuss how vLLM can be used on the Akash platform to serve open source LLMs for multi user applications with concurrent requests. I highlight the benefits of vLLM for those looking for a high throughput server implementation. I also show the how to use vLLM with OpenWebUI, an open-source LLM user interface, and provide a step-by-step guide on deploying vLLM on Akash. I also demonstrate how you can use vLLM with agent systems such as LangChain and Crew. Our Crew AI agent example shows how you can generate an Akash Deployment yaml in a jupyter notebook
Комментарии