AnythingLLM: Fully Private AI - the Easy Way 🤖🔒

preview_player
Показать описание
Concerns around data privacy make locally run (on one’s computer) open-source LLMs an attractive option.

Ollama makes this quite simple, and LM Studio gives access to a vast library of quantized favorites.

However, improving the user experience involves a number of hoops.

Fortunately, AnythingLLM solves this issue, and more:

💻 Downloadable like any other software
🔘 One-click hook to open source models*
💬 Familiar chat interface
🛠️ Built-in assistant capabilities (system prompt, RAG, web fetch)
🚀 Additional agent capabilities: web search, scraping, graph, teachability, file save**

*I use Orca-mini, a fine-tuned Llama2 in the demo video, but a more powerful computer could handle Mistral-7B. Alternatively, select the Ollama or LM Studio option and pull Phi-3 or other models from there.

**Requires a function-calling model.

Рекомендации по теме