Run a GOOD ChatGPT Alternative Locally! - LM Studio Overview

preview_player
Показать описание
LM Studio is a desktop application that allows users to run large language models (LLMs) locally on their computers without any technical expertise or coding required. It provides a user-friendly interface to discover, download, and interact with various pre-trained LLMs from open-source repositories like Hugging Face. With LM Studio, users can leverage the power of LLMs for tasks such as text generation, language translation, and question answering, all while keeping their data private and offline.

▼ Link(s) From Today’s Video:

-------------------------------------------------

▼ Extra Links of Interest:

Let's work together!

Thanks for watching Matt Video Productions! I make all sorts of videos here on Youtube! Technology, Tutorials, and Reviews! Enjoy Your stay here, and subscribe!

All Suggestions, Thoughts And Comments Are Greatly Appreciated… Because I Actually Read Them.
Рекомендации по теме
Комментарии
Автор

God, I love how unfiltered this local LLM is. It's not the smartest, but it's the most honest discussion I've ever been able to have with any LLM...or really any human for that matter!

spadaacca
Автор

Totally agree with recommending Llama 3 8B Lexi Uncensored. I've used the System Prompt to give mine it's own personality, age, sense of humour, mood, etc. A bit of fun, but who wouldn't want an assistant that's tailor-made to suit them? Now, just need to figure out how to give LM Studio a voice, some one has done it, but I get errors when I try following along.

amkire
Автор

Finally someone covering LM Studio! It's the very best out there.

LilBigHuge
Автор

I tried making a bogus ad about a bogus Head Shop to use as a radio spot, and none of the gpts thought it was a thing to do. They all refused me. I just now installed the LM Studio and am running that Llama 3 llm, and it has already spit out 5 different styles of that ad for me. This is great. Thanks.

dalecorne
Автор

Awesome tool, I had no idea this existed, thank you so much Matt

Pepius_Julius_Magnus_Maximu...
Автор

Note that you can change the system prompt using the OpenAI Playground or using the API (9:25). In this case, you'll have to pay per token, but $5 goes a long way with either GPT-4o or GPT-3.5.

nathanbanks
Автор

This is so great. I rarely ever wanted to goof around with local LLMs because the oobabooga UI was honestly pretty horrible to understand and do anything with it. This one is simple and clean.

MrPablosek
Автор

Thank you Matt running AI locally is super important

Deljron
Автор

This is a great new setup. I had an old uncensored LLAMA local setup but it was very small and not very useful... but this one has multiple chats and works well. Thanks for the video and information.

Streeknine
Автор

Been using LM Studio for awhile now. Great piece of kit especially since they have added the GPU offload option which now makes the LLM's wizz along.

TPCDAZ
Автор

Your videos bring fresh insights and kindle a flame of curiosity within me.🌟🎥🤔

bobbykingAiworld
Автор

Thanks so much for introducing me to this amazing AI assistant! I'm really excited to explore the possibilities. Your content is always inspiring and informative, and I appreciate how you share your knowledge with the community.

sydroyce
Автор

2:59 bro is flexing his gigabit internet, the only flex I can approve 💀

renofumi
Автор

You are the freaking bomb man!! This is insane!

michaelandremovies
Автор

What I love about LM Studio is that it really is a hassle-free install. No need to download half dozen developer toolkits on your machine, pull random stuff from github and wonder why it still does not compile. Just download and install.

What I do not love is the performance. Or rather, I do not understand how it scales. I have tried three different GPUs on LM Studio (GTX1060 6GB, Tesla M40 24GB and P100 16Gb) and they all perform about the same (same hardware and software otherwise and 100% gpu offloading). On some models the 1060 is actually faster (tokens/sec) than the P100 which just does not make any sense.

Bottom line: it is an extremely easy way to running your own language models that costs nothing, highly recommended :)

timomustamaki
Автор

GPT-4o voice and uncensored but locally... HOLY F**** imagine the possibilities.. also create him or her own voice or accent.

brockly
Автор

Wouldn't it be funny if we had just watched Matt doom the entire human race to an AI apocalypse at 8:53?

juancarlosgonzalez
Автор

I recommend watching a video titled: Supercharge Your Local LLM: Internet-Enabled AI Assistant with LM Studio
It is a windows app that allows the Local LLMs in LM Studio to use the internet

DannyC
Автор

I've been using LM Studio for a while, it's pretty great for accessing different models, as long as your system can handle different ones

davidoswald
Автор

If you had set GPU offload setting to max layers (the one your left at 10) it would reply about 10 times faster if your GPU can fit the model on its VRAM.

gabrielsandstedt