NEW ORCA-Mini 🐳 Open-Sourced LLM that You can RUN Locally

preview_player
Показать описание
In this video we are going to look at the all new Orca-mini Open-Sourced LLM that is trained on a dataset that was created following the original Orca dataset creation instructions.

I will show you how to run this locally both using Jupyter Notebook & the Obabooga Text generation WebUI.

▬▬▬▬▬▬▬▬▬▬▬▬▬▬ CONNECT ▬▬▬▬▬▬▬▬▬▬▬
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
LINKS:
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
All Interesting Videos:

Рекомендации по теме
Комментарии
Автор

Your videos are simply the best AI content on YT. Greatly appreciated.

WildMidwest
Автор

How many days is it from Orca to Orca mini? 7? The pace of change is amazing, and underscores the desire for immediate access among devs - and, sadly, a few snake oil salesmen. Keep up the GREAT work of keeping us informed!

malikrumi
Автор

Thank You for showing amazing capabilities of orca mini, this is awesome.

pankymathur
Автор

Thanks. Changed the 'mps' to 'cuda'

ranjit-egcs
Автор

Very nice, thanks for walking through this

everyhandletaken
Автор

Excellent, I was waiting for MS to release Orca. They apparently are too slow 😋

jmirodg
Автор

Thank you, is the process same for orca 2?

hossromani
Автор

Orca is also available for gpt for all now.
PS.: We'll see if this comments got autodeleted (youtube normally does when gpt open source alternatives got mentioned).

michaelmueller
Автор

Good video, but I think you didn't mention which amount of vRAM would be necessary to run them. Probably because most experienced users know, based on the model size?

NoidoDev
Автор

0:42 I knew it lol that someone will just try to do the same thing basing just on their paper haha!

gileneusz
Автор

Working hard every day for us! 🙌 Very much appreciated! #iseeyou

MarkDemarest
Автор

Thanks for sharing this knowledge to the world! BTW, where can I download the Jupyter source code for this video?

ai
Автор

Nice video again, thx! Do you know if the model supports any other languages apart from english?

juanhartwig
Автор

dejamos vicuña 13b para irnos por este jejeje buen video!!

ozzymr
Автор

Nice....

Do you think it will be able to solve complex maths like gpt 4?

oladejiolaoluwa
Автор

Lord, I live in Los Angeles and have been researching this topic now for what feels like an eternity, I wish I could just buy a tablet from somebody with an offline LLM Pre-installed, and all I would have to do is transfer over whatever documents I want it to pull from as a data set, and that's the end of the story, anyone know how something like this could be possible? Can a tablet suffice, or would it have to be minimum a laptop?

RealTalker
Автор

First of all thanks for sharing the video. Can we use "CPU" instead of "MPS"? I'm hoping that I'll get a reply from you.


Regards
Guna Sekhar.

gunasekhar
Автор

test in Silly Tavern but response is repeated, is not very good. (13B, q8), ask it a programming question: javascript cannot save the document.cookie, why? It tell me that document.cookie is global value so it should load it as global.document.cookie, which is wrong answer.

fenix
Автор

Nice video! How can I run this model in LocalGPT with CPU only? Can I use TheBloke
/orca_mini_3B-GGML in LocalGPT. Is there actually functionality to run GGML fully on CPU?

DenisDenBBougrimov
Автор

Is it possible to load this model into a streamlit app?

buggingbee