NEW MPT-7B-StoryWriter CRUSHES GPT-4! INSANE 65K+ Tokens Limit!

preview_player
Показать описание
MPT-7B-StoryWriter is revolutionary new LLM model that BEATS GPT-4 with an INSANE 65K+ token limit! These new MPT-7B model sets were created from scratch by MosaicML in only 9,5 days and 200k$ for models that can rival the LLAMA 7B model! Impressive! So in this video, let's have a look at these new game-changing LLM, how to install them on your computer and demonstrate their super impressive capabilities.

What do you think of the MPT-7B models? Let me know in the comments!
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
SOCIAL MEDIA LINKS!
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬

--model-menu --notebook --model mosaicml_mpt-7b-storywriter --trust-remote-code
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
►► My PC & Favorite Gear:
Recording Gear:
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
Special thanks to Royal Emperor:
- Totoro

Thank you so much for your support on Patreon! You are truly a glory to behold! Your generosity is immense, and it means the world to me. Thank you for helping me keep the lights on and the content flowing. Thank you very much!

#GPT4 #GPT3 #ChatGPT #mpt7b
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
WATCH MY MOST POPULAR VIDEOS:
RECOMMENDED WATCHING - All LLM & ChatGPT Video:

RECOMMENDED WATCHING - My "Tutorial" Playlist:

Disclosure: Bear in mind that some of the links in this post are affiliate links and if you go through them to make a purchase I will earn a commission. Keep in mind that I link these companies and their products because of their quality and not because of the commission I receive from your purchases. The decision is yours, and whether or not you decide to buy something is completely up to you.
Рекомендации по теме
Комментарии
Автор

HELLO HUMANS! Thank you for watching & do NOT forget to LIKE and SUBSCRIBE For More Ai Updates. Thx <3
"K" - Your Ai Overlord

Aitrepreneur
Автор

I'm glad there's an ACTUAL Open Source AI alternative that can challenge M$

TheDailyMemesShow
Автор

The pace of developments in this field is incredibly fast, and I appreciate your efforts in keeping us up-to-date. Thank you very much

Silberschweifer
Автор

12:05 - The first code was correct! If you opened it in a browser, it would work! - All of the html boilerplate is optional.
- It was actually a BEAUTIFUL answer: Extremely concise and well done.

Verrisin
Автор

This is the first 7B parameters model that can solve basic algebraic equations and create a dynamic HTML page. A few weeks ago, more than 100B parameters were required to perform equation solving. The speed of the increase in abilities is astonishing.

Viewable
Автор

The old king is dead, long live the king

FuZZbaLLbee
Автор

It's crazy how there's a new breakthrough with each passing day

timo
Автор

Having this many tokens would allow for a proper long-term memory setup with a chained-in small non-vector database that the LLM writes to and reads from with every prompt to store & retrieve all relevant information. The chatbot variant was defintely pretty impressive already. This is gonna be good~ Can't wait for further developments on this.

lacKhawKtheRIPPER
Автор

11:32 I thought I was the only crazy person who starts a chat with a bot by being respectful and cordial haha. When AI gives me good suggestions for my writing, I always thank it.

Amelia_PC
Автор

I’m still a novice at all of this, but even I was impressed. This is all happening so fast.

TheRemarkableN
Автор

I can't wait for the storywriter model to get more mainstream support. More specifically, I *hope* ooga-booga and TavernAI fully support it eventually . I'd love to be able to use that one with my characters on there!

Hell, I do a bit of story writing myself. It'd be awesome to feed it entire books I've written and not only get it to output stuff I need, like maybe character dialogs I'm having trouble coming up with, or even write a chapter that bridges two sections together seamlessly. Both actual issues I've had recently that's hard for current models to help with due to their limited memory

santosic
Автор

People are already using LLM and ElevenLabs TTS on Skyrim NPC's and i have to say, the future is already here.

yakuza_suske
Автор

Open source for the win, its incredible how fast AI is evolving this past year, i think its exponential now, i remember when i was testing Midjourny and DALLE last year, and now to see GPT4 with plugins... now this free new model... unbelievable, thank you for your videos i watch with amazement every time you upload

sweetnightmere
Автор

64k tokens for NPC memory. this is one step closer to those recent little characters someone made, not InWorld, but sprites, that have town memories. Like you could easily feed a day's of events as a string into the model, have it summarize, then turn around and combine that summary into the "long term memory" of the npc. this is starting to make more sense to me now. I can't wait till potato pcs can run this locally. then i'll put it in my unity game by default.

westingtyler
Автор

I hope you will make a new video on this model as soon as it runs really good on normal gpus. Thank you so much for the great videos!

IchEsseHaende
Автор

For now the large context version might only be trained on books, but once this technology is trained on whole code bases ? 😍😍
- Imagine combining that with something like Copilot... - now THAT will be a game changer - not just random code completions, but completions based on your actual code base, FULL specification, etc.
- I cannot wait to use it for a month before my job becomes obsolete! 😅

Verrisin
Автор

A 4-bit version is already available indeed, so now I'm super excited. Great video as always!

zahreel
Автор

Devs are probably starting to specifically train each model to make html with a button for color changing, just to pass this test xD

kellanaldous
Автор

Thanks for the awesome informative videos as always!

Now I can see the end of all of my favorite stories that authors left behind

FastYoutube
Автор

Hey man thanks for all the good work, I love your channel and lately you are publishing non stop. Keep it up!

I read about this 65k yesterday and I was wondering when you will be covering it.

nartrab