How To Develop 2 AI Apps in 10 Minutes!

preview_player
Показать описание
You don't have to pay to try out building apps that use AI. With Ollama you can run AI models locally, for free. Vercel's AI library makes it easy to manage chats and stream back messages to the client. And you can use any server you want, like NextJS, Astro and more.

👉 VS Code theme and font? Night Wolf [black] and Operator Mono
👉 Terminal Theme and font? oh-my-posh with powerlevel10k_rainbow and SpaceMono NF

00:00 Introduction
00:57 Building An AI Chatbot
03:25 Building Out The Chat UI
05:10 Creating The AI Endpoint
07:48 Building An AI Photo Reviewer
11:22 Outroduction
Рекомендации по теме
Комментарии
Автор

Finally! Was looking forward for your take on doing "local ai" development.

vineetb
Автор

Are you kidding me, this man is just incredible and keep uploading banger content.

ParasBansal
Автор

just as usual, straight forward content, thanx Jack!

arthurbulat
Автор

So I'm just integrating an API into an application and building an idea around it. Seems straight forward enough.

UniqueGameFacts
Автор

As always jack is shipping top notch content ⭐️

stefangarofalo
Автор

Would love to see how this applies for a production setting though. Can your models run in a $5 VPS?

cusxio
Автор

Thank you Jack for the video.

I'm unable to receive a response using Vercel AI SDK v3.3.5 for the Photo Reviewer project, and it looks like StreamingTextResponse is deprecated aswell.

josephnaru
Автор

great tutorial, for some reason had to manually add "zod" to get this to work properly. also "use client" is not present by default in the page. vite user so this may be novice issues to nextjs users but in case others have issues

imgnsn
Автор

I guess deploying to vercel is not that easy, probably possible on a vps

MennoB-sktv
Автор

What machine spec do you need to run the models? can they be hosed on express running in a docker container with 1 core and 2GB ram? any ideas will be much appreciated...

mohammedasifuzzaman
Автор

awesome content, but is there any way to check if ollama is installed on the system.

hawarhekmat
Автор

Hi. What are the system requirements for this? 🤔

slandrei
Автор

Nice video as always.

Would like to know if there's any good ways to deploy this and where to store the model.

For instance, would it be possible to store the model in a Google Drive and then access that from the app?

Fralleee
Автор

From 1:17 does anyone know what zsh plugin those autosuggestions are coming from?

Thank you and great video!

scottwager
Автор

Maybe can install in vps and use as production ?

that_alwayls_funny
Автор

say “ollama run llava-llama3” three times fast

gregbarbosa
Автор

Excellent video and couldn't agree more with the advice to get into AI to increase one's chances on the job market. One point of criticism: The "just run it local for free" angle of creators with modern $2000+ Apple hardware is getting a bit tiring (read: making me jealous)

not_cool_dev
Автор

But if I hit the subscribe button, I will be unsubscribed silly billy!

jonathanvandenberg