I Analyzed My Finance With Local LLMs

preview_player
Показать описание


🔑 TIMESTAMPS
================================
0:00 - Project intro
1:35 - Sponsor (Coursera)
2:04 - Why using local LLMs?
3:34 - Install Ollama
4:14 - Run local Mistral model
6:17 - Run local Llama2 model
7:27 - Customize LLMs with Ollama
9:53 - Access Llama2 with Langchain (Python)
10:45 - Categorise bank transactions
14:46 - Create personal finance dashboard
17:24 - Conclusions

👩🏻‍💻 COURSES & RESOURCES
================================

🙋🏻‍♀️ LET'S CONNECT!
================================

As a member of the Amazon and Coursera Affiliate Programs, I earn a commission from qualifying purchases on the links above. By using the links you help support this channel at no cost for you.

#ai #datascience #ThuVu #dataanalytics
Рекомендации по теме
Комментарии
Автор

Update: Ollama now works on Windows normally

YoutubeCom_
Автор

As a data scientist, I am blown away by your video's theme. You successfully managed to keep it simple to attract the interest of the majority and mention about technical details that is beneficial for more technical people watching this video. Best wishes!

Arsenik
Автор

Incredible intro video for the semi technical about how chat gpt and similar models will be used in daily life to improve the mundane tasks, with a side of cautions about incorrect answers and computational limitations! Great balance, I’m already sharing it around our team 😊

noahchristie
Автор

Are you a real human? I have NEVER seen an author on youtube cover so much incredible knowledge in such a short video. This is absolutely AMAZING!!! Thank you

TailorJohnson-ly
Автор

Great video... My 2 cents: we can force LLMs to respond only in json format by stating it in system prompt, so you get consistent parsable response always (I've tried with gpt4), also you can provide list of possible expense categories to avoid grouping them together later (like 'Food & Beverage' and 'Food/Beverage')

AshishRanjan-jnre
Автор

wow, so concise, to the point, no nonsense, clean, information packed presentation, thank you.

murali-alive
Автор

Awesome structure to convey a "simple" idea, without getting down into the weeds with how truly complicated it is. Thanks!

johndoughto
Автор

This is great. We're in the process of integrating LLMs into our "what if" scenario modelling platform and this gave me a few ideas on next steps. Sharing this video with my dev team!

whatifi-scenarios
Автор

🎯 Key Takeaways for quick navigation:

00:00 💲 *Reviewing Income and Expense Breakdown*
- Explained the process of analyzing financial transactions.
- Talked about classification of expenses into categories.
- Spoke about using low-tech ways and an AI assistant for classification.
02:16 💻 *Running a Large Language Model Locally*
- Discussed different ways to run an open-source language model locally.
- Listed various popular frameworks to run models on personal devices.
- Explained why these frameworks are needed, emphasizing the size of the model and memory efficiency.
04:18 📚 *Installing and Understanding Language Models *
- Demonstrated how to install a language model through the terminal.
- Showed the interaction with the language model through queries in the terminal.
- Assessed the model's math capabilities, showing a failed example.
06:48 🎯 *Evaluating Expense Classification of Language Models*
- Checked if the language models can categorize expenses properly through the terminal.
- Demonstrated how to switch models, correctly installing another model.
- Showed the differences between the models and preferred one due to answer formatting.
08:24 🛠️ *Creating Custom Language Models*
- Explained how to specify base models and set parameters for language models.
- Demonstrated how to create a custom model through the terminal.
- Discussed viewing the list of models available and building a custom blueprint to meet specific requirements.
11:46 🔄 *Creating For Loop to Classify Expenses *
- Discussed forming a for loop to classify multiple expenses.
- Detailed how to chunk long lists of transactions to avoid token limit in the language model.
- Mentioned the unpredictability of language models and potential need for multiple queries.
14:32 🔍 *Analyzing and Categorizing Expenses*
- Demonstrated how to analyze and categorize transactions.
- Showed how to group transactions together, clean up the dataframe, and merge it with the main transaction dataframe.
15:14 📊 *Creating Personal Finance Dashboard *
- Detailed the creation of a personal finance dashboard, that includes income and expenses breakdown for two years.
- Introduced useful visualization tools such as Plotly Express and Panel, giving a short tutorial on how to use them.
- Demonstrated the assembling of a data dashboard from charts and supplementing it with custom text.
17:02 📈 *Visualizing Financial Behavior Over Time*
- Demonstrated the use of the finance dashboard, drawing observations.
- Concluded with a note on importance of incorporating assets into financial management.
- Highlighted the value of running large language models on personal devices for tasks like these.

Made with HARPA AI

roberthuff
Автор

OMG this is inspiring I always wanted a 3rd party view about my expenses without loosing control of my data and this video hits the nail on the head.

etutorshop
Автор

You just earned a new subscriber, Thu. I mean, wow. Very inspirational to see what you built on a friggin laptop, no less. Goes to show you don't need thousands of compute cores, either. Ver very cool. 🎉

tolandmike
Автор

This is great! I was recently experimenting on a personal finance tracker dashboard and connect it to a chatting apps, so the user could easily input their financial activity by only typing it. On the process, i try to use chat gpt to simplify and generalise the format so we can input the data faster, never have i thought that it could be done by a local LLM. Looking forward for your next video.

bimoariosuryandaru
Автор

I've noticed that most LLM understand that you would like a CSV formatted output and you use that to get more consistent output.

SamFigueroa
Автор

Thanks so much ! Being investigating AI for just one month, having so much to learn again (and that's cool), your videos really help.
Being not a natural english speaker, it was a bit fast to follow, but no issue : It was clear, precise, and... I will find time to listen to it up to be sure having got any lesson from it.
Same apply to your other videos, but change nothing :
( It could even help me improve my English level ;-)... )

brunogillet
Автор

Thank you for sharing this dear! You covered the basics and shown the path to a great first goal with your own custom on premise and well licensed LLM. Huge!

SebastianSastre
Автор

I love the content. Also, I have not seen anyone can program so fast!!!

EverythingMy
Автор

Incredible video, I love how you simplified all the process. Your content inspired me I will try it on my personal projects as well

soky
Автор

Outstanding video, especially for this beginner. Didn’t know you could run the models locally. Those ollama layers look like docker, fascinating how the context is setup. Time for me to spend some cycles on all your vids, not just the couple I’ve casually looked at. Thanks!

korntron
Автор

Hi Thu! Last year I had referenced your panel dashboard video to build my personal finance dashboard. I like seeing how you built yours. Your content is very useful. Thank you!

kevinmanalang
Автор

This is such a great video. Thank you for making it. I had no idea this sort of thing was possible and I'm finding all sorts of ways to take advantage of it now.

Codad