Fine Tune LLaMA 2 In Ten MINUTES! - With Google Colab and 0$

preview_player
Показать описание
In this video, we're going to fine-tune LLAMA 2 on Google Colab with zero dollars using a free dataset, aided by libraries from Hugging Face. We'll make the fine-tuning process easy and explain what LoRA and QLoRA are, as well as how the fine-tuning process works.

00:00 - Intro
00:50 - When to fine-tune model and use RAG
01:40 - How Fine Tuning Works
02:23 - What is LoRA
03:13 - What is QLoRA
03:45 - Installation
05:01- Model Setup
05:20 - Data Set
07:20 - QLoRa Parameter
07:55 - Training arguments
09:01 - SFT and RLHF
09:55 - Loading Data
11:22 - Training the model
12:10 - Results
13:30 - Uploading to Hugging Face

_______________________________________________________________

💷 50% Discount Code : A2LH6LZ

_______________________________________________________________

#llama2 #finetune #gpt #falcon #ai #llms #gpt #huggingface #autogpt
Рекомендации по теме
Комментарии
Автор

Would like to see a video of how to create a custom dataset for Fine tuning a model. Thank you !

henrijohnson
Автор

Thank for the video and work you put in

drp
Автор

Thank you for this nice video, you explain well the subject, I wish if you can make a tutorial on how to craft a training dataset from a custom data (like pdf documents).

hajhouj
Автор

This doesn't even work with A100 GPU High-Ram since it runs out of memory when you are training the model. I don't know how you could possibly do it on the free version.

RuChopra-oz
Автор

Do you know how to inference it or convert it on google colab to GGUF or any other to run it locally?

mrboltik
Автор

Hey, many thanks for the nice example.
When I try to follow up using the notebook, I run into this problem


ImportError Traceback (most recent call last)
in <cell line: 14>()
12 )
13 from peft import LoraConfig, PeftModel
---> 14 from trl import SFTTrainer

1 frames
in <module>
21 import torch.nn.functional as F
22 from torch.nn.utils.rnn import pad_sequence
---> 23 from transformers import top_k_top_p_filtering
24
25

ImportError: cannot import name 'top_k_top_p_filtering' from 'transformers'

Do you have an idea how to solve this?
Which versions of the package would be compatible?

Cheers,
Mirko

maindset.academy
welcome to shbcf.ru