DeepSeek Coder v2 Lite Instruct - Local Installation - Beats GPT-4 In Coding

preview_player
Показать описание
This video locally installs DeepSeek-Coder-V2 which is an open-source Mixture-of-Experts (MoE) code language model that achieves performance comparable to GPT4-Turbo in code-specific tasks.

🔥 Get 50% Discount on any A6000 or A5000 GPU rental, use following link and coupon:

Coupon code: FahdMirza

#deepseekcoder #deepseekcoderv2

PLEASE FOLLOW ME:

RELATED VIDEOS:

All rights reserved © 2021 Fahd Mirza
Рекомендации по теме
Комментарии
Автор

i think deepseek coder is way better than gpt4o. I have a page with 1276 lines of code. Usually gpt4o can not edit anything in this code without a billion errors. DeepSeek coder did this for me in one go with zero errors.

ganian
Автор

Thanks for the update. Is it possible to test inference with air llm for the community?
DeepSeek-Coder-V2 in BF16 format for inference, 80GB*8 GPUs are required. That's a lot of resources for running inference on the model locally. I know the lite version can be used but can the over 200 parameters version be utilized by Air LLM for inference? It will be magical if that is possible!

marilynlucas
Автор

Good tutorial! It looks fast in your video. What GPU were you using and what is the exact tokens / second could you get?

zhaokangchen
Автор

it would be great if you could demonstrate DeepSeek API with memory. Because it seems that it has no memory for me.

ganian
Автор

Please make a video to deploy this model on a notebook like a sagemaker or colab

chaithanyachaganti
Автор

I was over hopeful, I guess coder v2 online can't even produce a 200 line script broken down into 4 parts, 200 lines and it times out, and it can't count to 50 lines either.. It's basically a limited gpt API

jasonn
Автор

It s working really slow, deleted yesterday

MeTuMaTHiCa