filmov
tv
Phi-2 on Intel Meteor Lake - Coding question

Показать описание
What if we could run state-of-the-art open-source LLMs on a typical personal computer? Did you think it was a lost cause? Well, it's not!
In this post, thanks to the Hugging Face Optimum library, we apply 4-bit quantization to the 2.7-billion Microsoft Phi-2 model, and we run inference on a mid-range laptop powered by an Intel Meteor Lake CPU.
In this post, thanks to the Hugging Face Optimum library, we apply 4-bit quantization to the 2.7-billion Microsoft Phi-2 model, and we run inference on a mid-range laptop powered by an Intel Meteor Lake CPU.