Desktop Linux has an AI Problem

preview_player
Показать описание
Today we discuss Desktop Linux and the new push for NPU's in Desktops and Laptops. If we don't use AI in Desktop Linux, will the NPU be used at all?
Рекомендации по теме
Комментарии
Автор

if your concern is about npu support under linux, it will land fine

the snapdragon elite x is already being upstreamed

fuseteam
Автор

The NPU is nothing new. I have used the NPU in the ARM Rockchip RK3588 with Linux to run Llama2:13b locally. And that chip is almost 3 years old.

LivingLinux
Автор

The problem is NOT AI. The problem is that a closed source AI or LLM has potential for bias (ask a LLM if it can give you a fully unbiased result - it's "interesting"). Any opensource model has higher potential for scrutiny.

maxanderson
Автор

I don't think there's an AI problem with Linux. I run several different ones on my GPU. However, my outlook is that the NPU should be there if you want to use it; it shouldn't have to be used simply because it's there.

Maxume
Автор

gentoo don't want llm generated code in their repo because the source the llm got it's code from is of dubious origin, you're conflating a lot of stuff

fuseteam
Автор

I think the 'conundrum' Linux faces is what makes Linux great, in a 'free as in freedom' perspective. Microsoft will likely force users to use their AI 'tools, ' so that they harvest information from users to ultimately profit from that data (and who knows what else). Linux will allow users to choose if and how much AI they want to use... you install the tools you want. There may be a bit of a learning curve installing these tools (or even getting hardware setup to use these tools), at least at first, but people will still be able to choose AI tooling that is useful to them. I think this will be especially useful and beneficial for creatives who will find Microsoft's tooling invasive and Apple's cost of entry too high for what they get (A Mac Pro without the ability to add a discrete GPU?).

The big silicon companies (Nvidia, AMD, Intel, etc.) are all working on support for their NPU's (it would seem like a good idea for Nvidia to jump into this market, now that I think about it). Phoronix keeps pretty close tabs on AMD, Intel and Qualcomm's progress in supporting their NPUs on Linux.

As far as Nvidia is concerned, I don't know if they will add ai support to future kernel drivers, considering their big competitive advantage is their proprietary CUDA platform, which many AI tools are tuned for out of the box.

CompellingBytes
Автор

you say "i don't want the NPU to be unused" but what do want to use it for?

train a llm? probably already work under linux
else?

fuseteam
Автор

I bet games will start leaning on npus for offloading npc ai eventually

adambester
Автор

Linux has a bigger problem in the Wayland Age - accessibility tools. I’d want to see these tools updated or rewritten before considering new features.

But I also know these are completely different realms of development that aren’t all that related unless we use the NPU to help fuel these tools…?

Redmage
Автор

I think this will be short-lived, maybe a decade at most, because of quantum chips. Linux will obviously be the first kernel ready to support this kind of QPUs, and it will remain the most accessible way for startups to train models on quantum computers mixed with classical hardware. In the near future, we will be able to train and infer LLMs and other types of AI at unprecedented speeds.

We also need to consider the Majorana 1 project—it's a game-changer for material science. The development of topological quantum processors will accelerate rapidly, potentially making QPUs accessible and affordable. We might even reach a point where smartphones have built-in QPUs capable of running inference on entire 70B-parameter models or beyond.

Additionally, while Microsoft’s NPUs have limited support on Linux, Google’s TPUs seem like a better option for AI workloads. They are more cost-effective, work natively with Linux, and provide a powerful alternative for startups looking to scale AI training efficiently.

mully_cap
Автор

I have not yet seen any use that I have for AI. I learned how to organize a file directory in the MS-DOS days, and I know where I save things. I do my own graphics and write my own words. The only time I talk to my computer is when I'm cussing at it.

AI is developing into a powerful tool for some applications, and much good may it do people in those fields, but just as my refrigerator doesn't need an Internet connection, my computer doesn't need AI.

gregcampwriter
Автор

AI is like Cortana no body ask for it, no body wanted it, but microsoft forced it on everyone. NPU can be used for a lot of thing all AMD NPU's are on open drivers. on Linux it will be more on the app side to the system side as it possess a security risk.

DJgregBrown
Автор

AMD Ryzen AI also have NPU in their AMD XDNA which appears to have been the first dedicated AI processing silicon on a Windows x86 processor. Although it looks like you have to install an xdna plugin/driver to use the npu with linux or the npu sits there dormant/unused?

Walking-Wanderers
Автор

Um, I know NPUs are kind of like a co processor. Can it be used for puposes outside of AI? Does it have a shader like language I can play with?

aodfr
Автор

Creators have been on Linux for ages - I switched full time in 2010 as a professional illustrator. In terms of extant AI - if you can barely write and you don't have a creative bone in your body, AI can help you suck less. If you're already competent at these things, it enshittifies your work.

dustanddeath
Автор

What about using the NPU for surround sound applications? Or upscaling audio quality for really old music recordings? A lot of audiophiles might enjoy this if it's done right.

nckrad
Автор

NPU support is already there, I am using serveral turing rk1 and coral devices to accelerate lot of scientific research in Desktop Linux

codeconquerors
Автор

The NPU doesn't have to explicitly be used for AI. Assuming so is a gross oversimplification. Those units can be put to (arguably better) other uses.

maxanderson
Автор

I'll waste the NPU until the regulations and court cases are done. I have a lot of reasons for not wanting over hyped bots running on my system. We'll see but for now, nope. I don't care if a chip is dormant and unused.

bitterseeds
Автор

AI is great for tasks where the goal can be tightly constrained—I use AI image upscaling a lot and I'd love it to be more convenient—but I think AI just sucks for open ended tasks. I was really impressed with generative and linguistic AI at first, but the more I've seen the less useful any of it looks. The output is just unreliable and low quality. It's usually just good enough to make you waste a bunch of time trying tweak it into something actually good. In the end I've decided it's easier just to do it myself.

It's much like my experience the first time I drove an automatic transmission car. I'd become so used to manual shift that I found the automatic's help entirely unhelpful. Even after over twenty years of driving, I still find it takes more mental effort to drive an automatic because I can't stop myself from trying to manipulate the stupid machine into doing what I want it to do. Maybe if I'd never learned manual shift and I'd started out with the idea of letting the machine do the thinking it would be different. That's about how I feel about using AI to help me do things. I find the help unhelpful.

Anyway, my computer is several years old and I'm still using Windows—and still hating it—because I want to keep using all my weird software. If stuff like AI upscaling gets incorporated conveniently into more software, I can see the benefit of having built in AI hardware but right now that's still in the category of not missing what I've never had.

TonboIV