GPU Accelerated Machine Learning with WSL 2

preview_player
Показать описание

Learn how Windows and WSL 2 now support GPU Accelerated Machine Learning (GPU compute) using NVIDIA CUDA, including TensorFlow and PyTorch, as well as all the Docker and NVIDIA Container Toolkit support available in a native Linux environment.

Clark Rahig will explain a bit about what it means to accelerate your GPU to help with training Machine Learning (ML) models, introducing concepts like parallelism, and then showing how to set up and run your full ML workflow (including GPU acceleration) with NVIDIA CUDA and TensorFlow in WSL 2.

Additionally, Clarke will demonstrate how students and beginners can start building knowledge in the Machine Learning (ML) space on their existing hardware by using the TensorFlow with DirectML package.

Learn more:

- Follow Clark Rahig on Twitter: @crahrig
Рекомендации по теме
Комментарии
Автор

This is difficult to follow. The installations on Nvidia's Cuda Toolkit Documentation site differs now from the instructions in the video. Both instructions use the `apt-key`, which is deprecated and causes apt to complain. The installation instructions don't work.

The Documentation site is difficult to parse. The (non-functional) instructions to install Cuda on WSL 2 directly don't come up until the very end of 'Getting Started with CUDA on WSL 2' after all the other instructions for docker, notebooks, etc.

This was a waste of my time, please make this installation process possible and less painful.

sorenrichenberg
Автор

By far the most helpful guide I've seen so far, thank you! I've managed to get everything running (incl. Docker and the Docker extension in VS Code, though I'm not using DirectML), and the example provided in the doc do make use of my GPU, but I don't know how to proceed next, as a newcomer: how do you actually get TensorFlow to detect your GPU, no matter your environment/code editor? In my case, I'm coding a Jupyter notebook within VSC with WSL 2 (Ubuntu 20.04), but doesn't see my GPU (GeForce 970). All I want to do is tell VS Code to use the Docker Image when I'm coding so that the GPU will be detected, but what are the exact commands I need to do type to accomplish that? The CUDA doc is great if you want to run a Jupyter Notebook through your web browser, but it doesn't say anything about using your own code editor. Can't find the solution online. Mind giving me a hand? Cheers!
Sidenote: I can't wait for TensorFlow, nVidia, Docker and Microsoft to standardize and stabilize all this stuff. It's so complex and overwhelming for a fledgling data scientist like myself, and the various documentations you can find all seem to contradict one another in some way. I just wish there were one clear set of steps to follow for people like me who need everything to be explained like they're 5 haha.

Berutoron
Автор

Setting up projects with WSL2 was a nightmare for me. Tasks running within WSL2 do not recognize external file changes and the subsystem is utterly slow. I used a top shelf XPS15 32GB with i7 and it nearly broke down using a Docker container and running some simple bash scripts along with a simple Magento and Typo3 installation.
It just doesn't work and it will never be a seamless and frustration-free integration. But I highly appreciate the efforts that MS is making and the huge steps that Windows is taking in the Open Source Linux crowd. But it's not here yet. WSL1 and WSL2 are not yet ready to replace a native Linux environment.

inspectorchicken
Автор

I get the insufficient error... this guide only works for one specific scenario and is not helpful

captainlennyjapan
Автор

Does cuDNN installation on wsl the same as Native Ubuntu?
Or It's a different way?

Adroitbit
Автор

Is there a way to set this up without the Windows Insider Program?

senadkurtisi
Автор

Is it possible to use both CUDA and DirectML, so that I can leverage all of my GPUs?

TernaryM
Автор

What is the state for AMD GPU with OPENCL framework?

harshaldongre
Автор

are the instructions the same for Ubuntu 20.04 in WSL2??

backflipinspace
Автор

I have my gpu driver installed already. Do I need to install CUDA driver as well?

baranaldemir
Автор

cuda supported ML installation is always a headache, i don't understand why they have to design a system like "don't try me" face

MuhammadTayyab
Автор

its not as simple as in video, too much complicated process

talha_anwar
Автор

reduce your title sound. My ear gets hurt with your title music.

mipmap
Автор

WSL2 is buggy to the bones. Install Ubuntu + Microk8s is a pain in the butt. But GPU does not work inside Kubernetes of Micro8s. And worst, Kubernetes coming along with Docker Desktop is screwing with volume mount etc.. Anyway, this release is still a joke.

nguyenanhnguyen