Can not make the GPU work for Tensorflow on WSL

Good day!
Problem: I can not make the GPU work for TF in WSL. I did CUDA cudnn installations before which worked, but now I run out of options:

My specs:
GPU: 3080 Ti
Ubuntu: 22.04
WSL: 2.0
base OS: Windows 10

First thing I did is went here to check compatibility: Install TensorFlow with pip, which states I need:

  • CUDA 11.2
  • cuDNN 8.1.0

First problem I met is that you can not install CUDA 11.2 with WSL 2.0 and UBUNTU 22.04. Ubuntu needs to be 20.04. Error I run into was: ‘Failed to verify gcc version’. As I understood I can only make the right GCC version work with 20.04.
So i went on to check if anyone did installations of cuda cudnn with latest cuda versions and I saw people doing it with Cuda >11.5 which supports my Ubuntu version, so I installed 11.8
with

wget https://developer.download.nvidia.com/compute/cuda/11.8.0/local_installers/cuda_11.8.0_520.61.05_linux.run
sudo sh cuda_11.8.0_520.61.05_linux.run

My first question: Was this understanding wrong? Is it possible to run TF with CUDA on latest Ubuntu version?
I did install CUDA itself and was able to check it.

import tensorflow as tf

print(tf.test.is_built_with_cuda())

Which returned - True.

I went on installing CUDNN, https://developer.nvidia.com/rdp/cudnn-download , selected 11.X version, which should support all the 11 versions and copied files:

sudo cp cuda/include/cudnn.h /usr/local/cuda/include/                                                                                 
sudo cp cuda/lib/libcudnn* /usr/local/cuda/lib64/

I tried to run

import tensorflow as tf
print(tf.keras.backend.backend() == 'cudnn')

But It returned false.

print(tf.test.is_gpu_available())

Was also False.

I tried all kind of different versions and combinations, but nothing worked for me… Could anyone please help with a hint? I’m desparate :smiley:

Hi @Nikita_Polovinkin,

Could you please try again by Setting up NVIDIA® GPU support in WSL2 and other mentioned steps in the link?

Let us know if the issue still persists. Thank you.