How to use an RTX 4070ti for Machine Learning?

Hello,

I’m new to machine learning/deep learning, and I want to use my GPU to make jupyter notebook faster when working on deep learning algorithms.

I heard I could use my RTX4070ti for that. I also have a i7-13700KF.
I’m using Anaconda interface.

How can I optimize my working environnement, step by step? Knowing I need to use : keras, tensorflow.

Thanks a lot for helping a newcomer,
Best

Hi @Skhao, For using GPU with tensorflow you have to install CUDA and cuDNN in your machine. Please refer to document for setting up tensorflow with GPU. Thank You.