Comparison chat bw GPUs calculations using CUDA vs WebGL for tensorflow js?

Question

  • Is there any comparison chat bw GPUs calculations using CUDA vs WebGL for tensorflow js?

Pd I understand now that:

Since web browsers do not have direct access to CUDA APIs, TensorFlow.js in the browser relies on WebGL as its GPU backend.

so I guess this isn’t a fair comparison and always for training we would ideally use CUDA (not running in the browser) and for inference there is no choice but to use WebGL if we want to use it in the browser and user’s platform/hardware supports it.

Correct. We recommend training via Node.js which is better for training. Front end is optimized for inference. With technologies like WebGPU now available though the difference between front end and backend is certainly smaller but backend still has an edge.

1 Like