Tfjs-node compatibility with modern AMD CPUs

I tried getting tfjs-node to run on my old PC but ran into issues that I believe where caused by my ancient CPU not supporting the AVX instructions.

I am planning on getting a new CPU and wanted to confirm before purchasing that modern AMD processors (released within the past year or two) are compatible with tfjs-node.

I understand that in order to use the GPU backend I will need NVidia rather than AMD.

@Jason can help here

According to the eng team, AMD uses the x86-64 architecture (same as Intel), so it should have no issues. The only thing that might not work is really low-end AMD processors, which might not have the necessary avx instructions to run the precompiled TF library, but this is also an issue on Intel.

1 Like

+1 to what Jen said. I’ve not seen others have issues with AMD, so long as avx is supported so you should be fine with your upgrade.

The bigger issue would be with graphics card if using node-GPU build that would need NVIDIA for CUDA just like Python. Both node and python are wrappers around TF c++ core when on server side.

TensorFlowJS in browser doesn’t have these issues as using browser technology instead like WebGL

Thanks for the confirmation. I figured AMD would be okay, but wanted to verify before spending the money

Which specific CPU are you looking to buy out of interest?

This wiki page seems to have some details for CPU support for AVX2 etc:

I am looking to get as many cores as possible (on a budget) to improve parallelization.

I am thinking AMD Ryzen 9 3900X will prolly get me the most bang for my buck.

I am currently using the prehistoric AMD Phenom II X6 and have not really had a reason to upgrade until now.

Seems like this processor supports AVX and AVX2 according to this website:

Also these folk seem to be planning to use same CPU for ML purposes too:

and here:

So that looks promising - though I admit I have never used AMD before myself it would seem that evidence shows it seems to work for everyone else who is talking about it.