Tf.data prefetch autotune set max ram usage

Im using tf.data prefetch with an autotuner to load my dataset, which is loaded into my RAM (950GB). Unfortunately sometime the autotuner exceeds/spikes above my RAM (i dont mean the gpu memory) limit and the jobs gets canceled. Is there an option to set an absolute limit for the autotuner?