prefetch autotune set max ram usage

Im using prefetch with an autotuner to load my dataset, which is loaded into my RAM (950GB). Unfortunately sometime the autotuner exceeds/spikes above my RAM (i dont mean the gpu memory) limit and the jobs gets canceled. Is there an option to set an absolute limit for the autotuner?

1 Like

I just encountered the same issue when mapping over a dataset with It seems to go beyond max ram, leading to the terminal killing itself.

Can you tell me more about your problem and code? Maybe I can help you

Hi @munsteraner @Voxlz ,

Can you use this following option in auto tune options ram_budget: When autotuning is enabled, it determines the RAM budget to use. See if it is working by using this option.


Hi Laxma,

as far as I know, this option is only for the GPU RAM.

Hi @munsteraner ,

No, the ram_budget option is not only for GPU RAM. It can also be used to limit the amount of CPU RAM that is used by the autotuner.

Here are few articles and source code to refer for more details.

Hope this helps you.