GPU memory is fully occupied despite setting `set_memory_growth`

Hey,
I want to limit my GPU memory and I tried setting set_memory_growth and tf.config.LogicalDeviceConfiguration and seems like the GPU still consumes all the memory, even when I set batch size = 1. Is there any solution on this? Thanks

Hi @imayachita, Could you please use tf.config.LogicalDeviceConfiguration and set the memory limit, and check if full gpu memory is occupied or not. Below is the example code

gpus = tf.config.list_physical_devices('GPU')
if gpus:
  # Restrict TensorFlow to only allocate 1GB of memory on the first GPU
  try:
    tf.config.set_logical_device_configuration(
        gpus[0],
        [tf.config.LogicalDeviceConfiguration(memory_limit=1024)])
    logical_gpus = tf.config.list_logical_devices('GPU')
    print(len(gpus), "Physical GPUs,", len(logical_gpus), "Logical GPUs")
  except RuntimeError as e:
    # Virtual devices must be set before GPUs have been initialized
    print(e)

Thank You.