Caching model from tensorflow hub

How to cache and use a model from tensorflow hub instead of loading the model again and again?

1 Like

Hi @Fatema_Dalal, Please refer to this document for Caching model downloads from TF Hub. Thank You.

1 Like

Hi @Kiran_Sai_Ramineni Thanks for the reference. I have referred this document. I have tried downloading the model and load it from my local system. I need help with reducing the time it takes to load the model.

Hi @Fatema_Dalal, Could you please try by saving the loaded from tf hub and load the saved model. For example,

import tensorflow as tf
import tensorflow_hub as hub

model_url = "https://tfhub.dev/google/imagenet/mobilenet_v2_140_224/feature_vector/5"
model = hub.load(model_url)

# Save the model
tf.saved_model.save(model, "./saved_model")

loaded_model = tf.saved_model.load('/content/saved_model')

Thank You.

Thanks @Kiran_Sai_Ramineni I have tried this as well and it does help to reduce the time significantly. Is it possible to load the the model at the beginning of application in case of using fast api and then just call the model at the time it is actually required?

Hi @Fatema_Dalal, Generally loading time depends upon the model size. May i know which model you are loading and how much time it is taking. Thank You.

1 Like

Hi @Kiran_Sai_Ramineni, I am using super resolution model. It Takes about 6-7s to load.

Hi @Fatema_Dalal, As per my knowledge it might not be a long time.

please refer to this document for loading the model at the beginning of application in case of using fast API. Thank You.

Okay. Thanks for sharing. I’ll have a look.

Okay very interesting