TF Hub Model deployment

I am trying to deploy a model I trained using BERT from TF hub, using flask, but I keep getting this error:

Traceback (most recent call last):
  File "c:\Users\AnanyaAgrawal\Flask\app.py", line 14, in <module>
    model = tf.keras.models.load_model(model_path, custom_objects=custom_objects())
  File "C:\Users\AnanyaAgrawal\anaconda3\envs\myenv\lib\site-packages\keras\src\saving\saving_api.py", line 262, in load_model  
    return legacy_sm_saving_lib.load_model(
  File "C:\Users\AnanyaAgrawal\anaconda3\envs\myenv\lib\site-packages\keras\src\utils\traceback_utils.py", line 70, in error_handler
    raise e.with_traceback(filtered_tb) from None
  File "C:\Users\AnanyaAgrawal\anaconda3\envs\myenv\lib\site-packages\keras\src\engine\base_layer.py", line 870, in from_config 
    raise TypeError(
TypeError: Error when deserializing class 'KerasLayer' using config={'name': 'keras_layer', 'trainable': False, 'dtype': 'float32', 'handle': 'https://tfhub.dev/tensorflow/bert_en_uncased_preprocess/3'}.

This is my code:

from flask import Flask, request, render_template
import numpy as np
import tensorflow as tf
import tensorflow_hub as hub
from tensorflow import keras
app = Flask(__name__)

# Load the pre-trained BERT model
model_path = r"C:\Users\AnanyaAgrawal\Flask\data\modelnew.h5"
def custom_objects():
    return {'KerasLayer': hub.KerasLayer}

# Load the model with custom objects
model = tf.keras.models.load_model(model_path, custom_objects=custom_objects())
# Function to classify text
def classify_text(text):
    # Perform prediction using the loaded model
    prediction = model.predict(np.array([text]))
    # Assuming it's binary classification, you may need to adjust this logic
    if prediction > 0.5:
        return "Potentially Suspicious"
    else:
        return "Not Suspicious"

# List to store user inputs
user_inputs = []

@app.route('/')
def index():
    return render_template('index.html')

@app.route('/predict', methods=['POST'])
def predict():
    user_input = request.form['user_input']
    user_inputs.append(user_input)
    prediction = classify_text(user_input)
    return render_template('result.html', prediction=prediction)

@app.route('/inputs')
def inputs():
    return render_template('inputs.html', user_inputs=user_inputs)

if __name__ == '__main__':
    app.run(debug=True)

Hi @Ananya, As per the error log I can see the error occurs while loading the model. when you load the model from tf_hub the download location defaults to a local temporary directory so TensorFlow will create a temp directory to keep loaded models, however, after a few days or so, the contents of the folders (the loaded model) will be deleted. This might be the cause for the error. Could you please try by reading the hub model directly from remote storage (GCS) instead of downloading the models locally with using

os.environ["TFHUB_MODEL_LOAD_FORMAT"] = "UNCOMPRESSED"

To know more about Caching model downloads from TF Hub. please refer to this document. Thank You.