Importing models directly into Python Variables from model files stored in S3 bucket

Hello,

I have been thinking from quite some time that if it would have been possible to import the ML models directly into python variables when the model file is actually not at the local disk where the code is running but rather is stored in an S3 bucket.

As we can load a model file/folder present in the local disk using the tensorflow functions like tf.keras.models.load_model("<model_file_name>"), is there any possible way to import the model from the model files stored in the S3 bucket?

Because there can be scenarios where we are running some code to import model inside a container and that container has a limited storage bandwidth but we know the model sizes can be extremely large and thus it would become difficult for it to be stored inside a container or a persistent storage.

For, eg. I am able to use the below specified code to import a dataset (CSV File) directly into memory (basically into a python variable) using the Pandas and the Boto3 library:

filename = "dataset/items_dataset.csv"
s3 = boto3.client('s3',
                  aws_access_key_id = '<AWS_ACCESS_KEY_ID>',
                  aws_secret_access_key = '<AWS_SECRET_ACCESS_KEY>')

obj = s3.get_object(Bucket='<AWS_BUCKET_NAME>', Key=filename)
df = pd.read_csv(io.BytesIO(obj['Body'].read()))
df.head()

where dataset/items_dataset.csv is the path of the dataset file in the S3 bucket.

Thus, I wanted to know whether is it possible to have a similar way of importing the model file stored in the S3 bucket without having the need of downloading it to the local disk before loading it to memory through tensorflow functions.

Also, if there is any way of achieving this, then is there any way to do the vice-versa as well, i.e. to store the model directly into the S3 bucket?

1 Like

Hey Did you get any workaround solution to load the keras model from S3 bucket.