How to count the allocated memory size of the model

  When we use the benchmark test, it will display   "peak memeroy footprint (MB) :init =   overall =   "     but the size value displayed in each test is different.
 We would like to know is there any other way to count the memory size of the model

thanks

You can use the model profiler that tells you the model’s memory requirement, no. of parameters, flops etc

Sample code using Toy Example:


from tensorflow.keras.applications import EfficientNetB3

from model_profiler import model_profiler

model = EfficientNetB3(include_top=True)

Batch_size = 128

profile = model_profiler(model, Batch_size)

print(profile)

Output:

Downloading data from https://storage.googleapis.com/keras-applications/efficientnetb3.h5
50102272/50095040 [==============================] - 1s 0us/step
50110464/50095040 [==============================] - 1s 0us/step
| Model Profile                    | Value         | Unit    |
|----------------------------------|---------------|---------|
| Selected GPUs                    | None Detected | GPU IDs |
| No. of FLOPs                     | 1.5651        | BFLOPs  |
| GPU Memory Requirement           | 41.7385       | GB      |
| Model Parameters                 | 12.3205       | Million |
| Memory Required by Model Weights | 46.9991       | MB      |
1 Like
   thanks, Actually we want to count the memory usage of the tflite model when it runs on Android.

Is there any way?