How can I measure the complexity of the model?

I need to measure time and memory complexity for a keras model or captioning model using keras … how can I start ?
Thanks

Check:

https://tensorflow-prod.ospodiscourse.com/t/how-to-find-out-keras-model-memory-size/5249

Thanks a lot for replying … excuse me confirm just i got the point or not please …
i read the links and got to measure the time and memory consumption, i need to add those lines to my model

run_metadata = tf.RunMetadata()
with tf.Session() as sess:
_ = sess.run(train_op,
options=tf.RunOptions(trace_level=tf.RunOptions.FULL_TRACE),
run_metadata=run_metadata)

after the model finished , i should run tf.profiler.profile ?
what if i finished the training stages and got the final file with h5 at epoch 20 … is there another way to measure time and memory instead start training again

You can approximate memory with some calc of input, number parameter, dtype etc… but to really profile your model you need to run it.

Thanks a lot … is there any tutorial for these calculations ? In what title I should search for this , please

You can check Ability to calculate projected memory usage for a given model · Issue #36327 · tensorflow/tensorflow · GitHub

But I suggest you to estimate It at runtime with the previous solution.

1 Like

thanks a lot but excuse me what will be the difference between the runtime and the other way ?

Cause at runtime it is what it is going to really consume the library on the machine e.g. kernel memory requirements and any other allocation, data transfer etc…

1 Like