i have some trained Models on TF2 and i want to measure the performance while executing the inference. I have seen that there is something like that for TensorFlow Lite, where you can measure
Initialization time Inference time of warmup state Inference time of steady state Memory usage during initialization time Overall memory usage
Is there something for normal TensorFlow?
I use the Python API on Jetson Platform (Linux aarch64)
Thank you very much