I am running a subclassed model and update parameters using
tf.GradientTape(). While the model runs, I continue getting the following message every few seconds.
<tf.Variable 'UnreadVariable' shape=() dtype=int64, numpy=1>
The last number after
numpy= continues increasing. The model seems to run fine, after one epoch I get all the
print statements and can retrieve model’s loss but message continues being printed and continues increasing this last integer.
Is it something I should be concerned about or is it some kind of feature? I found this Stackoverflow post but the response there is not very clear to me.
Thank you for your help!
Hi @acraev ,
It is simply a TensorFlow message that is printed when a variable is created but not yet used. The number after the
numpy= is the number of times the variable has been created but not yet used, This is because TensorFlow does not need to read the variables until it calculate the gradients of the function. These variables are used to store the gradients of the function with respect to its inputs.
I think you can safely ignore the
UnreadVariable message and continue your training.
Please let me know if it helps you.