Hi y’all… continuing the saga from my previous post. I’ve included code snippets for your viewing at the bottom. Let me know if you need more!
Currently, I’m trying to build out a GradientTape with just some integers I obtained from a custom loss function. It seems like it’s trying to find the gradient for multiple variables at once, as I had to change the GradientTape to persistent, or I got the following error:
RuntimeError: A non-persistent GradientTape can only be used to compute one set of gradients (or jacobians)
This workaround necessitates that I would manually delete the GradientTape later, which of course I’m not the biggest fan of…
Thanks for reading and take care!
Some output :
model_2: print loss_value_tensor in get grad f’n(48,)
model_2: print x shape in compute loss f’n(48, 28, 28, 1)
model_2: print y shape in compute loss f’n(48,)
model_2: print loss_value shape in compute loss f’n()
… And that will make sense when you see the code.
Error was: Shapes of all inputs must match: values.shape = [3,3,1,8] != values.shape =  [Op:Pack] name: initial_value
Here’s the code snippets I think you’d need to diagnose the problem. The matrices could be filled out with dummy data, since we’re just worried about the numpy array shapes, and the tensor shapes: