Is it possible to backpropagate through an integral placed in the loss function?

this question may be stupid but i would like to ask nonetheless.
let’s say that my neural network predict the value of the variable u(x) and i have two other trainable variable alpha and beta and i know that they are related through an equation of the form:

u(x) - int_0^x (f(alpha,beta)) =0

so that i can add the residual to the loss function with add_loss. is tensorflow able to calculate the gradient through the integral to update alpha and beta? basically i have this law that needs to be respected but there is an integral in it and i don’t know what to do.

1 Like