Conv2D Backprop Input Implementation

I am trying to understand how the Conv2DBackpropInput op works. Since I could not find a description of the algorithm in TF’s API Docs, I am trying to implement it as sequential code and compare against the op itself:

input_backprop_gold = tf.raw_ops.Conv2DBackpropInput(
    input_sizes=[N, H, W, C],
    strides=[1, stride, stride, 1],
input_backprop_np = np.zeros(shape=(N, H, W, C), dtype=np.float32)
assert input_backprop_np.shape == input_backprop_gold.shape
for n in range(N):
    for p in range(P):
        for q in range(Q):
            for k in range(K):
                for r in range(R):
                    for s in range(S):
                        for c in range(C):
                            input_backprop_np[n, stride * p + r, stride * q + s, c] += filter_np[r, s, c, k] * \
                                                                                       out_backprop_np[n, p, q, k]
    assert np.allclose(input_backprop_np, input_backprop_gold)

This seems to produce correct results, but I don’t understand why, because the description of conv2d backprop input as provided in How does Backpropagation work in a CNN? | Medium says it should be a normal convolution of the filter (rotated by 180°) and the out_backprop.

Do you have any resource that describe how the Conv2DBackpropInput op is implemented in TF? (docs, sequential code, …)

Hi @richard_wwu ,

The Conv2DBackpropInput operation computes the gradients of a 2D convolution with respect to its input. It is equivalent to the tf.nn.conv2d_transpose operation in TensorFlow.

You can find the more details in this documentation and github source code implementation

Please let me know if it helps you.


Hi @Laxma_Reddy_Patlolla ,

many thanks for the links. Since the docs are rather general, do you know where to find the implementation of gen_nn_ops.conv2d_backprop_input? (tensorflow/ at 0db597d0d758aba578783b5bf46c889700a45085 · tensorflow/tensorflow · GitHub)