MirrorPadGrad unsupported in XLA/TPU?

,

Is reflection padding unsupported in TPU?

Borrowing code from this blog post, I am trying to use a ReflectionPadding() layer inside my model and train it on TPUs.

The layer is defined like so:

from tensorflow import pad
from tensorflow.keras.layers import Layer

'''
  2D Reflection Padding
  Attributes:
    - padding: (padding_width, padding_height) tuple
'''
class ReflectionPadding2D(Layer):
    def __init__(self, padding=(1, 1), **kwargs):
        self.padding = tuple(padding)
        super(ReflectionPadding2D, self).__init__(**kwargs)

    def compute_output_shape(self, input_shape):
        return (input_shape[0], input_shape[1] + 2 * self.padding[0], input_shape[2] + 2 * self.padding[1], input_shape[3])

    def call(self, input_tensor, mask=None):
        padding_width, padding_height = self.padding
        return pad(input_tensor, [[0,0], [padding_height, padding_height], [padding_width, padding_width], [0,0] ], 'REFLECT')

Here’s the gist of the error trace:

Is there a central repository when TPU/XLA unsupported ops are discussed/tracked?

2 Likes

For CPU and GPU in tf2xla we had something like this (but unmaintained)

2 Likes

There is a list here: Available TensorFlow Ops  |  Cloud TPU  |  Google Cloud. MirrorPadGrad is not listed but MirrorPad is.