Broadcasting for Ragged Tensors

I am trying to change a function I wrote to make it work for ragged tensors in addition to normal (dense) tensors. One issue I ran into is that it seems that broadcasting is not supported for ragged tensors. As a minimal example (using Python 3.7, TF 2.8.2), consider the following:

tf.linalg.matvec( tf.constant([ [[0,0,1],[0,1,0],[1,0,0]] ]), tf.ragged.constant([ [1,1,0], [0,0,1] ]) )
# InvalidArgumentError: Batch dimensions of `a` and `b` do not have the same size.
tf.linalg.matvec( tf.constant([ [[0,0,1],[0,1,0],[1,0,0]] ]), tf.constant([ [1,1,0], [0,0,1] ]) )
# <tf.Tensor: shape=(1, 2, 3), dtype=int32, numpy=array([[[0, 1, 1],[1, 0, 0]]], dtype=int32)>

My first question is: why does this not work in the ragged case? Is there a fundamental reason I am not seeing or has it just not been implemented?

In my case, I have a ragged tensor X of shape [B, None, None, C] representing a batch of B images of different sizes and same number of channels C = 3. I’m trying to apply a per-pixel linear transformation represented by a tensor A of shape [B, 1, 1, C, C] as follows:

Y = tf.linalg.matvec(A, X)
# InvalidArgumentError: Batch dimensions of `a` and `b` do not have the same size.

So my second question is: what would be a good way to do this instead?

Thanks for any advice!