Output_shape() throws error in tf.keras.layers.Attention

I see the following error. Am I doing something wrong? Should I create a bug report in the repo?

attn = tf.keras.layers.Attention()
    ...: output = attn(inputs = [np.random.rand(1, 2, 2), np.random.rand(1, 2, 8), np.random.rand(1, 2, 2)])
    ...: output.output_shape()
---------------------------------------------------------------------------
AttributeError                            Traceback (most recent call last)
<ipython-input-79-0e91a2f8d1ae> in <module>
      1 attn = tf.keras.layers.Attention()
      2 output = attn(inputs = [np.random.rand(1, 2, 2), np.random.rand(1, 2, 8), np.random.rand(1, 2, 2)])
----> 3 output.output_shape()

/opt/conda/lib/python3.7/site-packages/tensorflow/python/framework/ops.py in __getattr__(self, name)
    511         from tensorflow.python.ops.numpy_ops import np_config
    512         np_config.enable_numpy_behavior()""".format(type(self).__name__, name))
--> 513     self.__getattribute__(name)
    514 
    515   @staticmethod

AttributeError: 'tensorflow.python.framework.ops.EagerTensor' object has no attribute 'output_shape'

In [80]: attn = tf.keras.layers.Attention()
    ...: output = attn(inputs = [np.random.rand(1, 2, 2), np.random.rand(1, 2, 8), np.random.rand(1, 2, 2)])
    ...: attn.output_shape()
---------------------------------------------------------------------------
AttributeError                            Traceback (most recent call last)
<ipython-input-80-9646783aedac> in <module>
      1 attn = tf.keras.layers.Attention()
      2 output = attn(inputs = [np.random.rand(1, 2, 2), np.random.rand(1, 2, 8), np.random.rand(1, 2, 2)])
----> 3 attn.output_shape()

/opt/conda/lib/python3.7/site-packages/keras/engine/base_layer.py in output_shape(self)
   2244     """
   2245     if not self._inbound_nodes:
-> 2246       raise AttributeError(f'The layer "{self.name}" has never been called '
   2247                            'and thus has no defined output shape.')
   2248     all_output_shapes = set(

AttributeError: The layer "attention_10" has never been called and thus has no defined output shape.

@Md_Iftekhar_Tanveer,

Welcome to the Tensorflow Forum!

You can view the output shape as shown below

import tensorflow as tf
import numpy as np
attn = tf.keras.layers.Attention()
output = attn(inputs = [np.random.rand(1, 2, 2), np.random.rand(1, 2, 8), np.random.rand(1, 2, 2)])
#print(output)
print(output.shape)

Output:

(1, 2, 8)

Thank you for your reply.

I was getting the error in Tensorflow 2.8.0. Updating the package solved the problem.