Is it possible to add an embedding layer [n, N] with a dense layer [1, N]?

Or two embedding layers having the same number of rows but different column sizes, eg. embd1 [10, N] and embd2 [3, N]?

Is it possible to add an embedding layer [n, N] with a dense layer [1, N]?

Or two embedding layers having the same number of rows but different column sizes, eg. embd1 [10, N] and embd2 [3, N]?

Yes you can add Dense layer with Embedding Layer

**Working sample code**

import tensorflow as tf

import numpy as np

model = tf.keras.Sequential()

model.add(tf.keras.layers.Embedding(1000, 64, input_length=10))

model.add(tf.keras.layers.Dense(64, activation=‘relu’))

model.add(tf.keras.layers.Dense(1))

input_array = np.random.randint(1000, size=(32, 10))

model.compile(‘rmsprop’, ‘mse’)

output_array = model.predict(input_array)

print(output_array.shape)

**Output**

(32, 10, 1)

Oh, there is a slight misunderstanding. I want to say embedding layer + Dense layer (addition of two layers).

Sorry, if I introduce even more misunderstanding

If you want to add outputs of two layers (meaning to sum them up), you can use tf.keras.layers.Add | TensorFlow Core v2.8.0

The tricky thing here is that it should be used on tensors of the same shape. But Embedding will output 2D tensor and Dense usually outputs 1D tensor. So you have to either reshape Dense output or use average pooling / mean pooling on Embedding output before adding them together.

Alternatively, you can concatenate two tensors with tf.keras.layers.concatenate | TensorFlow Core v2.8.0

Here you’ll face the same problem with different shapes and have to make them compatible.

1 Like

Thanks for the reply. I was wondering about the same problem. I was adding a 3D embedding tensor to another 3D dense tensor which was fine, but not sure how to do 3D embedding tensor to another 1D dense tesnor.

tf.reduce_sum() function also can be used to eliminate one or several dimensions of a multi-dimensional tensor.