Hi everyone!

I’m new with Tensorflow and I’m working with a particular representation. I need add external feacture but in the end to model, por example:

input_1 = Input(shape=(train_s1_seq.shape[1],))

input_2 = Input(shape=(train_s2_seq.shape[1],))

input_3 = Input(shape=(dist_text.shape[1],))

I need that input_3 add in the end for concatenation.

common_embed = Embedding(name=“synopsis_embedd”,input_dim =len(t.word_index)+1,

output_dim=len(embeddings_index[“x”]),weights=[embedding_matrix],

input_length=train_s1_seq.shape[1],trainable=False)

lstm_1 = common_embed(input_1)

lstm_2 = common_embed(input_2)

## Bi-LSTM

common_lstm = Bidirectional(LSTM(tot_neuronas_lstm, return_sequences=True, activation=“relu”)))

vector_1 = common_lstm(lstm_1)

vector_1 = Flatten()(vector_1)

vector_2 = common_lstm(lstm_2)

vector_2 = Flatten()(vector_2)

I can add vector_1 and vector_2, that come from Embedding and LSTM respectively. The problem is with “Input_3”, I put it directly in concatenate but it gives me the following error:

```
ValueError: Graph disconnected: cannot obtain value for tensor KerasTensor(type_spec=TensorSpec(shape=(None, 3), dtype=tf.float32, name='input_144'), name='input_144', description="created by layer 'input_144'") at layer "concatenate_47". The following previous layers were accessed without issue: ['synopsis_embedd', 'synopsis_embedd', 'bidirectional_47', 'bidirectional_47', 'flatten_94', 'flatten_95', 'multiply_142', 'multiply_143', 'subtract_94', 'multiply_141', 'subtract_95', 'lambda_47']
```

## Concatenate

conc = Concatenate(axis=-1)([input_3, vector_1, vector_2])

¿Can you me a help for to solve this problem?. I appreciatte very much the help.

Greeting,

CarlosJ.