Problem "Graph disconnected"

Hi everyone!

I’m new with Tensorflow and I’m working with a particular representation. I need add external feacture but in the end to model, por example:

input_1 = Input(shape=(train_s1_seq.shape[1],))
input_2 = Input(shape=(train_s2_seq.shape[1],))
input_3 = Input(shape=(dist_text.shape[1],))

I need that input_3 add in the end for concatenation.

common_embed = Embedding(name=“synopsis_embedd”,input_dim =len(t.word_index)+1,
lstm_1 = common_embed(input_1)
lstm_2 = common_embed(input_2)


common_lstm = Bidirectional(LSTM(tot_neuronas_lstm, return_sequences=True, activation=“relu”)))
vector_1 = common_lstm(lstm_1)
vector_1 = Flatten()(vector_1)
vector_2 = common_lstm(lstm_2)
vector_2 = Flatten()(vector_2)

I can add vector_1 and vector_2, that come from Embedding and LSTM respectively. The problem is with “Input_3”, I put it directly in concatenate but it gives me the following error:

ValueError: Graph disconnected: cannot obtain value for tensor KerasTensor(type_spec=TensorSpec(shape=(None, 3), dtype=tf.float32, name='input_144'), name='input_144', description="created by layer 'input_144'") at layer "concatenate_47". The following previous layers were accessed without issue: ['synopsis_embedd', 'synopsis_embedd', 'bidirectional_47', 'bidirectional_47', 'flatten_94', 'flatten_95', 'multiply_142', 'multiply_143', 'subtract_94', 'multiply_141', 'subtract_95', 'lambda_47']


conc = Concatenate(axis=-1)([input_3, vector_1, vector_2])

¿Can you me a help for to solve this problem?. I appreciatte very much the help.


You can visualize the graph with tf.keras.utils.plot_model(model, show_shapes=True, show_dtype=True). It could help to find disconnected parts of the graph or errors in the model architecture.
You say that the model has 3 input layers, but the ValueError says about missing values from the layer named “input_144”. It seams that the model has at least 144 input layers.

1 Like

Thank @Ekaterina_Dranitsyna :grin:. Now, I can see the structure and I haven’t the same issue.

In the next steps, It shows other issue :grimacing:

Epoch 1/15
InvalidArgumentError                      Traceback (most recent call last)
<ipython-input-148-dc3ca411f24b> in <module>()
     41   [train_s1_seq,train_s2_seq, train_dist],y_train.values.reshape(-1,1), epochs = num_epochs,
---> 42                                 batch_size=val_batch,validation_data=([val_s1_seq, val_s2_seq, test_dist],y_val.values.reshape(-1,1)))
     44             print("Train:")

6 frames
/usr/local/lib/python3.7/dist-packages/tensorflow/python/eager/ in quick_execute(op_name, num_outputs, inputs, attrs, ctx, name)
     58     ctx.ensure_initialized()
     59     tensors = pywrap_tfe.TFE_Py_Execute(ctx._handle, device_name, op_name,
---> 60                                         inputs, attrs, num_outputs)
     61   except core._NotOkStatusException as e:
     62     if name is not None:

InvalidArgumentError:  indices[1,2] = 62 is not in [0, 3)
	 [[node model_28/text_dist/embedding_lookup (defined at <ipython-input-148-dc3ca411f24b>:42) ]] [Op:__inference_train_function_57239]

Errors may have originated from an input operation.
Input Source operations connected to node model_28/text_dist/embedding_lookup:
 model_28/text_dist/embedding_lookup/54789 (defined at /usr/lib/python3.7/

Function call stack:

¿Any idea what’s up?

My structure is:


Embedding layer expects to receive values in range [0, 3) but got number 62 instead and doesn’t know what to do with it.

As @Ekaterina_Dranitsyna pointed out, your input values must be bounded within this range [0, 3). I’d suggest going through the documentation of the Embedding layer once to thoroughly understand what each of the arguments expects in terms of values:

1 Like

Thank very much for the help @Ekaterina_Dranitsyna and @Sayak_Paul. Now, I can run fine.