TypeError: dataset length is unknown tensorflow

I was playing with this function from tf.data called tf.data.Dataset.from_generator() where it takes a ImageDataGenerator object and turns it into a Dataset Object. Everything was fine but when I fit my model I can’t use steps_per_epoch and validation_steps including them throws an error,

TypeError: dataset length is unknown.

Then I commented them out and continued to fit the model but its training for infinitely seems there is no stopping. I have attached my images. (edited)

When I use .from_generator it automatically converts from ImageDatagenerator to a Dataset object but not sure why it’s still considering it as a generator and throwing error.

Any help on this?


Looping in @Andrew_Audibert :+1:

I also checked for similar Issues here (keyword: tf.data.Dataset.from_generator - Issues · tensorflow/tensorflow · GitHub) - in case someone has encountered a similar issue.

1 Like

Just to extend the official example in the documentation

import tensorflow as tf
def gen():
  ragged_tensor = tf.ragged.constant([[1, 2], [3]])
  yield 42, ragged_tensor

dataset = tf.data.Dataset.from_generator(
         tf.TensorSpec(shape=(), dtype=tf.int32),
         tf.RaggedTensorSpec(shape=(2, None), dtype=tf.int32)))

You will see that cardinality is -2 that is tf.data.experimental.UNKNOWN_CARDINALITY.
So it can’t quickly estimate the cardinality on datasets using from_generator

You can use tf.data.experimental.assert_cardinality to manually set the cardinality.


Thanks, will do that.

Also, the weird thing is I got it working by setting the steps_per_epoch and validation_steps manually.

model.fit(train_dataset_gen  , epochs = 3, 
                    steps_per_epoch = 62.5 , 
                    validation_data = valid_dataset_gen , 
                    validation_steps = 10)


Epoch 1/3
Found 2000 images belonging to 2 classes.
63/62 [==============================] - ETA: -1s - loss: 0.6973 - accuracy: 0.4990Found 1000 images belonging to 2 classes.
62/62 [==============================] - 189s 3s/step - loss: 0.6973 - accuracy: 0.4990 - val_loss: 0.6936 - val_accuracy: 0.5063
Epoch 2/3
63/62 [==============================] - ETA: -1s - loss: 0.6958 - accuracy: 0.4830Found 1000 images belonging to 2 classes.
62/62 [==============================] - 188s 3s/step - loss: 0.6958 - accuracy: 0.4830 - val_loss: 0.6921 - val_accuracy: 0.5250
Epoch 3/3
63/62 [==============================] - ETA: -1s - loss: 0.6950 - accuracy: 0.5025Found 1000 images belonging to 2 classes.
62/62 [==============================] - 188s 3s/step - loss: 0.6950 - accuracy: 0.5025 - val_loss: 0.6921 - val_accuracy: 0.5188
<tensorflow.python.keras.callbacks.History at 0x7f427aacfdd0>

But not sure whether it’s the right way of doing things.

All I know is we can’t use a len() function on a generator, but by using tf.data.dataset.from_generator() returns a dataset right?

1 Like

Yes it returns a dataset


then this error is really suspicisous.

1 Like

Do you meant the error about the len or something else?

1 Like

the error about the len.

1 Like

It is the same explained with cardinality you don’t have the len available with from_generator.


Thanks a lot, Bhack with your help I was able to fix it.

It seems during the conversion of the generator to the dataset object length of the dataset is unknown and infinite. By using the tf.data.experimental.cardinality() we can get the number of samples in our dataset.

Like as I said before during the conversion the length is infinite and unknown so it will return -2 .

We can fix this by entering the number of samples explicitly in our dataset by using tf.data.experimental.assert_cardinality(num_of_samples) and now we can even use the len() function.

I have shared the link to the complete notebook here Google Colab