Nmt with attention bug

Hello I have a problem with this tutorial.

I run it a couple of times before (even yesterday) and everything worked and when I tried running it today on Google Collab I get an error. Does anyone else have this problem? Can you please help. This is the error

Convert the target sequence, and collect the “[START]” tokens

example_output_tokens = output_text_processor(example_target_batch)

start_index = output_text_processor._index_lookup_layer(’[START]’).numpy()

first_token = tf.constant([[start_index]] * example_output_tokens.shape[0])


AttributeError Traceback (most recent call last)
in ()
2 example_output_tokens = output_text_processor(example_target_batch)
3
----> 4 start_index = output_text_processor._index_lookup_layer(’[START]’).numpy()
5 first_token = tf.constant([[start_index]] * example_output_tokens.shape[0])

2 frames
/usr/local/lib/python3.7/dist-packages/keras/layers/preprocessing/index_lookup.py in _standardize_inputs(self, inputs, dtype)
734 if isinstance(inputs, (list, tuple, np.ndarray)):
735 inputs = tf.convert_to_tensor(inputs)
→ 736 if inputs.dtype != dtype:
737 inputs = tf.cast(inputs, dtype)
738 return inputs

AttributeError: ‘str’ object has no attribute ‘dtype’

Please refer to the simplified and modernized architecture of Neural machine translation with attention tutorial. Thank you.