Save Keras Tokenizer in distributed lstm

I have developed a lstm model with tensorflow and keras.
The model is trained through “ray” framework with distributed data set.
I can save the model from the chief node.
But what is about the keras Tokenizer? It is created in every worker node.
If I save it from chief node, I lose some vocabulary. And it effects in serving.

Please Help.

Hi @Ritapa_Kundu

Welcome to the TensorFlow Forum!

The given information is not enough, Could you please share the reproducible code to replicate and understand the issue better? Thank you.