Distributed training with data dictionary input

I want to customize a model to train on multi-gpus using mirroredstrategy
my input data is generated from overwritten keras.utils.Sequence class and has structure of a nested dictionary:
tf_batch_data: Dict[Text, Dict[Text, List[tf.Tensor]]]
But as i read documentation about distributing input data, it requires tf.data.dataset for data to be distributed using tf.distribute.Strategy.experimental_distribute_dataset or tf.distribute.Strategy.distribute_datasets_from_function
My question is: is it possible to distribute my dictionary data to the model because my model uses indexing batch by key (string) or do anyone know the better way to deal with it. Thanks in advance!