Are tfrecords compatible with NLP teacher forcing?

I am unable, despite having read the documentation several times over and every ‘Unable to Parse’ post, to make my NLP dataset work with tfrecords. I am currently thinking that this is due to the dictionary structure of a tfrecord. Teacher forcing requires (encoder_inputs, decoder_inputs), targets. This structure just seems incompatible with dictionaries that require key access.

Is there a repo or tutorial which highlights this use-case?

I could post specific code, but I’m making this more about the topic. I cannot find one example of using tfrecords and teacher forcing online. If relevant, the repo is here:

Alright, the answer requires looking into what exactly is called, with what arguments, when using model.fit(). After switching to a manual training step and learning that fit() is calling call() with [0] of the dataset and calling train_step() with the entire dataset, I was able to properly adjust the inputs to what they need to be. In the end, my dataset was structured like this:

            return ((
                parsed_example['encoder_input'],
                parsed_example['decoder_input']
            ), parsed_example['target'])