In Keras with tensorflow 2 I can override the make_train_function() function (when creating a custom model for instance) and I can set the force parameter to true. This allows me to dynamically call a function (the random function in my code, just as a test) and adds its return value to the loss at each batch
Problem: I am not using tensorflow 1.15 and the only function that exists and that is similar to make_train_function() is in keras/engine/training.py :
def _make_train_function(self): if not hasattr(self, 'train_function'): raise RuntimeError('You must compile your model before using it.') self._check_trainable_weights_consistency() #if self.train_function is None: inputs = (self._feed_inputs + self._feed_targets + self._feed_sample_weights) if self._uses_dynamic_learning_phase(): inputs += [K.learning_phase()] with K.name_scope('training'): with K.name_scope(self.optimizer.__class__.__name__): training_updates = self.optimizer.get_updates( params=self._collected_trainable_weights, loss=self.total_loss) updates = (self.updates + training_updates + self.metrics_updates) # Gets loss and metrics. Updates weights at each call. self.train_function = K.function( inputs, [self.total_loss] + self.metrics_tensors, updates=updates, name='train_function', **self._function_kwargs)
I have tried to modify this function by removing the different conditions (if, with) but still not working. So is there a way to modify this function such that it takes into account the same “force” argument that exists for the make_train_function of TF2 ?
Thanks a lot