TFDF Custom metrics logging during training

How to log AUC score/ custom metrics at every tree? I tried with:

model = tfdf.keras.GradientBoostedTreesModel(num_trees=3000,features=all_features, exclude_non_specified_features=True,
                                    early_stopping_num_trees_look_ahead=30, task=tfdf.keras.Task.CLASSIFICATION,
                                    verbose=True, apply_link_function=False)

model.compile(metrics=tf.keras.metrics.AUC())

Though, when calling model.make_inspector().training_logs(), only loss and accuracy are logged (for classification task).

Maybe try passing it as a list

metrics=[tf.keras.metrics.AUC()]

Hi, I also tried the same but it only calculates metrics when training is finished.

Hi An Tran,

The metrics returned by the inspector’s training logs are hardcoded. The TF-DF API does not currently allow to customize it.

If of importance, please file a feature request.

In the mean time, a (relatively inefficient) solution is to remove trees from the model (using the model inspector and model builder) and then to evaluate each intermediate model with metrics=[tf.keras.metrics.AUC()].

Thanks for the info. I’m trying to find a way to make the model log aucs metric when calling model.make_inspector().training_logs(). Currently, the output is like this for the classification task:

 TrainLog(num_trees=11, evaluation=Evaluation(num_examples=233, accuracy=0.9570815450643777, loss=0.5311337378147846, rmse=None, ndcg=None, aucs=None)),

TL;DR: It is not possible to output AUC in model.make_inspector().training_logs() with the current code (without adding a new parameter and implementing the related logic).