I slightly modified the first example from this tutorial by saving the y_pred from each epoch

```
class CustomModel(keras.Model):
def __init__(self, **kwargs):
super().__init__(**kwargs)
self.saved_pred = []
def train_step(self, data):
x, y = data
with tf.GradientTape() as tape:
y_pred = self(x, training=True) # Forward pass
loss = self.compiled_loss(y, y_pred, regularization_losses=self.losses)
# Compute gradients
trainable_vars = self.trainable_variables
gradients = tape.gradient(loss, trainable_vars)
# Update weights
self.optimizer.apply_gradients(zip(gradients, trainable_vars))
# Update metrics (includes the metric that tracks the loss)
self.compiled_metrics.update_state(y, y_pred)
# Return a dict mapping metric names to current value
self.saved_pred.append(y_pred)
return {m.name: m.result() for m in self.metrics}
```

import numpy as np

inputs = keras.Input(shape=(32,))

outputs = keras.layers.Dense(1)(inputs)

model = CustomModel(inputs = inputs, outputs = outputs)

model.compile(optimizer=“adam”, loss=“mse”, metrics=[“mae”])

x = np.random.random((1000, 32))

y = np.random.random((1000, 1))

model.fit(x, y, epochs=3)

Here’s the output of model.saved_preds:

ListWrapper([<tf.Tensor ‘custom_model_1/dense_5/BiasAdd:0’ shape=(None, 1) dtype=float32>, <tf.Tensor ‘custom_model_1/dense_5/BiasAdd:0’ shape=(None, 1) dtype=float32>])

There’s no numpy attribute for the Tensors in this list, so I’m wondering if it’s possible to extract their values.