I have a **trained** `sequential`

`keras`

model.

The last layer is a Dense layer with softmax activation function:

```
model = keras.models.Sequential()
model.add(...)
model.add(...)
model.add(...)
model.add(keras.layers.Dense(50, activation='softmax'))
```

How can I get the the output of the model, before the `softmax`

, without changing the model architecture ? (I have trained model, which I can’t change or train).

I have tried with:

```
probs = model.predict(X_train)
logits = probs - np.log(np.sum(np.exp(probs), axis=-1, keepdims=True))
```

but it seems that if I’m running softmax on logtis, it give me different results from `probs`

:

```
def softmax(x):
e_x = np.exp(x - np.max(x))
return e_x / e_x.sum(axis=1, keepdims=True)
probabilities = softmax(logits)
```