# How do i get probabilities prediction from keras?

Hi,
I’m pretty new to ML.
For a school project, I try to optimize the tpr of my confusion matrix, but i use a Sequential model from keras and they don’t have the predict_proba of scikitlearn. How could i have the probabilities to modify the threshold ?

``````def perceptron(X,y):

X_train,X_test,y_train,y_test = train_test_split(X, y, test_size=0.2, random_state=42)

model = km.Sequential()

history = model.fit(X_train, y_train,
batch_size=10,
epochs=100,
validation_split=0.2,
verbose=0) # % of data being used for val_loss evaluation

ev = model.evaluate(X_test, y_test)
return y_test,X_test,history,model
``````
``````def classification(y_test,X_test,history,model,aff):

y_pred = model.predict(X_test)

for i in range(len(y_pred)):
if y_pred[i]<0.5:
y_pred[i]=0
if y_pred[i]>0.5:
y_pred[i]=1

threshold = 0.5

probabilities = tf.nn.softmax(X_test).numpy()
pred = np.argmax(probabilities, axis=1)

print(probabilities)
print(pred)

class_report = classification_report(y_test, y_pred) # Additional evaluation metrics

# Calculate confusion matrix
conf_matrix = confusion_matrix(y_test, y_pred,labels=[0,1])

``````

Instead of the y_pred = model.predict(X_test), what can i do ?

Thanks

Your current code applies sigmoid activation to the dense layers. (Maybe you should use `relu` activation for all but the last layer?) The last layer in the model outputs a single value between 0 and 1, which you can treat as a probability for binary classification. `model.predict` should therefore output a probability for each input. I’m not sure what you’re trying to do with `tf.nn.softmax` or if it’s necessary.