How to get Leaky Relu Backward?

Hello everyone, I’m trying to build a single layer neural network in the function below. For different reasons I’m trying to avoid using Keras because I need to run over 400K instances of this network. Loading 400K keras objects is very slow. So I’m doing my own implementation. I have forward propogation finished.

I don’t quite remember how do backpropogation. I can figure out most of it, but I’m not sure if it’s different with a leaky_relu activation. Would anyone be able to help me leaky_relu backward?

Thank you!

import tensorflow as tf
import numpy as np

@tf.function
def predict(x, b, w):
z = tf.tensordot(w, x, axes=1) + b
G = tf.nn.leaky_relu(tf.tensordot(w, x, axes=1) + b)
return G

weights = tf.random.uniform((9, 9), dtype=tf.float32)
bias = tf.random.uniform([9], dtype=tf.float32)
signal = tf.random.uniform([9], dtype=tf.float32)
model_predicted = []

for i in range(446436):
model_predicted.append(predict(signal, bias, weights))