# Step function as activation during testing

Hallo everyone,
I’m very new in the forum and also new-worker with tensorflow/keras. Last time I received a mail from a friend (not availabe anymore).

If you have a sigmoid function

** sigma(x) = 1/(1+exp(-x))**

then consider a modified sigmoid function

** sigma(x) = 1/(1+exp(-ax))**

where a is a parameter that controls the steepness of the curve. as a approaches infinity, then the sigmoid function approaches a step function

** step(x) = 1 if x >= 0, and 0 if x < 0**

but this function is not differentiable at x=0 … further, the gradient is 0 everywhere else, so it is not that useful for gradient descent.

However, the step function makes the output 0/1, and the sigmoid function outputs a real number from 0 to 1. Hence, we can convert a neuron with a step activation to a Boolean function, but not the neuron with a sigmoid activation.

Since sigmoid approximates step, we can train with a sigmoid, and then replace it with a step function (this is roughly what happens when we train perceptrons). However, we lose some accuracy doing this.

I trained my NN with 2 layers (14 neurons at the 1st hidden layer using linear activation and 1 neuron in the last layer using sgmoid). I have binary data (0/1) as inputs and as output.
My question is to know how to replace the sigmoid activation by the step_function at the testing phase as suggested by my friend ?
I tried

Blockquote

class BNN(models.Sequential):

``````def __init__(self):
super().__init__()

name='output_layer',use_bias=True))
``````

Blockquote

Blockquote

from tensorflow import nn
from tensorflow import round as kround
def step_sigm(x):
return kround(nn.sigmoid(x))
model.layers[1].activation=step_sigm

Blockquote

but seems not to work since I didn’t notice any modification on weights or something else.

Could someone help me out please how to set this step_function as activation in testing phase using tensorflow/keras ?

Many Thanks for any help