The official webpage of the Neural Structured Learning has provided three tutorials all of which only focus on classification problems. So, can NSL and its graph regularization be also applicable to regression problems? If so, is there any example representing how that works?
Bhack
June 23, 2021, 9:18pm
#2
Yes check:
opened 10:07AM - 08 Apr 20 UTC
closed 11:44AM - 11 Apr 20 UTC
question
I was able to integrate adversarial loss to my TensorFlow model, My input data h… as combination of categorical and continuous attributes.
Is there any way to specify not to perturb categorical data while creating adversarial examples?
```
code:
x = tf.placeholder(tf.float32,shape=(None,20))
y = tf.placeholder(tf.float32,shape=(None,1))
#build a sample model
def model(x,y,is_training):
with tf.variable_scope("regression_model",reuse=tf.AUTO_REUSE) as scope:
layer0 = tf.layers.Dense(units=64,activation=tf.nn.relu)(x)
layer1 = tf.layers.Dense(units=128,activation=tf.nn.relu)(layer0)
output = tf.layers.Dense(units=1)(layer1)
error = tf.subtract(y,output)
print(error)
loss = tf.reduce_mean(error,axis=0)
return loss
#normal loss mean absolute error
regular_loss = model(x,y,True)
adv_config = nsl.configs.AdvRegConfig()
adv_input,adv_weights = nsl.lib.gen_adv_neighbor(x,regular_loss,config=adv_config.adv_neighbor_config)
adv_loss = model(adv_input,y,True)
overall_loss = regular_loss + 0.2*adv_loss
train_step = optim.minimize(overall_loss)
tf.random.set_random_seed(100)
sess = tf.Session()
init_op = tf.global_variables_initializer()
sess.run(init_op)
X = np.random.random((32,20))#batch of 32
Y = np.random.random((32,1))
sess.run([train_step,adv_loss,regular_loss,overall_loss],feed_dict={x:X,y:Y})
```
https://santhoshkolloju.medium.com/neural-structured-learning-4b8b3d6c0e70
2 Likes