Hi TF-fans!

I am getting more and more into TensorFlow but still trying to figure some things out.

I was about to write my own constraint and boundary condition function but then noticed there is a `tf.constraints`

class, which got me curious. Are there such helper methods that can be applied to the weight variables themselves without needing a Keras model and layers? At the moment I am just using the `tf.optimize.minimize`

.

If I need to constrain my variables manually:

Where do I place my constraints in the training loop?

What type of constraints should I use?

Here are a few alternatives I have tried:

`w.assign(tf.div(w, tf.sum(w)))`

`w.assign(tf.div(tf.exp(w), tf.sum(tf.exp(w))))`

`w.assign(tf.softmax(w))`

Neither seems to work. Individual weights are not in [0, 1] and do not sum to 1. Training stalls immediately with all of them, but that may be due to some unrelated issue.

In general, my question is what the recommended way is to manage constraints and boundary conditions in TF and also looking for your personal experience with the above constraints.

Copying you @Dennis seeing as you helped me out a lot a few days ago when I started working with TF.