Keras weight contraints including both bias and kernel matrix at the same time

My question is related with how keras implements the weights constraints, in particular in dense layers. I am using mainly R and its keras implementation, so I will use here their notation, but I will be also happy if someone has a solution in Python.

As far as I know, keras constraints like constraint_maxnorm() can be applied to a layer_dense() with the arguments kernel_constraint or bias_constraint, where the kernel refers to the weigth matrix without the bias terms. However, in order to test new approaches I think it would be useful to also be able to apply constraints to the full weight matrix, where the biases vector forms an extra row in the weigth matrix. An example of this kind of constraint would be to restrict to 1 the L1 norm of the weight vectors incident to a neuron (considering the bias at that neuron also as an element of said vector).

In this sense, custom constraints can be created but the problem is that they receive the kernel weights as input, not both the kernel weights and bias, and the output is also only the kernel weights, not the full weight matrix.

Is it possible to implement this kind of constraints in keras or tensorflow that affect both the kernel and the bias at the same time? In this sense, it would be enough to be able to create custom constraints that accept both the kernel weights and the bias as input, and then apply the desired constraint twice, at the kernel and at the bias.