Setup a Residual Neural Network block in tfjs

I would like to implement the behavior of a ResNet (Residual Neural Network). Would the following code do that? More specifically, does the concatenate layer I have in the following code function like the identity skip connection, which would effectively make layers 3,4,5, and into a residual block? Does TFJS automatically know how to pass the backwards propagation signal through the concat layer?

  const dense_layer_1 = TENSORFLOW.layers.dense({ units: 1600, activation: "relu", useBias: true }).apply(input);
  const dense_layer_2 = TENSORFLOW.layers.dense({ units: 800, activation: "relu", useBias: true }).apply(dense_layer_1);
  const dense_layer_3 = TENSORFLOW.layers.dense({ units: 800, activation: "relu", useBias: true }).apply(dense_layer_2);
  const dense_layer_4 = TENSORFLOW.layers.dense({ units: 800, activation: "relu", useBias: true }).apply(dense_layer_3);
  const dense_layer_5 = TENSORFLOW.layers.dense({ units: 800, activation: "relu", useBias: true }).apply(dense_layer_4);
  const dense_layer_6 = TENSORFLOW.layers.dense({ units: 800, activation: "relu", useBias: true }).apply(dense_layer_5);
  const concat_layer1 = TENSORFLOW.layers.concatenate().apply([dense_layer_2, dense_layer_6]);
  const dense_layer_7 = TENSORFLOW.layers.dense({ units: 800, activation: "relu", useBias: true }).apply(concat_layer1);
  const dense_layer_8 = TENSORFLOW.layers.dense({ units: 800, activation: "relu", useBias: true }).apply(dense_layer_7);
  const output = TENSORFLOW.layers.dense({ units: 1, activation: "linear", useBias: true }).apply(dense_layer_8);
  const residual_model = TENSORFLOW.model({ inputs: input, outputs: output });