Tensorflow JS a simple runnable example that crashes in browser

So I am following a course in Coursera and decided to play a bit with the code, here it is my simple runnable code that you can run by launching a live server in vs code:

<!DOCTYPE html>
<html lang="en">
  <script
    src="https://cdnjs.cloudflare.com/ajax/libs/tensorflow/4.2.0/tf.min.js"
    integrity="sha512-luqeEXU5+ipFs8VSUJZTbt6Iil1m7OT0bODSccqew2CN85iad5Mn//M9+CPVI4UGlo8kN51OWFSox+fYe4qgYQ=="
    crossorigin="anonymous"
    referrerpolicy="no-referrer"
  ></script>
  <body>
    <h1>Hello World</h1>
    <script>
      // train "model" on xs and ys
      async function doTraining(model, xs, ys) {
        const history = await model.fit(xs, ys, {
          epochs: 500,
          callbacks: {
            onEpochEnd: async (epoch, loss) =>
              console.log("Epoch: ", epoch, "Loss: ", loss),
          },
        });
      }

      // define the "keras" layered model
      function makeModel(units = 1, inputShape = [1], lr = 0.1) {
        const seq = tf.sequential();
        seq.add(tf.layers.dense({ units, inputShape }));
        seq.compile({
          loss: "meanSquaredError",
          optimizer: tf.train.sgd(lr),
        });
        seq.summary();
        return seq;
      }

      // little vectors
      const xs = tf.range(0, 10);
      const ys = tf.range(1, 11);
      // train model
      const seq = makeModel(1, 1, 0.1);
      doTraining(seq, xs, ys).catch((e) => {
        console.log(e);
      });
    </script>
  </body>
</html>
  • If the lr is 0.1 it crashes
  • If the lr is 0.01 it gets to the minimum

Any ideas why is this so?

I am unfamiliar with the Coursera course but I cover these areas in my YouTube series over on Google Developers for Web ML:

Essentially 0.1 is probably too high and you may never converge to a reasonable fit for your data. See my visuals in this video above around 8 mins - but see the whole video for the context.

1 Like

Nice! I was trying to plot something like that, but didnt get there yet.

I wonder how this happens though, I also found out that normalizing the data is essential…

Yep I cover Normalization here:

at around 4min 54 sec.