Corrupted configuration and batch_input_shape (Loading pre-trained layers model in tensorflow.js )

Hi! I’m trying to load my pre-trained layers model into a node.js app. When loading, I got this error:

ValueError: An InputLayer should be passed either a `batchInputShape` or an `inputShape`.

I assumed this was because the InputLayer in my model.json file had the property “batch_shape” instead of “batch_input_shape” and changed it in all relevant layers. I am now getting this error:

ValueError: Corrupted configuration, expected array for nodeData: [object Object]

Here are my questions:

  • Did manually changing batch_shape to batch_input_shape cause this error, or is this another error? (I am leaning towards the latter, since changing the name of an attribute wouldn’t change the data type?)
  • What is causing the “Corrupted configuration” error? I don’t see a “nodeData” in my model.json file
  • Is there documentation for what type every attribute should be in a model.json file? Or, does anyone have an example of a functional model.json file for a layers-model I can check mine against?

My models.json file is too long to include here, so here’s a link to the file: https://www.file.io/BjSr/download/DQTe2Pqhilcq

Additional information:

  • Model was trained following this tutorial almost exactly
  • Model was converted to json using these steps

Thank you so much for your help! Let me know if I should provide additional info

Hi! Is this solved? As I’m currently stumbling into this error (when loading) as well.

Having the same problem here.
A working model saved converted with tfjs.converters.save_keras_model(new_model, ‘js-model’) resulting in model.json and group1-shard1of1.bin.

Loading in nodejs with

const handler = tf.io.fileSystem(MODEL_PATH);
  const model = await tf.loadLayersModel(handler);

results in the “batchInputShape” error.

Hi, @goblincat, @abimdanu @boerme

Welcome to TensorFlow Forum!

I apologize for the delayed response and thank you for bringing this issue to our attention, if possible could you please help us with your Github repo or codepen example along with converted TensorFlow.js model (@goblincat I’m getting error while accessing your model files) and complete steps to replicate the same behavior from our end to investigate this issue further from our end ?

Thank you for your cooperation and patience

I found out that that I can’t even load the saved model with load_keras_model in the same environment:

tfjs.converters.save_keras_model(model, 'js-model')

new_model = tfjs.converters.load_keras_model('js-model/model.json')
Exception encountered: Unrecognized keyword arguments: ['batch_shape']

Perhaps that would be a good place to start while some of us gets something set up in github.
Btw. I’m on a mac m1 with python 3.11, tensorflow 2.16.1, tensorflow-metal 1.1.0 and tensorflowjs 4.20.0

Hi Rahul!
Thanks for the response! I have reuploaded my model here. The function I’m trying to run is just this:

const tfn = require("@tensorflow/tfjs-node");
const handler = tfn.io.fileSystem("./model.json");
let model;

async function getModel() {
    if (!model) {
      model = await tfn.loadLayersModel(handler);
    }
    return model;
};

Can you load the model and run inferences from Python ? Just to rule out whether it’s conversion or model/training/saving issues.

I’ve been able to reload my model in Python but it looks like Boerme in the previous reply can’t