Getting different accuracy when converted to .tflite model

When I am training the neural network using tensorflow I am getting 75% accuracy, once I convert the model to .tflite file, it is giving biased results on the android app. What would be the possible reason? Can anyone please suggest me.

Hi @Doli_Hazarika

Please place whatever code you have available to understand the issue. It can be a difference at pre or post processing of the data for example.

Regards

import numpy as np
import pandas as pd
from sklearn.preprocessing import MinMaxScaler
from sklearn.model_selection import train_test_split
from sklearn import metrics
from tensorflow import keras
from keras.models import Sequential
from keras.layers import Dense, Dropout
from keras.callbacks import ModelCheckpoint

Load and preprocess data

fname = “E:/Doli/DATA/EO_EC/1min_EC_EO/Sub_2/lab_data.txt”
data = pd.read_csv(fname, header=None)
x = data.iloc[:, :-1]
y = data.iloc[:, -1]
scaler = MinMaxScaler()
x_scaled = scaler.fit_transform(x)

Split into training and testing sets

x_train, x_test, y_train, y_test = train_test_split(x_scaled, y, test_size=0.33, random_state=42)

model2 = Sequential()
model2.add(Dense(1000, activation = ‘relu’, input_shape = (x.shape[1], )))
model2.add(Dropout(0.2))
model2.add(Dense(1000, activation = ‘relu’))
model2.add(Dropout(0.2))
model2.add(Dense(1000, activation = ‘relu’))
model2.add(Dropout(0.2))
model2.add(Dense(1, activation = ‘sigmoid’))
model2.summary()

model2.compile(loss = ‘binary_crossentropy’, optimizer = ‘adam’, metrics=[‘accuracy’])
model2.summary()

Train model

checkpointer = ModelCheckpoint(filepath=‘MLP.weights.best.hdf5’, verbose=1, save_best_only=True)
hist = model2.fit(x_train, y_train, epochs=50, batch_size=24, validation_split=0.1, callbacks=[checkpointer], verbose=2, shuffle=True)

Evaluate model on test set

score = model2.evaluate(x_test, y_test, verbose=1)
print(“Accuracy:”, score[1])

Make predictions on test set

y_pred = model2.predict(x_test)
y_pred_binary = [1 if p > 0.5 else 0 for p in y_pred]

Print evaluation metrics

print(“Accuracy = {}\nPrecision = {}\nRecall = {}\nF1 Score = {}”.format(metrics.accuracy_score(y_test, y_pred_binary), metrics.precision_score(y_test, y_pred_binary), metrics.recall_score(y_test, y_pred_binary), metrics.f1_score(y_test, y_pred_binary)))

import tensorflow as tf
from tensorflow import lite

Convert model to TFLite format

keras_file = “Method9.h5”
keras.models.save_model(model2, keras_file)

Convert model to TFLite format with float32 quantization

converter = lite.TFLiteConverter.from_keras_model(model2)
converter.optimizations = [tf.lite.Optimize.DEFAULT]
converter.target_spec.supported_types = [tf.float32]
tflite_quant_model = converter.convert()
open(“quant_model_float32.tflite”, “wb”).write(tflite_quant_model)

Ṭhe above codes were used for training the tensorflow model. Please let me know where am I going wrong.

Thank you for your time!

OK!

First of all have you compared the inference results with the tensorflow model and then with the tflite model using python before importing the file inside andorid?

Second, can you place also the code from the android inference?

Regards

How do I compare the inference results with the tensorflow model and with tflite model? Can you please help me out?

Try to feed both with the same input and observe the results.

Just to expand on George_Soloupis’s suggestion, I outline the general approach below. From your original code, I saw that you’ve saved the TF model as Method9.h9 and your TFLite model as quant_model_float32.tflite, so I tried to keep your naming scheme in writing the example.

First, you want to set up your input data in the same way as before. I noticed you used a MinMaxScaler, so I copied your code into the code block below. Notice that in the last line I select only one input value to test. To be comprehensive, I recommend trying different values.

import tensorflow as tf
import pandas as pd
from sklearn.preprocessing import MinMaxScaler

# Load and preprocess data following original code
fname = "E:/Doli/DATA/EO_EC/1min_EC_EO/Sub_2/lab_data.txt"
data = pd.read_csv(fname, header=None)
x = data.iloc[:, :-1]
y = data.iloc[:, -1]
scaler = MinMaxScaler()
x_scaled = scaler.fit_transform(x)

# Pick random single row to test
x_test = x_scaled[0:1]

Next, you’ll want to load the TF model you previously saved and predict on that single x_test value defined above.

# Load the TF model
model_tf = tf.keras.models.load_model("Method9.h5")

# Infer the single input test value with TF model
pred_tf = model_tf.predict(x_test)
print(pred_tf)

Third, you are going to want to load the TFLite model and run inference on the same input value again. I noticed your TFLite model uses float32, so I included a line to cast the input just in case. Loading TFLite models can be little tricky (see this guide), so I tried to be more verbose in the example below.

# Load the TFLite model and allocate tensors
interpreter = tf.lite.Interpreter(model_path="quant_model_float32.tflite")
interpreter.allocate_tensors()

# Get input and output tensors
input_details = interpreter.get_input_details()
output_details = interpreter.get_output_details()

# Cast input to mirror float32 quantization
x_test_float32 = tf.cast(x_test, tf.float32)

# Pass the tensor into the interpreter
interpreter.set_tensor(input_details[0]['index'], x_test_float32)
interpreter.invoke()

# Infer the single input test value with TFLite model
pred_tflite = interpreter.get_tensor(output_details[0]['index'])
print(pred_tflite)

Finally, confirm that the two values are within rounding error.

print(pred_tf)
print(pred_tflite)

The hope is that the two predictions are within rounding error. I tried it with some fake data and I seem to be getting the same results with both TF and TFLite models. If this is the case, then the problem lies in the Android app code rather than the saved models themselves.

Anyway, I hope this helps point you in the right direction!

Yes, I got same output for pred_tf and pred_tflite i.e [[0.92405754]]

package trial.myapplication;

import androidx.appcompat.app.AppCompatActivity;

import android.content.res.AssetFileDescriptor;
import android.os.Bundle;
import android.util.Log;
import android.view.View;
import android.widget.Button;
import android.widget.TextView;

import org.apache.commons.math3.stat.StatUtils;
import org.tensorflow.lite.DataType;
import org.tensorflow.lite.Interpreter;
import org.tensorflow.lite.support.tensorbuffer.TensorBuffer;

import java.io.FileInputStream;
import java.io.IOException;
import java.io.InputStream;
import java.nio.ByteBuffer;
import java.nio.channels.FileChannel;
import java.util.Arrays;

public class MainActivity extends AppCompatActivity {

private double[] inputArray;
private Interpreter tflite;
private static final int NUM_CLASSES = 2;
private static final String[] CLASS_LABELS = {"Up", "Down"};
private String[] runPrediction;

@Override
protected void onCreate(Bundle savedInstanceState) {
    super.onCreate(savedInstanceState);
    setContentView(R.layout.activity_main);

    try {
        ByteBuffer model = loadModelFile();
        Interpreter.Options options = new Interpreter.Options();
        tflite = new Interpreter(model,options);
    } catch (IOException e) {
        e.printStackTrace();
    }

    final TextView textView1 = findViewById(R.id.textView1);
    Button button1 = findViewById(R.id.button1);

    final TextView textView2 = findViewById(R.id.textView2);
    Button button2 = findViewById(R.id.button2);

    final TextView result = findViewById(R.id.result);
    Button predict = findViewById(R.id.predict);

    button1.setOnClickListener(new View.OnClickListener() {
        @Override
        public void onClick(View view) {
            String str = "";
            try {
                InputStream inputStream = getAssets().open("Up_data.txt");
                int size = inputStream.available();
                byte[] buffer = new byte[size];
                inputStream.read(buffer);

                str = new String(buffer);
            } catch (IOException e) {
                e.printStackTrace();
            }

            String[] stringArray = str.split(",");
            inputArray = new double[stringArray.length];
            for (int i = 0; i < stringArray.length; i++) {
                inputArray[i] = Double.parseDouble(stringArray[i]);
            }
            String inputArrayString = Arrays.toString(inputArray);
            textView1.setText(inputArrayString);

        }
    });

    button2.setOnClickListener(new View.OnClickListener() {
        @Override
        public void onClick(View view) {
            String str = "";
            try {
                InputStream inputStream = getAssets().open("Down_data.txt");
                int size = inputStream.available();
                byte[] buffer = new byte[size];
                inputStream.read(buffer);

                str = new String(buffer);
            } catch (IOException e) {
                e.printStackTrace();
            }
            String[] stringArray = str.split(",");
            inputArray = new double[stringArray.length];
            for (int i = 0; i < stringArray.length; i++) {
                inputArray[i] = Double.parseDouble(stringArray[i]);
            }
            String inputArrayString = Arrays.toString(inputArray);
            textView2.setText(inputArrayString);
        }
    });

    predict.setOnClickListener(new View.OnClickListener() {
        @Override
        public void onClick(View view) {
            String predictedClassLabel = MainActivity.this.runPrediction(inputArray);
            result.setText(predictedClassLabel);
        }
    }
    );
}

public String runPrediction(double[] inputData) {

    double[] normalizedArray = StatUtils.normalize(inputData);
    float[] floatArray = new float[normalizedArray.length];
    for (int i = 0; i< normalizedArray.length; i++) {
        floatArray[i] = (float) normalizedArray[i];
    }

    TensorBuffer inputBuffer = TensorBuffer.createFixedSize(new int[]{1, floatArray.length}, DataType.FLOAT32);
    inputBuffer.loadArray(floatArray);

    TensorBuffer outputBuffer = TensorBuffer.createFixedSize(new int[]{1, NUM_CLASSES}, DataType.FLOAT32);
    tflite.run(inputBuffer.getBuffer(), outputBuffer.getBuffer());

    float[] probabilities = outputBuffer.getFloatArray();
    Log.d("Probabilities", Arrays.toString(probabilities));
    int maxIndex = 0;
    for (int i = 0; i < probabilities.length; i++) {
        if (probabilities[i] < probabilities[maxIndex]) {
            maxIndex = i;
        }
    }
    String predictedClassLabel = CLASS_LABELS[maxIndex];
    return predictedClassLabel;
}


private ByteBuffer loadModelFile() throws IOException {
    AssetFileDescriptor fileDescriptor = getAssets().openFd("model27may.tflite");
    FileInputStream inputStream = new FileInputStream(fileDescriptor.getFileDescriptor());
    FileChannel fileChannel = inputStream.getChannel();
    long startOffset = fileDescriptor.getStartOffset();
    long declaredLength = fileDescriptor.getDeclaredLength();
    return fileChannel.map(FileChannel.MapMode.READ_ONLY,startOffset,declaredLength);
}

}

This is the java android code. Please let me know where I am doing wrong.