How cache detector on browser when using face detection pre trained model

I’ve used face detection pretrained model using tfjs on react js like this:

      import * as faceDetection from "@tensorflow-models/face-detection";
      ...
      const model = faceDetection.SupportedModels.MediaPipeFaceDetector;
      const detectorConfig = {
        runtime: "tfjs",
        maxFaces: 5,
        modelType: "short",
      };

      const faceDetector = await faceDetection.createDetector(
        model,
        detectorConfig
      );

When the browser is refreshed, the detector loads again from the beginning instead of loading from cache, How do I make the detector saved on browser caches when loaded for the second/3rd time
and so on ?

@irufano you can leverage browser storage mechanisms like localStorage to save and retrieve the model when the page is refreshed. Here’s an example of how you might modify your code to implement caching:

jsxCopy code

import React, { useEffect, useState } from "react";
import * as faceDetection from "@tensorflow-models/face-detection";

const FaceDetectionComponent = () => {
  const [faceDetector, setFaceDetector] = useState(null);

  useEffect(() => {
    const loaFaceDetector = async () => {
      // Check if the face detector is already in localStorage
      const cachedFaceDetector = localStorage.getItem("cachedFaceDetector");

      if (cachedFaceDetector) {
        // If cached, parse and set the face detector
        setFaceDetector(faceDetection.loadFromJSON(JSON.parse(cachedFaceDetector)));
      } else {
        // If not cached, create a new face detector and cache it
        const model = faceDetection.SupportedModels.MediaPipeFaceDetector;
        const detectorConfig = {
          runtime: "tfjs",
          maxFaces: 5,
          modelType: "short",
        };

        const newFaceDetector = await faceDetection.createDetector(model, detectorConfig);

        // Cache the face detector
        localStorage.setItem("cachedFaceDetector", JSON.stringify(newFaceDetector));

        setFaceDetector(newFaceDetector);
      }
    };

    loadFaceDetector();
  }, []); // Run only once on component mount

  // Rest of your component logic using faceDetector...

  return (
    <div>
      {/* Your component JSX */}
    </div>
  );
};

export default FaceDetectionComponent;

This modified code checks whether the face detector is already cached in localStorage. If it is, it loads the cached detector; otherwise, it creates a new one and caches it for future use. The caching is done using localStorage.setItem and retrieved using localStorage.getItem.

Keep in mind that caching the model in this way assumes that the model is static and does not change frequently. If the model is updated regularly, you may need to implement additional logic to handle model versioning and updates.

Let me know if it worked or else we can dig deep into this.

I got error the instance cannot convert to JSON

TypeError: Converting circular structure to JSON
    --> starting at object with constructor 'Object'
    |     property 'inputs' -> object with constructor 'Array'
    |     index 0 -> object with constructor 'Object'
    |     property 'children' -> object with constructor 'Array'
    --- index 0 closes the circle
    at JSON.stringify (<anonymous>)

and when i try without JSON.stringify
the detector saved is not instance but became [object Object]
then I call estimateFaces got the error

TypeError: this.detector.estimateFaces is not a function
    at FaceDetector.detect

@irufano

Hmmm …the error you’re encountering, “TypeError: Converting circular structure to JSON,” indicates that there’s a circular reference within the object you’re trying to convert to JSON.

In the context of TensorFlow.js models, this can happen because the model object contains references to other objects, creating a circular structure. To address this issue, you can use a custom serialization approach to selectively serialize only the necessary information from the model. TensorFlow.js provides a toJSON method that you can implement for your specific use case.

by the way I used code below based on examples demo face detection at tfjs official github here

const model = faceDetection.SupportedModels.MediaPipeFaceDetector;
        const detectorConfig = {
          runtime: "tfjs",
          maxFaces: 5,
          modelType: "short",
        };

        const newFaceDetector = await faceDetection.createDetector(model, detectorConfig);

the question is how to use pretrained model like face detection directly without implement createDetector on the example demo?

@irufano

Here’s how you can modify the code to use a pre-trained model:

const modelCacheKey = "cachedFaceDetector";

// Check if the model is already in localStorage
const cachedModel = localStorage.getItem(modelCacheKey);

let faceDetector;

if (cachedModel) {
  // If cached, parse and set the face detector
  faceDetector = await tf.loadLayersModel(JSON.parse(cachedModel));
} else {
  // If not cached, load a pre-trained face detection model
  const model = await tf.loadLayersModel('path/to/pretrained/model.json');

  faceDetector = model;

  // Cache only the necessary information
  const serializedModel = await model.toJSON();
  localStorage.setItem(modelCacheKey, JSON.stringify(serializedModel));
}

// Use the faceDetector as needed

In this code, replace 'path/to/pretrained/model.json' with the actual path or URL to the pre-trained face detection model you want to use. This way, you can skip the explicit implementation of createDetector and directly load the pre-trained model.

Make sure the pre-trained model is compatible with the tf.loadLayersModel method, and its architecture matches the expectations of your application.

I got this error

http.ts:173 Uncaught (in promise) Error: Failed to parse model JSON of response from https://storage.googleapis.com/mediapipe-models/face_detector/blaze_face_short_range/float16/1/blaze_face_short_range.tflite. Please make sure the server is serving valid JSON for this request.
    at HTTPRequest.load (http.ts:173:1)
    at async loadLayersModelFromIOHandler (models.ts:293:1)
    at async loadModel (faceTf.js:18:1)

from this

 console.log("loadLayersModel");
    // If not cached, load a pre-trained face detection model
    const model = await tf.loadLayersModel(
      `https://storage.googleapis.com/mediapipe-models/face_detector/blaze_face_short_range/float16/1/blaze_face_short_range.tflite`
    );
    console.log("HERE ",model)

    faceDetector = model;
    
    // Cache only the necessary information
    const serializedModel = model.toJSON();
    
    localStorage.setItem(modelCacheKey, JSON.stringify(serializedModel));

it seems blaze_face_short_range.tflite has no valid json

The model that used in demos is

export const DEFAULT_DETECTOR_MODEL_URL_SHORT =
    'https://tfhub.dev/mediapipe/tfjs-model/face_detection/short/1';

how to access it with the loadLayersModel from tfjs core?

Hey @irufano !

The issue is likely because tf.loadLayersModel expects the model to be in a format that contains the model architecture and weights in a JSON format, but the provided URL points to a TFLite (TensorFlow Lite) model, which is a different format.

To load a TensorFlow Lite model in TensorFlow.js, you can use the tf.lite.loadModel function. Here’s how you can modify the code to load the TensorFlow Lite model:

const modelCacheKey = "cachedFaceDetector";

// Check if the model is already in localStorage
const cachedModel = localStorage.getItem(modelCacheKey);

let faceDetector;

if (cachedModel) {
  // If cached, parse and set the face detector
  faceDetector = await tf.loadLayersModel(JSON.parse(cachedModel));
} else {
  // If not cached, load a pre-trained face detection model (TensorFlow Lite)
  const modelUrl = 'https://storage.googleapis.com/mediapipe-models/face_detector/blaze_face_short_range/float16/1/blaze_face_short_range.tflite';
  const model = await tf.lite.loadModel(modelUrl);
  console.log("HERE ", model);

  faceDetector = model;

  // Cache only the necessary information
  // (TFLite models do not need toJSON, so we can skip it)
  localStorage.setItem(modelCacheKey, 'cached'); // Just to indicate that the model is cached
}

// Use the faceDetector as needed

In this modification, I replaced tf.loadLayersModel with tf.lite.loadModel to load the TensorFlow Lite model directly. Additionally, TFLite models do not need to be serialized using toJSON, so we can skip that step. The localStorage.setItem line is just a placeholder to indicate that the model is cached.

Make sure that the TensorFlow.js version you are using supports loading TensorFlow Lite models with tf.lite.loadModel. If you encounter any issues, consider checking the TensorFlow.js documentation for the specific version you are using.

Thank you bro fro your help :pray:

1 Like

@irufano most welcome. If you ever need any assistance in the future, I am just a text away.