Error on predict() - High memory usage in GPU: 1179.94 MB, most likely due to a memory leak

Hello there :wave:,
I’m developing a React Native (managed by Expo) simple app which should let to detect/recognize text from live stream coming from a Tensor Camera.

I found these tflite models and, thankfully to the amazing job of PINTO0309, I’ve converted to json + bin files.

Following official documentation I’ve coded like that the TensorCamera onReady callback:

const handleCameraStream = (images: IterableIterator<tf.Tensor3D>, updateCameraPreview: () => void, gl: ExpoWebGLRenderingContext) => {
		const loop = async () => {
			if(!images) return;

			if (frameCount % makePredictionsEveryNFrames === 0) {
				const imageTensor = images.next().value;
				if(!imageTensor) return;
				
				if (model) {
					const tensor4d = imageTensor.expandDims(0);

					const predictions = await model.predict(tensor4d.cast('float32'))
					console.log('🎉 - Predictions: ', predictions);

					tensor4d.dispose();
				}

				imageTensor.dispose();
			}

			frameCount++;
			frameCount = frameCount % makePredictionsEveryNFrames;
			
			requestAnimationFrameId = requestAnimationFrame(loop);
		};

		loop();
	}

TensorCamera:

let textureDims;
if (Platform.OS === 'ios') 
	textureDims = { height: 1920, width: 1080 };
else 
	textureDims = { height: 1200, width: 1600 };

<TensorCamera
	style={ styles.camera }
	cameraTextureHeight={textureDims.height}
	cameraTextureWidth={textureDims.width}
	useCustomShadersToResize={false}
	type={CameraType.back}
	resizeHeight={800}
	resizeWidth={600}
	resizeDepth={3}
	onReady={handleCameraStream}
	autorender={true}
/> 

Unfortunately I get a memory leak warning and then app crashes!

WARN  High memory usage in GPU: 1179.94 MB, most likely due to a memory leak

I’ve tried both tf.tidy(), tf.dispose() functions but the errors persists.

What I’m doing wrong?
How can I improve memory handling?

Thank you :pray:

Hi @Giuseppe_Ravida,

While I’m not an expert, here are some suggestions to improve memory handling in your React Native app. Ensure that you dispose of TensorFlow tensors (imageTensor and predictions) and the TensorFlow model (model) when they are no longer needed. Utilize tf.tidy() to automatically handle the disposal of intermediate tensors; consider optimizing the model size, reducing the camera texture dimensions, and minimizing unnecessary tensor creation to mitigate the memory leak issue.

I hope this helps!

Thanks.

Ehi :wave:,
tnx for your help. :hugs:

Onestly I’m stuck a little bit. I’ve tried also using tf.setBackend(‘cpu’); the error disappears but application freeze (on predict).

I’m struggling :sob:

Hi @Giuseppe_Ravida,

for debugging you can also try the tf.memory() function.
Feel free to share a little console log of the model.summary() and tf.memory()
(before and after the predict call).
Looking forward,
Dennis