Memory Leaks with TensorFlow.js hand gestures

Hey guys,
In the last months, I’ve used the same HandPose model used by Minko Gechev, software engineer at Google Angular team, in his demo here: Controlling an Angular app with a camera via hand gestures - YouTube

and I have integrated the same source code, shared by him on GitHub ( in my Angular application with a slight change in order to be able to navigate between the different steps (or order-creation & product-creation processes) instead of “open/close menu & choose a menu-item”.

Here are the dependencies that I’ve added to my project for the hand gestures feature:
@tensorflow-models/handpose: 0.0.7
@tensorflow/tfjs-backend-webgl: 3.7.0
@tensorflow/tfjs-converter: 3.7.0
@tensorflow/tfjs-core: 3.7.0
fingerpose: 0.0.2

But I’ve faced a performance problem: there is a continuous increase of memory consumption throughout using my application (even after the loading of the hand tracking model is completed and started working). I’ve published an article regarding this topic:

and video here: How TensorFlow.js Became a Performance Bottleneck for My App - YouTube

I’ve taken many Heap Snapshots and noticed that whenever I go to the view of editing an order or product where the hand gestures tracking is present and then come back to the products list view (where there is no hand tracking), the size of the snapshot increases by almost 20 MB as you can see in the attached screenshots.

I’m using MacBook Pro 13", Version: macOS 11.5.2

My app is available online here:
The source code is on GitHub here:

I appreciate it if you could give me any note that could help me to fix this memory leak. Many thanks!

Welcome to the forum - let us take a look and get back to you on this. Thank you for sharing the working links and resources for us to check. Most appreciated.

1 Like

Hello there. I took a look at your working application and found the following:

  1. I loaded your app, everything is fine so far.
  2. I went to the order page and as expected you load the hand model which will take a moment as it is a few Mb in size to load. Once loaded Took heap memory dump ~25Mb.
  3. I went back to the main screen. Took memory dump - still ~25MB
  4. I go back to make new order, there is pause again, took memory dump and as you mentioned it doubles to ~50Mb.

I figured this is something to do with the logic of your web app as it seems to be loading in a new version of the model every time you change a view. I confirmed this by checking the network tab here:

Essentially it seems you are calling model.load every time the view changes which is not recommended. Instead you should load the model once on page load (or first use) as a constant and after that only call model.estimateHands().

You do not need to load the model in repeatedly - this will cause memory issues as you have seen if you do not release the old memory using model.dispose(), not to mention it could make your app slow in that moment as the act of loading the model itself is a very intensive process which on some browsers can block GUI updating whilst it loads (our team are actually looking into making the load less intensive but this is not the main issue here - which is the old memory is not being disposed). Essentially the model should be loaded only once and then cached.

If you could change the logic of your application such that it does not load the model on every view change this should resolve your situation.


Many thanks Jason, that was a valuable insight!

In my project, I have a dynamic stepper implemented as an Angular library and it has the navigation between steps with hand tracking as a feature in it.

<lib-dynamic-stepper [steps]=“steps”

DynamicStepperModule is used in order-creation and product-details views and is the one that is responsible for loading and releasing the hand pose model (via its HandGestureService) and not the main app (AppComponent) like in the case of the demo from Minko.

The solution that I see for my case is to release the model whenever the user leaves the details view. But I just couldn’t find the .dispose() method.

The source code that I have calls handpose.load() (as you can see in line 50 in hand-gesture.service.ts) that returns Promise and “handpose” is coming from @tensorflow-models. In line 62, there is a call to model.estimateHands(video), but both “handpose” and “HandPose” don’t have a “dispose()” method.

Glad it was useful!

I would highly recommend not reloading the TensorFlow.js model on every view change as the “load” is non trivial and will block your GUI (and even when we make it less intensive, its still going to use a lot of resource and potentially redownload the model every time when it does not need to depending on browser settings).

I would suggest loading TFJS model once, storing in a constant that does not get destroyed between view changes, and then just calling the prediction function when needed. This will lead to a much more performant web app and save bandwidth redownloading the model on every view change if you are not caching locally too eg in localstrorage etc.

Your suggested solution will solve the memory leak but will not solve performance issue as it will have to re parse everything every time and send to GPU memory from CPU which is very intensive.

Good luck!

1 Like

Thanks Jason,
the Memory Leak is fixed :slight_smile: after saving the handPose model in a variable (_handPoseModel: HandPose) in HandGestureService and wrapping model.estimateHands(video).then(..) in video.addEventListener('loadeddata', (e) => {..})

The solution is available here:

and an article about it is available here.

Glad you got it solved and thanks for writing the follow up article :slight_smile: Good luck on your TensorFlow.js journey!

1 Like

PS I just finished reading your new article. A few comments that may be of use to you:

  1. Your browser should usually cache downloaded files (depends on your browser settings) so you should see it is faster to load after you have done it once (as you saw in your write up) and then having the model in singleton as you mentioned is correct for most use cases.

  2. You only need to use localstorage / indexdb for offline functionality eg in a Progressive Web App or similar if you want the model to work offline.

We have function calls in our library APIs to save to localstorage / indexdb etc for you. See this page:

That should make your life easier when working with “raw TFJS models” (when working with model.json directly) in the future beyond these pre-made easy to use classes!

1 Like

Thanks for the note. I have seen the “Save and load TensorFlow.js models” page before opting for saving in a singleton, but I didn’t found the .save() method under my model object in order to use: const saveResult = await'localstorage://my-model-1');

Yep that is because the model you are using is one of our “showcase” pre-made models that are actually wrapped in a nice easy to use JavaScript class the TensorFlow.js team wrote which makes it simple for JS devs to use - no need to worry about Tensors and such to convert webcam image to correct input format for model etc.

The link above applies when working with the raw TensorFlow.js model itself - that is when you are loading the model.json file yourself and loading that directly (which our class does for you behind the scenes) so that class would have to support expose calling that save method which to the best of my knowledge I do not think it does in its current form as these are designed to be run like you currently have it running - cached by browser running as a singleton in the web app.

1 Like