I would like to ask how to save the personalization model during training on device

To tflite team:
I would like to ask how to save the personalization model during training on device. If this personalization model can be used for transmission, it may be possible to implement federated learning for user privacy and security, and I would like to ask if this feature is supported or will be supported soon.

2 Likes

This also means that if I exit the app, the trained tflite model will be lost, is that right? Looking forward to a reply.

I guess this can help you: On-Device Training with TensorFlow Lite
(there is a specific session on saving the data on device)

1 Like

Thank you very much for your reply, I used this session, but it can only save the parameters of the last layer of the model (.ckpt), I would like to know if I can save all the parameters of the model including tflite?

Won’t the checkpoint have all the model’s parameters?
I can’t test it right now, but I’d expect it to be the full model

Yes, after using the On Device Training case I was only able to save the model parameters for the last layer, not sure if I got it wrong somewhere, but I did follow the case.

I’m very sorry, it was my mistake, I’ve figured out how to save the model as well as train it flexibly (defining the relevant interpreter functions on the server), thank you!

1 Like

Hi zirui_lian, can you please share how you save the model?