Build Serving Image with Multiple Models

I don’t have a problem when creating my own serving image in docker using 1 model but when I try to build a serving image with multiple models it doesn’t work.

Here’s the command I used

Create Serving Image

docker run -d --name serving_base tensorflow/serving

Copy SavedModel

docker cp /home/model1 serving_base:/models/model1
docker cp /home/model2 serving_base:/models/model2
docker cp /home/model.config serving_base:/models/model.config

Commit

docker commit serving_base acvision

Stop Serving Base

docker kill serving base

Checking docker image

docker run --rm --name serve_acvision -p8500:8500 -p8501:8501 -d acvision

Checking the 1st model if working
Screenshot from 2021-09-29 01-43-44

Could you please refer the similar issues link1 and link2
Hope this helps.Thanks

1 Like

I’m able to run it with multiple models but my problem is serving it in images using docker.

Like This:

@IreneGi - Can you help?

@Robert_Crowe Thanks, I’m still facing the problem. I also add a batching inference like this

then I try to copy the batching parameters file on the containers model folder but I don’t have any idea if it worked

Please open a GitHub issue Issues · tensorflow/serving · GitHub as it is easier to route/track.