Virtual Environment vs Containers

I currently have a virtual environment inside which I have installed all the packages that I want (tensorflow, … ) and I use it to work locally. I constantly update the packages to the latest releases however, I am usually stuck on python 3.6
Is it the best approach to work locally? Or are containers a better approach where I can easily update my packages including python’s version.
If so, is there any guideline on how to create my environment inside containers?




I found while containers have more of a learning curve they are much easier in the long run. TensorFlow even has containers with source so you can build TensorFlow versions you need without installing a bunch of tooling on your system. If you use a GPU Nvidia-docker is by far the easiest way to get up and running with TensorFlow on a GPU. Best of all you avoid the “it works on my machine” issues where you can ship your container to the cloud or provide them to other developers to pickup and join your project.


Venv are very different from containers. One of the main difference is that a container/image it is related to a whole OS not only python.
If you want to explore containers with TF you can follow our official Docker guide and share with us any feedback or issue related to the docs.