What does your coding environment look like?

100% Colab for ML related stuff and I think Colab Pro is pretty awesome deal as well.

3 Likes

Colab pro for the win! :slight_smile:

3 Likes

Colab and Vsocde with Tensorflow dev-cointainer

5 Likes

Colab Pro is my favorite for day-to-day development needs, but I also love keeping a VS Code workspace always open and ready to quickly test ideas.
And my general workflow. I develop and varify my model can learn with the setup described above, then I clean and convert my code into a Python package using VS Code and push it into a private repo. Now I can easily go serverless for training with Cloud AI Platform (and Cloud TPU for giant networks). I always try to make TensorFlow IO API the only dependency for my data pipeline, which boosts the portability to different filesystems and accelerators :blush:

2 Likes

I do a lot of prototyping in Colab, including capability uplift work, but switch to VSCode as quickly as I can. Very very rarely do I work with a client that has a green fields setup; it’s always an existing mix of technologies from AWS, GCP & Azure. Emacs for all my hobby work.

2 Likes

We have standard VSCode devcointainers available at vscode-dev-containers/repository-containers/github.com/tensorflow at main · microsoft/vscode-dev-containers · GitHub

5 Likes

I used to be in Ubuntu, now coding on my own GPU workstation with Windows + WSL2 Ubuntu. Both systems share the same disk storage so I can write code in Windows VS Code and run projects in linux command line. CUDA is also supported natively and TF seems to be working well so far!

4 Likes

Totally depends on the use case:

  • colab if it’s open source work on ML or something that is python related but doesn’t require my local resources
  • kaggle kernels if I am doing something with data on kaggle or public bigquery datasets
  • cloud notebooks for most of ML/Data tasks related to work
  • vscode for almost everything else from normal coding to documentation to jotting down ideas etc

My ideal tool would be something which is an amalgamation of

  • github workspaces (which I think is built on top of vscode) like repo level environment setup and
  • that environment can use colab/kaggle/cloud notebook/normal VM backends
  • and allows me to expose a local directory for local resources (mount local disk on network type scenario)
  • allows the ability to launch this complete environment on different OSes
2 Likes

github workspaces

Do you meant GitHub codespaces?

In that case It could be supported by the devcointainers we have at
https://tensorflow-prod.ospodiscourse.com/t/what-does-your-coding-environment-look-like/33/9

2 Likes

Usually Colab or an AI Platform Notebook.

When I need to have access to TPUs, I usually create TFRecords (if the dataset is pretty large) inside an AI Platform Notebook and use them inside a Colab Notebook to use its free TPUs. Using this approach, I have been able to maintain sanity on costs and also train large-scale models.

To clean up the work and modularize things, I defer to PyCharm.

Fun fact: I don’t prefer to set things up manually. This is why I LOVE working with Colab and AI Platform Notebooks so much.

4 Likes

I mostly use a mixture of PyCharm and IntelliJ depending on what language I’m working in. It’s a pity that neither PyCharm nor IntelliJ understand C/C++ code as it means I have to switch to CLion or VScode to be able to trace something through all of TF or TF-Java.

When developing models I prefer to do that in an IDE as well, the mutability of notebooks and their incompatibility with source control means that they tend to cause issues. I admit I’m not that familiar with colab as other clouds are not allowed at work, does it have good version control solutions?

1 Like

Typically work in IntelliJ + WSL2 preview w/ CUDA support in terminal. Occasionally I’ll jump to vscode or CLion if needed.

I use Colab scratchpad for small tests and verifications in browser when I don’t want to clutter my Drive:

2 Likes

I like to use a text editor such as atom with the console. I have my own custom built deep learning workstation at home. :grinning:

I do enjoy Colab when teaching others about Tensorflow. My students enjoy the environment there.

2 Likes

Probably in order of usage Vim, Emacs, Colab (its nice for making slides too!). Mmm, does Google Sheets count as a coding environment? :slight_smile:

3 Likes

Mostly VScode + tensorflow-with-gpu docker container.

1 Like

My approach

  1. Bookmark of the one and only Colab-Notebook Untitled.ipynb for everything. Prototype new ideas, and nowhere else.
  2. Decide from time to time: a) Delete notebook cells because it’s rubbish, or b) keep it
  3. If 2b) start to add docstring and dream up some possible unit tests in Colab (e.g. checking dimensions of intermediate results)
  4. Start coding an actual python package with unit tests, setup.py, and so forth.

Step 1 is what I would call “data science” or “ml research”. With step 3 and 4 we enter software development or “ml engineering” territory. Imo it’s best to stay as long as possible in step 1 and 3 where creativity happens (=changes), what you don’t really want in step 4 anymore.

1 Like

I have ubuntu under WSL so I use vs code + jupyter notebooks to work locally as I can work, leave, pick up later without worrying about losing a session.
However, I do not have a local GPU so for some tasks I rely on Colab.
If it wasn’t for session expiring, I would probably be doing everything on Colab

2 Likes

You can evaluate our TF Cloud library for a remote GPU

1 Like

We are trying to introduce Devcontainer and Github Codspaces support in the repository.
Please send us a feeback in the PR if you are a Codespaces beta tester or a Vscode user:

1 Like

If you like Vim and Jupyter, you may like nbterm :+1:

https://blog.jupyter.org/nbterm-jupyter-notebooks-in-the-terminal-6a2b55d08b70

1 Like