What does your coding environment look like?

You can evaluate our TF Cloud library for a remote GPU

1 Like

We are trying to introduce Devcontainer and Github Codspaces support in the repository.
Please send us a feeback in the PR if you are a Codespaces beta tester or a Vscode user:

1 Like

If you like Vim and Jupyter, you may like nbterm :+1:

https://blog.jupyter.org/nbterm-jupyter-notebooks-in-the-terminal-6a2b55d08b70

1 Like

For me I do exploration and reporting in Juypter, then all the project work inside of PyCharm Pro. We run all our code inside of docker containers. For some client work we’ll be on AWS hosted notebooks to keep data inside their cloud.

2 Likes

I’m an old-school person

  • IDE: vim
  • Training environment: local GPU for small training - local GPUs for big training - cloud for huge training
  • Python 3.9
  • Archlinux
  • TensorFlow also in C++ :slight_smile:
2 Likes

i choose Colab, Jupyter-notebook , VS-code sometime kaggle notebook too

1 Like

Colab for experiment, sometimes local jupyter notebook and for final code use PyCharm. Also for training final model to Production I used GCP AI Platform.

1 Like

I have an Anaconda environment that I keep up to date, the latest Community build of PyCharm, and I never used them because I end up doing all of my work on Colab.

2 Likes

Nah, you are not the only one. :joy:

1 Like

Steps I follow to understand an issue:
(1) I (create gist/use existing gist) in colab to understand where the exact issue is.
(2) I come back to Visual Studio Code and write test for which build fails.
(3) I run bazel test and reproduce the issue
(4) I then add the fix in code wherever required and run bazel test again.
(5) Repeat step 4 until the bazel test passes.
(6) Push the code and raise PR.

Open for suggestions to improve this process if you have any in your mind.

2 Likes

How TF bazel build time is going to impact your contribution routine? Do you iterate on test locally or are you waiting for the TF team to manually kickoff CI tests on your commits push?

3 Likes

It’s quite funny but I analyse “an issue that I am going to work on for next few days” before going to sleep and just after waking up, I first run bazel build after syncing with the upstream and till the time I start my work (after 1-1.5 hrs after I wake up), the bazel build usually completes. So, during the day, it takes very less time to build because of caching and I usually don’t sync branch during day. :laughing:

@Bhack , Any suggestion on improving faster bazel build?

2 Likes

I was trying to add a Github Action to continuously monitor and speedup the contributor build experience inside our official tensorflow/tensorflow:devel Docker image:

But we need to have a GCS cache to bootstrap the process. See

We need also to see if we want to wait for the WIP TF Dockerfiles refactoring in SIG Build

2 Likes

Visual Studio Code, perfect IDE for what I do: python, PHP, JSON, as little as possible, sql, HTML5 / CSS3 (like for JSON, this stuff is a punishment)

2 Likes

For my part, I set up a cluster under kubelflow, for the training of my models locally

2 Likes

Definitely Colab pro, awesome product

2 Likes

@Bhack , Is there a way I can configure github workflow on my fork which can run bazel test and files don’t get pushed in PR? My system is getting too slow when running bazel test and parallelly working on something else.

1 Like

Yes you can copy on your repo something like this PR

But you need to bootstrap the remote cache (e.g. on gcs) with a first build on your machine with exactly the same env (e.g. Docker tensorflow/tensorflow:devel).

Cause, as you can see from that PR GitHub Action execution log, If you start the build from scratch, without a pre-populated remote cache, the GitHub Action will go in timeout as It will take too much compile time.

You can find more info on how to add the remote cache param (e.g. on GCS) at

https://docs.bazel.build/versions/master/remote-caching.html

My goal is to have this available automatically for all the contributors so that we could have a public read only remote cache that It is updated on every master commit/merge.

3 Likes

@ashutosh1919 In the case you want to share your experience exploring this soultion please open a new thread in the forum.

3 Likes

with GitHub Copilot :eyes: :fire: