Constructing a loop

I am trying to construct a while loop from elementary operations Enter, Merge, Switch, NextIteration, Exit. Normally, when I build a Graph, I add the Operations one-by-one following the flow of tensors, i.e. when I create an operation I can name the tensor used as an input (coming as an output of another operation). The problem is with the loop, where by nature there always exists an operation (the first to add regardless where I start to build the loop), where I cannot specify the input tensor yet. Without that I cannot add the operation.
Somewhere I read about a “workaround” to use a dummy tensor and later change it to the output of the last element of the loop. The problem is that I cannot do that either, as a Graph is append only, no existing operation can be changed.
Can someone direct me to a page, where I can learn how to make a while loop (in TF 1.x, i.e. without tf.while)?

1 Like

TF 1.x is EOL.

For TF 2 please take a look at:

1 Like

Thanks, but I do not see the relevance of that article. I would like to construct a loop in a Graph from the five elementary operations, while the linked article does not even mention them.
I also might be in the dark regarding versions, but my understanding is that new TF2 features like while_loop actually generate under the hood the same old elementary operations that are used in TF1. Maybe the terminology I used was incorrect, but the emphasis is to build the Graph from elementary operations.
Also, the link is around Python code, while I want to use a more direct access. Actually I work on a Pascal wrapper around the C calls. So I could simplify my question, how to make a while loop using the C API. Actually I did not even find a C++ solution. For TF2 C++ API the control_flow_ops still contain only the elementary operations.
Yet another approach to the same question: I can use the 800+ operations ops.pbtxt. Those include the 5 mentioned ops, but not while_loop. Maybe the elementary while could be used somehow?

1 Like

Can someone direct me to a page, where I can learn how to make a while loop without tf.while
My reply was related to your final claim.
Yes it was in python and authograph will create the graph for tconditionals and loops as in the linked example.

Are you looking for something like:

1 Like

Thank you,
Probably the answer to my question is hidden around here, or even more in tensorflow/while_loop.cc at 5dcfc51118817f27fad5246812d83e5dccdc5f72 · tensorflow/tensorflow · GitHub, but not sure. What I would like to understand is the concept how to do it.
What I normally do when adding an Operational Node (and call it AddXXXX):

  • Create an elementary operation with TF_NewOperation
  • Set all inputs (TF_AddInput), inputlists (TF_AddInputList)
  • Set all attributes (e.g. TF_SetAttrBool)
  • Finalise it (TF_FinishOperation).

The problem is in TF_AddInput. It requires a TF_Output structure and that includes the existing operation that gives that output. In a linear Graph it is not a problem:

  • AddPlaceholder (name: PH)
  • AddSquare(input: PH, name: RES).
    And then I can make a session and run it with one input tensor and use RES as output.

For a loop it should be something like:
AddPlaceholder(name: INP)
AddEnter(input: INP, name: ENT)
AddMerge(input: ENT, input : NEXT, name: MERGE)
[ make a COND output]
AddSwitch(input MERGE, input COND, name SWITCH)
AddExit(input SWITCH[true])
[ body using SWITCH[false] returning BODY ]
AddNextIteration(input: BODY, name: NEXT)

The problem is that AddMerge cannot be constructed as it would need NEXT as an input, but at that step NextIteration is not yet added. If I trick it with some dummy input, I cannot change it later to the output of NEXT.

I think the trick is somewhere around NextIterationName and CreateMerge (lines 55 and 66 in while_loop.cc), but I am lost.

1 Like

@markdaoust do you have any hint on this use case of lowlevel internal?

Enter, Merge, Switch, NextIteration, Exit

These were a mistake. IIUC one of the major changes for TF2 is control_flow_v2, where they switched these all to use a new implementation based on functions (PartitionedCall, StatefulPartitionedCall), so that downstream compilers have a chance to understand what the original code’s structure was.

Skye explains it in this TF-internals:

At some point the runtime does lower it to the old version for execution but my understanding was that TF2 doesn’t ever generate these into the graph anymore (does it? It’s explained in the video somewhere).

We never delete deprecated ops, so graphs containing these will continue to work.

Thank you. I shall study PartitionedCall and StatefulPartitionedCall and how they can be used to make a loop. Actually I have not yet found a C++ description for that and did not get much smarter from the Python description either, but will make some more efforts.

Nonetheless, your reply made me maybe even more confused:

  • I had watched the attached Skye’s video few times earlier (and asked the same question there as a comment), but there she also talks about tf.while_loop being broken down to these 5 elementary operations (see chart at 2:20), but does not even mention PartitionedCall. The details at 9:53 also show the classic set-up.
  • Maybe you do not delete deprecated ops, but at least they are marked as such in the ops.pbtxt (e.g. AdjustContrast). These five ops are not marked as deprecated, so I would assume that they are still to be used.
  • Even if these 5 were a mistake and now there are better ways, I still wonder how these were supposed to work, when an ops cannot be created without its input, but its input is to be declared later.