Hi Everyone,

We just published a new and quick TensorFlow basics overview page.

Let us know what you think.

Hi Everyone,

We just published a new and quick TensorFlow basics overview page.

Let us know what you think.

5 Likes

I was thinking that some section like â€śFrequently Asked Questionsâ€ť could be useful. So that people would search it before asking the same basic question, which was already answered.

It helps to know why a technology was created.An introduction should not leave me reading for pages and pages saying, â€śbut what was the point of all this?â€ť.

A Tensor is not what is interesting about Tensors. Tensors exist to create a chain of Tensors, which represents a series of computations that can be run over and over again. It would help to include a demonstration and maybe an illustration of a chain of two Tensors. Without this concept of a data structure representing a chain, there is no reason for Tensors to exist.

Iâ€™m not familiar with this terminology. What do you mean by â€śchainâ€ť here?

â€śChainâ€ť was the wrong word, apologies.

Tensors are about creating directed graphs of computations. The directed graph represents a series of computations that can be run over and over again. The Tensor data type is an output node of this graph. A computation graph can have more than one output node.

To explain this well requires a diagram as well as words.

Thanks, I thought that might be what you meant, but I just wanted to be sure.