# What is the correct way to retrieve the value of a specific index of a 3-D tensor?

Hello.
I am trying to create a custom metric for my sequential model, where it takes two probability distribuitions, y_pred and y_true, and for every  in y_true, it sums the value from y_pred in the same index, and returns it normalized.
ex. y_true: [ [0,0,1], [1,0,0] ]
y_pred: [ [0.2,0.2,0.6] , [0.3,0.4,0.3] ]
returned value: 0.45 ((0.6 + 0.3) / 2)

So I created a function, and passed it as the metric for the model. It did not work, as I could not convert the tensors to numpy, because the function was being converted into a graph function.

Then I wrapped it in a tf.py_function(), which is wrapped in a lambda, and passed it. Now
it seems to be retrieving the correct values, but I do not know how to use the slice() function correctly.

Before, I was getting an error, which told me that it was expecting a certain size for slice, and it was too big.
I changed it, so now slice works. But now it tells me there are multiple values inside the sliced tensor, so it cannot retrieve the true value.

In a 3-D tensor, with shape [1,3,7], when slicing to get a single value from it, what is the correct size for it?

Currently my function looks like this:

def forgivingAccuracy(y_true, y_pred):

``````    """y_pred and y_true are both 2D vectors in a categorical format, with shape [3,7]

Both represents 3 mood scores, that ranges from 1 to 7, but

y_pred is a probability distribuition.

returns the sum of probabilities guessed correctly, normalized.

e.g. 3 is the correct score, and it predicts with 0.3 certainty, 0.3 is added.

then the sum is divided to normalize."""

correct = float(0)

total = 3

row = 0

"""Loops through all scores, adds together the chances predicted correctly.

e.g. Correct is "Good", and AI predicted it with 60% certainty,

0.6 is added to the "correct" variable.

Then, it is divided by the total probability (1 * 3)."""

for mood in y_true:

column = 0

for score in mood:

#Ugly code, but what it does is slice the tensor to get index [row][column]

#And then access the content

#Here is the bug. What is the correct size for this to return a single float?
target = tf.slice(y_pred,begin=[0,row,column],size=[0,0,1])

print("target is:")

tf.print(target)

if score == float(1) and score.size > 0:

correct = correct + score.eval()

row = row + 1

print("row is: %x" %row)

return correct / total
``````

Thank you in advance. If there is any other information I need to post, tell me and I will add it as soon as possible.

I just found the answer. There were two problems.

1. I was misunderstanding the shape of y_true and y_pred. It is [Batch_size,(Input_shape)].
Which means the shapes were not [3,7], but rather [32,3,7].

2. The size is supposed to be [1,1,1]. The reason I was getting an error for doing this before is unknown.

I did not know that custom metrics become a tf.function, but after I discovered that all I had to do was
tf.config.run_functions_eagerly(True) to debug this.

I hope this helps anybody running into the same problem.