Slice Of A Variable Returns Gradient None
I've been playing around with the tf.gradients() function and came across a behavior I didn't expect. Namely it seems to be unable to calculate the gradient of a sliced Variable. I
Solution 1:
d = c[:, ]
creates a different tensor then a, b, c
. If you consider dependency graph, d depends on c. then gradients doesn't work in this case. grad(y, x)
works if x depends on y, not the other way around.
Post a Comment for "Slice Of A Variable Returns Gradient None"