Skip to content Skip to sidebar Skip to footer

Basic Function Minimisation And Variable Tracking In Tensorflow 2.0

I am trying to perform the most basic function minimisation possible in TensorFlow 2.0, exactly as in the question Tensorflow 2.0: minimize a simple function, however I cannot get

Solution 1:

You need to call minimize multiple times, because minimize only performs a single step of your optimisation.

Following should work

import tensorflow as tf

x = tf.Variable(2, name='x', trainable=True, dtype=tf.float32)

# Is the tape that computes the gradients!
trainable_variables = [x]

# To use minimize you have to define your loss computation as a funcctionclassModel():
    def__init__(self):
        self.y = 0defcompute_loss(self):
        self.y = tf.math.square(x)
        return self.y

opt = tf.optimizers.Adam(learning_rate=0.01)
model = Model()
for i inrange(1000):
    train = opt.minimize(model.compute_loss, var_list=trainable_variables)

print("x:", x)
print("y:", model.y)

Post a Comment for "Basic Function Minimisation And Variable Tracking In Tensorflow 2.0"