Estimators optimize loss
Fixes #78 (closed)
Merge request reports
Activity
added 1 commit
- 04bb72e3 - Use tf.contrib.layers.optimize_loss in estimators
To be able to use both tf.contrib.layers.optimize_loss and learning rate decay and the moving average optimizer, I had to use this:
class Optimizer: def __call__(self, lr): print("Encapsulating the optimizer with the MovingAverageOptimizer") if nnet_optimizer == "sgd": self.optimizer = tf.train.GradientDescentOptimizer(learning_rate=lr) self.optimizer = tf.contrib.opt.MovingAverageOptimizer(self.optimizer) elif nnet_optimizer == "adam": self.optimizer = tf.train.AdamOptimizer(learning_rate=lr) self.optimizer = tf.contrib.opt.MovingAverageOptimizer(self.optimizer) return self.optimizer optimizer = Optimizer() def learning_rate_decay_fn(learning_rate, global_step): return tf.train.exponential_decay( learning_rate, global_step=global_step, decay_steps=learning_rate_decay_steps, decay_rate=learning_rate_decay_rate, staircase=True, ) optimize_loss = partial( optimize_loss, learning_rate_decay_fn=learning_rate_decay_fn ) learning_rate_for_optimize_loss = learning_rate ... estimator = Logits( architecture=architecture, optimizer=optimizer, loss_op=loss_op, n_classes=n_classes, config=run_config, model_dir=model_dir, add_histograms=add_histograms, extra_checkpoint=extra_checkpoint, vat_loss=vat_loss, balanced_loss_weight=balanced_loss_weight, use_sigmoid=use_sigmoid, labels_are_one_hot=labels_are_one_hot, optimize_loss_learning_rate=learning_rate_for_optimize_loss, optimize_loss=optimize_loss, )
The problem is optimizer will be created inside that function and we don't have access to it to wrap it. Here using the class I can keep track of it.
Edited by Amir MOHAMMADIadded 2 commits
assigned to @tiago.pereira
mentioned in commit 8e62d6e7
Please register or sign in to reply