Skip to content
Snippets Groups Projects

Estimators optimize loss

Merged Amir MOHAMMADI requested to merge estimators_optimize_loss into master

Fixes #78 (closed)

Merge request reports

Loading
Loading

Activity

Filter activity
  • Approvals
  • Assignees & reviewers
  • Comments (from bots)
  • Comments (from users)
  • Commits & branches
  • Edits
  • Labels
  • Lock status
  • Mentions
  • Merge request status
  • Tracking
  • Amir MOHAMMADI added 1 commit

    added 1 commit

    • 04bb72e3 - Use tf.contrib.layers.optimize_loss in estimators

    Compare with previous version

  • To be able to use both tf.contrib.layers.optimize_loss and learning rate decay and the moving average optimizer, I had to use this:

    class Optimizer:
        def __call__(self, lr):
            print("Encapsulating the optimizer with the MovingAverageOptimizer")
    
            if nnet_optimizer == "sgd":
                self.optimizer = tf.train.GradientDescentOptimizer(learning_rate=lr)
                self.optimizer = tf.contrib.opt.MovingAverageOptimizer(self.optimizer)
    
            elif nnet_optimizer == "adam":
                self.optimizer = tf.train.AdamOptimizer(learning_rate=lr)
                self.optimizer = tf.contrib.opt.MovingAverageOptimizer(self.optimizer)
            return self.optimizer
    
    optimizer = Optimizer()
    
    
    def learning_rate_decay_fn(learning_rate, global_step):
        return tf.train.exponential_decay(
            learning_rate,
            global_step=global_step,
            decay_steps=learning_rate_decay_steps,
            decay_rate=learning_rate_decay_rate,
            staircase=True,
        )
    
    optimize_loss = partial(
        optimize_loss, learning_rate_decay_fn=learning_rate_decay_fn
    )
    learning_rate_for_optimize_loss = learning_rate
    
    ...
    estimator = Logits(
        architecture=architecture,
        optimizer=optimizer,
        loss_op=loss_op,
        n_classes=n_classes,
        config=run_config,
        model_dir=model_dir,
        add_histograms=add_histograms,
        extra_checkpoint=extra_checkpoint,
        vat_loss=vat_loss,
        balanced_loss_weight=balanced_loss_weight,
        use_sigmoid=use_sigmoid,
        labels_are_one_hot=labels_are_one_hot,
        optimize_loss_learning_rate=learning_rate_for_optimize_loss,
        optimize_loss=optimize_loss,
    )
    

    The problem is optimizer will be created inside that function and we don't have access to it to wrap it. Here using the class I can keep track of it.

    Edited by Amir MOHAMMADI
  • Amir MOHAMMADI added 2 commits

    added 2 commits

    • 7e2d37d7 - Use tf.contrib.layers.optimize_loss in estimators
    • 88eea1d2 - Trun nitpicky back on!

    Compare with previous version

  • Amir MOHAMMADI added 1 commit

    added 1 commit

    Compare with previous version

  • Amir MOHAMMADI added 1 commit

    added 1 commit

    Compare with previous version

  • Looks good to me, thanks

  • mentioned in commit 8e62d6e7

Please register or sign in to reply
Loading