Skip to content
Snippets Groups Projects

Resolve "exponential decay learning rate is not working"

Merged Amir MOHAMMADI requested to merge 31-exponential-decay-learning-rate-is-not-working into master
1 unresolved thread
1 file
+ 2
2
Compare changes
  • Side-by-side
  • Inline
+ 2
2
@@ -56,10 +56,10 @@ The example consists in training a very simple **CNN** with `MNIST` dataset in 4
@@ -56,10 +56,10 @@ The example consists in training a very simple **CNN** with `MNIST` dataset in 4
>>>
>>>
>>> loss = BaseLoss(tf.nn.sparse_softmax_cross_entropy_with_logits, tf.reduce_mean)
>>> loss = BaseLoss(tf.nn.sparse_softmax_cross_entropy_with_logits, tf.reduce_mean)
>>>
>>>
>>> optimizer = tf.train.GradientDescentOptimizer(0.001)
>>>
>>> learning_rate = constant(base_learning_rate=0.001)
>>> learning_rate = constant(base_learning_rate=0.001)
>>>
>>>
 
>>> optimizer = tf.train.GradientDescentOptimizer(learning_rate)
 
>>>
>>> trainer = Trainer
>>> trainer = Trainer
Now that you have defined your data, architecture, loss and training algorithm you can save this in a python file,
Now that you have defined your data, architecture, loss and training algorithm you can save this in a python file,
Loading