标签:case value ica on() dynamic apply nes ali glob
#!/usr/bin/env python2
# -*- coding: utf-8 -*-
import tensorflow as tf
my_var=tf.Variable(0.)
step=tf.Variable(0,trainable=False)
ema=tf.train.ExponentialMovingAverage(0.99,step)
maintain_average_op=ema.apply([my_var])
with tf.Session() as sess:
init_op=tf.global_variables_initializer()
sess.run(init_op)
decay=0.99
#影子变量值变化
for i in range(1,6):
print sess.run([my_var,ema.average(my_var)])
sess.run(my_var.assign_add(i))
sess.run(maintain_average_op)
print sess.run([my_var,ema.average(my_var)])
print "==="
print "----------------"
#num_updates即step变化
sess.run(my_var.assign(5.))
for i in range(1,20,3):
print sess.run([my_var,ema.average(my_var)])
sess.run(step.assign_add(i))
sess.run(maintain_average_op)
print sess.run([my_var,ema.average(my_var)])
print "==="
滑动平均模型
shadow_variable= decay shadow_variable + (1 - decay) variable
Reasonable values for?decay?are close to 1.0, typically in themultiple-nines range: 0.999, 0.9999, etc.
The?apply()?methodadds shadow copies of trained variables and add ops that maintain a movingaverage of the trained variables in their shadow copies. It is used whenbuilding the training model.?
The optional?num_updates?parameter allows one to tweak thedecay rate dynamically. It is typical to pass the count of training steps,usually kept in a variable that is incremented at each step, in which case thedecay rate is lower at the start of training. This makes moving averages movefaster. If passed, the actual decay rate used is:
min(decay, (1 +num_updates) / (10 + num_updates))
标签:case value ica on() dynamic apply nes ali glob
原文地址:http://blog.51cto.com/13959448/2327983