码迷,mamicode.com
首页 > 其他好文 > 详细

tensorflow-影子变量值

时间:2018-12-08 23:53:45      阅读:205      评论:0      收藏:0      [点我收藏+]

标签:case   value   ica   on()   dynamic   apply   nes   ali   glob   

#!/usr/bin/env python2 # -*- coding: utf-8 -*- import tensorflow as tf my_var=tf.Variable(0.) step=tf.Variable(0,trainable=False) ema=tf.train.ExponentialMovingAverage(0.99,step) maintain_average_op=ema.apply([my_var]) with tf.Session() as sess: init_op=tf.global_variables_initializer() sess.run(init_op) decay=0.99 #影子变量值变化 for i in range(1,6): print sess.run([my_var,ema.average(my_var)]) sess.run(my_var.assign_add(i)) sess.run(maintain_average_op) print sess.run([my_var,ema.average(my_var)]) print "===" print "----------------" #num_updates即step变化 sess.run(my_var.assign(5.)) for i in range(1,20,3): print sess.run([my_var,ema.average(my_var)]) sess.run(step.assign_add(i)) sess.run(maintain_average_op) print sess.run([my_var,ema.average(my_var)]) print "==="

滑动平均模型

shadow_variable= decay shadow_variable + (1 - decay) variable

Reasonable values for?decay?are close to 1.0, typically in themultiple-nines range: 0.999, 0.9999, etc.

The?apply()?methodadds shadow copies of trained variables and add ops that maintain a movingaverage of the trained variables in their shadow copies. It is used whenbuilding the training model.?

The optional?num_updates?parameter allows one to tweak thedecay rate dynamically. It is typical to pass the count of training steps,usually kept in a variable that is incremented at each step, in which case thedecay rate is lower at the start of training. This makes moving averages movefaster. If passed, the actual decay rate used is:

min(decay, (1 +num_updates) / (10 + num_updates))

tensorflow-影子变量值

标签:case   value   ica   on()   dynamic   apply   nes   ali   glob   

原文地址:http://blog.51cto.com/13959448/2327983

(0)
(0)
   
举报
评论 一句话评论(0
登录后才能评论!
© 2014 mamicode.com 版权所有  联系我们:gaon5@hotmail.com
迷上了代码!