标签:before 论文 question learn learning ref nts com after
batch normalization 和激活函数的顺序问题
在初始论文中batch normalization 是放在了激活函数前面,但是在实践中是放在后面更有效一点.
Batch-normalized 应该放在非线性激活层的前面还是后面? - 知乎
https://www.zhihu.com/question/283715823
Batch Normalization before or after ReLU?
batch normalization 和激活函数的顺序问题
标签:before 论文 question learn learning ref nts com after
原文地址:https://www.cnblogs.com/lzida9223/p/10972768.html