码迷,mamicode.com
首页 > 其他好文 > 详细

信息熵

时间:2017-06-21 13:56:54      阅读:136      评论:0      收藏:0      [点我收藏+]

标签:href   most   one   div   wal   info   https   rom   res   

That transfer of information, from what we don’t know about the system to what we know, represents a change in entropy. Insight decreases the entropy of the system. Get information, reduce entropy. This is information gain. And yes, this type of entropy is subjective, in that it depends on what we know about the system at hand. (Fwiw, information gain is synonymous with Kullback-Leibler divergence, which we explored briefly in this tutorial on restricted Boltzmann machines.)

So each principal component cutting through the scatterplot represents a decrease in the system’s entropy, in its unpredictability.

It so happens that explaining the shape of the data one principal component at a time, beginning with the component that accounts for the most variance, is similar to walking data through a decision tree. The first component of PCA, like the first if-then-else split in a properly formed decision tree, will be along the dimension that reduces unpredictability the most.

KL 散度 。 准备翻译一下: 

https://www.countbayesie.com/blog/2017/5/9/kullback-leibler-divergence-explained

信息熵

标签:href   most   one   div   wal   info   https   rom   res   

原文地址:http://www.cnblogs.com/xinping-study/p/7058803.html

(0)
(0)
   
举报
评论 一句话评论(0
登录后才能评论!
© 2014 mamicode.com 版权所有  联系我们:gaon5@hotmail.com
迷上了代码!