标签:scikit-learn nmf nnmf matrix factorization non-negative
之前写过两篇文章,分别是
1)矩阵分解的综述:scikit-learn:2.5.矩阵因子分解问题
2)关于TruncatedSVD的简单介绍:scikit-learn:通过TruncatedSVD实现LSA(隐含语义分析)
今天发现NMF也是一个很好很实用的模型,就简单介绍一下,它也属于scikit-learn:2.5.矩阵因子分解问题的一部分。
NMF是另一种压缩方法,前提是假设数据矩阵是非负的。在数据矩阵不包含负值的情况下, NMF可以代替PCA及他的变形(NMF can be plugged in instead of PCA or its variants, in the cases where the data matrix does not contain negative values.)。他通过把X分解成W和H,并优化下式:
This norm is an obvious extension of the Euclidean norm to matrices. (Other optimization objectives have been suggested in the NMF literature, in particular Kullback-Leibler divergence, but these are not currently implemented.)
工作了,待续。。。。
版权声明:本文为博主原创文章,未经博主允许不得转载。
scikit-learn:通过Non-negative matrix factorization (NMF or NNMF)实现LSA(隐含语义分析)
标签:scikit-learn nmf nnmf matrix factorization non-negative
原文地址:http://blog.csdn.net/mmc2015/article/details/47802463