码迷,mamicode.com
首页 > 其他好文 > 详细

Classical method of machine learning

时间:2014-07-19 18:05:19      阅读:190      评论:0      收藏:0      [点我收藏+]

标签:数据   io   for   cti   re   c   

1. PCA principal components analysis

主要是通过对协方差矩阵Covariance matrix进行特征分解,以得出数据的主成分(即特征向量eigenvector)与它们的权值(即特征值eigenvalue)。

PCA是最简单的以特征量分析多元统计分布的方法。其结果可以理解为对原数据中的方差variance做出解释:哪一个方向上的数据值对方差的影响最大?换而言之,PCA提供了一种降低数据维度的有效办法;如果分析者在原数据中除掉最小的特征值所对应的成分,那么所得的低维度数据必定是最优化的(也即,这样降低维度必定是失去讯息最少的方法)。

2. kmeans

The kernel k-means problem is an extension of the k-means problem where the input data points are mapped non-linearly into a higher-dimensional feature space via a kernel function $k(x_i,x_j) = \phi^T(x_i)\phi(x_j)$.

3. bayes

4. spectral clustering

In multivariate statistics and the clustering of data, spectral clustering techniques make use of the spectrum (eigenvalues) of the similarity matrix of the data to perform dimensionality reduction before clustering in fewer dimensions. The similarity matrix is provided as an input and consists of a quantitative assessment of the relative similarity of each pair of points in the dataset.

广义上来说,任何在演算法中用到SVD/特征值分解的,都叫Spectral Algorithm。 从很老很老的PCA/LDA,到比较近的Spectral Embedding/Clustering,都属于这类。

它的思想就是将聚类和图划分等同起来。

就是计算Laplacian matrix的算法不一样。

  1. 计算相似矩阵S;(相似就连边);
  2. 计算Laplacian矩阵L(是图论里的概念);
  3. 计算L的特征向量(注意这里是最小的k个特征向量);组成转换矩阵;
  4. 降维;
  5. 聚类;(k-means)

Given a simple graph G with n vertices, its Laplacian matrix $L:=(\ell_{i,j})_{n \times n}$ is defined as:

$L = D - A.$
That is, it is the difference of the degree matrix D and the adjacency matrix A of the graph. In the case of directed graphs, either the indegree or outdegree might be used, depending on the application.

5. svm

6. EM 

To be learned:

6. deep learning

7. Spark

Classical method of machine learning,布布扣,bubuko.com

Classical method of machine learning

标签:数据   io   for   cti   re   c   

原文地址:http://www.cnblogs.com/linyx/p/3854949.html

(0)
(0)
   
举报
评论 一句话评论(0
登录后才能评论!
© 2014 mamicode.com 版权所有  联系我们:gaon5@hotmail.com
迷上了代码!