码迷,mamicode.com
首页 > 其他好文 > 详细

机器学习笔记(Washington University)- Clustering Specialization-week six

时间:2017-06-02 23:52:08      阅读:235      评论:0      收藏:0      [点我收藏+]

标签:ecif   path   pairs   void   gauss   fine   mod   cal   user   

1. Hierarchical clustering

  • Avoid choosing number of clusters beforehand
  • Dendrograms help visualize different clustering granularities (no need to rerun algorithm)
  • Most algorithm allow user to choose any distance metric (k-means restricted us to euclidean distance)
  • Can often find more  complex shapes than k-means or gaussian mixture model

Divisive (top-down):

start with all data in a big cluster and recursively split(recursive k-means)

  • which algorithm to recurse
  • how many clusters per split
  • when to split vs stop, max cluster size or max cluster radius or specified number of clusters

 

Agglomerative (bottom-up):

start with each data point at its own cluster, merge cluster until all points are in one big cluster (single linkage)

single linkage

  • initialize each point to be its own cluster
  • define distance between clusters to bb the minimum distance of C1 in cluster one and C2 in clustrer two
  • merge the two closest cluster
  • repeat step 3 until all points are in one cluster

 

Dendrogram

x axis shows data points (carefully ordered).

y axis shows distance between pairs of clusters.

Path shows all cluser to which a point belongs and the order in which clusters merge.

 

机器学习笔记(Washington University)- Clustering Specialization-week six

标签:ecif   path   pairs   void   gauss   fine   mod   cal   user   

原文地址:http://www.cnblogs.com/climberclimb/p/6935542.html

(0)
(0)
   
举报
评论 一句话评论(0
登录后才能评论!
© 2014 mamicode.com 版权所有  联系我们:gaon5@hotmail.com
迷上了代码!