1 / 17

Hierarchical Clustering

Hierarchical Clustering. Two Types of Clustering. Partitional algorithms: Construct various partitions and then evaluate them by some criterion Hierarchical algorithms: Create a hierarchical decomposition of the set of objects using some criterion. Partitional. Hierarchical.

Download Presentation

Hierarchical Clustering

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Hierarchical Clustering

  2. Two Types of Clustering • Partitional algorithms: Construct various partitions and then evaluate them by some criterion • Hierarchical algorithms: Create a hierarchical decomposition of the set of objects using some criterion Partitional Hierarchical

  3. (How-to) Hierarchical Clustering • The number of possible dendrogramswith n leafs = • (2n -3)!/[(2(n -2)) (n -2)!] • Number Number of Possible • of Leafs Dendrograms • 2 1 • 3 3 • 4 15 • 5 105 • ... … • 34,459,425 Since we cannot test all possible trees we will have to heuristic search of all possible trees. We could do this.. Bottom-Up (agglomerative): Starting with each item in its own cluster, find the best pair to merge into a new cluster. Repeat until all clusters are fused together. Top-Down (divisive): Starting with all the data in a single cluster, consider every possible way to divide the cluster into two. Choose the best division and recursively operate on both sides.

  4. 0 4 8 8 7 7 0 2 0 5 5 0 3 0 4 We begin with a distance matrix which contains the distances between every pair of objects in our database. D( , ) = 8 D( , ) = 3

  5. A generic technique for measuring similarity To measure the similarity between two objects, transform one of the objects into the other, and measure how much effort it took. The measure of effort becomes the distance measure. The distance between Patty and Selma. Change dress color, 1 point Change earring shape, 1 point Change hair part, 1 point D(Patty,Selma) = 3 The distance between Marge and Selma. Change dress color, 1 point Add earrings, 1 point Decrease height, 1 point Take up smoking, 1 point Lose weight, 1 point D(Marge,Selma) = 5 This is called the “edit distance” or the “transformation distance”

  6. Agglomerative clustering algorithm • Most popular hierarchical clustering technique • Basic algorithm • Compute the distance matrix between the input data points • Let each data point be a cluster • Repeat • Merge the two closest clusters • Update the distance matrix • Until only a single cluster remains • Key operation is the computation of the distance between two clusters • Different definitions of the distance between clusters lead to different algorithms

  7. Bottom-Up (agglomerative): Starting with each item in its own cluster, find the best pair to merge into a new cluster. Repeat until all clusters are fused together. Consider all possible merges… Choose the closest …

  8. Bottom-Up (agglomerative): Starting with each item in its own cluster, find the best pair to merge into a new cluster. Repeat until all clusters are fused together. Consider all possible merges… Cluster the closest … Consider all possible merges… Cluster the closest …

  9. Bottom-Up (agglomerative): Starting with each item in its own cluster, find the best pair to merge into a new cluster. Repeat until all clusters are fused together. Consider all possible merges… Choose the best … Consider all possible merges… Choose the best … Consider all possible merges… Choose the best …

  10. Bottom-Up (agglomerative): Starting with each item in its own cluster, find the best pair to merge into a new cluster. Repeat until all clusters are fused together. Cluster the closest Consider all possible merges… … Consider all possible merges… Cluster the closest … Consider all possible merges… Cluster the closest …

  11. We know how to measure the distance between two objects, but defining the distance between an object and a cluster, or defining the distance between two clusters is non obvious. • Single linkage (nearest neighbor):In this method the distance between two clusters is determined by the distance of the two closest objects (nearest neighbors) in the different clusters. • Complete linkage (furthest neighbor):In this method, the distances between clusters are determined by the greatest distance between any two objects in the different clusters (i.e., by the "furthest neighbors"). • Group average linkage:In this method, the distance between two clusters is calculated as the average distance between all pairs of objects in the two different clusters.

  12. Single linkage Average linkage

  13. Summary of Hierarchal Clustering Methods • No need to specify the number of clusters in advance. • Hierarchal nature maps nicely onto human intuition for some domains • They do not scale well: time complexity of at least O(n2), where n is the number of total objects. • Like any heuristic search algorithms, local optima are a problem. • Interpretation of results is (very) subjective.

  14. Hierarchical Clustering Matlab Diketahuititiktitikberikut (1, 2) (2.5, 4.5) (2, 2) (4,1.5) (4, 2.5) Buatklusterisasihirarkhidarititik-titiktersebut Jawab : Matrik X menyimpantitik-titiktsb

  15. Selanjutnyamenghitung distance titik 1 dan 2, titik 1 dan 3, dstsampaisemuapasangantitikdiketahui distance-nya. Fungsimatlabuntukmelakukaniniadalahpdist. Untukmemudahkanmembacamatrik distance Y tersebut, matriktersebutbisaditransformasikansbb (Elemen 1,1 berartijarak titik1 dengan titik1 yaitu 0, dst)

  16. Selanjutnyamelakukanhirarchical clustering denganfungsi linkage Cara membacamatrikhasil Z adalahsbb: Baris-1 : Object ke-4 dan ke-5 yang berjarak 1 dicluster Baris-2 : Object ke-1 dan ke-3 yang berjarak 1 dicluster Baris-3 : Object ke-6 (hasil cluster baris-1) dan ke-7 (hasil cluster baris-2) dicluster, keduanyaberjarak 2.0616 Baris-4 : Object ke-2 dan ke-8 (hasil cluster baris-3) dicluster, keduanyaberjarak 2.5 Lebihjelasnyadapatdilihatpadagrafisdiatas

  17. Membuatdendrogramdarimatrikhasil Z dendrogram (Z) sehinggamenghasilkan figure dendrogramberikut

More Related