Hierarchical clustering using mutual information

Mostra el registre complet Registre parcial de l'ítem

  • dc.contributor.author Kraskov, Alexander
  • dc.contributor.author Stögbauer, Harald
  • dc.contributor.author Andrzejak, Ralph Gregor
  • dc.contributor.author Grassberger, Peter
  • dc.date.accessioned 2021-02-05T11:08:42Z
  • dc.date.available 2021-02-05T11:08:42Z
  • dc.date.issued 2005
  • dc.description.abstract We present a conceptually simple method for hierarchical clustering of data called mutual information clustering (MIC) algorithm. It uses mutual information (MI) as a similarity measure and exploits its grouping property: The MI between three objects X, Y, and Z is equal to the sum of the MI between X and Y, plus the MI between Z and the combined object (XY). We use this both in the Shannon (probabilistic) version of information theory and in the Kolmogorov (algorithmic) version. We apply our method to the construction of phylogenetic trees from mitochondrial DNA sequences and to the output of independent components analysis (ICA) as illustrated with the ECG of a pregnant woman.en
  • dc.format.mimetype application/pdf
  • dc.identifier.citation Kraskov A, Stogbauer H, Andrzejak RG, Grassberger P. Hierarchical clustering using mutual information. Europhys Lett. 2005 Mar 25;70:278-84. DOI: 10.1209/epl/i2004-10483-y
  • dc.identifier.doi http://dx.doi.org/10.1209/epl/i2004-10483-y
  • dc.identifier.issn 0295-5075
  • dc.identifier.uri http://hdl.handle.net/10230/46366
  • dc.language.iso eng
  • dc.publisher IOP Publishing Ltd.
  • dc.relation.ispartof Europhysics Letters. 2005 Mar 25;70:278-84
  • dc.rights © Institute of Physics (IOP) https://doi.org/10.1209/epl/i2004-10483-y
  • dc.rights.accessRights info:eu-repo/semantics/openAccess
  • dc.title Hierarchical clustering using mutual informationen
  • dc.type info:eu-repo/semantics/article
  • dc.type.version info:eu-repo/semantics/acceptedVersion