Dimensionality reduction
Jump to navigation
Jump to search
This is more for overview of my own than for teaching or exercise.
|
Dimensionality reduction
As a wide concept
Ordination, Factor Analysis, Multivariate analysis
✎ This article/section is a stub — some half-sorted notes, not necessarily checked, not necessarily correct. Feel free to ignore, or tell me about it.
Factor Analysis, Principal Component Analysis (PCA), and variants
Correspondence Analysis (CA)
Conceptually similar to PCA, but uses a Chi-square distance, to be more appicable to nominal data (where PCA applies to continuous data).
See also:
Multi-dimensionsional scaling (MDS)
Input
Result evaluation
Algorithms
See also
- WS Torgerson (1958) Theory and Methods of Scaling
- JB Kruskal, and M Wish (1978) Multidimensional Scaling
- I Borg and P Groenen (2005) Modern Multidimensional Scaling: theory and applications
- TF Cox and MAA Cox (1994) Multidimensional Scaling
Generalizized MDS (GMDS)
A generalization of metric MDS where the target domain is non-Euclidean.
See also:
Singular value decomposition (SVD)
See also:
- http://www.kwon3d.com/theory/jkinem/svd.html
- http://en.wikipedia.org/wiki/Singular_value_decomposition
- http://www.netlib.org/lapack/lug/node53.html
- http://public.lanl.gov/mewall/kluwer2002.html
Nonlinear dimensionality reduction / Manifold learning
Isomap
Locally Linear Embedding (LLE)
Hessian Eigenmapping / Hessian LLE
Sammon’s (non-linear) mapping
Spectral Embedding
Local Tangent Space Alignment (LTSA)
Non-metric MDS
t-SNE
UMAP
UMAP (Uniform manifold approximation and projection)
Expectation Maximisation (EM)
✎ This article/section is a stub — some half-sorted notes, not necessarily checked, not necessarily correct. Feel free to ignore, or tell me about it.
A broadly applicable idea/method that iteratively uses the Maximum Likelihood (ML) idea and its estimation (MLE).