Math notes / Linear algebra

From Helpful
(Redirected from Linear algebra)
Jump to: navigation, search
This is more for overview (my own) than for teaching or exercise.

For some more in-depth material to study, see e.g. the

Math on data:

some introduction · areas of statistics
types of data · on random variables, distributions
Virtues and shortcomings of...
on sampling · probability
glossary · references, unsorted
Footnotes on various analyses

Other data analysis, data summarization, learning

Linear algebra studies things like linear spaces (a.k.a. vector spaces) and linear functions (a.k.a. linear maps, linear transforms), systems of linear equations. part because this fairly specific focus has relatively wide application. A lot of said uses relate to, informally, "when a matrix multiplies a vector, it does something meaningful", and a decent area of linear algebra studies the various useful things that you can do.

Vectors and matrices

Notation and conventions

Contents/uses, properties, operations related to properties


Can hold any tabular sort of data. Many uses are more specific and constrained. In many cases we deal with data as real numbers.

You may sometimes see irregular matrices or sparse matrices, often in computing. In general, matrices are assumed to be regular and non-sparse.

Uses of matrices

Matrices are used for various bookkeeping of nontrivial data, so have many specific(ally named) uses. Including:

In linear algebra

  • representing certain numerical problems (to be solved in specific ways - may also be a linear-equation thing)
Commonly the (coefficients of) sets of linear equations
Often done because the method keeps track of numbers via positions in something matrix-like
  • Transformation matrix [1]
storing linear transforms in matrices that matrix multiplication, typically on coordinate vectors, will apply that transformation to it
see e.g. OpenGL, and various introductions to 3D graphics
Some single-purpose transformation matrix examples:
Rotation matrix - [2]
Shift matrix - (ones only on the subdiagonal or superdiagonal)
Shear matrix - [3]
Centering matrix - [4]

In graph theory and such

  • distance matrix - distances between all given points. E.g. used for graphs, but also for other things where there is a sensible metric.
  • adjacency matrix - [5]
  • further matrices assisting graph theory, including degree matrix[6], incidence matrix[7],
  • Similarity matrix [8]
  • Substitution matrix [9]
  • Stochastic matrix -
    • a.k.a. probability matrix, transition matrix, substitution matrix, Markov matrix
    • stores the transitions in a Markov chain
    • [10]

In multivariate analysis, statstical analysis, eigen-analysis

  • Covariance matrix - used in multivariate analysis. Stores the covariance between all input variables [11]


  • representing differential equations


  • Many are defined in such a way that certain operations are meaningful (though the meaning of operations can obviously vary).
For example, multiplication of graph's adjacency matrix with itself will express connections in as many steps

  • Real vector space ℝn (often ℝ2 or ℝ3 in examples) are quite common in vectors, and common to various matrix uses.
For example, when solving linear equations, you often have row vectors in ℝn, and many (though not all) operations


Augmenting - an augmented matrix appends columns from two matrices (that have the same amount of rows) Seen in application to systems of linear equations


Some supporting terms and properties


Common concepts around vectors and matrices


Linear combination

Span, basis, and subspace

Simultaneous equations / systems of equasions

Linear independence



Vector spaces and related concepts

Orthogonality, orthonormality

Angle between two vectors

More complex concepts around here

Eigenvectors and eigenvalues

Eigenvalue algorithms

Power method / power iteration
Deflation Method




On the decomposed matrices' sizes

Definition / main properties

In more detail

Further properties, observations, and uses

See also