Math notes / Algebra
This is more for overview of my own than for teaching or exercise.
|
Algebra as a concept
Algebra 'the study of mathematical symbols and the rules for manipulating these symbols" (wikipedia)
A little more practically,
an algebra defines types, and operations to work on them.
It tries to tie together lots of mathematical concepts into a coherent system.
You could define many different algebraic systems if you wish,
but the most useful ones are the ones taught in school: natural numbers and real numbers,
and addition, subtraction, multiplication, and division.
These are so universally useful they are called elementary algebra.
When a class is called 'algebra' in school, it tends to teach elementary algebra plus a few other widely applicable concepts,
like variables and expressions and what is valid to do on them.
Schools may touch on some basics of linear algebra, which studies linear spaces, linear functions, because having a basic sense of these is useful in various engineering.
There is linear algebra, multilinear algebra, the algebra of sets, and more.
Not a lot more.
A lot of things you can think of either can be expressed in an algebra that already exists,
or exists but isn't generally-applicable enough to be commonly known.
Elementary algebra
Elementary algebra or basic algebra points at the more commonly applicable parts of algebra, that is regularly taught in secondary education, typically the more easily understood parts and those that help move into calculus. (and excluding the separately useful linear algebra, and often less-directly-useful abstract algebra)
To some degree, the concept of basic algebra seems to exist to contrast it with the more complex stuff typically left until university (though basic algebra courses tends to introduce these concepts only explored later).
Algebra assumes knowledge of arithmetic, and introduces some concepts that are central to like
- the concept of variables
- how to play with expressions and how they are affected by certain changes,
- polynomial equations (linear equation, quadratic equation, etc.)
- factorization, root determination, and such
Linear algebra
Linear algebra studies things like linear spaces (a.k.a. vector spaces) and linear functions (a.k.a. linear maps, linear transforms), systems of linear equations.
...in part because this fairly specific focus has relatively wide application.
A lot of said uses relate to, informally, "when a matrix multiplies a vector, it does something meaningful", and a decent area of linear algebra studies the various useful things that you can do.
Vectors and matrices
Conventions: Dimensions and addressing
Matrix dimensions are mentioned as row-by-column.
For example, the following is a 3-by-4 matrix:
[1 2 3 4] M = [5 6 7 8] [9 8 7 6]
Conventionally, matrices are shown with box brackets (Tte above being one approximation), and vectors with parentheses.
Picking out items is often done with subscript (sometimes superscript).
For example, for that last matrix, M3 1 points at the third row, first column, the cell with value 9. Using numbers only really happens in examples, algorithms might do Mi j.
Vectors are basically lists. Vectors often imply coordinate vectors in that-dimensional space, but there are other uses.
Vectors can be seen as 1-by-something sizes matrices - row vectors.
Or, less commonly, as something-by-1 matrices - columns vectors.
The preference for one in some contexts (e.g. 3D stuff in computers),
comes mostly from what happens around matrix multiplication.
For a vector you have only one dimension, so v4 picks out the fourth element from a vector, either way.
In generic definitions you may see vectors described like:
V = (v1, v2, ..., vn)
You may see a matrix-style notation like:
(a b c)
You may also see a symbolic notation for vectors like:
v = ai + bj (two dimensions) v = ai + bj + ck (three dimensions)
...where a, b, and c refer to variables/values in each dimension, and i, j, and k to the dimensions themselves.
Terminology: Properties of some basic matrices
...and a few common terms to go along, like that the main diagonal[1] of a matrix is the the diagonal across the matrix through the elements with the same row index as column index. (top left to bottom right) (Regularly seen mentioned more for square matrices, but it exists for rectangular matrices as well)
A diagonal matrix[2] is a square matrix which has non-zero elements only in the diagonal, e.g.
1 0 0 0 2 0 0 0 2
A square matrix[3] is one that has as many rows as columns. (A number of operations only make sense on a square and not on a rectangular-shaped one - which is often relatively intuitive because of what the operation tries to do, or the because of what the contents represent)
An identity matrix[4] is a square diagonal matrix that has 1 on the main diagonal, 0 elsewhere, for example
1 0 0 0 1 0 0 0 1
Notes:
- when used in matrix multiplication, do not change the matrix they are applied to. For example, for any m-by-n matrix A:
Im A = A A In = A
- (In is often used to refer to an n-by-n identity matrix, so that example was I3. Sometimes it's just I, with size left up to context)
- identity matrices help in certain further definitions, like...
An invertible matrix (also: non-singular, non-degenerate)
- A square matrix is invertible iff its determinant is nonzero
- A square matrix is invertible if there exists a matrix B so that AB = BA = I
- (also, a square matrix is invertible iff the linear map that the matrix represents is an isomorphism)
- If a square matrix has a left inverse or a right inverse then it is invertible (see invertible matrix for other equivalent statements).
A singular matrix (also: non-invertible, degenerate)
- a matrix is singular iff if its determinant is 0
Being invertible / being nonsigular / having a nonzero determinant is often a useful check
- whether a transformation matrix has one that reverses it (e.g. not when it zeroes something out)
- whether rows/columns are linearly independent (when seeing it as a system of linear equations, whether one can be reduced to 0)
- whether a numeric analysis can
- Getting to more specific matrices
Some operations common to those
Terminology: Wider uses of matrices
As matrices are used for any bookkeeping of nontrivial data, they have many specific uses.
Including:
In linear algebra
- representing certain numerical problems,
- for example, and commonly, the coefficients of a set of linear equations (each row being one equation)
- in part just a data storage thing, but there are some matrix properties/transforms that make sense
- Transformation matrix [5]
- storing linear transforms in matrices
- ...so that matrix multiplication, typically on coordinate vectors, will apply that transformation to that vector
- see e.g. the workings of OpenGL, and various introductions to 3D graphics
- Some single-purpose transformation matrix examples:
- Rotation matrix - [6]
- Shift matrix - http://en.wikipedia.org/wiki/Shift_matrix (ones only on the subdiagonal or superdiagonal)
- Shear matrix - [7]
- Centering matrix - [8]
In graph theory and such
- distance matrix - distances between all given points. E.g. used for graphs, but also for other things where there is a sensible metric.
- adjacency matrix - [9]
- note that multiplying this with itself will express connections in as many steps (like a distance matrix)
- further matrices assisting graph theory, including degree matrix[10], incidence matrix[11],
- Similarity matrix [12]
- Substitution matrix [13]
- Stochastic matrix -
- a.k.a. probability matrix, transition matrix, substitution matrix, Markov matrix
- stores the transitions in a Markov chain
- [14]
In multivariate analysis, statstical analysis, eigen-analysis
- Covariance matrix (a.k.a. auto-covariance matrix, dispersion matrix, variance matrix, or variance–covariance matrix)
- used in multivariate analysis
- Stores the covariance between all input variables
- http://en.wikipedia.org/wiki/Covariance_matrix
Other
- Confusion matrix
- a visualisation of the performance of a classification algorithm
- rows are predicted class, columns are known class
- numbers are the fraction of the cases categorized that way, with the diagonal being '...correctly'
- the closer this is to an identity matrix, the better the classifier performed
- representing differential equations
- Many others, see e.g. http://en.wikipedia.org/wiki/List_of_matrices
-->
Common concepts around linear algebra
Space
Linear combination
Span, basis, and subspace
Orthogonality, orthonormality
Simultaneous equations / systems of equasions
Linear independence; Basis
Rank
Rank is the maximum number of linearly independent rows that appear in a given matrix.
(Side notes: the rank is at most the minimum of the row and columns size)
http://en.wikipedia.org/wiki/Rank_(linear_algebra)
Angle between two vectors
More complex concepts around here
Eigenvectors and eigenvalues
Eigenvalue algorithms
Power method / power iteration
Deflation Method
Eigendecomposition
Applications
SVD
On the decomposed matrices' sizes
Definition / main properties
In more detail
Further properties, observations, and uses
See also
Others
-->
Abstract algebra
Abstract algebra studies the possible generalizations within algebra.
It concerns concepts like group theory, rings, fields, modules, vector spaces, and their interrelations.