Norms
Induced Norms
n-Norm
Frobenius Norm
Spectral Norm
Derivatives
Consider the following set of linear equations:
We can define the matrix A to represent the coefficients, the vector B as the results, and the vector x as the variables:
And rewriting the equation in terms of the matrices, we get:
Now, let's say we want the derivative of this equation with respect to the vector x:
We know that the first term is constant, so the derivative of the left-hand side of the equation is zero. Analyzing the right side shows us:
Pseudo-Inverses
There are special matrices known as pseudo-inverses, that satisfies some of the properties of an inverse, but not others. To recap, If we have two square matrices A and B, that are both n × n, then if the following equation is true, we say that A is the inverse of B, and B is the inverse of A:
Right Pseudo-Inverse
Consider the following matrix:
We call this matrix R the right pseudo-inverse of A, because:
but
We will denote the right pseudo-inverse of A as
Left Pseudo-Inverse
Consider the following matrix:
We call L the left pseudo-inverse of A because
but
We will denote the left pseudo-inverse of A as