Read through Linear Algebra Demystified, and worked through the problems. It was very helpful for me as a self-teaching source.

## Archive for April 2009

### Linear algebra demystified

28 April 2009### Linear algebra summary

27 April 2009On the heels of how successful it was for me to work with Advanced Calculus demystified, I picked up Linear algebra demystified from the library to work through. I’ve been working through it from April 11 to April 27. This book was similarly effective for me, being a cursory, wide review, and providing lots of worked examples. David McMahon does a good job, and apparently has written a number of these books.

**Chapter 1 – Systems of linear equations** I learned how to carefully and correctly perform Gauss-Jordan elimination reduction of a matrix. For me the trick again is just to be very very organized, and not to let lack of space allow me to compromise the steps. Everything must be double checked, and this is only easy with well organized work. Also,

The *Hermitian conjugate *of is denoted and is the transpose and complex conjugate. A matrix with an inverse is called *nonsingular* (note, no information is lost when A operates on a vector).

**Chapter 2 – Matrix algebra ** *Cramer’s rule* uses determinants to solve systems of linear equations

Calculating the inverse of a matrix (of any size), can be accomplished by calculating the determinant and the adjugate of the matrix. The *minor* of a matrix is created by eliminating (row, col)(m,n) . The *cofactor* is the *signed minor* for (m,n) . The *adjugate* of is the matrix of cofactors. , so

**Chapter 3 – Determinants**

**Chapter 4 – Vectors**

**Chapter 5 – Vector spaces ** A vector space is a set of elements that is closed under addition, multiplication. Is associative, commutative. There exists an identity element and inverse under addition. Scalar multiplication is associated and distributive. There exists an identity element for multiplication. Given a vector space V, the subset W is a *subspace *if W is also a vector space. This is verifiable by checking only for the zero vector in W and closure under addition in W. E.g. has a subspace . Is it true that generally, has subspaces of , or is there an infinite number?

A matrix A in row-echelon form has a row space of A and column space of A. The *null space* of a matrix is found from , and the rank(A) + nullity(A) = n. The *closure relation* or *completeness *means that we can write the identity in terms of outer products of a set of basis vectors.

**Chapter 6 – Inner product spaces** The *inner product* is given by

*Inner products* on function spaces can be used to check for orthogonality of functions! The Gram-Schmidt process can generate an orthonormal basis from an arbitrary basis.

**Chapter 7 – Linear transformations**

**Chapter 8 – Eigenvalues ** The *characteristic polynomial* of a matrix A is given by and by setting this to zero, you have the *characteristic equation*. Two matrices A and B are *similar *if . Similar matrices have the same eigenvalues. The eigenvectors of a symmetric or Hermitian matrix form an orthonormal basis. A unitary matrix has the property that . When tow or more eigenvectors share the same eigenvalue, the eigenvalue is called *degenerate*. The number of eigenvectors that have the same eigenvalue is the *degree of degeneracy*.

**Chapter 9** – **Special matrices** A matrix A is *symmetric *if **.** Also note that . A symmetric matrix S can be written as and any A can be used to construct a symmetric matrix. A* skew-symmetric* matrix K has the properties that and . The product of two symmetric matrices is symmetric if they commute. The product of two skew-symmetric matrices is skew-symmetric if they anti-commute. The *Hermitian operator* is defined by

Note that and and . A *Hermitian matrix *is A is one that . A Hermitian matrix has the properties that the elements of its trace are all real numbers, and that all the eigenvalues are real, and the eigenvectors are orthogonal and form a basis. For an *anti-Hermitian matrix *A, the trace diagonal and the eigenvalues are all imaginary. For an *orthogonal matrix *P, .

**Chapter 10 – Matrix decomposition** A=LU decomposition is where any matrix A is expressed as a matrix with only lower elements (L) with zeros above, and an upper matrix U with zeros below. The matrix L can be formed from

where are the simple row operations to take A to U. A must not be singular. The SVD decomposition is singular value decomposition for A singular or nearly singular. QR decomposition is for A non-square non-singular where R is a square upper triangular matrix, and Q is an orthogonal matrix.

### All the mathematics you missed

18 April 2009I have to put in a plug for this book, All the Mathematics You Missed. It is a terse overview of general college math, placing subjects in context and relation, and summarizing results. If you’ve taken a class in one of the subjects described, it is a good refresher, if not, it is a good introduction with leads to more complete books. I like this book so much, when I thought I lost it, I had to go buy another. Now I have two for reference.

### Partial derivative notation

10 April 2009There is another notation that was new to me and . This also works for mixed partials, but note that the subscripts are reversed .

### Bose condensate to fermion gas

5 April 2009The energy and spatial distribution of a fermi gas is governed by its temperature, unlike a bose-einstein condensate. However, it is possible to have pairs of fermions (spin 1/2) become bosonic (spin 1). What happens when a “pseudo” bose condensate comprised of paired fermions were to have the pairing process suddenly decay? A large number of fermions suddenly “appearing” in a (non-physically-allowed) constrained region of phase space?

I’ve since found that this is called a fermionic condensate, and perhaps my question can be more simply stated as, “what happens when a fermionic condensate undergos a reverse BCS transition“?

Update – I read that there is a concept of “conservation of statistics“, that may bear on this. I don’t understand it yet, however.

### San Francisco trip

5 April 2009### Advanced calculus demystified

4 April 2009Most of my time recently was spent in completing a self-review of vector calculus using the text Advanced Calculus Demystified. This study refreshed my knowledge of the mechanics and added a couple new tricks. I completed nearly 75 of the problems in about 3 weeks of study. (This book was particularly good, because all the problems were not just answered but worked out in the back of the book).

I learned a couple meta-lessons for me in this.

- There is no substitute for doing the problems. I picked up on tics in my approach, and it was confidence building to get the correct answer.
- Almost all mistakes came from hasty leaps in the math (or combining leaps), and almost all hasty leaps were dictated by
*space on the paper*! (Emotional issue here, not wanting to waste paper, or shortcuts because “only sluggards have to write out the details”). - I actually (half) remember more than I thought – my judgment of my current ability was a mistake in perception.

A lot of these still condense down to the maxim I was holding close a year ago – **no more shortcuts**.