Wednesday, February 12, 2014

What is dreamed and math that should never be (part 2)

Dear Diary,

It just so happens my two best friends from high school have math Ph.D.s.  One responded with this:

"I started thinking about what you wrote, and it was pretty interesting, so I couldn't resist...

Your strategy for finding eigenvalues works, but you have to be careful with the intuition that 2 equations in 2 unknowns will give you a solution. Let's look at the 2x2 case:
xy = d (the determinant)
x + y = t (the trace)

This gives y = t - x so x(t-x) = d, and this is a quadratic in x: you should expect 2 solutions, and you won't know which one is right. But here's a tricky observation: the two equations are symmetric, meaning that if you swap the variables x and y, you get the same equations. So if (x = a, y = b) is a solution, so is (x = b, y = a) - the two solutions you get are actually the same pair of eigenvalues.

Now you'll have to forgive me for taking a mathematical tangent...... The fact that you get 2 solutions here instead of 1 is an example of Bezout's theorem, which says roughly that if you have two curves in the plane, given by polynomial equations of degrees d_1 and d_2 (the degree is the highest number of variables appearing in a single monomial, so xy has degree 2), you should expect them to intersect (d_1)(d_2) times. This generalizes to higher dimensions: if you have n hypersurfaces in n-space (a hypersurface is the solution set to polynomial equation in n variables), you should expect them to intersect a number of times equal to the product of their degrees.

Now this might not happen: they might not intersect at all (this corresponds to their intersection being "at infinity" - think of parallel lines), or they might intersect fewer times because some intersection points have "higher multiplicity" (think of a parabola that just brushes the x-axis at the origin - this intersection point has "multiplicity 2"). But there's a precise way of asserting that "these situations almost never happen".
This explains something special about linear algebra: a system of n linear equations in n unknowns will always have a unique solution if it is consistent (i.e. the solution isn't "at infinity") - all the equations have degree 1, so Bezout's theorem tells us to multiply a bunch of 1s together.

Now let's look at the 3x3 case. You have 3 equations:

xyz = d (the determinant) - degree 3
x + y + z = t (the trace) - degree 1
1/x + 1/y + 1/z = s (the trace of the inverse) - not a polynomial

The last equation isn't a polynomial, but we can replace it with yz + xz + xy = sd (which has degree 2) by multiplying by xyz = d without losing any information, as long as d is nonzero.
This gives 3 equations in 3 variables of degrees 1, 2, and 3. Bezout's theorem says that for almost all d, t, and s, there will be 6 solutions to this system of equations. So your strategy cuts down the number of possible eigenvalue triples to 6... but now, miraculously, we can apply the same observation as in the 2x2 case: your three equations are symmetric under all permutations of the three variables, and there are 6 of these. So the 6 solutions we get are really all the same solution, just with their names permuted.
To check this, I selected 3 random values of d, t, and s, and asked Wolfram Alpha to solve the equations:

Yep, 6 solutions! And (up to some very small error that Alpha gets from solving numerically) they're all the same set of 3 eigenvalues. Phew, math works.

What if d = 0? Then we have a problem, because we don't have 3 equations: the matrix has no inverse, so we can't take the trace of its inverse."

I hadn't thought if the determinant was zero.  You're right!  What an insight! See, Diary, math is cool!

No comments:

Post a Comment