06-1350/Class Notes for Thursday October 19
|
Matrix exponentiation is harder than you think
You think you know matrix exponentiation? OK, here's a question to test your knowledge. Fix a natural number . From here and on the word "matrix" means "an matrix with rational entries". Let be a linear combination of words whose letters are either matrices or meaningless symbols of the form . Say that if becomes when it is interpreted as a sum of products of formal power series of matrices and polynomials in formal variables , where is interpreted to mean . (Note that the number of formal parameters need not equal the number of matrices involved).
Thus for example, if the matrices and are known to commute (say and ), then . In fact, it is even true that .
Question. Provide a reasonable algorithm to decide if or not. In other words, provide a reasonable algorithm to decide if a certain equality between rational matrices and their exponentials holds or not.
Note that an unreasonable algorithm for this question is readily available. Over the algebraic closure of , exponentiations such as can be carried out explicitly using diagonalization, and then matrices can be multiplied or added as required, and the question is settled. The reason this is unreasonable is that the algebraic closure of is easy to speak of but very hard to work with. Many of you must have seen computer programs that manipulate rational numbers; this is easy. Have you ever seen a computer program that manipulates arbitrary algebraic numbers?
And of course, I could have asked the question about matrices with entries in some ring that doesn't have an algebraic closure. The question still makes sense, but the approach with explicit exponentiation does not.
In reality I care about an analog of this question in which the underlying algebra isn't necessarily a matrix algebra. For any algebra and any , makes sense within the algebra of formal power series in with coefficients in . Hence our question makes sense even if isn't a matrix algebra.