Jay-qu Posted March 3, 2007 Report Posted March 3, 2007 So the semester has started and with it new subjects :) Only a week in and I have found something I cannot do.. We have been asked to prove that for any matrix A: i) A+A^T is symmetricii) The product of two upper triangular matrices is also upper triangular. The first can apparently be done via a capital letter proof ie since the definition of a symmetric matrix is that it equals its transpose - to complete the proof you show that: A+A^T = [math](A+A^T)^T[/math] I can see that this works (in my head), but cant knead out the capital letter proof.. The second cant be done via this method and a [math]A_i_j[/math] element by element proof is needed. Its a little more of a tedious proof and im not sure how to represent matrices with latex (if it indeed can..) but its easy enough to see how it works. Another cool piece of matrix math is that when you multiply a matrix A by an Elementary matrix (an n x n matrix that can be derived from the identity matrix via just one row operation) it is equivalent to performing that row operation on A! Try it ;) So anyone else have some interesting matrix proofs - or would someone like to try and solve i) from above J Quote
Erasmus00 Posted March 3, 2007 Report Posted March 3, 2007 i. is easy enough . [math](A+A^{T})^{T}=A^T+(A^T)^T[/math] This from the distributive nature of the transpose operation. [math] = A^T + A [/math] Because two transposes gets you back to where you start. Hence, the transpose is equal to the original, and hence we have a symmetric matrix. Perhaps easier to see in index notation [math] A+A^T[/math] in index is [math]A_{ij}+A_{ji}[/math] This is obviously symmetric, transposing swaps i and j, which clearly leaves that unchanged. -Will Quote
Jay-qu Posted March 3, 2007 Author Report Posted March 3, 2007 Cool, I was unaware of the distributive nature of the transpose operation :) Is there a proof for this? Quote
Erasmus00 Posted March 3, 2007 Report Posted March 3, 2007 Cool, I was unaware of the distributive nature of the transpose operation :) Is there a proof for this? Its implicit in the deffinition of transpose. Imagine transposing A+B [math] A_{ij}+B_{ij} \to A_{ji}+B_{ji} [/math] This is clearly A transpose + B transpose. -Will Jay-qu 1 Quote
sanctus Posted March 3, 2007 Report Posted March 3, 2007 For ii) can't you do it by induction? for n=1 it is trivial, for n=2 as well.Supposing it is true for a n*n matrix, we need to show it is ture as well for (n+1)*(n+1) matrix: writing the product as:[math]C_{ik}=\sum_{j=1}^nA_{ij}B_{jk}+A_{i,n+1}B_{n+1,k}[/math]By induction hypothesis we know that the multiplication in the sums give a triangular matrix.Now if k is different from n+1 you always have: [math]B_{n+1,k}=0[/math] So we have to study: [math]A_{i,n+1}B_{n+1,n+1}[/math]Now for any i this gives always a term in ith line and (n+1) column, ergo the (n+1)*(n+1) matrix resulting from the multiplication of two upper triangular (n+1)*(n+1) matrices is also upper triangular CQFD Quote
Qfwfq Posted March 3, 2007 Report Posted March 3, 2007 Induction isn't necessary, I was going to give him a hint but then realized he was only musing about the impossibility of a "capital letter" proof, which I find quite obvious. Quote
Qfwfq Posted March 5, 2007 Report Posted March 5, 2007 I meant to suggest reasoning on orthogonality of the canonical basis, in working out the X in: [math]\forall (i, j): i < j \Rightarrow A_{ij}=0 \wedge B_{ij}=0[/math] [math]\Downarrow[/math] [math]\forall (i, j): i < j \Rightarrow (AB)_{ij} = X (= 0)[/math] But I s'pose Jay had already got it. Quote
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.