Reputation: 3461
Suppose a change of basis given by the eigenvectors of the exchange matrix J ref here. Now suppose I generate a random matrix such that H = H^T. For example, my H is:
[-4.339231145150657 -1.9513538676596298 0.022375597517700463 0.0
-1.9513538676596298 -1.292344753373925 1.3152010547965873 0.022375597517700463
0.022375597517700463 1.3152010547965873 -1.1096459842204194 4.229348916735498
0.0 0.022375597517700463 4.229348916735498 -3.79113483769014]
Then, if I want to change to the "J" basis I have to do the linear transformation: evecJ^T*H*evecJ (evecJ is theeigenvector matrix of J). So the inverse transformation should bring me back to the original value of H. However, this is not the case, if I do
*(evecJ',H,evecJ)
*(evecJ,H,evecJ')
compared if I do
evecJT = evecJ'
*(evecJT,H,evecJ)
*(evecJ,H,evecJT)
Is there some argument why these are two different approaches?
Upvotes: 0
Views: 104
Reputation: 7874
It's hard to know without more information, but my guess is that it's a combination of 2 factors:
One of the few cases where the semantics of Julia don't quite follow from the syntax is for multiplications or divisions with transposed arguments, i.e. operations the form A' * B
: it will try to replace these with functions such as Ac_mul_B
to call the appropriate BLAS functions, avoiding intermediate allocations. However it can't do this if the transpose occurs elsewhere. In other words, J'*H
and JT = J'; JT*H
will actually call different functions internally.
Matrix-matrix multiply involves a lot of intermediate rounding of floating point numbers, so the exact result will depend on the order in which the operations are performed. What seems likely here is that the orders used in the non-transposed and transposed cases are actually different.
Upvotes: 1