2019학년도 1학기

반도체 공학과 공학수학1

주교재: Erwin Kreyszig, Engineering Mathematics, 10th Edition

부교재: 이상구 외 4인, 최신공학수학 I,  1st Edition

강의시간:  공학수학1, (화09:00-10:15)   (목10:30-11:45)

담당교수:  김응기 박사

Week 15

15주차

8.3: Symmetric, Skew-Symmetric, and Orthogonal Matrix

1.7 닮음, 행렬의 대각화, 이차형식

8.3 Symmetric, Skew-Symmetric, and Orthogonal Matrices

Definition  Symmetric, Skew-Symmetric, and Orthogonal Matrices

A real square matrix is called

symmetric : Transposition leaves it unchanged,

thus

skew-symmetric : Transposition gives the negative of ,

thus

orthogonal : Transposition gives the inverse of

.

Example 1  Symmetric, Skew-Symmetric, and Orthogonal Matrices

is symmetric matrix.

is skew-symmetric matrix.

is orthogonal matrix.

Sage Coding

 A = matrix([[-3, 1, 5], [1, 0, -2], [5, -2, 4]]) print A == A.transpose() print A.is_symmetric() B = matrix([[0, 9, -12], [-9, 0, 20], [12, -20, 0]]) print -B == B.transpose() print B.is_skew_symmetric() C = matrix([[2/3, 1/3, 2/3], [-2/3, 2/3, 1/3], [1/3, 2/3, -2/3]]) print C.transpose() == C.inverse()

True

True

True

True

True

is real square matrix.

is symmetric matrix, is skew-symmetric matrix.

Example 2

Sage Coding

 A = matrix([[9, 5, 2], [2, 3, -8], [5, 4, 3]]) R = (A + A.transpose())/2 S = (A - A.transpose())/2 print "symmetric part of A =" print R print "skew symmetric part of A =" print S

symmetric part of A =

[  9 7/2 7/2]

[7/2   3  -2]

[7/2  -2   3]

skew symmetric part of A =

[   0  3/2 -3/2]

[-3/2    0   -6]

[ 3/2    6    0]

Theorem 1  Eigenvalues of Symmetric and Skew-Symmetric Matrices

(a) The eigenvalues of symmetric matrix are real.

(b) The eigenvalues of skew-symmetric matrix are pure imaginary or zero.

Example 3  Eigenvalues of Symmetric and Skew-Symmetric Matrices

The following matrices are symmetric and have real eigenvalues

Solution

Find eigenvalues of the matrix

Eigenvalues

The solutions of this quadratic equation are .

Sage Coding

 A = matrix([[5, 3], [3, 5]]) A.eigenvalues()

[8, 2]

Find eigenvalues of the matrix .

Eigenvalues

The solutions of this quadratic equation are .

Sage Coding

 A = matrix([[-5, 2], [2, -2]]) A.eigenvalues()

[-1, -6]

The following skew-symmetric matrix has the eigenvalues , and .

Sage Coding

 A = matrix([[0, 9, -12], [-9, 0, 20], [12, -20, 0]]) A.eigenvalues()

[0, -25*I, 25*I]

The following matrix has the real eigenvalues and but is not symmetric.

Solution

Eigenvalues

The solutions of this quadratic equation are .

Sage Coding

 A = matrix([[3, 4], [1, 3]]) A.eigenvalues()

[5, 1]

Orthogonal transformations and Orthogonal Matrices

Orthogonal transformations are transformations

where is an orthogonal matrix.

With each vector such a transformation assigns a vector .

The plane rotation through an angle

is an orthogonal transformation.

It can be shown that any transformation in the plane or in three-dimensional space is a rotation(possibly combined with a reflection in a straight line or a plane, respectively).

Theorem 2  Invariance of Inner Product

An orthogonal transformation preserves the value of the inner product of vectors and in , defined by

For any and in , orthogonal matrix , and , we have

.

The transformation also preserves the length or norm of any vector given by

.

Proof

Let be orthogonal.

Let and .

Show that .

Now and .

Hence .

From this invariance of follows if we set .

Theorem 3  Orthonormality of Column and row Vectors

A real square matrix is orthogonal if and only if its column vectors (and also its row vectors) form an orthonormal system, that is,

Theorem 4  Determinant of an orthogonal matrix

The determinant of an orthogonal matrix has the value or .

Proof

From and

We get for an orthogonal matrix

.

,

Example 4  Illustration of Theorems 3 and 4

and

Sage Coding

 var('t') A = matrix([[2/3, 1/3, 2/3], [-2/3, 2/3, 1/3], [1/3, 2/3, -2/3]]) print A.det() B = matrix([[cos(t), -sin(t)], [sin(t), cos(t)]]) print B.det().simplify_full()

-1

1

Theorem 5  Eigenvalues of an orthogonal matrix

The eigenvalues of an orthogonal matrix are real or complex conjugates in pairs and have absolute value .

Example 5  Eigenvalues of an orthogonal matrix

The orthogonal matrix in has the characteristic equation

The real eigenvalues is or .

Trying find . Division by gives and the two eigenvalues and , which have absolute value .

Sage Coding

 A = matrix([[2/3, 1/3, 2/3], [-2/3, 2/3, 1/3], [1/3, 2/3, -2/3]]) print A.charpoly() print solve(x^3 - 2/3*x^2 - 2/3*x + 1, x) print abs(-1/6*I*sqrt(11) + 5/6).simplify_full() print abs(1/6*I*sqrt(11) + 5/6).simplify_full()

x^3 - 2/3*x^2 - 2/3*x + 1

[x == -1/6*I*sqrt(11) + 5/6, x == 1/6*I*sqrt(11) + 5/6, x == -1]

1

1

General properties of eigenvectors

Eigenvectors of an matrix may form a basis for . If we are interested in a transformation , such an “eigenbasis”(basis of eigenvectors).

We can represent any in uniquely as a linear combination of the eigenvectors , , , ,

.

And denoting the corresponding eigenvalues of the matrix by , , , , we obtain

(1)

Theorem 1  Basis of Eigenvectors

If an matrix has distinct eigenvalues, then has a basis of eigenvectors , , , for .

Example 1  Eigenbases. Nondistinct Eigenvalues. Nonexistence

The eigenvalues and eigenvectors of the matrix .

Solution

Eigenvalues

Characteristic equation is

Roots (eigenvalues of ) are , .

Eigenvector of corresponding .

Solution is .

This determines an eigenvector corresponding to .

We choose ,

.

Eigenvector is

.

Eigenvector of corresponding .

Solution is .

This determines an eigenvector corresponding to .

We choose ,

.

Eigenvector is

.

Sage Coding

 A = matrix([[5, 3], [3, 5]]) A.eigenvectors_right()

Evaluate

[(8, [(1, 1)], 1), (2, [(1, -1)], 1)]

Even if not all eigenvalues are difference, a matrix may still provide an eigenbasis for .

A may not have enough linearly independent eigenvectors to make up a basis. For instance,   and has only one eigenvector ( arbitrary).

Theorem 2  Symmetric Matrices

A symmetric matrix has an orthonormal basis of eigenvectors for .

Example 2  Orthonormal Basis of Eigenvectors

The matrix is  symmetric, and an orthonormal basis of eigenvectors is

, .

Sage Coding

 A = matrix(QQbar, [[5, 3], [3, 5]])   # QQ는 gram-schmidt 적용 안됨 P = A.eigenmatrix_right()[1]  # matrix whose colums are eigenvectors print P print G, M = P.transpose().gram_schmidt(orthonormal=True)  # gram-schmidt는 행벡터에 대하여 적용 print G.transpose()

Evaluate

[ 1  1]

[ 1 -1]

[ 0.7071067811865475?  0.7071067811865475?]

[ 0.7071067811865475? -0.7071067811865475?]

Similarity of Matrices, Diagonalization

Definition  Similar Matrices Similarity Transformation

An matrix is similar  to an matrix if

for some matrix .

This transformation, which gives from is a similarity transformation.

.

Theorem 3  Eigenvalues and Eigenvectors of Similar Matrices

If is similar to , then has the same eigenvalues as . Furthermore, if is an eigenvector of , then is an eigenvector of corresponding to the same eigenvalue.

Proof

( an eigenvalue, ) we get .

Now .

By this “identity trick” the previous equation gives

.

Hence is an eigenvalue of and a corresponding eigenvector.

Indeed, would give , contradicting .

Example 3  Eigenvalues and Eigenvectors of Similar Matrices

Let and .

Then .

Here was obtained with . We see that has the eigenvalues , .

Characteristic equation of is .

The roots (the characteristic equation of ) is , .

From the first component of we have .

For this gives , say, .

For this gives , say, .

We have

These are eigenvectors of the diagonal matrix . We see that and are the column of . This suggests the general method of transforming a matrix to diagonal from by using , the matrix with eigenvectors as column.

Sage Coding

 A = matrix([[6, -3], [4, -1]]) A.eigenmatrix_right()

Evaluate

([3 0]  [  1   1] ,[0 2], [  1 4/3] )

Theorem 4  Diagonalization of Matrix

If an matrix has a basis of eigenvectors, then

(5)

is diagonal, with the eigenvalues of as the entries on the main diagonal. Here is the matrix these eigenvectors as column vectors. Also

()

Example 4  Diagonalization

Diagonalize .

Solution

The characteristic equation is .

The roots (eigenvalues of ) are , , .

By the Gauss elimination applied to with .

We find eigenvectors and then by the Gauss-Jordan elimination.

The results are

,   ,   ,   ,

Calculating and multiplying by from the left, we thus obtain

.

Sage Coding

 A = matrix(QQ, [[7.3, 0.2, -3.7], [-11.5, 1.0, 5.5], [17.7, 1.8, -9.3]]) print A.is_diagonalizable() A.eigenvectors_right()

Evaluate

True

[(3, [(1, -3, 1)], 1), (0, [(1, 1/2, 2)], 1), (-4, [(1, -1, 3)], 1)]

 P = matrix([[1, -3, 1], [1, 1/2, 2], [1, -1, 3]]).transpose() P.inverse()*A*P

Evaluate

[ 3  0  0]

[ 0  0  0]

[ 0  0 -4]

Quadratic Forms. Transformation to Principal Axes

By definition a quadratic forms in the components , , , of a vector is a sum of terns, namely,

(7)

is called the coefficient matrix of the from. The matrix is symmetric.

We can take off-diagonal terns together in pairs in pairs and write the result as a sum of two equal terns.

Example 5  Quadratic Form. Symmetric Coefficient Matrix

Let .

Here .

From the corresponding symmetric matrix , where ,

thus , , , we get the same result,

.

By Theorem 2 the symmetric coefficient matrix of has an orthonormal basis of eigenvectors. Hence if we take these as column vectors, we obtain a matrix that is orthogonal.

().

.                              (8)

If we get , then, since , we get

.                                               (9)

Furthermore, in (8) we have and , so that becomes simply

.                 (10)

Theorem 5  Principal Axes Theorem

The substitution transforms a quadratic form

()

to the principal axes form or canonical form , where , , , are the (not necessarily distinct) eigenvalues of the matrix , and is an orthogonal matrix with corresponding eigenvectors , , , , respectively, as column vectors.

Example 6  Transformation to Principal Axes. Conic Sections

Find out what type of conic section the following quadratic form represents and transform it to principal axes :

.

Solution

We have , we have

.

The characteristic equation .

The roots (eigenvalues of ) are , .

Hence (10) becomes

.

represents the ellipse , that is

.

The direction of the principal axes in the -coordinates,

we have determine normalized eigenvectors from with and

and  then (9). We get

and

hence

,        .

This is rotation.

Sage Coding

① Computing eigenvalues of

 A = matrix(2, 2, [17, -15, -15, 17]) print A.eigenvalues()

Evaluate

[32, 2]

② Computing eigenvectors of

 print A.eigenvectors_right()

Evaluate

[(32, [(1, -1)], 1), (2, [(1, 1)], 1)]

③ Computing diagonalizing

 G = matrix([[1, -1], [1, 1]])  # Constructing a matrix whose columns  are eigenvectors P = matrix([1/G.row(j).norm()*G.row(j) for j in range(0,2)]) # Normalizing the row vectors (The orthogonality follows from the # fact that the eigenvalues are distinct) P = P.transpose()     # Constructing a matrix whose columns are                       # orthonormal eigenvectors print P

Evaluate

[ 1/2*sqrt(2)  1/2*sqrt(2)]

[-1/2*sqrt(2)  1/2*sqrt(2)]

④ Sketching two ellipses simultaneously

 var('u, v') s = vector([u, v]) B = P.transpose()*A*P p1 = implicit_plot(s*A*s == 128, (u, -10, 10), (v, -10, 10), axes = 'true') p2 = implicit_plot(s*B*s == 128, (u, -10, 10), (v, -10, 10), color = 'red', axes = 'true') show(p1 + p2)  # Ploting two graphs simultaneously

Evaluate

[한빛 아카데미] Engineering Mathematics with Sage:

[저자] 이상 구, 김영 록, 박준 현, 김응 기, 이재 화

Contents

A. 공학수학 1 – 선형대수, 상미분방정식+ Lab

Chapter 01 벡터와 선형대수 http://matrix.skku.ac.kr/EM-Sage/E-Math-Chapter-1.html

Chapter 02 미분방정식의 이해 http://matrix.skku.ac.kr/EM-Sage/E-Math-Chapter-2.html

Chapter 03 1계 상미분방정식 http://matrix.skku.ac.kr/EM-Sage/E-Math-Chapter-3.html

Chapter 04 2계 상미분방정식 http://matrix.skku.ac.kr/EM-Sage/E-Math-Chapter-4.html

Chapter 05 고계 상미분방정식 http://matrix.skku.ac.kr/EM-Sage/E-Math-Chapter-5.html

Chapter 06 연립미분방정식, 비선형미분방정식 http://matrix.skku.ac.kr/EM-Sage/E-Math-Chapter-6.html

Chapter 07 상미분방정식의 급수해법, 특수함수 http://matrix.skku.ac.kr/EM-Sage/E-Math-Chapter-7.html

Chapter 08 라플라스 변환 http://matrix.skku.ac.kr/EM-Sage/E-Math-Chapter-8.html

B. 공학수학 2 - 벡터미적분, 복소해석 + Lab

Chapter 09 벡터미분, 기울기, 발산, 회전 http://matrix.skku.ac.kr/EM-Sage/E-Math-Chapter-9.html

Chapter 10 벡터적분, 적분정리 http://matrix.skku.ac.kr/EM-Sage/E-Math-Chapter-10.html

Chapter 11 푸리에 급수, 적분 및 변환 http://matrix.skku.ac.kr/EM-Sage/E-Math-Chapter-11.html

Chapter 12 편미분방정식 http://matrix.skku.ac.kr/EM-Sage/E-Math-Chapter-12.html

Chapter 13 복소수와 복소함수, 복소미분 http://matrix.skku.ac.kr/EM-Sage/E-Math-Chapter-13.html

Chapter 14 복소적분 http://matrix.skku.ac.kr/EM-Sage/E-Math-Chapter-14.html

Chapter 15 급수, 유수 http://matrix.skku.ac.kr/EM-Sage/E-Math-Chapter-15.html

Chapter 16 등각사상 http://matrix.skku.ac.kr/EM-Sage/E-Math-Chapter-16.html

Made by Prof. Sang-Gu LEE  sglee at skku.edu

http://matrix.skku.ac.kr/sglee/   with Dr. Jae Hwa LEE