<Linear Algebra by SGLee>  (선형대수학 PBL 보고서)  http://matrix.skku.ac.kr/sglee/

LA PBL (Problem Based Learning) class report

[Syllabus: Linear Algebra ]

Linear Algebra Syllabus (선형대수학 수업계획서)

<Lectures Recorded>

http://matrix.skku.ac.kr/2017-Album/2017-Spring-Lectures.htm

http://matrix.skku.ac.kr/2015-LA-FL/Linear-Algebra-Flipped-Class-SKKU.htm

A. 2017, Linear Algebra(선형대수학),

○ Lectures and Problem solving, 2017

(English Textbook) LA Free Textbook (e-book) : http://goo.gl/t3JcNP

Linear Algebra Lecture Note (English) http://matrix.skku.ac.kr/LA/

Linear Algebra Lecture Note (Korean) http://matrix.skku.ac.kr/LA-K/

Linear Algebra Simulations:  http://matrix.skku.ac.kr/LinearAlgebra.htm

빅데이터 시대의 수학교육  (대학수학교육과 CT) 코딩교육 https://youtu.be/nZlOKfW6XVY

Linear Algebra Simulations:  http://matrix.skku.ac.kr/LinearAlgebra.htm

○ LA Contents

Linear Algebra, First Class, Syllabus-Review  https://youtu.be/43nhECDzfiE

[Linear Algebra, <Lectures Recorded> English (영어 강의)]

Chapter 1. Vectors

*Sec 1.1 Vectors in n-space and *1.2 Inner product and Orthogonality  https://youtu.be/f6eKIuLE-Ko

Sec 1.3  Vector Equations of lines and planes  and 1.4 Excercisehttps://youtu.be/MR1md8R1T_g

Sec 2.1  Linear System of Equations, https://youtu.be/JbfSo5G6JR0

Week 2 Review, https://youtu.be/nhYG5uuGHqU

Chapter 2. Linear system of equations

Sec 2.1 Linear system of equations  https://youtu.be/JbfSo5G6JR0

Sec 2.2  Gaussian and Gauss-Jordan elimination, https://youtu.be/ySncbrZTdMk

Sec 2.2 -2.3 Exercise https://youtu.be/ySncbrZTdMk https://youtu.be/khbfoZBFfvA

Sec 2.4  Exercises, https://youtu.be/khbfoZBFfvA

Chapter 3. Matrix and Matrix Algebra

Sec 3.1 Matrix operation https://youtu.be/rDt3EOGl9lg

Sec 3.2 Inverse matrix https://youtu.be/o2iT6ZT5WIU

Sec 3.3 Elementary matrix  https://youtu.be/DubGO81dTAI

Sec 3.4 (part 1) Subspace  https://youtu.be/gNm7yzk8ess

Sec 3.4 (part 2) Linear independence https://youtu.be/LswYaDbj4ds

Sec 3.5 Solution set of a linear system and matrix https://youtu.be/7zZDGgPGE4s

Sec 3.6 Special matrices and Sec 3.8  Exs/Sol https://youtu.be/gve7cYW3W9I

(* Linear Algebra Sec  *3.7 LU-decomposition - http://youtu.be/lKJPnLCiAVU )

Student Review : Ch3-Ch2-Ch1   https://youtu.be/m4ZEHknJMZY

Chapter 4. Determinant

Sec 4.1 Definition and Properties of the Determinants  https://youtu.be/ltxi0hCUILg

Sec 4.2 Cofactor Expansion/ Appl of Determinants https://youtu.be/Yn5qu_062sA

Sec 4.3 Cramer's Rule https://youtu.be/rAmfnERqfU8

Sec *4.4 Application of Determinant  https://youtu.be/APsZ33BBOVs

Sec  4.5 Eigenvalues/Eigenvectors, 4.6 Excercise https://youtu.be/s1OI74nr660

Chapter 5. Matrix Model

5.1 Lights out Game (http://youtu.be/_bS33Ifa29s )

5.2 Power Method (http://youtu.be/CLxjkZuNJXw )

Project: http://youtu.be/coNq48CW6Pg

- Ch 5 Matrix Model 학생 발표 https://youtu.be/4u9LtmX7lvk

Chapter 6. Linear Transformations

Sec 6.1 Matrix as a Function (Transformation)  https://youtu.be/Es4BfHnIq7g

Sec 6.2 Geometric Meaning of LT (part 1) https://youtu.be/V6m0PKQm6es

(part 2) https://youtu.be/qeRmhJQIphI

Sec 6.3 Kernel and Range  https://youtu.be/7OfNTNl6IjI

Sec 6.4 Composition of LT and Invertibility   https://youtu.be/Im7uaogKySw

Sec *6.5 Computer Graphics with Sage  https://youtu.be/45zSkGN7inw

Sec 6.6 Exercises  https://youtu.be/pJgIaHpIhsM

Chapter 6. QnA Review https://youtu.be/snQsn2J_tuA

LA Midterm PBL 1 Presentation https://youtu.be/WG-HFdER5Ro

LA PBL and Ch6 and Ch4 Student Review https://youtu.be/xq4YyRtzTKg

Sample Midterm Exam: http://matrix.skku.ac.kr/LA/2016-S-LA-Midterm-Final-Solution.pdf

LA Midterm Exam Sol http://matrix.skku.ac.kr/2017-Album/LA-Midterm-Exam-Solution.htm  (image)

2017 Spring LA Midterm Exam Review https://youtu.be/DOhahD4Nb44

Chapter 7. Dimension and Subspaces

Sec 7.1 and 7.2 (Review)  Bases and dimensions, Basic spaces  https://youtu.be/45J08qGSzmk

Sec 7.3,7.4, 7.5, Rank-Nullity theorem, Rank theorem, Projection theorem

Sec *7.6 Least square solution  https://youtu.be/GwHh5lh5wEs

Sec 7.7 Gram-Schmidt orthonormalization process  https://youtu.be/Px6Gaks9fXQ

Sec * 7.8 QR-Decomposition; Householder ...

Sec 7.9 Coordinate vectors  https://youtu.be/VR9FoZDQmAo

Chapter 8. Diagonalization

Sec 8.1 Matrix Representation of LT  https://youtu.be/LpIR47W_stw

Sec 8.2 Similarity and Diagonalization  https://youtu.be/wqrLcfSeL8Q

Sec 8.3 Diagonalization with orthogonal matrix https://youtu.be/5Sg-Edczw_g

Sec 8.4 Quadratic forms and Sec *8.5 Appl. https://youtu.be/mjAr3ddevE8

Sec 8.6 SVD and Pseudo-Inverse  https://youtu.be/KU5l-XWDJuo

Sec 8.7 Complex eigenvalues and eigenvectors  https://youtu.be/-l7uTfYHjFU

Sec 8.8 Hermitian, Unitary, Normal Matrices  https://youtu.be/NRTmmmC-L9k

Sec *8.9 Linear system of differential equations http://www.hanbit.co.kr/EM/sage/1_chap6.html

Chapter 9. General Vector Spaces

Sec 9.1 Axioms of Vector Space, https://youtu.be/RnKjspG65AM

Sec 9.2 Inner product spaces; *Fourier Series, https://youtu.be/J0s8AkP4E38

Sec 9.3 Isomorphism, https://youtu.be/WiZZtF0c1hY

Chapter 10. Jordan Canonical Form

10.1 Finding the Jordan Canonical Form with a Dot Diagram (https://youtu.be/8fwPPOg8LW0  )

*10.2 Jordan Canonical Form and Generalized Eigenvectors, https://youtu.be/YrRnCByzxNM

10.3 Jordan Canonical Form and CAS, https://youtu.be/YrRnCByzxNM   (http://youtu.be/LxY6RcNTEE0  )

(학생 문제 풀이, https://youtu.be/y4173MpjoxE  , Section 10-1 http://youtu.be/9-G3Fd2xOW0

(Math for Big Data, Lecture 10, Finding JCF using Dot Diagram, https://youtu.be/1E3wXN1oZyc  )

(Math for Big Data, Lecture 11, Generalized eigenvectors and Matrix Function,

Solution Book for Linear Algebra

Project Presentation (Project 발표) http://youtu.be/cxdj7hDWk08

2017 S LA Final Exam Solution for Grading

Appendix

SKKU Sage Matrix Calculator by SGLee https://youtu.be/Yx_llWB8qCY

SKKU Sage Visual Math  by SGLee  https://youtu.be/8S7bNgmiQeM

SKKU Sage Linear Algebra Lab by SGLee  https://youtu.be/gb1BsNdqB1Y

SKKU Sage Linear Algebra knowls by SGLee    https://youtu.be/IW05qYfeT2Y

SKKU Sage Calculus Book by SGLee   https://youtu.be/NU41B3SqLM8

Welcome to ICME 12, Seoul https://youtu.be/HdInOrAB8rU

[Sample Exam]

2017 LA Midterm Exam http://matrix.skku.ac.kr/2017-Album/2017-S-LA-Midterm-Exam-Final-3.pdf

Reference video: http://youtu.be/CLxjkZuNJXw

http://matrix.skku.ac.kr/3d-print/

[선형대수학 Korean Lectures – 우리말 강의 (동영상)]

선형대수학 입문 (Lecture) - 성균관대 이상구 교수 Prof Sang-Gu Lee, SKKU http://matrix.skku.ac.kr/sglee/

(한국어 강좌 설명) PBL - Flipped Learning   http://youtu.be/Mxp1e2Zzg-A

Lecture 1 Introduction http://youtu.be/w7IzR4nGa3Q

Section 1.1 벡터 and 1.2 내적 http://youtu.be/aeLVQoPQMpE

Section 1.3 벡터방정식, Vector Equations http://youtu.be/4UGACWyWOgA

Section 2.1 Linear System of Equations, 선형연립방정식 http://youtu.be/CiLn1F2pmvY

Section 2.2 Gauss-Jordan 소거법, http://youtu.be/jnC66zvqHJI

Section 3.1 Matix Algebra, 행렬연산 http://youtu.be/DmtMvQR7cwA

Section 3.2, Section 3.3 역행렬과 기본행렬 http://youtu.be/GCKM2VlU7bw

Section 3.4 Subspaces - 부분공간 http://youtu.be/HFq_-8B47xM

Section 3.5 Solution space 3.6 Special matrices, 해공간, 특수행렬   http://youtu.be/daIxHJBHL_g

Section 4.1 Determinant - 행렬식, http://youtu.be/DM-q2ZuQtI0

Section 4.2 Cofactor and Inverse, 여인자 전개와 역행렬, http://youtu.be/XPCD0ZYoH5I

Section 4.3 Cramer's rule 4.4. Appl, 4.5 Eigenvalue and Eigenvector

크래머의 법칙, 고유값, 고유벡터 http://youtu.be/OImrmmWXuvU

Section 6.1 Linear Transformation, 선형변환  http://youtu.be/YF6-ENHfI6E

Section 6.2 Meaning of LT, 선형변환의 기하학적 의미 http://youtu.be/cgySDj-OVlM

Section 6.3 Kernel and Range, 핵과 치역  SKKU  http://youtu.be/9YciT9Bb2B0

Section 6.4 Composite and Inverse of LT, 선형변한의 합성과 역행렬 http://youtu.be/EOlq4LouGao

LA Midterm Exam http://youtu.be/R3F3VNGH8Oo

Section 7.1 Basis and Dimension, 기저와 차원 SKKU http://youtu.be/or9c97J3Uk0

Section 7.2 Fundamental Subspaces, 주요 부분공간들 SKKU https://youtu.be/BC9qeR0JWis

Section 7.3 Rank Nullity Theorem, SKKU http://youtu.be/ez7_JYRGsb4

Section 7.4 Rank Theorem, 계수정리 SKKU http://youtu.be/P4cmhZ3X7LY

Section 7.5 Projection Theorem, 정사영정리 SKKU http://youtu.be/GlcA4l8SmlM

Section 7.6* 최소제곱해(least square solution)  http://www.youtube.com/watch?v=BC9qeR0JWis

Section 7.7 Gram-Schmidt의 정규직교화과정, SKKU  http://youtu.be/gt4-EuXvx1Y

Section 7.8* QR-분해, Householder transformations  http://www.youtube.com/watch?v=crMXPi2lgGs

Section 7.9 Coordinate vector, 좌표벡터, SKKU  http://youtu.be/M4peLF7Xur0

Section 8.1 Matrix of LT, 선형변환의 행렬표현,

http://youtu.be/gn5ve1tXD7k  and http://youtu.be/jfMcPoso6g4

Section 8.2 Similarlity   닮음과 행렬의 대각화, http://youtu.be/xirjNZ40kRk

Section 8.3 Ortho. Diag. 직교대각화  http://youtu.be/jimlkBGAZfQ

Section 8.5* Appl of Quadratic Function http://youtu.be/cOW9qT64e0g

Section 8.6 Singular Value Decomposition, https://youtu.be/ejCge6Zjf1M

Section 8.7 and 8.8  Complex ev, Hermitian, Unitary, Normal matrix

복소고유값, 복소고유벡터, 정규행렬, http://youtu.be/8_uNVj_OIAk

Section 9.1 and 9-2 General Vector Spaces, Inner product spaces,  http://youtu.be/m9ru-F7EvNg

Section 9.3 Isomorphism 동형사상, LA,   http://youtu.be/frOcceYb2fc

Section 10.1 Jordan Canonical Form (Jordan 표준형)  http://youtu.be/NBLZPcWRHYI

Section 10.3  Jordan Canonical Form with Sage  http://youtu.be/LxY6RcNTEE0

(15 주차)  복습과 프로젝트 발표

Math, Art and 3D Printing  http://youtu.be/olTfft1cuGw

SKKU LA 2015 S PBL 보고서 발표 by 김병찬 & 우시명,  http://youtu.be/hUDuQ8e8HsU

SKKU 선형대수학  PBL 보고서 발표 by  손홍철 http://youtu.be/woyS_EYWiDs

SKKU 선형대수학  PBL 보고서 ppt 발표 by  박민  http://youtu.be/E-5m65-8Ea8

SKKU 선형대수학  Project Hessian Matrix PPT 발표 by 전승준  http://youtu.be/JHT6aTQhr-A

SKKU 선형대수학  Computer Graphic with Sage by 김태용, 이학현, 이종화  http://youtu.be/JFVM4KRr2nc

[PBL Report]

2017 Fall

Linear Algebra

PBL Report (Final)

Professor: LEE, Sang-Gu

Sungkyunkwan University (SKKU)

Suwon, South Korea

Name: ***

Major: ***

Student Number: ***

E-mail or cell phone number:  ***

Personal Reflection Note

 Subject Linear   Algebra - LA Major *** Name *** Year 2017 Fall learning contents Matrix theory with applications Self-Checking Activity Excellent Good Fair 1. I have contributed to generate ideas and facts needed to resolve the issue. ● 2. I proposed learning issues associated with learning. ● 3. When I study alone, I used a variety of learning materials. ● 4. I provide new information and knowledge in this class. ● 5. I was actively involved in the discussions. And I provided a lot of questions in   order to understand these discussions. ● 6. I have made a contribution to the learning activities for our class. ● ※ Please record the following items by considering your learning process. Do you understand the most of contents of this learning process?                                      Yes I was able to understand the content and some content I still interest to know and improve my self. What kind of learning materials have you used to study? I-Campus online lecture – Khan academy – Q&A , some time I use the material shared by professor and students and watching online such as ( YouTube, LA online video ) What did you learn through the learning activities of this course? This course has provided me a lot of knowledge specially about vector operation, determinate, linear   transformation, CAS system, Dimension and subspace, diagonlization, orthogonal matrices, orthogonal similarity, and general vector space. What have you learned from other colleagues? I have learned several things such as: how to find the cofactor, and how to use sage for checking my solutions answers. How do you apply newly learned information to real world problems? First, I will apply this knowledge in my research requirement, such as analysis and also in case of the fluid and heat and mass transfer. Where we deal with the fluids and the diffusion of the molecules in 3-Dimensions by using vectors x, y, and z. also I will try to use the application of sage that will help me in checking my solution when I am dealing with real world problem.   Self-Evaluation for Q/A Activities. 1 2   Evaluation for other students The student was very active, challenging, and helpful for me. I see a good future in them.

Self-Evaluation

 Subject Linear Algebra – LA Major *** Name *** Year 2017 Fall Activity Worst Bad Not Bad Not Good Good Excellent 1. Regular on-line and off-line attendance ● 2. Active participation in Q/A discussions. ● 3. Providing appropriate questions and response in Q/A ● 4. Providing information and the knowledge to help colleagues ● 5. Respecting the opinion of other colleagues ● 6. Positive contribution to Q/A consensus-building process ● 7. I also want to study with this class colleagues at next time. ● [Your Opinions] ▶ Satisfaction according to the self-evaluation. I am satisfied ▶ Sorrow according to the self-evaluation. The time was an issue with me, but that did not stop to work hard and improve my self. which is allow me to improve my self compare to the midterm. and now I am totally satisfied with my self. and I have gained a lot through this subject and planning to review some of the course content in the future.

 Subject Linear Algebra - LA Colleague name Sungkyunkwan University Your name *** Activity Worst Bad Not Bad Not Good Good Excellent 1. Regular on-line and off-line attendance ● 2. Active participation in Q/A discussions. ● 3. Providing appropriate questions and response in Q/A ● 4. Providing information and the knowledge to help colleagues ● 5. Respecting the opinion of other colleagues ● 6. Positive contribution to Q/A consensus-building process ● 7. I also want to study with this class colleagues at next time. ● [Your Opinion] ▶ Satisfaction according to the evaluation The student was good at activity. Specially at the end of the semester they was more challenging. ▶ Sorrow according to the self-evaluation The time was the risk for all of us. but that never effect us in to keep study and learning more.

PBL Feedback for Students

 Contents 평가 (●) Excellent Good Fair Bad Worst 1. Active participation in this class ● 2. Professor helped students improve their ability. ● 3. I acquired new knowledge and Improved level of knowledge in this class.. ● 4. I improved reasoning skills in this class. ● 5. I acquired self-directed learning skills in this class. ● 6. I acquired problem-solving skills in this class. ● 7. I acquired learning operating skills in this class. ● 8. I acquired expertise in this class. ● 9. This process was similar to the actual research situation. ● 10. Fairness of evaluation methods ● 11. While I solve problems, I became more aware of the learning topics. ● 12. Active communication related on topics ● 13. Learning outcomes were derived through the problems solving process. ● 14. Effect of problem based learning ● 15. I want to participate other PBL lessons again. ●

16. What is the merit of this PBL class.

This PBL class is helpful, and such as best reference for each of us to review some requirement in later future. also has improved me a lot and teach me new things.

17. What do you want to improve this PBL system.

Solve-Revise-Finalize (and Final OK)

Participation

Weekly:

Number of your QnA participation for the whole weeks. Answers: 70 , Replayed: 39.

1) Summary

Chapter 7

Theorem 7.1.1

1-  Rank theorem

In this section, we study the relationship between the rank of a matrix A and the theorems that is related to the dimension of subspaces associated to A .

2-  Projection theorem

In Chapter 1, we have studied the orthogonal project in wherethe vectors and their projections can be visualized. In this section, we generalize the concept of projection in. We also show that the projection is a linear transformation and find its standard matrix, which will be crucial to study the Gram-Schmidt Orthogonalization and the QR-Decomposition.

Gram-Schmidt Orthogonalization

3- Least square solution

Previously, we have studied how to find solve the linear system Ax = b when the linear system has a solution. In this section, we study how to find an optimal solution using projection when the linear system does not have any solution.

4- Gram-Schmidt Orthonormalization process

Every basis of has elements, but each bases can be different. In this section, we show that every nontrivial subspace of has a basis and how to find an orthonormal basis from a given basis.

3-  QR-Decomposition; Householder Transformations

If an matrix A has k linearly independent columns, then the Gram-Schmidt Orthogonalization can be used to decompose the matrix A in the form of A = QR where the columns of Q are the orthonormal vectors obtained by applying the Gram-Schmidt Orthogonalization to the columns of and R is an upper triangular matrix. The QR -decomposition is widely used to compute numerical solutions to linear systems, least-squares problems, and eigenvalue and eigenvector problems. In this section, we briefly introduce the QR -decomposition.

Comment: Isomorphism이 성립하면 한 집단에서 성립하는 선형대수학적 성질들이 다른 집단에서도 역시 성립하기 때문에 매우 유용한 개념이다. Ex) 사회 현상을 분석 할 때 한 집단에서 적용되는 성질들이 Isomorphism이 성립되는 다른 집단에 그대로 적용 할 수 있음.

Comment: Proof 1) ono to one이면 kerT_A={0}이고 이를 Ax=0의 unique solution set으로 구할 수 있음, Ax=0의 unique solution이 존재함은 A가 linearly independet함을 의미한다.

Proof2) T_A가 onto면 ImT_A=R^m ⇔ RREF(A)의 leading ones가 m개 ⇔A의 row rank가 m ⇔m row vectors가 linearly independent함

Questions:

Why Sarrus' method cannot be applied to the case of the degree 4 or higher.

As I understood, I need to 4! permutations to make determinant in case of 4 degree.

But using Sarrus’ method, only 8 permutations are made.

here is the other methods if we have to solve determinant above 4 degree cases which is uploaded by TA.

Comment: 위에서 보듯이 Sarrus’ method는 2by2나 3by3 행렬에서만 적용 가능한 특수한 방법이며 그 이상의 행렬의 determinant를 구하기 위해서는 다른 방법이 요구됩니다.

cited  https://en.wikipedia.org/wiki/Rotation_matrix​

​In case of clockwise rotation, you can just invert sign of theta from positive to negative.

교수님께서 답변해주신 것처럼 김지윤 학생이 질문한 4개의 점을 각각 shear transformation 해 주어야 합니다.

u=[3,1], v=[2,4], u+v=[5,5]라고 하면

shear transformation 값은

[3,1]

[2,4]

[5,5] .

이를 그래픽으로 나타내면 아래와 같이 됨을 확인 할 수 있습니다.

Transformation 전

Transformation 후 (shear transformation along x-axis with scale 3

3) Solving problems

Chapter-1

Linear algebra. Chapter 1 Excercise problem 10.

Problem 10 [Projection] For  4 2 3  and  2   4, find the scalar

projection and vector projection of onto .

1) vector projection

proj=

= =

2) scalar projection

=- proj

= 2   4 - =

Computation by Sage [ Mohammed ] http://math3.skku.ac.kr/home/pub/520/

Input

a=vector([4, -2, 3])

b=vector([2, -1, 4])

ab=a.inner_product(b)

aa=a.inner_product(a)

p=ab/aa*a;w=b-p

print "p=", p

print "w=", w

Output

p= (88/29, -44/29, 66/29)

w= (-30/29, 15/29, 50/29)

Comment: I learn how to find vector projection and scalar projection

Chapter-2

Linear algebra. Chapter 2 Excercise problem 1 (new)

Problem 1 Answer the questions for the following linear system.

(1) Find the coefficient matrix.

(2) Express the linear system in the form .

(3) Find its augmented matrix.

1) Find the coefficient matrix.

2) Express the linear system in the form .

Let:

3) Find its augmented matrix .

Computation by Sage ( Mohammed )

Input

A=matrix(3, 3, [3,6,-12,1,-2,3,3,-6,1])

b=vector([3,17,1])

print A.augment(b,subdivide=True)

Output

[  3   6 -12|  3]

[  1  -2   3| 17]

[  3  -6   1|  1]

Comment: I reviewed how to form the linear system in form of , and find the augmented matrix.

Chapter-3

Linear algebra. Chapter 3 Excercise problem 6

Problem 6, Find a 4 × 4 elementary matrix corresponding to each elementary operation.

Solution:

First, Identify matrix

Comment: I learn how to obtain the elementary matrix from by using a single row operation (ERO)

Linear algebra. Chapter 3 Excercise problem 9

Problem 9, Determine if is a subspace of

Solution:

Let,

Here, is closed under the vector addition.

2) Closed under the scalar multiplication.

Here, is under the scalar multiplication.

Answer Because satisfied the conditions, is a subspace of .

Comment: In this case become subspace of , then we let be a non empty subset of . Then is called subspace of , if satisfied the following two conditions from the definition of [Subspace] (page 112.)

Closed under the scalar multiplication

Chapter-4

Linear algebra. Chapter 4 Excercise problem 2

Problem 2 Find the following determinants.

1) det              2) det

1) =

det 25

2) detdet-168

Computation by Sage [ Mohammed ]

Input

A=matrix(QQ, 3, 3, [2, 1, 5, 1, -2, 0, 4, 3, 6])

B=matrix(QQ, 4, 4, [2, 6, 6, 2, 2, 7, 3, 6, 1, 5, 0, 1, 3, 7, 0, 7])

print A.det()

print B.det()

Output

25

-168

Comment: 1) For a 2by2 matrix the determinant is ad - bc

2) For a 3×3 matrix multiply a by the determinant of the 2××2 matrix that is not in a's row or column, likewise for b and c, but remember that b has a negative sign!

3) The pattern continues for larger matrices: multiply a by the determinant of the matrix that is not in a's row or column, continue like this across the whole row, but remember the (+,−, +,−) pattern.

Chapter-6

Linear algebra. Chapter 6 Excercise problem 8 ( New )

Problem 8 Let be moved by two linear transformation and , where

Find

Solution

,

Computation by Sage [ Mohammed ]

Input

x1,x2,x3,z1,z2,z3=var('x1 , x2 , x3, z1 , z2 , z3')

T(x1,x2, x3)=(2*x1+4*x2(-8)*x3,4*x2+(-2)*x3,x3)

S(z1,z2,z3)=(4*z1(-2)*z2+z1,z1+8*z2,z1)

x(x1,x2,x3)=(x1,x2,x3)

A=linear_transformation(QQ^3,QQ^3,T)

B=linear_transformation(QQ^3,QQ^3,S)

print '(S dot T)(x)='

print  (B.matrix(side='right')*A.matrix(side='right'))*(x)

Output

(S dot T)(x)=

(x1, x2, x3) |--> (2*x1 - 32*x2 - 16*x3, 2*x1 + 32*x2 - 48*x3, 2*x1 - 32*x3)

Comment:

Chapter-7

Linear Algebra. Chapter 7 Problem  1.(new)

Use determinant to check whether the following vectors are linearly independent or not:

, , , ,

Solution.

By Theorem 7.1.1. the following five vectors in

, , , ,

are linearly dependent because .

Sage http://math3.skku.ac.kr/home/pub/476 by 황훌

x1=vector([1,1,-3,2,0])

x2=vector([0,2,1,3,0])

x3=vector([0,-1,0,-4,1])

x4=vector([0,-1,0,5,2])

x5=vector([2,2,-6,4,0])

A=column_matrix([x1,x2,x3,x4,x5])

print A.det()

Comment: 에서 위의 5개의 벡터가 기저(일차독립)일 필요충분조건은 ≠0이다.

그러나 이 문제에서는 =0이므로 5개의 벡터가 일차독립이 아니다.

LA Chapter 7 Exercises 2 (new)

Problem 2. Determine if the given set is a basis for

= {(1,2,3,4), (0,1,1,1), (-1,2,1,1), (0,0,-1,-1)}

Solution

We only solve the problem using Sage because Sarrus’ method cannot be applied n by n matrix.

To determine whether is a basis for ,check whether determinant of below matrix is zero or not.

A =

Since det(A) , = {(1,2,3,4),(0,1,1,1),(-1,2,1,1),(0,0,-1,-1)} is linearly independent. By theorem 7.1.4, is a basis for

■

Double checked by Sage [노경아] http://math3.skku.ac.kr/home/pub/472

Input

A=matrix(4,4,[1,0,-1,0,2,1,2,0,3,1,1,-1,4,1,1,-1])

print A.det()

Output

Det(A) = 1

Comment: If a subset of satisfies the following two conditions, then is called a basis for .1. is linearly independent.2. span(S) = .

LA Chapter 7 Exercises 3(new)

Find the basis for the subspace of spanned by solutions of the equation  .

Sol)

Let , where . .

Let in ℝ

.

A basis of the null space of :

Double checked by Sage 노경아http://math3.skku.ac.kr/home/pub/485

Input

A=matrix(ZZ, 1, 5, [1,2,3,4,5])

A.right_kernel()  #해공간의 기저

A.right_nullity() #해공간의 차원 = nullity(A)=4

Output

Free module of degree 5 and rank 4 over Integer Ring Echelon basis matrix:

[ 1  0  0  1 -1]

[ 0  1  0  2 -2]

[ 0  0  1  3 -3]

[ 0  0  0  5 -4]

4 = nullity() a basis

Comment: span() = span()

LA Chapter 7 Exercises (new)

Problem 6. For the following matrix A, find a basis for its column space Col(A) and compute the column rank c(A). (While there is a square matrix A in our textbook, I make a new matrix which is the rectangular matrix.)

http://math3.skku.ac.kr/home/pub/474

(1) Find rank(A) directly with Sage.

Input

A=matrix(QQ,[[10,2,-1],[0,-5,1],[0,1,2],[0,6,0]])

print A

print A.rank()

Output

 [10  2 -1] [ 0 -5  1] [ 0  1  2] [ 0  6  0] 3

(2)A basis in column space of matrix A is same as a basis in row space of matrix AT.

Input

B=matrix(QQ,[[10,0,0,0],[2,-5,1,6],[-1,1,2,0]])

print B

print B.echelon_form()

print B.right_nullity()

print B.rank()

Output

[10  0  0  0]

[ 2 -5  1  6]

[-1  1  2  0]

[     1      0      0      0]

[     0      1      0 -12/11]

[     0      0      1   6/11]

1      3

Now when S is a basis of c(A).

Comment: 위의 풀이를 통해서 행렬의 rank와 그 행렬의 전치 행렬의 rank가 같음을 볼 수 있고 Rank-Nullity theorem이 성립함을 확인할 수 있다.

Linear algebra. Chapter 7 Excercise problem 8. (New)

problem 8 ) Check if rank   rank.

1) Solution

we want to show rank() = rank(). by finding the RREF(A).

[      1       0       0       0 -274/41]

[      0       1       0       0  -84/41]

[      0       0       1       0  139/41]

[      0       0       0       1  -10/41]

The rank() = 4,

Find RREF.

[1 0 0 0]

[0 1 0 0]

[0 0 1 0]

[0 0 0 1]

[0 0 0 0]

Rank = 4.

2) Computation by Sage

Input

A=matrix(QQ,4,5,[2,-1,4,1,2,0,1,2,3,4,2,1,6,12,2,-1,4,0,2,-2])

B=A.transpose()

print A.echelon_form()

print B.echelon_form()

print A.rank()

print B.rank()

print A.rank()==B.rank()

The output solution obtained by Sage computation check

[      1       0       0       0 -274/41]

[      0       1       0       0  -84/41]

[      0       0       1       0  139/41]

[      0       0       0       1  -10/41]

[1 0 0 0]

[0 1 0 0]

[0 0 1 0]

[0 0 0 1]

[0 0 0 0]

4, 4

True, which we can see in the theorem 7.3.1. rank  rank . According to the proof. Comment: rankrank . the rank = to the number of pivot entires in the RREF(A)

LA Chapter 7 Exercises (new)

Problem 14. For given  express x as x1+x2 for which x1 is in the direction of

and x2 orthogonal to a.

Input

x=vector([1,2,3,4,5])

a=vector([9,9,9,9,10])

x1=(x.inner_product(a))/(a.inner_product(a))*a

x2=x-x1

print x1

print x2

print x2.inner_product(x1)

print x==x1+x2  #checking if x=x1+x2

Output

(315/106, 315/106, 315/106, 315/106, 175/53)  # x1

(-209/106, -103/106, 3/106, 109/106, 90/53)  # x2

0

True

Comment: 주어진 벡터 xa와 직교하는 벡터 의 합으로 표현 할 수 있음을 확인하였다

LA Chapter 8 Exercises, Problem 1 (New).

Suppose a linear transformation  is defined by is an ordered basis for .

Find the matrix representation of  relative to the ordered basis .

Solution

.

http://math3.skku.ac.kr/home/pub/478

Input

x,y = var('x, y')

h(x,y) = [x-y, x+y]

T = linear_transformation(QQ^2, QQ^2, h)

x1 = vector([1,2])

x2 = vector([-1,1])

y1 = vector([1,2])

y2 = vector([-1,1])

B = column_matrix([y1, y2, T(x1), T(x2)])

print B

print

C = B.echelon_form()

print C

print

A = C.submatrix(0,2,2,2)

print A

Output

[ 1 -1 -1 -2]

[ 2  1  3  0]

[   1    0  2/3 -2/3]

[   0    1  5/3  4/3]

[ 2/3 -2/3]

[ 5/3  4/3]

RREF 변환하여 선형기저를 구하는 것은 물론 특히 Submatrix 의미를 알게 되었다. Submatrix(0.2.2.2) 행렬의 (1,3) 성분으로부터 2개의 행과 2개의 열로 이루어진 부분행렬을 의미한다.

Linear algebra. Chapter 7 Excercise problem 12.

(Original) For and , find the standard matrix for .

Solution

(P) = .

, and

Fine the

Comment: the projection allows me to visualization the image of an objects for examples in case of my major when I draw organic molecules or molecules such as Amino acid structure I can define the differences between amino types by using the projection model.

Linear algebra. Chapter 7 Excercise problem 14.

For given   express as for which is in the direction of   and is perpendicular to .

1) Solution

we use the below equation of theorem 7.5.1 to find the orthogonal projection for

.

Solve for

Using , and to solve for

+

2) Computation by Sage

Input

x=vector([4,0,1])

a=vector([2,1,4])

x1=(x.inner_product(a))/(a.inner_product(a))*a

x2=x-x1

print x1

print

print x2

print

print x2.inner_product(x1)

print

print x==x1+x2   # Checking if x=x1+x2

Output

(8/7, 4/7, 16/7)

(20/7, -4/7, -9/7)

0

True

Comment: What I have learned: I learn that in orthogonal project in , the vectors

and their  projection can be visualized.

Linear algebra. Chapter 7 Excercise problem 16 (New)

Find the least squares curve passing through the five points .

Solution :   Let

=> => =>.

Let    where  rank()=4.

=> =>

2) Computation by Sage

Input **

A=matrix(4, 4, [1, 1, 1, 1, 1, 4, 16, 64, 1, 3, 9, 27, 1, 2, 4, 8,])

b=matrix(4 , 1, [ 2, 1, -3, 1])

B=A.transpose()

C=B*A

D=C.inverse()

D*B*b

Output

[  -11]

[ 71/3]

[-25/2]

[ 11/6]

Answer: where -11, 71/3, -25/2, 11/6

Comment: What I have learned: When we find the least squares curve passing through only 2 or 3 points, it does not have a solution, but the above theory and code gives us the least square solution.

Linear algebra. Chapter 7 Excercise problem 19 (New)

Show that each of the following sets of vectors is linearly independent, and

find its corresponding orthogonal set: .

Solution

Let for some

∴linearly independent.

Using Gram-Schmidt orthogonal process,

2) Computation by Sage [ Mohammed ]

Input

x1=vector([0,0,2,0])

x2=vector([2,0,1,2])

x3=vector([2,1,2,1])

A=matrix([x1,x2,x3])

[G,mu]=A.gram_schmidt()

B=matrix([G.row(i)/G.row(i).norm() for i in range(0,3)]);B

Output

[             0              0              1              0]

[   1/2*sqrt(2)              0              0    1/2*sqrt(2)]

[ 1/3*sqrt(3/2)  2/3*sqrt(3/2)              0 -1/3*sqrt(3/2)]

Comment: 1) I learned how to solve learn independent. 2) I learned ho to use Gram-Schimidt orthogonal process. 3) I learned how to use sage.

Linear algebra. Chapter 7 Excercise problem 22

For u=(1,2), u=(1,3), v=(1,3), v=(1,4), let ={u,u}, ={v,v} which are bases for .

(1) Find the transition matrix .

(2) Find the transition matrix .

(3) Suppose [w]= (2,2). Find [w] using the transition matrix .

(4) Suppose [w]= (1,2). Find [w] using the transition matrix .

Solution

RREF

The transition matrix from and from .

*

*

Comment: I learned How to find the transition matrix

Chapter-8

LA Chapter 8 Exercises(new)

Problem 16.The following matrix has a full column rank. Find its pseudo-inverse.

A =

1) Solution

가 full column rank를 가지므로,

= = 는 가역행렬이고,

==

2) Double checked by Sage 노경아 http://math3.skku.ac.kr/home/pub/492/

Input

A=matrix(QQ,[[1,1,2],[0,2,3],[3,7,0]])

print A.rank()

B=A.transpose() * A

Print B

Pseudo=B^-1*A.transpose()

print Pseudo

Output

3  = rank(A)

[10 22  2]

[22 54  8]

[ 2  8 13]

[  7/8 -7/12  1/24]

[ -3/8   1/4   1/8]

[  1/4   1/6 -1/12]

Comment: Full column Rank를 가질 때, 우리는 A† = (ATA)-1ATpseudo-inverse 구할 수 있다. 이는 최소제곱해를 이용할 때 유용한 역할을 한다.

LA Chapter 8, Problem 5 (new)

(1) Find the eigenvalues of .

(2) Find the algebraic and geometric multiplicity of each eigenvalue of, Then, show that is diagonalizable or not.

Solution

(1)

(2) Eigenvalue ‘1’ has its algebraic multiplicity 2 and geometric multiplicity 1.

Eigenvalue ‘2’ has its algebraic multiplicity 1 and geometric multiplicity 1.

The matrix does not have three linearly independent eigenvectors (It means that the sum of the geometric multiplicities of eigenvalues of A is not equal to n.)

The matrix is not diagonalizable

Double checked by Sage [ http://math3.skku.ac.kr/home/pub/503 ]  by 김도형

Input

A=matrix(QQ,3,3,[3,0,1,4,-2,8,-4,0,-1])

print(A)

print

B=A.eigenvalues()

print(B)

print

Output

[ 3  0  1]

[ 4 -2  8]

[-4  0 -1]

[-2, 1, 1]

Comment: 고유값과 그에 대응하는 대수적 그리고 기하적 중복도를 구하여 행렬 가 대각화 가능한지 여부를 판단하는 방법을 학습하였다.

Chapter-9

Linear algebra. Chapter 9 Excercise problem 1 (New)

When we define the addition and the scalar multiple on and as follows. Check if ℝ and are vector space.

1) , .

2) , .

3)

4)

Solution:

The vector sum ( a+b )and the scalar multiplication by

1)

,

we let,

2)

,

we let, .

3)

4)

1) Is not vector space, because scalar multiplication law does not hold.

2) Is vector space, when .

3) Is not vector space, because it does not hold the addition laws.

4) Is vector space, because it satisfies the two basic law.

{ Addition, and Scalar Multiplication, , }

Comment: I learnt how to check the vector spaces under addition and multiplication laws.

Linear algebra. Chapter 9 Excercise problem 3 (New)

Problem 3 Let be a vector in . Write as a linear combination of

, =, =.

Solution:

Let,

So,

Computation by Sage [ Mohammed ] [Lee Dong Ho]

Input:

A= matrix(3,3,[1,1,1,1,1,2,1,-1,3])

b=vector([2,6,2])

B=A^(-1)*b

print B

Output:

(-6, 4, 4)

Comment: I learnt, how to form a new vectors from the original vectors.

Linear algebra. Chapter 9 Excercise problem 4 (New)

Determine if the given vectors in a given vector space are linearly independent or linearly dependent.

(1) :, , , .

(2)

(3)

Solution

(1) Let,

Find the determinant of the coefficient matrix = -829

(2) Let,

(3) Let,

(1) The vector are linearly independent.

(2) The vector are linearly independent.

(3) The vector are linearly dependent.

Computation by Sag [ Mohammed ]

B=matrix(QQ, 4, 4, [3, 0, 4, 8, -6, 2, 0, 3, 9, -3, 3, -2, 1, 6, 9, 0])

print B.det()

Output

-829 ( Determinant )

Comment: I learn how to determine if a given vector are linearly independent or dependent.

Linear algebra. Chapter 9 Excercise problem 5 (New)

Problem 5 Let be the complex inner product space with the Euclidean inner product. Let . Answer the following.

1) Compute

2) Compute .

3) Confirm the Cauchy-Schwarz inequality.

4) Confirm the triangle inequality.

Solution

1)

=

2)

3)

implies

4)

Triangle inequality holds.

Comment: I reviewed about Euclidean inner product.

Linear algebra. Chapter 9 Excercise problem 14 (New)

Show by the theorem 9.1.3 that 5, , are linearly independent.

Solution

By using Wronski' Test we can show that for some .

for some .

Determinant =    for some . So, the given vectors are linearly independent

Computation by Sage [ Mohammed ]

Input in http://sage.skku.edu/

var('x')

W=wronskian(5, e^(4*x), e^x) # wronskian(f1(x), f2(x), f3(x))

print W

Output

-60*e^(5*x)

Comment: I learned that how to check linear independence of several continuous functions.

Linear algebra. Chapter 9 p1 (New)

If are subspaces of a vector space , Show that is a subspace of .

Solution

Let x, y

( is a subspace of ) and

( is a subspace of )

Let x, y

( is a subspace of ) and

( is a subspace of )

, .

is subspace of .

Comment: I learn how to prove a subspace at given vector.

[Midterm Exam]

 Spring 2017, LA Midterm Exam Solution-채점용 (50 min In class Exam ) Sign Course Linear Algebra Prof. Sang-Gu Lee Class # (mark) 분반 41    or     42 Major 전공 Student No. 학번 Name ※ Notice   http://matrix.skku.ac.kr/LA/       1. Fill out the above boxes before you start this Exam.  (학번, 이름 등을 기입하고 감독자 날인)       2. Honor Code: (시험 부정행위시 해당 교과목 성적이 "F" 처리됨은 물론 징계위원회에 회부될 수 있습니다.)       3. You can go out only after the permission from proctors.         (감독위원의 지시가 있기 전에는 고사장 밖으로 나갈 수 없으며, 감독위원의 퇴실 지시가 있으면 답안지를 감독위원께 제출한 후에 퇴실하시기 바랍니다.)     4. You may use the following in your answers.   (중간고사까지는 한국어 답안도 OK) Total Score (100 pt) Offline Exam 86 Participation 14 14

 var('a, b, c, d')                 # Define variables eq1=3*a+3*b==12              # Define equation1 eq2=5*a+2*b==13              # Define equation2 solve([eq1, eq2], a,b)            # Solve eq’s A=matrix(QQ, 3, 3, [3, 0, 0, 0, 0, 2, 0, 3, 4]);     # Matrix x=vector([3, 1, 2])              # Define vector x A.augment(x)                   #  [A: x] A.echelon_form()  또는  A.RREF() # Find RREF A.inverse()                     # Find  inverse A.det()                        # Find determinant A.adjoint()                     # Find adjoint matrix A.charpoly()                    # Find charct. ploy A.eigenvalues()                 # Find eigenvalues A.eigenvectors_right()           # Find eigenvectors A.rank()                        # Find rank of A A.right_nullity()                 # Find nullity of A var('t')                  # Define variables x=2+2*t                 # Define a parametric eq. y=-3*t-2 bool( A== B)            # Are A and B same? var('x, y')                        # Define variables f = 7*x^2 + 4*x*y + 4*y^2-23     # Define a function implicit_plot( f, (x, -10, 10), (y, -10, 10))   # implicit Plot  parametric_plot((x,y), (t, -10, 10), rgbcolor='red')  # Plot plot3d(y^2+1-x^3-x, (x, -pi, pi), (y, -pi, pi))     # 3D Plot A=random_matrix(QQ,7,7)  # random matrix of size 7 over Q F=random_matrix(RDF,7,7)  # random matrix of size 7 over R P,L,U=A.LU()            # LU  (P: Permutation M. / L, U print P, L, U   h(x, y, z) = [x+2*y-z, y+z, x+y-2*z] T = linear_transformation(U, U, h)  # L.T. print T.kernel()             # Find a basis for kernel(T) C=column_matrix([x1, x2, x3]) D=column_matrix([y1, y2, y3]) aug=D.augment(C, subdivide=True) Q=aug.rref()   [G,mu]=A.gram_schmidt()    # G-S B=matrix([G.row(i)/G.row(i).norm() for i in range(0,4)]); B #   A.H  # conjugate transpose of A A.jordan_form()   # Jordan Canonical Form of A

I. (1pt x 20= 20pt)  True(T) or False(F).  Let be a set of vectors in .

1. (  F  ) The set of all linear combinations of two vectors  and in is a plane.   (two nonzero vectors)

2.T ) A set of vectors in that contains a zero vector is linearly dependent.

3. (  F  ) If {​, ​, ​} is a linearly independent set, then so is the set {​, ​, ​} for any scalar . (nonzero)

4. (  F  ) Any matrix can be written as a product of elementary matrices. (nonsingular)

5. (  F  )    is a linear transformation. (not LT)

6. (  F  ) If is surjective, . (injective)

7. (  F ) If is invertible, is an eigenvalue of (nonzero)

8. (  F  ) is an odd permutation. (even)

9. (  F  )  If , , in are independent vectors, then (nonzero)

10. ( T ) If is a real orthogonal matrix, then the linear mapping   preserves length.

11. (  F ) For a transformation , if ,  then it is called surjective.  (injective)

12 (  F ) If is linearly independent and is a subset of , then is linearly independent. (if is a subset of )

13 (  F ) For any matrix with ,    .   (only if nonsingular)

14 (  F ) Let . Then, columns   of span a row space of . (column space)

15 (  F ) In , vectors are always linearly independent.  (dependent)

16 (  F ) A line   forms a subspace through and parallel to . (only if is a zero vector)

17 (  F ) A normal vector is same as the orthogonal complement of . (no)

18. (  F  )  A normal vector of is . (no)

19T ) Let be an matrix. For any ( ) . ( : cofactors)

20. (  F ) The homogeneous system   for   always has a non trivial solution if . (if )

II. (3pt x 5 = 15pt) State or Define (Fill the boxes and/or state).

1.  Vector Equation : A plane in can be uniquely obtained by passing through a point   and three nonzero vectors , and   in that are linearly independent. Let   be any point on , then can be expressed as a linear combination of , and .

=>

where , and   are parameters in (i.e. )

2.  For a point   and a plane , the distance from the point to the plane is

=>

3.   For vectors  ,   in , tell me how you can define the angle between and .

There exist ,       such that

=>

4. [Determinant] The determinant of an matrix is defined as

Let be an matrix. We denote the determinant of matrix as or and define it as follows.

■

5. [kernel] Let be a linear transformation. Then

The kernel of =

The set of all vectors in , whose image becomes by ,

is called kernel of and is denoted by . That is, .

III. (3pt x 13 = 39pts)  Find, Compute or Explain (Fill the boxes) :

1. Find of parabolic equation   which passes through   and . Vandermonde matrix.

Form a LSE and use Vandermonde matrix.  , , . Then .

 A=matrix(QQ, 2, 2, [-1, 1, 2, 1]) A.det() B=A.inverse() y=matrix(QQ, 3, 1, [3, 5, 5]) By=B*y print By   [-3] [-4] [-2] A=matrix(QQ, 3, 3, [1, 1, 1, 4, 2, 1, 9, 3, 1]) A.det() B=A.inverse() y=matrix(QQ, 3, 1, [3, 5, 5]) By=B*y print By   [-1] [ 5] [-1]

=> ,           ∴      ,           ,

2. Find the volume of parallelepiped which is generated by three vectors, , , and ..

Let , , .  The volume of parallelepiped is               ■

Double checked by Sage. http://math3.skku.ac.kr/home/pub/289

 x1=matrix(2,1,[6,3]) x2=matrix(2,1,[2,7]) y1=matrix(3,1,[8,0,3]) y2=matrix(3,1,[0,-4,6]) y3=matrix(3,1,[4,2,-2])                   A=x1.augment(x2) B=y1.augment(y2).augment(y3) a=A.det() b=B.det() print a.abs() print b.abs()          36                              16

3. Find the degree 3 polynomial     which passes through the following four points.

=>

 A=matrix(3, 3, [1, 1, 1, 8, 4, 2, 27, 9, 3]) b=vector([-2, -2, 6]) Ai=A.inverse() print "x=", Ai*b print "x=", A.solve_right(b) x= (1, -2, -1)

=>     ,       ,      ,

■

4. Let the characteristic polynomial of matrix be   . Find eigenvalues of matrix .

The eigenvalues of matrix is ,   ,   and   .

5.  If   is a linear transformation, then the standard matrix    of T has the following relation for     where

6. When you have eigenspaces of corresponding to each eigenvalue 0 and 5.

Show that they are orthogonal to each other in the plane.

The eigenspace of corresponding to   is = .

The eigenspace of corresponding to    is = .

For any   in     and any    in ,    [ Show   <, >  =  0 ]

= and =  for some t and s resp., then

<, > =< , >  =

=> and are orthogonal.

and are orthogonal to each other in the plane.         ■

7. Let be an matrix. Find 2 statements which is not equivalent to “the matrix is invertible.”? (Choose two)

(1) Column vectors of are linearly dependent.

(2) Row vectors of are linearly independent.

(3)   has a unique solution .

(4) For any vector ,   has a unique solution.

(5) and are row equivalent.

(6) and are column equivalent.

(7)

(8)   is an eigenvalue of .

(9) : by   is injective

(10) : by   is surjective.

1         ,          8        .      ■

8. Fill out a Sage command and answer box to find eigenvalues of and corresponding eigenvectors.

A=matrix([[4, -1, 0, -1], [-6, -3, 6, -1], [0, -2, 4, -2], [6, 5, -6, 3]])

print A.eigenvectors_right()

[(2, [(1, 1, 2, 1)], 1), (-2, [(0, 1, 0, -1)], 1), (4, [(1, 0, 1, 0), (0, 1, 1, -1)], 2)]

Answer   eigenvalues of   =   2,  -2, 4,  4.

corresponding eigenvectors :  (1, 1, 2, 1), (0, 1, 0, -1), (1, 0, 1, 0), (0, 1, 1, -1)  in the order

9. Let   be moved by two linear transformations , where
,        .
Find .

,   =>     .

10.  Find the dimension of the null space of the following matrix
where   .

the dimension of the null space  =           ■

12. Let . Find .

12. Consider where   and . You were asked to find

(1) Augment matrix [A: y]   (2) RREF (3) Det  (4) Inverse of   (4) characteristic polynomial of   (5) all eigenvalues of   (6) all eigenvectors of .   The following is your answer. Fill out the blanks to find each.

 1) Step 1: Browse http://math3.skku.ac.kr or http://math1.skku.ac.kr/ (or https://cloud.sagemath.com etc)  2) Step 2: Type  class/your   ID: (     2017LA      )  and  PW : (      ****           )   3) Step 3:  Click “New worksheet (새 워크시트)” button.  4) Step 4: Define a matrix in the first cell in rational (QQ) field.    A = matrix(QQ,5,5,[7,-2,-2,1,0,3,0,-2,1,2,12,-4,-3,2,0,6,-8,-4,6,4,1,-2,-2,1,6]) and     y = matrix(QQ, 5, 1, [0, 2, 4, 2, 1] ) 5) Step 5: Type a command to find augment matrix [A: y]   A.augment(y)              and evaluate  6) Step 6: Type a command to find RREF            (      A.RREF()     )   and evaluate.   7) Step 7: Type a command to find determinant of    (       A.det()        )   and evaluate.  8) Step 8: Type a command to find inverse of        (       A.inverse()        )   and evaluate.  9) Step 9: Type a command to find char. polynomial of (    A.charpoly()          ) and evaluate.  ...  10) Last step : Give 'print' command to see what you like to read.

Now we have some out from the Sage.

RREF =  Identity matrix of size 5

det  = 144

inverse =

[   -1   1/6   2/3 -1/12     0]

[ -4/3   2/3   2/3 -1/12  -1/6]`

[   -4   1/3   7/3  -1/6     0]

[ -8/3   5/6   4/3  1/12  -1/3]

[ -7/6   1/6   2/3 -1/12   1/6]

characteristic polynomial of = x^5 - 16*x^4 + 95*x^3 - 260*x^2 + 324*x - 144

eigenvalues of = {  6 , 4 ,  3 ,  2 ,  1 }

eigenvectors  = [( 6 , [(0, 1, 0, 2, 2)], 1), (4, [(1, 1, 2, 3, 1)], 1), (3, [(1, 1, 2, 2, 1)], 1), (2, [(0, 1, 0, 2, 0)], 1), (1, [(1, 1, 3, 2, 1)], 1)]

Write what (   4,   [  (1, 1, 2, 3, 1)  ]  ,     1  ) means for Eigenvectors of :

(1, 1, 2, 3, 1)  is only one L.I. eigenvector of corresponding the eigenvalue 4.   or 4 =eigenvalue, (1, 1, 2, 3, 1) : corresponding eigenvector, 1, algebraic multilpicity of 4 )

V. (5 pt x 3 = 15pt)  Explain or give a sketch of proof.

1. Define a transformation :   by   . Show it is a matrix transformation. (so a LT)

where     .  So it is a LT.

2. Let a linear transformation transforms any vector   to a symmetric point to the line which passing through the origin with slope . Find the transformation matrix with the aid of following pictures.

Picture: The image of the standard basis by a symmetric transformation to the line with slope .

(Sol)       .

3.  Linear transformation (Linear operator): Let's define as a projective transformation, which transforms any vector in to projection on a line which passes through the origin and has an angle with -axis. For the given transformation , let's define as a corresponding standard matrix. As shown by the right hand side picture,    <same direction with half length>. Now by using the matrix representation of symmetric transformation   ,    find the standard matrix for .

Figure   The relationship between symmetric  transformation  and projective transformation to the line with slope

(Sol)  =>

=>

[Final Exam]

 Spring 2017, LA Final Comprehensive Exam (1 hour in class Exam) 채점용답안 Sign Course Linear Algebra GEDB003 Prof. Sang-Gu Lee Class # (mark) 분반 41    or     42 Year 학년 Student No. 학번 Name 1. Fill out the above boxes before you start this Exam.  (학번, 이름 등을 기입하고 감독자 날인)       2. Honor Code: (시험 부정행위시 해당 교과목 성적이 "F" 처리됨은 물론 징계위원회에 회부될 수 있습니다.)       3. You can go out only after the permission from proctors.         (감독위원의 지시가 있기 전에는 고사장 밖으로 나갈 수 없으며, 감독위원의 퇴실 지시가 있으면 답안지를 감독위원께 제출한 후에 퇴실하시기 바랍니다.)     4. You may use the following in your answers.  (Use only Math and English in your Final!!)               28 + 15 + 13 + 35 + 24 = 115 pts Total Score Offline Exam 115 Participation 15

I. (2pt x 14= 28pt)  True(T) or False(F).

1. (  T  )  Let be an matrix. For any ( ) where are cofactors.

2. (  F  ) Let be a dimensional subspace of . Then for some nonzero vector .

(Let be a dimensional subspace of . Then for some nonzero vector .)

3. (  T  ) If , then rank()rank()rank().

4. (  F ) Let be a basis for . For , any subset of is linearly dependent.

( Let be a basis for . For , any subset of is linearly dependent. )

5. ( T )  Let be a square matrix of order . Then is diagonalizable if and only if the sum of the geometric multiplicities of eigenvalues of is equal to if and only if have linearly independent eigenvectors if and only if each eigenvalue of has the same algebraic and geometric multiplicity.

6. ( F ) Suppose is a basis for . If is an singular matrix of order , show that the set is also a basis for .

(Suppose is a basis for . If is an invertible matrix of order , show that the set is also a basis for .)

7. ( T ) Let the decomposition be the singular value decomposition (SVD) of an matrix where are positive diagonal entries of and is nonsingular. Then = and can be expressed as where the columns of , , are the left singular vectors of and the columns of , , are the right singular vectors of . Also the matrix is a pseudo-inverse of , where . If has full column rank, then the least squares solution to is  .

8. ( F ) Suppose has the Euclidean inner product and is a unitary matrix. Then for and for each and all eigenvalues of .

(Suppose has the Euclidean inner product and is a unitary matrix. Then for and for each and all eigenvalues of .)

9. ( T )  If () is skew-Hermitian, then every eigenvalue of is a pure imaginary number.

10. ( F ) The set of invertible matrices of order is a subspace of the vector space .

(The set of invertible matrices of order is not a subspace of the vector space . )

11. ( T ) If the columns of  are orthonormal in . Then the column spaces of   is an dimensional subspace in

12. ( F ) If are times differentiable on the interval and there exists such that Wronskian is not zero, then these functions are linearly dependent. Conversely if for some in , then are linearly independent.

(If are times differentiable on the interval and there exists such that Wronskian is not zero, then these functions are linearly independent. Conversely if for every in , then are linearly dependent.)

13. ( T ) If is an symmetric and positive definite matrix, then defines an inner product on . The well known Euclidean inner product is its special case when .

14. ( F ) Any -dimensional real vector space is isomorphic to and and .

(Any -dimensional complex vector space is isomorphic to and .)

II. (3pt x 5 = 15pt) State or Define.

1. State more than 5 things that you know/can/find after you studied in our LA class.

...

Finding shortest distance between a point and plane (점과 평면사이의 최단거리 구하기),

Finding solutions of LSE using Gauss-Jordan Elimination and/or inverse matrix and/or Cramer's rule,

Finding a basis for a given vector space,

Finding a matrix representation of a given linear transformation,

Finding eigenvalues and eigenvectors,

Finding the singular value decomposition (SVD) of an matrix .

...

2. Let be a basis for . Then we can obtain an orthonormal basis for from by Gram-Schmidt orthonormalization process. Fill the gap(box) in the process.

We first derive an orthogonal basis for from the basis as follows:

[Step 1] Take .

[Step 2] Let be a subspace spanned by and let

.

[Step 3] Let be a subspace spanned by and and let

.

[Step 4] Repeat the same procedure to get

where

.

=> is orthogonal. By taking , we have an orthonormal basis for .

3. [Procedure for diagonalizing a diagonalizable matrix ]

Step 1 : Find linearly independent eigenvectors of .

Step 2 : Construct a matrix whose columns are in this order.

Step 3 : The matrix diagonalize and

where are eigenvalues of .

The matrix has eigenvalues and corresponding eigenvectors . Find matrix diagonalizing and the associated diagonal matrix such that .

where

.      ■

4. [(General) Vector space]

If a set has two well-defined binary operations, vector addition (A)’ and scalar multiplication (SM)’, and for any and two basic laws  A. .  SM. .

and the following eight laws hold, then we say that the set forms a vector space over with the given two operations, and we denote it by (simply if there is no confusion). Elements of are called vectors.

A1. .                    A2. .

A3.For any , there exists a unique element in such that  .

A4. For each element of , there exists a unique such that .

SM1. .             SM2. .

SM3. .         SM4. .

5. [Jordan Canonical Form] Write what you know about JCF and how are you going to find a JCF of a given matrix.

If a given matrix is diagonalizable, most computational problems involving that matrix and desired conclusions can be easily obtained. However, not every matrix is diagonalizable. In this case, we use a method for finding the Jordan Canonical Form of a non-diagonalizable matrix by a similarity transformation.

For every square matrix (not necessarily diagonalizable), one can obtain a block-diagonal matrix called the Jordan canonical form matrix that is similar to .

Reference video: https://youtu.be/8fwPPOg8LW0 https://youtu.be/djd1XktKViA

Practice site: http://matrix.skku.ac.kr/LA/Ch-10/                 http://matrix.skku.ac.kr/JCF/

 Theorem 10.1.1 Let be an matrix with () linearly independent eigenvectors.  Then, is similar to a matrix                               (or where for some unitary matrix . Furthermore, we have       ,  (, ) where each , called a Jordan block, corresponds to an eigenvalue of . The block diagonal matrix is called the Jordan canonical form of and each are called Jordan blocks of .

We can find JCF in Sage Cell http://sage.skku.edu/ (or http://math3.skku.ac.kr , http://math1.skku.ac.kr/ , https://cloud.sagemath.com etc). by using the following commends (or other ways ... ) :

A = matrix(QQ, 5, 5, [7, -2, -2, 1, 0, 3, 0, -2, 1, 2, 12, -4, -3, 2, 0, 6, -8, -4, 6, 4, 1, -2, -2, 1, 6] )

A.jordan_form()   # Find Jordan Canonical Form of A

...

III. (8+5=13pts)  Find, Compute or Explain (Fill the spaces) :

1.   We did define a matrix and other works as following in a Sage Cell http://sage.skku.edu/.

And you found the following output. Please explain what you did [in empty spaces] as much as you can.

 A = matrix(QQ, 5, 5, [7, -2, -2, 1, 0, 3, 0, -2, 1, 2, 12, -4, -3, 2, 0, 6, -8, -4, 6, 4, 1, -2, -2, 1, 6] ) #A = random_matrix(QQ, 100, 100)              1. What this means? y = matrix(QQ, 5, 1, [1, -1, 0, 1, 2]) F= A.augment(y)                     print F print F.echelon_form()                print A.det()                            print A.inverse()                            2.  What this means? print A.charpoly()                      print A.eigenvalues()                   print A.eigenvectors_right()                  3. What this means? print A.jordan_form()                        4. What this means? var('x, y') f=3*x^2-sqrt(3)*x*y-y^2 -25 implicit_plot(f==0, (x,-10,10), (y,-10,10))      [G,mu]=A.gram_schmidt()             5.  What this command means?                     B=matrix([G.row(i)/G.row(i).norm() for i in range(0,4)]); B #

 [[ 7 -2 -2  1  0  :  1] [ 3  0 -2  1  2  : -1] [12 -4 -3  2  0  : 0] [ 6 -8 -4  6  4  : 1] [ 1 -2 -2  1  6  : 2] # augment matrix [ A : y ] [     1      0      0      0      0   -5/4] [     0      1      0      0      0 -29/12] [     0      0      1      0      0   -9/2] [     0      0      0      1      0 -49/12] [     0      0      0      0      1 -13/12] -144    # det of A    [   -1   1/6   2/3 -1/12     0] [ -4/3   2/3   2/3 -1/12  -1/6] [   -4   1/3   7/3  -1/6     0] [ -8/3   5/6   4/3  1/12  -1/3] [ -7/6   1/6   2/3 -1/12   1/6] x^5 - 16*x^4 + 95*x^3 - 260*x^2 + 324*x - 144 6. What the above is? [6, 4, 3, 2, 1]   # eigenvalues of A  [(6, [(0, 1, 0, 2, 2)], 1), (4, [(1, 1, 2, 3, 1)], 1), (3, [(1, 1, 2, 2, 1)], 1), (2, [(0, 1, 0, 2, 0)], 1), (1, [(1, 1, 3, 2, 1)], 1)]  7. What are the algebraic and geometric multiplicities of eigenvalue 1 of A ? [ 6 | 0 | 0 | 0 | 0 ] [-+-+-+-+-] [ 0 | 4 | 0 | 0 | 0 ] [-+-+-+-+-] [ 0 | 0 | 3 | 0 | 0 ] [-+-+-+-+-] [ 0 | 0 | 0 | 2 | 0 ] [-+-+-+-+-] [ 0 | 0 | 0 | 0 | 1 ]     8.  What the following vectors means?   [      29/1158*sqrt(1158)         7/579*sqrt(1158)         1/579*sqrt(1158)         1/193*sqrt(1158)        -3/386*sqrt(1158)] [-301/2684*sqrt(1342/579)  127/1342*sqrt(1342/579) 1277/2684*sqrt(1342/579) -801/2684*sqrt(1342/579)   -75/244*sqrt(1342/579)] [-859/9224*sqrt(2306/671) -235/4612*sqrt(2306/671) 2445/9224*sqrt(2306/671) 4219/9224*sqrt(2306/671) -143/9224*sqrt(2306/671)] [     71/24*sqrt(10/1153)      -7/20*sqrt(10/1153)    611/120*sqrt(10/1153)   -251/120*sqrt(10/1153)     349/40*sqrt(10/1153)]

2. The graph of can be drown  with  Sage Command in http://sage.skku.edu.

var('x, y'),  f=7*x^2+6*x*y+7*y^2-200,  implicit_plot(f==0, (x,-10,10), (y,-10,10))

And where => => Unit eigenvectors

, and     =>   (Answer the following questions))

(1) What is a new axis ? (2) Explain this graph of as much as you can.

(1) The new axis is obtained by ratating 45 degree counterclockwise of the given axis.

(2)  The semiminor axis(단축) is and the semimajor axis(장축) is 인 타원을 그려서 시계방향으로 45도 회전시킨 그래프입니다.  즉 in a new axis

IV. (5pt x 7 = 37pts)  Find or Explain (Fill the boxes) :

1. Using the table below compute the dimension of for matrix :

 (a) (b) (c) (d) (e) Size of 3 13 2 3 4

, , .

 (a) 3 3 (b) 13 13 (c) 2 2 (d) 3 3 (e) 4 4

■

2. Consider  the linear transformation and defined by

and   (so and ).

respectively. Find the matrix representation of the composition transformation .

■

3. For , , , , , , , , let , , which are ordered bases for .

(1) Find the transition matrix .

(2) Find using the transition matrix when .

(1) Let  . comes from RREF of

(2)

4. Find the SVD(Singular Value Decomposition) of where and .

The singular values of are and .

Unit eigenvectors of corresponding to eigenvalues are , respectively.

Unit eigenvectors of corresponding to eigenvalues are , , , resp. =>,

=>     ■

5. The SVD of is given. Find its pseudo-inverse .

Then .   (where and .)       =    ■

6. Find the least square line passing through the four points (1,2), (2,3), (3,4), (4,5).

From , we have a linear system . Let , , and .     =                  ■

7. Let be the set of all continuous functions from the interval to the complex set . If the addition and scalar multiple of these functions are defined as for , then is a complex vector space with respect to these operations. Then in where are continuous functions from to .

For , define the following inner product . Then is a complex inner product space. Let . With the above inner product, the Cauchy-Schwarz inequality is given by

.

V. (6pt x 4 = 24pts) Give a sketch of proof or Explain or Fill the Box :

1.  A linear system has a solution if and only if .

Let , , . Then the linear system can be written as

.    (1)

Hence we have the following:

has a solution.

There exist satisfying the linear system (1).

is a linear combination of the columns of .

Col

.

2. For matrices , with multiplication defined, show Null(Null().

[ Show ]

.                   . This implies      Null(Null().

3. Let be a square matrix. Then is orthogonally diagonalizable if and only if the matrix is symmetric.

() Suppose is orthogonally diagonalizable. Then there exists an orthogonal matrix and a diagonal matrix such that .

[ Show ]

Since , we have  .

Hence

.

Therefore, is symmetric.

4.  If (Hermitian), then all the eigenvalues of are real numbers.

Let be an eigenvalue of , that is, there exists a nonzero vector such that . By multiplying both sides by on the left-hand side, we get . Hence . Since is a nonzero real number, we just need to show that is a real number. [ Show

( is a scalar)

.    ()

Therefore, and are all real numbers.

Final  (What do you think?)

◆ State one page of your final comment after you finish this PBL reports.

First of all, I would like to give thank to my professor and all the colleagues for making this

semester wonderful and amazing though it was very challenging for me. I have learn a lot through this course, the system of this course was much helpful to me, even environmentally compare to our environment is totally different, because this environment has added a lot of useful things for my future.

Secondly, The PBL report is the best way of evaluating our work and activities of whole semester. also the PBL report is one of the best learning method, after prepared this work I

have realized that how much work I have put on this report, and what I have gained through

this work was very helpful to me and improved my skills. also I have made this PBL report as

a reference in later future, for any requirement study relative to Linear Algebra.

Finally, I have learned so many things related to mathematics which can be use in our daily life. the course was totally new for me, after taken this class I am totally satisfied with all activities. In the future I will be able to apply most of the techniques and ideas which I have learn through this class and this system.

LA PBL보고서를 마치고

[9월 PBL]

추석연휴동안 PBL보고서를 작성하면서 많은 생각이 오갔습니다. 처음 1,2주에 대학원 직권수강신청이 늦게 되는 바람에 아이캠퍼스에 접속할 수 없었습니다. 또한 당시 수강 신청이 결국 안된 줄 알아서 두번째 수업은 참석하지 못했었습니다. 그런 이유로 새로운 방식의 수업과 학습 방식에 적응하고 익숙해지는 데에 다른 동료 학생들 보다 오래 걸렸고 수업 참여도 많이 저조했던 것 같습니다. 이번 PBL보고서를 작성하면서 그동안 수업 참여와 학습 상태에 대해서 객관적으로 알 수 있었고 많은 반성을 하는 계기가 되었습니다. 또한 보고서를 작성하기 위해 Q&A에 더 적극적으로 참여하게 되면서 수업 방식에 대해서 좀 더 익숙해 질 수 있었습니다. 또한 단순히 제출을 위한 보고서가 아니라 이 보고서를 만드는 과정에서 많은 공부가 되고 그동안의 학습이 정리가 되어 정말 유익한 시간이 되었던 것 같습니다.

저는 대학원에서 경제학을 전공하고 특히 거시경제학 분야의 dynamic programming을 이용한 분석에 대해서 많은 관심을 가지고 공부하고 있습니다. 아직 LA의 많은 부분을 더 공부해야 하지만 이번 학기 이 수업을 배워 나가면서 실제 제가 분석에 사용했던 개념들이나 기법들에 대해서 좀 더 깊은 이해와 원리를 깨우치는 시간 이 되는 것 같고 앞으로도 기대가 많이 됩니다. 공부를 하며 좀 더 여유가 된다면 배운 내용을 제 분야에 접목해보는 연습을 많이 해 볼 계획입니다. 늘 열정적으로 좋은 강의를 해 주시는 교수님께 다시 한 번 감사드립니다.

[10월 PBL]

첫번째 중간고사 전 PBL을 제출하고 나서 다른 시험 준비와 논문 프로젝트로 인해서 선형대수학 공부에 평소보다 많은 시간을 투자하지 못하였습니다. PBL을 작성하면서 수업 참여 회수를 파악하면서 이를 좀 더 파악하고 반성 할 수 있었습니다. 전에는 제가 프로그래밍을 하면서 대부분의 과정은 이미 짜여져 있는 코드들을 이용하여 분석했기 때문에 그 부분에 어떻게 선형대수학이 접목되는 지 잘 이해하지 못한 상태였습니다. 하지만 제 전공 공부를 해나가면서 선형대수학이 제 전공에 어떻게 적용되는 지를 좀 더 이해 할 수 있었습니다.

[11월 PBL]

본격적으로 교수님께서 과제를 내주시면서 좀 더 많은 문제들을 제 힘으로 풀어보는 시간이었습니다. 그전에도 Flipped 방식의 수업이 학생들과의 교류도 가능하게 하고 자기 주도적 학습을 가능하게 했었지만 과제를 하면서 그것이 더 빛을 발했던 것 같습니다. 많은 문제들을 접하기 전에는 조금은 추상적으로 이해되었던 것들이 직접 문제들을 풀어보면서 더 구체적으로 이해를 할 수 있었습니다. 또한 그 과정에서 그동안 이해가 부족했던 부분을 다시 한 번 확인 할 수 있는 시간이었습니다.

[12월 PBL]

처음 선형대수학을 수강하기로 마음 먹었던 것은 제 전공 분야를 조금 더 잘 이해하기 위해서였습니다. 저는 공부를 하는 데에 있어서 제 나름대로의 이해를 하지 못하면 완전히 학습 할 수 없었습니다. 제가 말하는 이해란 이것이 왜 쓰이고 왜 저한테 필요한 지에 대한 질문에서 시작합니다. 그 동안 경제학을 공부하면서 부분적으로 배웠던 선형대수학 이론들은 제가 생각하는 측면에서의 이해가 부족했습니다. 그 부분을 이번 한 학기 동안 선형대수학을 학습하면서 채워 나아갈 수 있었습니다. 막상 대부분의 복잡한 분석을 들어가면 이미 선진 학자들이 만들어 놓은 코드들을 사용하는 데에 그치는 것이 대부분이지만 이제는 그 코드들에 들어간 이론들이 왜 필요한 지를 이해하고 사용하게 된 것 같습니다. 새로운 학습 방식과 이론들로 인해서 비록 조금은 힘들었던 학기였지만 그만 큼 얻은 것도 많은 시간이었습니다. 다시 한 번 한 학기 동안 학생들에게 많은 열정과 애정을 보여주신 교수님께 감사드립니다.

**************************************

9월 : 학부 수업은 들은 지 오래되어 처음으로 flipped class라는 것을 경험하게 되었습니다. 첫 강의부터 아리송하여 어떻게 참여를 해야 하는 것인지 알아가는 것에 어려움이 있었습니다. 처음에는 동영상 강의를 아이캠퍼스에서 볼 수 있다는 것을 알지 못해서 구글에서 동영상을 찾아보고, 강의노트를 보고 공부하던 기억이 있습니다. 이제는 이 수업에 조금씩 익숙해지고 있습니다. 교수님께서 올려두신 자료가 많아 그것을 보면서 수업을 예습하면서 훨씬 쉽게 공부를 할 수 있었습니다. 또한 다른 학생들이 Q&A 게시판에 글을 올리는 것을 보면서 동기도 부여되고, 또한 그 글을 읽으면서 모르는 개념에 대한 부분을 빠르게 정립해 나갈 수 있었다고 생각합니다.

10월 : 이제 10월 첫 수업을 하게 되었는데, 학기 초보다 조금 여유로워진 만큼 수업에 좀 더 열심히 참여하고 질문도 일주일에 3개씩 미루지 말고 해야겠다는 생각이 듭니다. 새로운 방식의 수업이라 조금 힘들기도 하지만 질문을 많이 할 수 있다는 점은 이 수업의 가장 으뜸가는 면이라고 볼 수 있을 것 같습니다.

11월 : 수업에 조금씩 적응해가면서 Q&A 게시판에 글을 올리는 것이 조금씩 수월해졌습니다. 과제를 통해 수업 중에 배운 내용을 실제 문제 풀이 과정에서 적용해볼 수 있었습니다. 또한 Chapter7과 Chapter8의 문제푸는 과제를 하면서, 팀원과 서로의 문제를 고쳐주었는데 그러면서 수학적으로 정해져있는 체계적인 풀이방법에 대해 배울 수 있게 되었습니다.

12월 : 다른 학우들의 문제를 고쳐주고 Finalize 해주면서 많은 것을 배웠다고 생각합니다. 또한 수식 만드는 법, Matrix 만드는 법, Sage 코드 Publish 하는 법을 학우들과 같이 공유하였는데, 도움이 되었다는 답글을 보니 클래스에 작게나마 도움이 될 수 있어서 다행이라는 생각을 하게 되었습니다.

***********************************

[9월-10월]

수업시간에 교수님의 강의를 듣는 전통적인 수업방식과 달리 온라인을 통해 선행학습을 하고, 오프라인 강의에서는 교수님 및 학생들과 질문하고 토론하는 플립러닝(Flipped learning) 수업방식이 학기 초반에는 다소 어색하였습니다. 그러나 매주 수업 내용을 요약하거나 질문하는 과정을 통해 보다 적극적으로 강의 내용을 이해하려고 노력하였으며, 다른 학우들이 온라인 게시판에 올린 질문과 답변들을 보며 제가 모르고 있던 부분까지 자세하게 알게 되면서 점점 장점이 많은 수업 방식이라고 생각하게 되었습니다. 또한 단순히 수학문제를 푸는 방식이 아닌 서로 토론하는 과정에서 추상적인 수학의 개념을 함께 이해하고, 혼자 공부했더라면 지나쳤을 문제들에 대해서 고민하는 좋은 시간이었습니다.

[11월-12월]

중간고사 이전에 다소 부진했던 온라인 게시판 참여를 보완하기 위해 적극적으로 노력한 시기였습니다. 플립러닝이라는 수업방식을 최대한 활용하여 다른 학우들과 함께 문제를 풀고 공유하는 과정을 통해 학기말이 되면서 선형대수학에서 배운 개념들을 정리할 수 있는 좋은 시간이었습니다. 또한 각자 정리한 보고서를 수업시간에 발표하면서 다른 학우들은 어떻게 문제를 풀고 정리하였는지 확인할 수 있었습니다. 이번 학기에 수강한 선형대수학은 앞으로 경제학 공부를 할 때, 많은 도움이 될 것이라고 생각합니다. 학기말이 되니 더욱 적극적으로 참여하지 못한 것이 아쉽지만 학기말까지 학생들의 적극적인 참여를 유도해주신 교수님 덕분에 포기하지않고 많은 것을 배운 것 같습니다. 감사합니다.

[Evaluation, 2017 Spring ]

 주관학부 이수 구분 수업언어 수강인원 수업환경 학수번호 분반 과목명 평가인원 강의평가 점수 자연과학 대학 비전공 국제어 24 비온라인 GEDB003 ** 선형대수학 17 88 항목 문항 평균 표준편차 선택지별  응답인원(명(%)) 전혀 그렇지 않다 약간 그렇지 않다 보통 이다 약간 그렇다 매우 그렇다 문항4 교수님은수업계획서에명시된바와같이한학기동안수업을진행하였다. Theprofessorconductedthecourseasstatedinthesyllabusthroughoutthesemester. 8.60 1.50 0 (00.0%) 0 (00.0%) 3 (17.6%) 6 (35.3%) 8 (47.1%) 문항5 이수업을통해새로운지식을배울수있었다. Iwasabletogainnewknowledgeandperspectivesinthisclass. 8.60 1.70 0 (00.0%) 0 (00.0%) 4 (23.5%) 4 (23.5%) 9 (52.9%) 문항6 이수업의과제와시험은수업내용을점검하는데적절한방식과내용으로시행되었다. Assignments,readings,papers,quizzes,andexamswerepertinenttothesubjectand  enhanced learning. 8.40 1.80 0 (00.0%) 0 (00.0%) 5 (29.4%) 4 (23.5%) 8 (47.1%) 문항7 교수님의열정과성실함으로인해수업에대한나의관심이높아졌다. Theprofessor'senthusiasmandcommitmentstimulatedmymotivationandincreasedmyinterestinthisclass. 8.10 1.80 0 (00.0%) 1 (05.9%) 3 (17.6%) 7 (41.2%) 6 (35.3%) 문항8 교수님과소통할수있는수단이항상열려있었다.  Themeanstocommunicatewiththeprofessorwasalwaysavailable. 8.70 1.60 0 (00.0%) 0 (00.0%) 3 (17.6%) 5 (29.4%) 9 (52.9%) 문항9 나는이수업을통해많은가르침과깨우침을얻을수있었다. Attendingthiscourseextendedmyknowledgeandgavebetterinsightsintothesubjectmatter. 8.50 1.80 0 (00.0%) 0 (00.0%) 5 (29.4%) 3 (17.6%) 9 (52.9%)

 주관학부 이수 구분 수업언어 수강인원 수업환경 학수번호 분반 과목명 평가인원 강의평가 점수 자연과학 대학 비전공 국제어 34 비온라인 GEDB003 ** 선형대수학 25 90 항목 문항 평균 표준편차 선택지별  응답인원(명(%)) 전혀 그렇지 않다 약간 그렇지 않다 보통 이다 약간 그렇다 매우 그렇다 문항4 교수님은수업계획서에명시된바와같이한학기동안수업을진행하였다. Theprofessorconductedthecourseasstatedinthesyllabusthroughoutthesemester. 8.10 2.30 1 (04.0%) 2 (08.0%) 3 (12.0%) 8 (32.0%) 11 (44.0%) 문항5 이수업을통해새로운지식을배울수있었다. Iwasabletogainnewknowledgeandperspectivesinthisclass. 8.60 1.90 0 (00.0%) 1 (04.0%) 5 (20.0%) 5 (20.0%) 14 (56.0%) 문항6 이수업의과제와시험은수업내용을점검하는데적절한방식과내용으로시행되었다. Assignments,readings,papers,quizzes,andexamswerepertinenttothesubjectand  enhanced learning. 8.20 2.30 1 (04.0%) 2 (08.0%) 3 (12.0%) 6 (24.0%) 13 (52.0%) 문항7 교수님의열정과성실함으로인해수업에대한나의관심이높아졌다. Theprofessor'senthusiasmandcommitmentstimulatedmymotivationandincreasedmyinterestinthisclass. 8.20 2.40 1 (04.0%) 2 (08.0%) 4 (16.0%) 5 (20.0%) 13 (52.0%) 문항8 교수님과소통할수있는수단이항상열려있었다.  Themeanstocommunicatewiththeprofessorwasalwaysavailable. 8.60 1.70 0 (00.0%) 1 (04.0%) 3 (12.0%) 8 (32.0%) 13 (52.0%) 문항9 나는이수업을통해많은가르침과깨우침을얻을수있었다. Attendingthiscourseextendedmyknowledgeandgavebetterinsightsintothesubjectmatter. 8.20 2.20 1 (04.0%) 1 (04.0%) 4 (16.0%) 7 (28.0%) 12 (48.0%)

[ Evaluation, 2017 Fall ]

 Fall 2017, 2학기 결과 Linear Algebra Sang-Gu Lee 성명     : 이상구 학년도/학기 : 평가 : Final (기말) English Class 이수구분 수업언어 수강인원 수업환경 학수번호 분반 과목명 평가인원 강의평가 점수 자연과학대학 인사캠 국제어 13 비온라인 GEDB003 01 선형대수학 7 91 항목 문항 평균 표준편차 선택지별 응답인원(명(%)) 전혀 그렇지 않다 약간 그렇지 않다 보통 이다 Agree   약간 그렇다 Strongly Agree매우 그렇다 문항4 교수님은 수업계획서에 명시된 바와 같이 한 학기동안 수업을 진행하였다. Theprofessorconductedthecourseasstatedinthesyllabusthroughoutthesemester. 8.60 3.00 1(14.3%) 0(00.0%) 0(00.0%) 1 (4.3%) 5 (71.4%) 문항5 이 수업을 통해 새로운 지식을 배울 수 있었다. Iwasabletogainnewknowledgeandperspectivesinthisclass. 8.60 3.00 1(14.3%) 0(00.0%) 0(00.0%) 1 (14.3%) 5 (71.4%) 문항6 이 수업의 과제와 시험은 수업내용을 점검하는데 적절한 방식과 내용으로 시행되었다. Assignments,readings,papers,quizzes,andexamswerepertinenttothesubjectand  enhanced learning. 8.60 3.00 1(14.3%) 0(00.0%) 0(00.0%) 1 (14.3%) 5 (71.4%) 문항7 교수님의 열정과 성실함으로 인해 수업에 대한 나의 관심이 높아졌다. Theprofessor'senthusiasmandcommitmentstimulatedmymotivationandincreasedmyinterestinthisclass. 8.60 3.00 1(14.3%) 0(00.0%) 0(00.0%) 1 (14.3%) 5 (71.4%) 문항8 교수님과 소통할 수 있는 수단이 항상 열려있었다.  Themeanstocommunicatewiththeprofessorwasalwaysavailable. 8.90 3.00 1(14.3%) 0(00.0%) 0(00.0%) 0 (00.0%) 6 (85.7%) 문항9 나는 이 수업을 통해 많은 가르침과 깨우침을 얻을 수 있었다. Attendingthiscourseextendedmyknowledgeandgavebetterinsightsintothesubjectmatter. 8.90 3.00 1(14.3%) 0(00.0%) 0(00.0%) 0 (00.0%) 6 (85.7%) Suggestions / 건 의 사 항 감사합니다 교수님ㅎㅎ   Thank yo so much, my Prof.^^ Nothing to add because the system of the class are helpful a lot. Having a project/presentation in addition to exams and PBL report will be better for some. this course is really good and also really enjoyable with professor SG Lee. Thank you so much to professor 좋습니다!! Great!! fine

기존의 다른 수학 과목들이 단순히 문제풀이를 시험에서 물어보던 것으로 인해 실용성을 못 느꼈는데, 이와 달리 현대 사회에서 수학을 하는 것을 배우는 기분이었습니다.

All i learnt was so perfect hope to have same kind of learning ways in next classes.

수업방식이 좋긴 하나, 잘하는 학생과 못하는 학생의 양극화 현상도 좀 심한 것 같아서 못하는 학생이 잘 따라올 수 있는 수단이 많이 필요할 것 같습니다.

우선 수업에서 다루는 내용이 너무 많다는 느낌이 많이 들었습니다. 정리가 안되어 힘들었습니다. 그리고 너무 늦게 끝납니다.

질문을 하신 다음 생각할 시간을 더 주시면 좋을 것 같습니다.

교재내용 전달에 보태 도구 사용을 적용하는 방식도 필요하다고 생각합니다.

교수님의 강의가 매우 좋아서 이해가 잘 되었고 특히 지속적으로 교과목의 필요성과 응용사례 등을 환기시키며 학습동기를 제공해주시는 것이 매우 좋았던 것 같습니다.

Nothing to add because the system of the class are helpful a lot.

Having a project/presentation in addition to exams and PBL report will be better for some. This course is really good and also really enjoyable with professor SG Lee. Thank you so much to professor

제가 선형대수학 강의를 선택한 계기는 고등학교 때 친하게 지내던 문과 친구 때문이었답니다. 그 친구는 문과로 갔지만, 수학을 좋아하던 친구였습니다. 고등학교 이과수학을 고등학교 들어오기 전에 모두 공부한 상태였으며 1학년 때 대학교 수학을 공부하던 친구였습니다. 그래서 그 친구가 공부하던 선형대수학이라는 과목을 생각하게 되었고 선형대수학을 선택했습니다. 덕분에 공업수학 미분방정식 부분에서 큰 도움이 되었습니다.

선형대수학 강의를 들으면서 긴장의 연속이었던 것 같습니다. QnA 작성에, 교수님이 질문을 할 때 답을 못할까 긴장을 하면서 들었던 것 같습니다. 하지만 그 덕에 시험공부는 보다 여유있게 할 수 있었습니다. 시험문제도 타 과목처럼 개념을 몰라도 문제 푸는 방법만 알고 있으면 되는 게 아니고,  (시험을 통하여 여러 학생들이 계산 과정에서 틀리는 정도를 평가하여, 너는 이것도 모르니까 ... 하고 묻는 것이 아니라, 학생들이 무엇을 배워서 이전에 할 수 없었던 어떤 문제/계산/일 들을 할 수 있게 되었다는 것을 묻는 ) 즉  문제를 푸는 것보단 전체적인 개념/정의/정리/증명과정을 이해하고 있는지, 논리적으로 설명할 수 있는지 와 실제 현실 문제를 해결할 수 있는 계산 능력과 알고리즘을 이해한 정도를 판단하는 것을 중시하는 문제를 중심으로 평가를 받았다는 것이 기억에 오래 남을 것 같습니다.