[PBL report Form Download(보고서 양식 다운로드)]
Korean (HWP) : http://matrix.skku.ac.kr/PBL/PBL-Report-Form-Korean.hwp <- Modify this for this semester
English (MS Word) : http://matrix.skku.ac.kr/PBL/PBL-Report-Form-English.docx <-- Modify this form for your semester
We have learned ‘Basic Mathematics(행렬, 도함수, 통계)’
http://matrix.skku.ac.kr/intro-math4ai/W1/
http://matrix.skku.ac.kr/intro-math4ai/W2/
http://matrix.skku.ac.kr/intro-math4ai/W3/
http://matrix.skku.ac.kr/intro-math4ai/W4/
http://matrix.skku.ac.kr/intro-math4ai/W5/
http://matrix.skku.ac.kr/intro-math4ai/W6/
http://matrix.skku.ac.kr/intro-math4ai/W7/
http://matrix.skku.ac.kr/intro-math4ai/W8/
http://matrix.skku.ac.kr/intro-math4ai/W9/
http://matrix.skku.ac.kr/intro-math4ai/W10/
http://matrix.skku.ac.kr/intro-math4ai/W11/
http://matrix.skku.ac.kr/intro-math4ai/W12/
http://matrix.skku.ac.kr/intro-math4ai/W13/
http://matrix.skku.ac.kr/intro-math4ai/W14/
to understand and can talk about the following concepts in 14 weeks(days) in this semester.
1. SVD(Singular Value Decomposition)
2. GDM(Gradient Descent Method)
3. Data and Covariance Matrix
4. PCA(Principal Components Analysis)
5. Rank Reduction and the role of SVD in PCA
6. BP(Back-Propagation) algorithm in ML(Machine Learning) and ANN(Artificial Neural Network)
l Sample Students HW/Report [예시, 이전 학기 학생들의 질문/답변/활동 기록]
Math4AI-Summary : http://matrix.skku.ac.kr/Math4AI-Summary/
2021 Fall PBL report by two Freshmen (English) : http://matrix.skku.ac.kr/2021-Math4AI-Fall-PBL/
2021 Summer PBL report (English) : http://matrix.skku.ac.kr/2021-Final-PBL-E/
2021 Summer PBL report (Korean) :http://matrix.skku.ac.kr/2021-Final-PBL/
2020 Fall PBL report (Korean) : http://matrix.skku.ac.kr/2020-Math4AI-PBL/ Basic Math for AI)
Sample 1 (도전학기 7주차 기말보고서) http://matrix.skku.ac.kr/2020-Math4AI-Final-pbl2/
Sample 2 (도전학기 7주차 기말보고서) http://matrix.skku.ac.kr/2020-math4ai-final-pbl/
Sample 3 (도전학기 4주차 중간보고서) http://matrix.skku.ac.kr/2020-Mid-PBL-2/
Sample 4 (도전학기 4주차 중간보고서) http://matrix.skku.ac.kr/2020-Mid-PBL-1/
English (Linear Algebra) English : http://matrix.skku.ac.kr/2018-album/LA-PBL.htm (선형대수학)
English (Linear Algebra) (Korean PDF file): http://matrix.skku.ac.kr/2015-album/2015-F-LA-Sep-Record.pdf
Calculus 1 PBL report http://matrix.skku.ac.kr/Cal-Book1/Calculus-1/ (미적분학 1)
Calculus 2 PBL report http://matrix.skku.ac.kr/Cal-Book1/Calculus-2/ (미적분학 2)
[ We could practice the our codes in http://matrix.skku.ac.kr/KOFAC/
○ {High School Math Review} Math Lab Review (실습실)
9th grade Math (중3 수학) http://matrix.skku.ac.kr/9th-Grade/
10th grade Math (고1 수학) http://matrix.skku.ac.kr/10th-Grade/
11th grade Math 1 (고2, 수학 1) http://matrix.skku.ac.kr/11th-Grade-1/
9th grade Math 2 (고2, 수학 2) http://matrix.skku.ac.kr/11th-Grade-2/
9th grade Math, Calculus (고3, 미적분) http://matrix.skku.ac.kr/12th-Grade-1/
9th grade Math, Statistics (고3, 확률통계) http://matrix.skku.ac.kr/12th-Grade-2/ ]
(1) State more than 10 Math Definitions and concepts what you learned in the first 14 weeks (days).
http://matrix.skku.ac.kr/intro-math4ai/
1. Polynomial functions: A function f(x) that is a polynomial of x is called a polynomial function. Most well-known polynomial functions are as follows. Linear function, quadratic function, nth order polynomial.
2. Rational functions: Rational functions f(x) are functions of x that are rational P(x)/Q(x) where P(x) and Q(x) are polynomials.
3. Vector and scalar: A quantity which does not depend on direction is called a scalar quantity. Vector quantities have two characteristics, a magnitude and a direction. Scalar quantities have only a magnitude.
4. Rules for Matrix Operations: A + B = B + A, 1A = A, (ab)C = aC + bC, …
5. Classification: Classification is the problem of identifying to which category a new data belongs, based on the given data's characteristics.
6. Gauss-Jordan elimination: Gaussian elimination is an algorithm for solving systems of linear equations. When we solve a system of linear equations using Gaussian elimination, the final form (of the augmented matrix) on the left side of the equation becomes the identity matrix.
7. Least Squares Problem: Least-squares problems fall into two categories: linear or ordinary least squares and nonlinear least squares, depending on whether or not the residuals are linear in all unknowns. The linear least-squares problem occurs in statistical regression analysis; it has a closed-form solution.
8. SVD: The SVD (singular value decomposition) always exists for any sort of rectangular or square matrices. It is the key feature of SVD. The singular value decomposition of m x n matrix A is the matrix factorization in the form of A = U x Sigma x V^T, where U is an m x m orthogonal matrix, V is an n x n orthogonal matrix, and sigma is an m x n rectangular (generalized) diagonal matrix with non-negative real numbers on the main diagonal.
9. Limits of Functions: An optimal solution is a solution in which a function defined in a set has a maximum or minimum value. The problem of finding an optimal solution involves generalized concepts and operations of derivatives. Techniques used to find an approximate solution help when we find an optimal solution.
10. Local Maximum and Minimum: The derivative can be used to determine whether a function increases or decreases based on the sign of slope at a point (or on a given interval). There might be a changing point of the slope (e.g, decreasing to increasing, increasing to decreasing). We call it a critical point, and the derivative of the function will be zero at a critical point. The second derivative can be used to determine whether a given function has a local maximum or minimum value at a critical point. Using the second derivative, we can 'check that a function has the absolute maximum or minimum on a given interval'.
11. Fermat's Theorem for Extrema: Fermat's theorem essentially says that every local extremum (i.e. local maximum or minimum) of the function that occurs at a point within the interval where the function is differentiable (i.e. the function has a derivative at that point) must be a stationary point.
12. Gradient descent method: Gradient descent is an iterative optimization algorithm for finding the local minimum of a function. To find the local minimum of a function using gradient descent, we must take steps proportional to the negative of the gradient (move away from the gradient) of the function at the current point.
13. Conditional probability: Conditional probability is an essential concept in data analysis, which is the probability that an event B occurs under the condition that an event A occurred.
14. Bayes' theorem: Bayes' theorem describes the probability of an event based on prior knowledge of conditions related to the event.
15. PCA: PCA converts a data set from high-dimensional space into low-dimensional, easy-to-handle spaces while preserving the distribution of the original data as much as possible.
16. Artificial Neural Network: computing systems inspired by the biological neural networks that constitute animal brains. An ANN is based on a collection of connected units or nodes called artificial neurons, which loosely model the neurons in a biological brain.
17. Backpropagation (BP): an algorithm for supervised learning of artificial neural networks using gradient descent. Given an artificial neural network and an error function, the method calculates the gradient of the error function with respect to the neural network's weights.
- Vector is a geometric object that has magnitude (or length) and direction. It can be represented as an ordered set of numbers arranged in columns.
- Matrix is a rectangular array or table of numbers, symbols, or expressions, arranged in rows and columns, which is used to represent a mathematical object or a property of such an object.
- Distance is a measure of data similarity using distance between two vectors. This is often used in data sets having similar values.
- Cosine similarity is a measure of data similarity using angle between two vectors. It is often used in data sets having trends.
- Least squares problem is minimizing the sum of the squares of the differences of data made in the results of each individual equation. It is often used in data fitting,
- QR decomposition is a decomposition of a matrix A into a product A = QR of an orthogonal matrix Q and an upper triangular matrix R.
- LU decomposition factors a matrix as the product of a lower triangular matrix and an upper triangular matrix.
- SVD: is a factorization of a real or complex matrix. It generalizes the diagonalization of a square normal matrix with an orthonormal eigenbasis to any rectangular matrix.
- Limit is the value that a function (or sequence) approaches as the input (or index) approaches some value.
- Derivative measures the sensitivity to change of the function value (output value) with respect to a change in its argument (input value).
- GDM is a first-order iterative optimization algorithm for finding a local minimum of a differentiable function.
- Expectation of a random variable X, often denoted E(X), is a generalization of the weighted average, and is intuitively the arithmetic mean of a large number of independent realizations of X.
- Variance is the expectation of the squared deviation of a random variable from its population mean or sample mean. Variance is a measure of dispersion, meaning it is a measure of how far a set of numbers is spread out from their average value.
- Standard deviation is a measure of the amount of variation or dispersion of a set of values. A low standard deviation indicates that the values tend to be close to the mean of the set, while a high standard deviation indicates that the values are spread out over a wider range.
- Bayes’ theorem describes the probability of an event, based on prior knowledge of conditions that might be related to the event.
- Covariance is a measure of the joint variability of two random variables. If the greater values of one variable mainly correspond with the greater values of the other variable, the covariance is positive. In the opposite case, when the greater values of one variable mainly correspond to the lesser values of the other, the covariance is negative. The sign of the covariance therefore shows the tendency in the linear relationship between the variables.
- Covariance matrix is a square matrix giving the covariance between each pair of elements of a given random vector.
- PCA is the process of computing the principal components and using them to perform a change of basis on the data, sometimes using only the first few principal components and ignoring the rest. It is commonly used for dimensionality reduction by projecting each data point onto only the first few principal components to obtain lower-dimensional data while preserving as much of the data's variation as possible.
- Linear regression is a linear approach for modelling the relationship between a scalar response and one or more explanatory variables. More simply, it can be understood as fitting the data by appropriate line.
- ANN are computing systems inspired by the biological neural networks that constitute animal brains. It consists of input layer, inner layer, output layer. Signals travel from the first layer (the input layer) to the last layer (the output layer), possibly after traversing the layers multiple times.
- Back Propagation is a widely used algorithm for training feedforward neural networks. In fitting a neural network, backpropagation computes the gradient of the loss function with respect to the weights of the network for a single input–output example, and does so efficiently, unlike a naive direct computation of the gradient with respect to each weight individually.
- MNIST is a large database of handwritten digits that is commonly used for training various image processing systems. The database is also widely used for training and testing in the field of machine learning.
(2) State more than 10 things that you know/can/find ... after you studied the first 14 weeks (days).
1. Sketch graphs of polynomial functions
2. Make a composite function
3. Find solutions of equations
4. Perform matrix operation by applying operations on vectors and matrices
5. Check inverse and transpose of a matrix
6. Find inner product and angle of two vectors
7. Find system of linear equations
8. Find SVD of a matrix
9. Find derivative of a differentiable function using while loop
10. Find local maximum, local minimum, absolute maximum, and absolute minimum of a function
11. Find the minimum value of a function with the GDM code.
12. Draw graphs for various differentiable functions and determine a small interval containing the local minimum identified from the figure, and apply the GDM code to each interval with a reasonable x1.
13. Explain how SVD is used in PCA
14. Take a data matrix and apply a PCA
15. Process of dimension reduction by PC’s on the covariance matrix.
16. Describe simply ANN and Backpropagation algorithm as you understand.
Fill in the below for your self-assessment and your project/term paper.
(A) Briefly describe your contributions through Q&A for yourself and fellow students in our "Basic Math4AI" classe!
(A1)Quantity :
- Check your participation numbers in QnA for each week (Saturday to Friday):
12/13: 0 12/14: 3 12/15: 3 12/16: 3
12/17: 2 12/18: 2 12/19: 0 12/20: 0
12/21: 5 12/22: 5 12/23: 1 12/24: 0
12/25: 1 12/26: 1 12/27: 0 12/28: 1
12/29: 0 12/30: 5
- Total number of sessions (Q: 2 , A: 26 , Others: 4 )
- Number of online attendances: ( 34 / 34 )
- Off-line attendance and number of absences: ( 3 / 3 ) (0 absence)
- Check your participation numbers in QnA for each week (Saturday to Friday):
Week(day) 1: 0 Week 2: 3 Week 3: 0 Week 4: 0
Week 5: 10 Week 6: 1 Week 7: 3 Week 8: 0
Week 9: 1 Week 10: 0 Week 11: 0 Week 12: 2
Week 13: 0 Week 14: 3 Week 15: 0
- Total number of sessions (Q: 1 , A: 13 , Others: 9 )
- Number of online attendances: ( 35 / 35 )
- Off-line attendance and number of absences: ( 3 / 3) (0 absence)
· Others Include some announcements or course related posts.
· I actively participated on Q&A open problems. I frequently shared my solution to the class.
· I shared the study material I’ve found during studying about the concept to the class. This might helped some colleagues understand the concept better.
· I actively commented on other colleagues’ solutions and contributed finalizing it.
Ng Zhi Wei Self Introduction / Motivation
[Question] Open problem 11 by 박우현
Problem 1 by Ng Zhi Wei
Problem 2 by Ng Zhi Wei
Problem 3 by Ng Zhi Wei
Problem 14 by 안성준
Problem 15 by 안성준
Problem 5 by random_matrix() by Ng Zhi Wei
Problem 6 by Ng Zhi Wei
Problem 1 by 남상현
Day 7 Open Problem 1 by Ng Zhi Wei 응즈웨이 (using while loop) [Comment by: 우건주, 고재윤, 남상현, 윤상수]
[News] The Future of Jobs in the Era of AI (인공지능 시대의 진로와 직업)
Explained SVD in class (2nd WebEx meeting)
[Question] What exactly does 'Linear Regression' mean?
Gradient Decent Method에 관한 간략한 정리와 GDM 최초점과 관련해 윤상수 학우님께서 올려주신 질문 게시글과 댓글로부터 배운 점
Finalise Code a Neural Network with Backpropagation In Python by 이우흠
using SVD to perform PCA
(A3) Number of Final OK by SGLee Problems (and/or Completed Discussion/Question) in QnA that your name is included.
22 problems
Final OK by SGLee [Finalized by 이예진] Dec. 14th, Summary of 1st WebEx meeting - How this class will proceeds and Who will get A in this class. by 윤상수, 우건주, 김명규, 박우현, 고재윤, 안성준, 김지훈, 도경근, 이규리, 황세진, 응즈웨이, 이예진, 정운섭, 이우흠, 인유진, 김지훈, 신유정, 도경근, 남궁보민, 우건주, 강민지, 남상현, 이우흠, 인유진.
[Final OK by SGLee] open problem 4 solution by 정운섭, 황세진, 응즈웨이 (finding solutions to various equation using sage) and question
Open Problem 4 by Ng Zhi Wei (Cont. from [Final OK by SGLee] open problem 4 solution by 정운섭, 황세진 (finding solutions to various equation using sage) and question) [Comment by: 이예진]
ALL Must read This!! [Sample, Re-Finalized OK by SGLee] Open Problem 16 [Re-Finalized by Ng Zhi Wei(응즈웨이), 이규식, 남상현, 이예진, 고윤진, 고재윤, 도경은, 황세윤, 신유정, 박우현, 이예진, 남상현, 도경근, 이규식] (Solutions and Comments) Find SVD of a rectangular 4 by 5 matrix which is bigger than 3 by 3.
OK : the2nd WebEx meeting (처음 게시글을 작성한 인원이 자신의 글에 코멘트를 달아준 학우들의 이름을 포함하여 제목 및 내용을 수정 해주어야 다른 분들의 검색이 쉬어질 것입니다.) 고재윤/전자전기공학과, 우건주/신소재공학과 (Woo Gun Joo), 문헌정보학과 2018 이규리(Lee Kyuri), 시스템경영공학과 황세진 (Hwang Sejin), 전자전기공학과 김지훈 (Kim Jihoon), 기계공학과 박우현 (Park Woohyeon), 2019 응즈웨이, 2020 인유진, 2021 강민지, 2018 정운섭, 2020 윤상수, 2018 남상현, 2015 김명규, 이우흠-2018 (Yi) , 안성준 2016, 2016 안성준, 박우현_2016, 2018 이규식, 정운섭_2018, 2018 이규리, 2015_김명규, 고재윤 2020, 2015_ 황세진, 안성준 2016, 2015_김명규, 2018 이우흠, 안성준 2016, 2018 도경근, 김지훈_2020
Open Problem 16 by 이우흠 Re-Finalized OK by SGLee] Open Problem 16 [Re-Finalized by Ng Zhi Wei(응즈웨이), 이규식, 남상현, 이예진, 고윤진, 고재윤, 도경은, 황세윤, 신유정, 박우현, 이예진, 남상현, 도경근, 이규식] (Solutions and Comments) Find SVD of a rectangular 4 by 5 matrix which is bigger than 3 by 3.
[Final OK by SGLee] Summary for 3rd Webex Meeting (12/28), Dear 응즈웨이 Ng Zhi Wei Software 3rd year and 이우흠/ LIYuxin/
(B)Quality of Your Participation:
· I especially remember the SVD. When I first listened to the lecture, there were actually many parts I didn't understand. However, I understood many parts through the explanations and discussions of my classmates. In addition, the part that the professor explained additionally in the last Webex meeting helped complete the concept.
· Sketch graphs of polynomial functions
· Make a composite function
· Find solutions of equations
· Perform matrix operation by applying operations on vectors and matrices
· Check inverse and transpose of a matrix
· Find inner product and angle of two vectors
· Find system of linear equations
· Find SVD of a matrix
· Find derivative of a differentiable function using while loop
· Find local maximum, local minimum, absolute maximum, and absolute minimum of a function
· Find the minimum value of a function with the GDM code.
· Draw graphs for various differentiable functions and determine a small interval containing the local minimum identified from the figure, and apply the GDM code to each interval with a reasonable x1.
· Explain how SVD is used in PCA
· Take a data matrix and apply a PCA
· Process of dimension reduction by PC’s on the covariance matrix.
· Describe simply ANN and Backpropagation algorithm as you understand.
(B2) What did you learn or feel while learning Basic Math4AI (Action Learning/PBL) with your classmates?
I received great help in understanding the lecture by sharing opinions on the solution to the problem through discussions with my classmates. In addition, it was easy to understand problems that were difficult to understand alone. And it was good to receive opinions from my classmates on the solution I solved.
I learned that visualizing the example or process helps intuitive understanding on the problem or concept. I saw some of the colleagues did very well on visualizing what they are saying.
It was good experience to study mathematics with students from such a various backgrounds. I could learn from colleagues’ background knowledge on diverse field.
I think it is a good way to learn when we share our answer in QnA and discuss and ask questions so we get a better understanding on the concepts and also see how people approach the questions.
(B3) Write names of YOUR PBL Team members and Team Leader. And Now I understand the concepts below
SVD Decomp, LU Decomp, QR Decomp, Euclidean Distance, Norm, Vector-Plane Projection, Determinants, Limits, Differentiation, Leact-Squares Solution, Matrix and Vector Actions.
Subject |
Basic Math4AI |
Major |
Software |
||||
Name/ID |
Ng Zhi Wei 2019313851 |
Year |
3 |
||||
Learning contents |
1. SVD(Singular Value Decomposition) 2. GDM(Gradient Descent Method) 3. Data and Covariance Matrix 4. PCA(Principal Components Analysis) 5. Rank Reduction and the role of SVD in PCA 6. BP(Back-Propagation) algorithm in ML(Machine Learning) and ANN(Artificial Neural Network) |
||||||
Self-Checking |
|||||||
Activity |
Excellent |
Good |
Fair |
||||
1. |
I have contributed to generate ideas and facts needed to resolve the issue. |
ü |
|
|
|||
2. |
I proposed learning issues associated with learning. |
ü |
|
|
|||
3. |
When I study alone, I used a variety of learning materials. |
ü |
|
|
|||
4. |
I provide new information and knowledge in this class. |
ü |
|
|
|||
5. |
I was actively involved in the discussions. And I provided a lot of questions in order to understand these discussions. |
ü |
|
|
|||
6. |
I have made a contribution to the learning activities for our class. |
ü |
|
|
|||
※ Please record the following items by considering your learning process. 1. Do you understand the most of contents of this learning process? Yes
2. What kind of learning materials have you used to study? The Lecture Notes and Lab. http://matrix.skku.ac.kr/intro-math4AI/ The lecture videos, QnA board, and online resources, lecture notes
3. What did you learn through the learning activities of this course? Using coding, solve the equation and draw a graph. And I learned to generate matrices and vectors, and I learned to calculate the angle or distance between two data. I also learned about transpositions, inverse matrices, and reversible matrices. And I learned about SVD decomposition, which is the most important thing. Also, use code to find the limit and derivative of the function and draw a graph. And I learned how to find the local minimum value, local maximum value, absolute maximum value, absolute minimum value, and GDM to find the critical value. I learned how AI technique finds the most efficient way to find equations, or solutions. I was able to learn the basics of current AI technology's data mining technology. And also, I was able to learn various ways, such as learning, finding, and compressing data efficiently. As I do the open problems, I learn the concepts and how to approach the questions in Sage.
4. What have you learned from the other colleagues? I was able to find solutions to the solutions of problems I didn't know from the solutions of other classmates. In addition, I was able to learn how to solve various perspectives through discussion. From the QnA board, I learn a lot from how people see the questions differently and how people approach the questions.
5. Self-Evaluation for Q/A Activities My score: 7/10 I think I contributed a fair amount in terms of giving my answers, asking questions, and acknowledge other people answers when I understand them.
6. Evaluation for other students Many students have actively participated. I personally think mr. 고재윤 did a great job in visualizing concepts. . 남상현 학우님: I think he have the most number of useful QnA which I read and learnt from so I chose him as my first peer evaluation student. Thus, I think he is deserving to get extra points. . 이규리 학우님: She explained to me the things needed in the midterm PBL report last week because a lot of the discussions were in Korean so I got her help for clarifications. Thus, I chose her as my 2nd peer evaluation student and she is deserving of extra points. Many classmates' solutions helped me, but in particular, the solutions of 고재윤, 윤상수, and 응즈웨이 are memorable. In particular, 응즈웨이's SVD problem solution received a lot of help. |
|||||||
Self-Evaluation 2
Subject |
Basic Math4AI |
Major |
Software |
||||||
Name/ID |
Ng Zhi Wei 2019313851 |
|
|
||||||
Evaluation Items |
Strongly disagree |
Disagree |
Mostly disagree |
Mostly agree |
Agree |
Strongly agree |
|||
1. I participated actively in both, online and offline classes. |
|
|
|
|
|
ü |
|||
2. I participated actively on a Q&A activity. |
|
|
|
|
ü |
||||
|
3. My question and replies made on Q&A are relevant. |
|
|
|
|
ü |
|
||
|
4. Information provided by my activity was useful for other students in the class. |
|
|
|
ü |
|
|||
|
5. I enthusiastically took into the consideration other students’ opinions or point of view. |
|
|
|
|
ü |
|||
|
6. I contributed to class by participating on Q&A discussions. |
|
|
|
|
ü |
|||
|
7. I am enthusiastic about taking other class with the same students I am taking this class. |
|
|
|
|
ü |
|
||
[Opinion] ► Satisfaction according to the Self-Evaluation I am satisfied that I took a lot of lectures in class and participated a lot in Q&A. Also, I am satisfied with my contribution to this class. I am very happy with the way this class is conducted. It is rather new to me but I think it is effective learning.
► Sorrow according to the Self-Evaluation There is still a lack of understanding. And although I am actively participating in the class, I want to participate more actively. Also, I want to contribute more to this class. The QnA board is a little messy. Rather than using the QnA board, I think using the discussion board and splitting the discussion board into the different days will be better. |
|||||||||
Self-Evaluation 3
Subject |
Basic Math4AI |
|||||||
Colleague’s name |
고재윤 |
|||||||
Name of evaluator |
윤상수, 강민지 |
|||||||
Evaluation Items |
Strongly disagree |
Disagree |
Mostly disagree |
Mostly agree |
Agree |
Strongly agree |
||
1. I participated actively in both, online and offline classes. |
|
|
|
|
|
o |
||
2. I participated actively on a Q&A activity. |
|
|
|
|
o |
|||
|
3. My question and replies made on Q&A are relevant. |
|
|
|
|
o |
||
|
4. Information provided by my activity was useful for other students in the class. |
|
|
|
|
o |
||
|
5. I enthusiastically took into the consideration other students’ opinions or point of view. |
|
|
|
|
|
o |
|
|
6. I contributed to class by participating on Q&A discussions. |
|
|
|
|
o |
||
|
7. I am enthusiastic about taking other class with the same students I am taking this class. |
|
|
|
|
o |
||
[Opinion] ► Satisfaction according to the Self-Evaluation 고재윤 학우분께서 작성하신 솔루션들로 문제 풀이를 이해하는 데 도움을 많은 도움을 받았습니다. Well visualized his thoughts. It helped other colleagues.
► Sorrow according to the Self-Evaluation 고재윤 학우분만큼 더 많은 솔루션을 작성하도록 앞으로 더 노력하고 싶습니다 There was no sorrow. |
||||||||
Self-Evaluation 3
Subject |
Basic Math4AI |
Major |
Software |
||||||
Name/ID |
Ng Zhi Wei 2019313851 |
|
|
||||||
Evaluation Items |
Strongly disagree |
Disagree |
Mostly disagree |
Mostly agree |
Agree |
Strongly agree |
|||
1. I participated actively in both, online and offline classes. |
|
|
|
|
|
ü |
|||
2. I participated actively on a Q&A activity. |
|
|
|
|
ü |
||||
|
3. My question and replies made on Q&A are relevant. |
|
|
|
|
ü |
|
||
|
4. Information provided by my activity was useful for other students in the class. |
|
|
|
ü |
|
|||
|
5. I enthusiastically took into the consideration other students’ opinions or point of view. |
|
|
|
|
ü |
|||
|
6. I contributed to class by participating on Q&A discussions. |
|
|
|
|
ü |
|||
|
7. I am enthusiastic about taking other class with the same students I am taking Discrete Mathematics. |
|
|
|
|
ü |
|
||
[Opinion] ► Satisfaction according to the Self-Evaluation I am very happy with the way this class is conducted. It is rather new to me but I think it is effective learning.
► Sorrow according to the Self-Evaluation The QnA board is a little messy. Rather than using the QnA board, I think using the discussion board and splitting the discussion board into the different days will be better. |
|||||||||
Self-Evaluation 4
자기소개/수강동기
안녕하세요. 저는 공학계열 21학번 강민지입니다.
저는 아직 전공진입을 하지 않은 상태에서 어떤 전공이 잘 맞을까 탐색하던 중, 인공지능에 대해 관심을 갖게 되었습니다. 비록 아직 미적분학 밖에 수강하지 않았지만, 이들을 토대로 인공지능에 기반이 되는 수학적 내용들을 탐구해보고 싶어이 강좌를 수강하게 되었습니다.
앞으로 성실하게 강의활동에 참여하여 여러 내용들을 학습하겠습니다. 잘 부탁드립니다.
#Open Problem 1 (solved by 강민지)
http://matrix.skku.ac.kr/KOFAC/ 에서 1주차 교재 http://matrix.skku.ac.kr/intro-math4ai/w1/ 의 코드를 수정하여 풀어본/그려본 그래프를 공유합니다.
<function of graph>
1. plot function을 이용하고,
2. 이때 정의역 a<x<b인 f(x)를, plot(f(x), (x,a,b)) 형식으로 입력한다.
3. 기본 파란색으로 출력되며, 함수 끝에 color='' 형식으로 색을 지정할 수 있다.
4. 함수가 발산할 경우, 함수 끝에 ymin, ymax를 통해 출력범위를 지정할 수 있다.
Q(강민지) detect_poles='show' 가 점근선을 출력한다는 의미가 맞을까요?
A(이상구 교수님) 맞습니다. detect_poles==’show’가 점근선을 출력한다는 의미입니다.
Comment : 강의를 처음 시작하며, 아직 낯선 강의 방식에 적응하는 단계에서 작성한 글이라, 처음에는 맞게 작성한 것인지 확신이 없었지만, 여러 학우분들의 도움이 되었다는 답변에 자신감을 갖고 QnA에 여러 글을 남기는 계기가 되었습니다.
#Open Problem 2(solved by 이예진)
(수업 이해를 돕고자 교수님 강의 내용 일부를 요약 정리하였습니다.)
먼저 day 2의 학습목표는 Tuple, Vectors, Matrix, Tensor에 대한 이해입니다.
나아가 행렬의 다양한 operation의 의미와 쓰임을 이해하는 것입니다.
한편, 선형대수학은 행렬의 operation 방법을 다루는 수학의 한 branch로, Artificial Intelligence를 배우고자 하는 사람이라면 필수 수강해야 하는 과목 중 하나입니다.
tuple이란 data를 정해진 짝으로 표현하는 방식으로, 이것을 n개의 tuple로 나타낸다고 표현합니다.
Ex) kim의 신장, 체중, 연령, 성별에 대한 data = (160,80,19,1) (4-tuple)
Vector의 연산은 크게 벡터끼리의 합, scalar값의 곱이 있습니다.
다음은 벡터 연산의 성질입니다.
◩ Open Problem 2 |
Sketch the graph for the function .
Cos 함수와 다항식의 곱으로 표현된 함수를 plot하였습니다.
이 때, HW1에서 x의 범위를 지정해주었던 것과 달리 이번에는 x뿐 아니라 y의 범위를 넣어 plotting해 보았습니다. 그런데 다음과 같이, 지정해준 ymin, ymax를 초과하여 그래프가 도출되었습니다
이에 다음과 같이 y의 범위를 조절해보았습니다.
그래프가 x기준 -0.3~0.3 사이, y 기준 -3~3 사이에서 진동하고 그 왼쪽으로는 0에 수렴, 오른쪽으로는 발산함을 확인할 수 있었습니다. 다음 그림을 통해 이 그래프의 전체적인 형태를 파악할 수 있습니다
또한, 다음 x의 범위를 조정한 다음 코드로 그린 그래프를 통해 x가 원점 주변에서 굉장히 많은 해를 가짐을 알 수 있습니다.
감사합니다.
Comment : x값의 범위를 좁혀 그래프를 표현해, 함수의 개형을 확실히 알아볼 수 있어 도움이 되었습니다. 함수 그래프 관련 예제들을 풀어볼 때는 단순히 코드 실행만 해보았었는데, 이 글을 본 후에 x값의 범위를 변경해가며, 함수의 개형을 좀 더 효율적으로 보게 되었습니다.
#Open Problem 3(solved by 고재윤)
Open Problem 3 |
Make a composite function from the functions that you learned and draw a graph of it. [Hint: plot(sin(e^(1/3)^x), (x, -1.2, 10))]
http://matrix.skku.ac.kr/KOFAC/ 로 접속하여 sin(e^(1/3)^x) 식을 -1.2부터 10까지 나타내기 위해 아래와 같이 코딩하였습니다.
plot ( sin(e^(1/3)^x), (x, -1.2, 10))
in http://matrix.skku.ac.kr/KOFAC/ gives
보시는바와 같이 한줄로 간단하게 코딩되었습니다. 전체적인 형태는 plot( function, (variable, starting interval, finishing interval))입니다.
그 결과 아래와 같은 그래프가 결과값으로 출력되었습니다.
함수에 대해 더 잘 이해하기 위해 e^(1.3)^x 그래프를 아래와 같이 코딩하고 출력값을 살펴보았습니다.
plot (e^(1/3)^x)
in http://matrix.skku.ac.kr/KOFAC/ gives
e^(1/3)^x 그래프의 x>0 부분을 보면 y=1 부근으로 수렴하며 감소한다는 것을 알 수 있습니다.
이를 통해 sin(e^(1/3)^x)의 x>0 부분이 왜 수렴하는 형태로 나왔는지 추론할 수 있었습니다.
반대로 x<0부분을 보면 exponential하게 y값이 증가한다는 것을 알 수 있습니다.
sin은 주기함수 이므로 sin(e^(1/3)^x)의 x<0 부분이 왜 진동하는 형태로 그려졌는지 추론할 수 있었습니다.
Comment : 합성함수 문제를 단순히 합성함수의 그래프만 출력하고, 분석하는 식으로 접근하였었는데, 합성함수의 각 구성함수들을 분리하여 분석하는 것도 이해에 큰 도움이 된다는 것을 알게 되었습니다.
#Open Problem 4(solved by 안성준)
sin(x)+sin(x^2) =exp(x)로 함수를 정해봤습니다.
코드를 다음과 같이 입력한 결과 너무 복잡한 식이라 교점이 바로 구해지지 않았습니다.
함수의 개형을 통해 근을 유추하고자 다음과 같이 입력하였습니다.
주기는 sin함수의 한 마디인 (-pi , pi)로 설정하였습니다.
그 결과 흥미로운 그래프가 나왔습니다
근을 더 엄밀히 유추하기 위해 범위를 줄였습니다.
한 번 더 범위를 줄인 결과 근이 더 명확히 볼 수 있었습니다.
근이 -2.624에 근접한 것을 확인할 수 있습니다.
마지막으로 find root를 통해 근을 찾은 결과입니다.
이상입니다. 감사합니다!
Comment : Solve Equation을 할 때, 근이 sol_equation 함수로 구하기 어려울 때, 그래프를 통해 추정할 수 있다는 사실을 알고는 있었으나, 직접 코드를 통해 문제를 해결해 보니, 그 개념을 더 또렷하게 이해할 수 있었습니다. 단순히, 그래프만 그리는 것이 아니라 그래프를 그리는 최종 목표가 근을 구하고 후에 max/min값들을 분석하기 위함 이라는 생각도 해보았습니다.
Open Problem 4
#Open Problem 5(solved by 강민지)
<sol-equation>
solve function을 사용하며, x에 대한 방정식을 solve(equation, x)형태로 입력한다.
이때 출력값은 [x==]형태로 방정식의 근이 출력된다.
값이 크거나 복잡해 다루기 어려울 경우에는, 그래프를 그려 근사 값을 찾는 것이 더 효율적인 방법이다.
그래프는 저번 강의에서 처럼 plot function을 이용해 그릴 수 있다.
<vector-operations>
데이터를 순서 쌍이나 튜플 형태로 나타낼 수 있을 때, 이는 2차원 혹은 3차원 데이터 (또 n 차원 데이터) 로 coordinate plane이나 coordinate space 위의 (또 n 차원 공간안의) 한 점으로 (표현할 수있다) 나타낼 수 있다.
이 때 그 point를 다루기 위해서 vector를 사용한다.
vector를 이용해 좀 더 쉽게 시각화 할 수 있고, vector의 성질을 통해 다양한 계산을 할 수 있다.
vector function을 이용하며, vector(좌표)형식으로 입력한다.
(vector와 print 함수) 혹은 (var과 plot3d 함수)를 이용해, 이를 시각화 하여 출력값을 볼 수 있다.
1) f(x)=5/1-x^2에 대하여 0.3~3^(1/2)+0.05 까지 붉은 선 그래프를 출력하고, 0.5~3^(1/2)까지 f(x)와 x축 까지 영역을 색칠하여 출력합니다.
여기서, sqrt는 제곱근을 계산합니다.
2) f(x)에서 색칠 영역에 대하여 integral 합니다.
3) 해당 근사 값을 5자리 수로 출력합니다.
1) var function을 통해, 'x,y,z' 변수의 3차원 좌표를 설정합니다.
2) (x,-5,5),(y,-5,5),(z,-5,5) 영역 상의 x+y == 5 라는 graph를 plot3d 함수를 이용해 출력합니다.
Q(강민지) 첫 문제에서 simplify.full() 함수는 무엇을 의미할까요?
A(이상구 교수님) simplify.full() 명령어는 위의 복잡한 숫자를 ... 보기좋은 모습/이해하기쉬운/이용하기쉬운/간단한 숫자로 표현해 줍니다.
Comment : 첫 문제에서 integral을 그래프 상 영역으로 표시하여, 계산 과정을 시각화 해 더 쉽게 이해할 수 있었습니다. 두번째 문제에서는, vector operation이 하나의 point를 지정하고 이들이 모여 하나의 영역을 출력하는 과정을 코드를 통해 이해할 수 있었습니다. 또한 코드 실행 과정 중에 낯선 코드가 있어 교수님과의 질의응답을 통해 답을 얻어 궁금증을 해결할 수 있었습니다.
#Open Problem 6(solved by 강민지)
with 6x6 matrix
[000100]
[001000]
[414410]
[000001]
[223021]
[100100]
위 6x6 행렬을 이용하여, 전치행렬이 존재하고 역행렬이 존재하지 않음을 확인했습니다.
-transposed matrix
exists.
-inverse matrix
doesn't exist.
Comment : 6X6행렬을 만들고 이 행렬의 전치행렬, 역행렬의 존재여부를 확인해보았습니다. 아직 미분적분학 밖에 수강하지 않아, 행렬이라는 개념을 처음 접해 전치행렬과 역행렬에 대해 이해하는 것이 조금 어려웠지만, 강의와 인터넷 검색들을 통해 이해하고 문제도 해결할 수 있었습니다.
#Open Problem 7(solved by 고재윤)
◩ Open Problem 7 |
What kinds of data can you apply the similarity measures we just discussed?
"What kinds of data can you apply the similarity measures we just discussed (Distance similarity)?"
Using the distance similarity, we can determine the color similarity. Defining the 3-dim vectors as an rgb pair, and later calculating Euclidean distance between them.
두가지 이상의 데이터를 비교하는 방법 중 하나로 두 데이터 사이의 거리를 구하는 방법이 있습니다.
위 식은 'Euclidean distance' 에 대한 공식으로, 좌표에서 점 A(a1, a2)와 점 B(b1, b2)사이의 거리를 나타내는 공식입니다.
이 공식을 기초로 하여 두 데이터 사이의 연관성 또는 similarity를 구할 수 있습니다. 대표적으로, RGB 표를 통해 두가지 이상의 빛들의 similarity를 쉽게 알아낼 수 있습니다.
위 그림은 RGB 표를 나타낸 것입니다.
우리가 전자기기를 통해 보는 빛의 색은 빛의 삼원색인 빨강 (R) , 초록 (G), 파랑(B) 이 세가지 색의 조합으로 위 그림과 같은 많은 종류의 색을 표현할 수 있습니다.
즉, 색 데이터는 3-dim vector로 표현됩니다. 우리는 색 데이터간의 거리로 데이터간의 similarity를 판단할 수 있을 것입니다. (Distance Similarity)
color1 = [1, 2, 3], color2= [11, 22, 33], color3= [12, 23, 34]라는 세 가지 색 데이터가 있다고 가정해보도록 하겠습니다.
각 벡터의 성분은 [R,G,B]이고, 성분의 크기는 그 성분의 세기를 뜻합니다.
color1과 color2 중 어떤 색이 color3와 더 비슷한 색인지 알기 위해( Distance Similarity를 구하기 위해 ) 아래와 같이 코딩하였습니다.
여기서 bool() 명령어는 괄호 안의 조건문이 참이면 True, 거짓이면 False를 출력하는 명령어입니다.
위와 같이 코딩한 결과 아래와 같이 출력되었습니다.
따라서 Color2가 Color1보다 Color3와 더 비슷한, similarity가 높은 색 데이터라는 것을 알 수 있었습니다.
Comment : RGB Color Similarity를 구하기 위해 vector operation을 도입해 Similarity를 구하는 과정을 이해할 수 있었습니다. Vector operation을 어떻게 실생활에 적용할까 하는 문제에 대해 답을 찾기 어려웠는데, 이 글을 보고 Similarity를 적용할 수 있는 데이터에 대한 개념을 완벽히는 아니지만, 어느정도 이해할 수 있었습니다.
#Open Problem 8(solved by 문태의)
[Open Problem 8]
What kind of data you cannot use this similarity measure using
the distance? Any other measures that you can think of? (Hint: Vectors/Data in
the same directions)
"Can you think of the data you cannot apply to the similarity measure we
just discussed (Distance similarity)?"
데이터 유사성을 측정하기 위해 'Euclidean distance'방식을 적용할 수 없는 데이터는 문장입니다.
예를 들어 다음과 같은 문장이 있다고 생각해보겠습니다.
보기에 가장 유사한 문장은 1번과 3번입니다.
이 세 문장의 데이터 유사성을 측정하기 위해 'Euclidean distance'방식을 사용해보겠습니다.
우선 각 문장이 각 단어를 몇개나 갖고 있는지 표로 나타내보겠습니다.
|
인공지능 |
수학 |
과학 |
어려워 |
1번 |
1 |
1 |
0 |
1 |
2 |