Syllabus
Unit I
Vector Spaces: Vector spaces – Sub spaces – Linear independence – Basis – Dimension; Inner Product Spaces: Inner products – Orthogonality – Orthogonal basis – Gram Schmidt Process – Change of basis – Orthogonal complements – Projection on subspace – Least Square Principle. QR- Decomposition.
Unit II
Functions of severable variables: Functions, limit and continuity. Partial differentiations, total derivatives, differentiation of implicit functions and transformation of coordinates by Jacobian. Taylor’s series for two variables. Vector Differentiation: Vector and Scalar Functions, Derivatives, Curves, Tangents, Arc Length, Curves in Mechanics, Velocity and Acceleration, Gradient of a Scalar Field, Directional Derivative, Divergence of a Vector Field, Curl of a Vector Field.
Unit III
Eigen values and Eigen vectors: Eigen Values and Eigen Vectors, Diagonalization, Orthogonal Diagonalization, Quadratic Forms, Diagonalizing Quadratic Forms, Conic Sections. Similarity of linear transformations – Diagonalization and its applications – Jordan form and rational canonical form. Case Studies: Applications on least square and image transformations.
Objectives and Outcomes
Course Objectives
Understand the basic concepts of vector space, subspace, basis and dimension. Also to understand the orthogonality concepts and apply to various problems computer science.
Course Outcomes
CO1: To understand the basic concepts of vector space, subspace, basis and dimension.
CO2: To understand the basic concepts of inner product space, norm, angle, Orthogonality and projection and implementing the Gram-Schmidt process, to compute least square solution.
CO3: To understand and compute the linear transformations.
CO4: To compute the eigen values and eigen vectors and apply to transformation problems.
CO5: To perform case studies on least square and image transformations.
CO-PO Mapping
PO/PSO |
PO1 |
PO2 |
PO3 |
PO4 |
PO5 |
PO6 |
PO7 |
PO8 |
PO9 |
PO10 |
PO11 |
PO12 |
PSO1 |
PSO2 |
CO |
CO1 |
2 |
2 |
|
|
3 |
|
|
|
|
|
|
|
|
|
CO2 |
2 |
2 |
|
|
2 |
|
|
|
|
|
|
|
|
|
CO3 |
3 |
3 |
|
|
2 |
|
|
|
|
|
|
|
|
|
CO4 |
2 |
2 |
|
|
1 |
|
|
|
|
|
|
|
|
|
CO5 |
3 |
2 |
|
|
2 |
|
|
|
|
|
|
|
|
|
Evaluation Pattern
Evaluation Pattern: 70:30
Assessment |
Internal |
External |
Midterm |
20 |
|
*Continuous Assessments (CA) |
50 |
|
**End Semester |
|
30 (50 Marks; 2 hours exam) |
*CA – Can be Quizzes, Assignment, Lab Practice, Projects, and Reports
**End Semester can be theory examination/ lab-based examination
Text Books / References
Textbook(s)
Howard Anton and Chris Rorres, “Elementary Linear Algebra”, Tenth Edition, John Wiley & Sons, 2010.
Reference(s)
Nabil Nassif, Jocelyne Erhel, Bernard Philippe, “Introduction to Computational Linear Algebra”, CRC press, 2015.
Sheldon Axler, “Linear Algebra Done Right”, Springer, 2014.
Gilbert Strang, “Linear Algebra for Learning Data”, Cambridge press, 2019.
Kenneth Hoffmann and Ray Kunze, “Linear Algebra”, Second Edition, Prentice Hall, 1971.
Mike Cohen, “Practical Linear Algebra for Data Science”, Oreilly Publisher, 2022.
Lab Experiments
- Matrix operations, Generation of random matrices with given rank
- Solution to linear system of equations, Left Inverse, Right Inverse, Pseudo Inverse
- Revision of curve and surface plots using parametric representations
- Span of a set (scatter plots for span of different sets)
- Finding basis for row space, column space, null space and left null space
- Finding orthogonal compliment of a given vector space
- QR decomposition
- Projections onto subspaces, Least Square Approximation, Linear Regression
- Eigenvalues, Eigenvectors, characteristic polynomial.
- Similar matrices, diagonalization, Cayley Hamilton Theorem.
- Scaling, Shifting, Rotation of images using Linear Transformations