Singular Value Decomposition (SVD): A Complete Guide
Table of Contents
- What is Singular Value Decomposition (SVD)?
- Why Is SVD Used in Dimensionality Reduction?
- Singular Value Decomposition Applications
- SVD for Image Compression
- SVD for Removing Background from Videos
Linear algebra works as a bridge between theory and practical implementation of concepts. So, it’s important to have a fair understanding of linear algebra to enhance knowledge of machine learning algorithms, which include a bit tricky concepts. One such topic that you must learn about is the Singular Value Decompositon of matrix for dimensionality reduction.
SVD is everywhere, especially in data science or while dealing with dimensionality reduction. But what does it mean, and how does it work?
This blog will answer all such questions and more in detail.
What is Singular Value Decomposition (SVD)?
Singular value decomposition in machine learning means the factorization of the matrix into three matrices. It shares a few algebraic properties and theoretical and geometrical insights about linear transformations.
As we learn about the SVD of a matrix, we will often come across the term ‘rank of the matrix’. So, first, let’s get familiar with the concept of rank of a matrix.
Rank of a Matrix
The rank of a matrix means the maximum number of linearly independent column or row vectors in the matrix.
If vector ‘r’ can’t be expressed as a linear combination of vectors r1 and r2, it is linearly independent of r1 and r2.
Have a look at the following three matrices:
In matrix A, row2 (r2) is a multiple of row1 (r1), i.e., r2= 2 r1. So, there is just one independent row. We can say that Rank(A)= 1.
Coming to matrix B, row3 (r3) is a sum of row2 and row1, i.e., r3= r1+r2. However, r1 and r2 are independent. So, Rank(B)= 2.
In matrix C, all the given three rows are independent of each other. So, Rank(C)= 3.
In simple words, the rank of a matrix is representative of the amount of unique information represented by the matrix. The higher the rank, the higher the information.
Singular value decomposition of a matrix deals with the decomposition of a matrix into a product of 3 matrices:
Let’s say the dimensions of A are m x n, then:
U = m x m matrix of left singular vectors.
S = m x n rectangular diagonal matrix of singular values that are arranged in descending order.
V = n x n matrix of right singular vectors.
Why Is SVD Used in Dimensionality Reduction?
One may wonder why to go through the tedious decomposition and take so much trouble. Well, an alternate representation of the decomposition will help you understand the reason better.
Let’s take a look at the following diagram:
With decomposition, we can express the original matrix as a linear combination of low-rank matrices.
Now, talking about practical application, you can see that just the first few singular values (k) are large while the remaining singular values approach zero. Therefore, we can ignore the terms except for the first few without missing out on much of the information.
So, we can conclude:
With singular value decomposition, we can represent large matrix A by three smaller matrices, i.e., U, S, and V.
This comes in handy in large computations.
To find a k-rank approximation of A, we have to select the first k singular values and truncating the three matrices accordingly.
Singular Value Decomposition Applications
Common applications of singular value decomposition are as follows:
It is used in data science.
The SVD algorithm has some exceptional algebraic characteristics and offers relevant geometrical and theoretical insights associated with linear insights.
It has great applications in science and engineering.
It is used in mathematics as well, such as calculating matrix approximation or the rank of a matrix.
It is used in statistics too, such as process control and least-squares fitting of data.
Before we discuss SVD applications in detail, here are a few key points to know:
The rank of a matrix is a measure of unique information stored in a matrix. The higher the rank, the more the information.
SVD of a matrix is the decomposition of a matrix into three matrices.
Eigenvectors of a matrix refer to the directions of maximum spread or data variance.
Singular values are like important values of different features in a matrix.
SVD for Image Compression
We keep on clicking beautiful pictures from our smartphones, and one day, we realize there is no more memory or space. We have all gone through this situation, right? Hence, we use image compression to minimize the image size to an acceptable quality. This allows us to store more pictures in the same space as before.
We use singular value decomposition in image processing to reduce tedious and time-consuming tasks. It leverages the fact that a few singular values received after SVD are large. We can trim the three matrices according to the first few singular values and get a compressed approximation of the original picture.
SVD for Removing Background from Videos
We often wonder about those lavish and cool backgrounds behind actors in commercials and programs. Many do this manually, but some opt for machine learning to avoid manual effort. As the background of a video is static and doesn’t have a lot of movement, which is not the case with foreground, we can exploit this feature to separate background and foreground using SVD.