connerchu.com/public/tidbits/svd/index.html

198 lines
13 KiB
HTML
Raw Normal View History

2024-07-30 02:56:17 +00:00
<!DOCTYPE html>
<html lang="en">
<head><script src="/livereload.js?mindelay=10&amp;v=2&amp;port=1313&amp;path=livereload" data-no-instant defer></script>
<meta charset="UTF-8">
<meta http-equiv="X-UA-Compatible" content="ie=edge">
<meta name="viewport" content="width=device-width, initial-scale=1, shrink-to-fit=no">
<meta name="author" content="">
<meta name="description" content="Singular Value Decomposition (SVD) is probably one of the coolest concepts in mathematics I have learned so far. Seen by many as the grand finale of an introductory linear algebra course, SVD combines many pervasive topics seen throughout physics and applied math, including eigenvalues/eigenvectors and unitary matrices. Although SVD was not covered in my introductory mathematical physics course, I tried my best to develop a basic understanding of this factorization tool so that I could comfortably use it when decomposing transient absorption data in my work at the Leone Group." />
<meta name="keywords" content="" />
<meta name="robots" content="noodp" />
<meta name="theme-color" content="" />
<link rel="canonical" href="http://localhost:1313/tidbits/svd/" />
<title>
Singular Value Decomposition :: Conner Chu
</title>
<link rel="stylesheet" href="/main.27e3622a96ea2e0f6e79defc47d8cf8e459fb29b39378b1118b2e9a8e72ef0eb.css" integrity="sha256-J&#43;NiKpbqLg9ued78R9jPjkWfsps5N4sRGLLpqOcu8Os=">
<link rel="apple-touch-icon" sizes="180x180" href="/apple-touch-icon.png">
<link rel="icon" type="image/png" sizes="32x32" href="/favicon-32x32.png">
<link rel="icon" type="image/png" sizes="16x16" href="/favicon-16x16.png">
<link rel="manifest" href="/site.webmanifest">
<link rel="mask-icon" href="/safari-pinned-tab.svg" color="">
<link rel="shortcut icon" href="/favicon.ico">
<meta name="msapplication-TileColor" content="">
<meta itemprop="name" content="Singular Value Decomposition">
<meta itemprop="description" content="Singular Value Decomposition (SVD) is probably one of the coolest concepts in mathematics I have learned so far. Seen by many as the grand finale of an introductory linear algebra course, SVD combines many pervasive topics seen throughout physics and applied math, including eigenvalues/eigenvectors and unitary matrices. Although SVD was not covered in my introductory mathematical physics course, I tried my best to develop a basic understanding of this factorization tool so that I could comfortably use it when decomposing transient absorption data in my work at the Leone Group.">
<meta itemprop="datePublished" content="2024-07-29T00:00:00+00:00">
<meta itemprop="dateModified" content="2024-07-29T00:00:00+00:00">
<meta itemprop="wordCount" content="994">
<meta name="twitter:card" content="summary">
<meta name="twitter:title" content="Singular Value Decomposition">
<meta name="twitter:description" content="Singular Value Decomposition (SVD) is probably one of the coolest concepts in mathematics I have learned so far. Seen by many as the grand finale of an introductory linear algebra course, SVD combines many pervasive topics seen throughout physics and applied math, including eigenvalues/eigenvectors and unitary matrices. Although SVD was not covered in my introductory mathematical physics course, I tried my best to develop a basic understanding of this factorization tool so that I could comfortably use it when decomposing transient absorption data in my work at the Leone Group.">
<meta property="article:published_time" content="2024-07-29 00:00:00 &#43;0000 UTC" />
</head>
<body>
<div class="container">
<header class="header">
<span class="header__inner">
<a href="/" style="text-decoration: none;">
<div class="logo">
<span class="logo__mark"> </span>
<span class="logo__text ">
Home</span>
<span class="logo__cursor" style=
"visibility:hidden;
">
</span>
</div>
</a>
<span class="header__right">
<nav class="menu">
<ul class="menu__inner"><li><a href="/resume/">Resume</a></li><li><a href="/research/">Research</a></li><li><a href="/tidbits/">Tidbits</a></li><li><a href="/photography/">Photography</a></li>
</ul>
</nav>
<span class="menu-trigger">
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 24 24">
<path d="M0 0h24v24H0z" fill="none"/>
<path d="M3 18h18v-2H3v2zm0-5h18v-2H3v2zm0-7v2h18V6H3z"/>
</svg>
</span>
</span>
</span>
</header>
<div class="content">
<main class="post">
<div class="post-info">
</p>
</div>
<article>
<h2 class="post-title"><a href="http://localhost:1313/tidbits/svd/">Singular Value Decomposition</a></h2>
<div class="post-content">
<script id="MathJax-script" async src="https://cdn.jsdelivr.net/npm/mathjax@3/es5/tex-mml-chtml.js"></script>
<p>Singular Value Decomposition (SVD) is probably one of the coolest concepts in mathematics I have learned so far. Seen by many as the grand finale of an introductory linear algebra course, SVD combines many pervasive topics seen throughout physics and applied math, including eigenvalues/eigenvectors and unitary matrices. Although SVD was not covered in my introductory mathematical physics course, I tried my best to develop a basic understanding of this factorization tool so that I could comfortably use it when decomposing transient absorption data in my work at the Leone Group.</p>
<p>Note: As a prerequisite to this tidbit, I recommend watching 3Blue1Brown&rsquo;s Essence of Linear Algebra series to gain intuition into how matrices act as linear transformations on some vector space. His visualizations are truly unmatched!</p>
<p>I think its best to begin with eigenvalue decomposition. Suppose we have some arbitrary matrix \(A \in \mathbb{C}^{n \times n}\) with n linearly independent eigenvectors. Eigenvalue decomposition allows you to represent this linear transformation as a composition of 3 other matrices, namely</p>
$$A = Q \Lambda Q^{-1}$$
<p>As is convention in matrix multiplication, we interpret this composition from right to left. \(Q^{-1}\) is an nxn matrix whose columns contain the original basis written in terms of the eigenvectors of \(A\), \(\Lambda\) is the diagonal matrix containing the eigenvalues of \(A\), and \(Q\) is an nxn matrix whose columns contain the eigenvectors written in terms of the original basis. More eloquently, what this essentially does when applied to some arbitrary vector is first transform the vector space into one represented by the eigenvectors. Then, because eigenvectors are those that solely scale under some linear transformation, our original transformation matrix A is nothing more than a scaling of this eigenvector space. Finally, applying matrix \(Q\) returns us back into the vector space described by our original basis.</p>
<p>Remember that eigenvectors and their corresponding eigenvalues only exist for square matrices because non-square ones encode some dimensionality reduction or extension. How then are we able to generalize this idea to non-square matrices? Enter singular value decomposition.</p>
<p>A quick search online often returns a definition similar to this: Any matrix \(M \in \mathbb{C}^{m \times n}\) can be unconditionally decomposed into three matrices,</p>
$$M = U \Sigma V^{\dagger}$$
<p>where U is an mxm unitary matrix, \(\Sigma\) is an mxn diagonal matrix, and V\(^{\dagger}\) is the conjugate transpose of an nxn unitary matrix. But what do these matrices do, what do they contain, and why?</p>
<p>Well it turns out that there lies a very intuitive interpretation if we just step back and think about how we can find similarities within a dataset in the first place. Naturally, it would make sense that we take the inner product between each combination of columns of our matrix M in order to quantify the orthogonality of each of our data points with respect to one another. \(M^{\dagger}M\) does exactly this by creating a column-wise correlation matrix for M. Equally motivated, we do the same for our rows of M, namely \(MM^{\dagger}\).</p>
<p>Great, we have these column-wise and row-wise correlation matrices for M, but how do we extract the most meaningful information from them? It makes sense then to try finding the eigenvectors and eigenvalues of these matrices.</p>
<p>The diagonal elements of a correlation matrix represents the variance of each variable, while the off-diagonal elements represent the covariance between variables. When applying a correlation matrix onto some vector space, it acts as a linear transformation, stretching or shrinking hyper-dimensional space in each direction depending on whether each pair of dimensions have high or low correlation. Because eigenvectors are those that do not orient and only scale following a transformation, the eigenvectors of a correlation matrix are the principle axis of the data. In other words, they are the directions in which the data has the most variance. We use our column-wise eigenvectors (called the left singular vectors) to form the columns of our matrix \(U\), and the conjugate transpose of our row-wise eigenvectors (called the right singular vectors) to form the columns of \(V^{\dagger}\). Lastly, our diagonal matrix is formed by the square roots of the shared eigenvalues (shared because both are PSD matrices).</p>
<p>Assuming SVD is true, a simple proof tying all of this together follows.</p>
$$ M^{\dagger}M = V \Sigma U^{\dagger} U \Sigma V^{\dagger} = V \Sigma^2 V^{\dagger} \implies M^{\dagger}M V = V \Sigma^2 $$
$$ MM^{\dagger} = U \Sigma V^{\dagger} V \Sigma U^{\dagger} = U \Sigma^2 U^{\dagger} \implies MM^{\dagger} U = U \Sigma^2 $$
<p>Note: Both U and V are unitary matrices because they are formed by the orthonormal eigenvectors of their respective correlation matrices, and the eigenvectors are orthonormal because the correlation matrices are symmetric.</p>
<p>We see clearly then that our final expressions take the form of the characteristic equation, where \(U\) and \(V\) are the eigenvectors and \(\Sigma^2\) are the eigenvalues.</p>
<p>Throughout my math courses, I have always preferred geometric intuition over any other. However in the case of SVD, I find it most satisfying and complete to see it as a decomposition that captures the dominant correlations between each variable in a dataset. Sure, SVD can be visualized as first a rotation/reflection governed by \(V^{\dagger}\), followed by some scaling factor \(\Sigma\) and dimensionality reducer/extender, followed by a final rotation/reflection governed by \(U\). But I feel this interpretation does not emphasize enough the construction behind these unitary and diagonal matrices.</p>
<p>In closing, many of these data analysis techniques that we normally abstract away by calling some NumPy function, have far deeper meanings than we would oftentimes expect. I have only barely covered the surface of this topic, and would love to learn more eventually. One of the most important lessons I have learned throughout the past year is to balance and access the opportunity costs of the concepts one dedicates time to understand. If you would like any clarification on these topics, I highly recommend watching <a href="https://youtube.com/playlist?list=PLMrJAkhIeNNSVjnsviglFoY2nXildDCcv&amp;si=3Inxq5egolTj3Lij" target="_blank" rel="noopener">Steve Brunton&rsquo;s series</a> along with <a href="https://www.youtube.com/watch?v=vSczTbgc8Rc" target="_blank" rel="noopener">Visual Kernel&rsquo;s video</a>. For those seeking more, I found <a href="https://stats.stackexchange.com/questions/2691/making-sense-of-principal-component-analysis-eigenvectors-eigenvalues" target="_blank" rel="noopener">this</a>, <a href="https://stats.stackexchange.com/questions/217995/what-is-an-intuitive-explanation-for-how-pca-turns-from-a-geometric-problem-wit" target="_blank" rel="noopener">this</a>, and <a href="https://stats.stackexchange.com/questions/10251/what-is-the-objective-function-of-pca/10256#10256" target="_blank" rel="noopener">this</a> helpful.</p>
</div>
</article>
<hr />
<div class="post-info">
</div>
</main>
</div>
<footer class="footer">
<div class="footer__inner">
<div class="footer__content">
<span>&copy Copyright 2024</span>
</div>
</div>
<div class="footer__inner">
<div class="footer__content">
<span>Powered by Hugo and Djordje</span>
</div>
</div>
</footer>
</div>
<script type="text/javascript" src="/bundle.min.e89fda0f29b95d33f6f4224dd9e5cf69d84aff3818be2b0d73e731689cc374261b016d17d46f8381962fb4a1577ba3017b1f23509d894f6e66431f988c00889e.js" integrity="sha512-6J/aDym5XTP29CJN2eXPadhK/zgYvisNc&#43;cxaJzDdCYbAW0X1G&#43;DgZYvtKFXe6MBex8jUJ2JT25mQx&#43;YjACIng=="></script>
</body>
</html>