You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
* pca fix
* revert (n -1)
* fix build error for bayes_nonconj
* fix the same issue for MCMC plots
* revert bayes_nonconj to main
---------
Co-authored-by: Humphrey Yang <[email protected]>
Co-authored-by: Matt McKay <[email protected]>
Copy file name to clipboardExpand all lines: lectures/svd_intro.md
+12-11Lines changed: 12 additions & 11 deletions
Original file line number
Diff line number
Diff line change
@@ -544,7 +544,7 @@ $$
544
544
X = \begin{bmatrix} X_1 \mid X_2 \mid \cdots \mid X_n\end{bmatrix}
545
545
$$
546
546
547
-
where for $j = 1, \ldots, n$ the column vector $X_j = \begin{bmatrix}X_{1j}\\X_{2j}\\\vdots\\X_{mj}\end{bmatrix}$ is a vector of observations on variables $\begin{bmatrix}x_1\\x_2\\\vdots\\x_m\end{bmatrix}$.
547
+
where for $j = 1, \ldots, n$ the column vector $X_j = \begin{bmatrix}x_{1j}\\x_{2j}\\\vdots\\x_{mj}\end{bmatrix}$ is a vector of observations on variables $\begin{bmatrix}X_1\\X_2\\\vdots\\X_m\end{bmatrix}$.
548
548
549
549
In a **time series** setting, we would think of columns $j$ as indexing different __times__ at which random variables are observed, while rows index different random variables.
550
550
@@ -561,14 +561,14 @@ Because our data matrix may hold variables of different units and scales, we fir
561
561
First by computing the average of each row of $X$.
562
562
563
563
$$
564
-
\bar{X_j}= \frac{1}{m} \sum_{i = 1}^{m} x_{i,j}
564
+
\bar{X_i}= \frac{1}{n} \sum_{j = 1}^{n} x_{ij}
565
565
$$
566
566
567
567
We then create an average matrix out of these means:
And subtract out of the original matrix to create a mean centered matrix:
@@ -583,27 +583,28 @@ $$
583
583
Then because we want to extract the relationships between variables rather than just their magnitude, in other words, we want to know how they can explain each other, we compute the covariance matrix of $B$.
584
584
585
585
$$
586
-
C = \frac{1}{{n}} B^\top B
586
+
C = \frac{1}{n} BB^{\top}
587
587
$$
588
588
589
589
**Step 3: Decompose the covariance matrix and arrange the singular values:**
590
590
591
-
If the matrix $C$ is diagonalizable, we can eigendecompose it, find its eigenvalues and rearrange the eigenvalue and eigenvector matrices in a decreasing other.
591
+
Since the matrix $C$ is positive definite, we can eigendecompose it, find its eigenvalues, and rearrange the eigenvalue and eigenvector matrices in a decreasing order.
592
592
593
-
If $C$ is not diagonalizable, we can perform an SVD of $C$:
593
+
The eigendecomposition of $C$ can be found by decomposing $B$ instead. Since $B$ is not a square matrix, we obtain an SVD of $B$:
594
594
595
595
$$
596
596
\begin{aligned}
597
-
B^T B &= V \Sigma^\top U^\top U \Sigma V^\top \cr
598
-
&= V \Sigma^\top \Sigma V^\top
597
+
B B^\top &= U \Sigma V^\top (U \Sigma V^{\top})^{\top}\\
598
+
&= U \Sigma V^\top V \Sigma^\top U^\top\\
599
+
&= U \Sigma \Sigma^\top U^\top
599
600
\end{aligned}
600
601
$$
601
602
602
603
$$
603
-
C = \frac{1}{{n}} V \Sigma^\top \Sigma V^\top
604
+
C = \frac{1}{n} U \Sigma \Sigma^\top U^\top
604
605
$$
605
606
606
-
We can then rearrange the columns in the matrices $V$ and $\Sigma$ so that the singular values are in decreasing order.
607
+
We can then rearrange the columns in the matrices $U$ and $\Sigma$ so that the singular values are in decreasing order.
607
608
608
609
609
610
**Step 4: Select singular values, (optional) truncate the rest:**
0 commit comments