You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
\tilde U^T X' \tilde V \tilde \Sigma^{-1} = \tilde A .
1138
+
$$
1139
+
1140
+
1141
+
1142
+
1143
+
Note that because we are now working with a reduced SVD, $\tilde U \tilde U^T \neq I$.
1144
+
1145
+
Consequently,
1120
1146
1121
1147
$$
1122
1148
\hat A \neq \tilde U \tilde A \tilde U^T,
1123
1149
$$
1124
1150
1125
-
we can't simply recover $\hat A$ from $\tilde A$ and $\tilde U$.
1151
+
and we can't simply recover $\hat A$ from $\tilde A$ and $\tilde U$.
1126
1152
1127
1153
1128
-
Nevertheless, we hope for the best and construct an eigendecomposition of the
1154
+
Nevertheless, we hope for the best and proceed to construct an eigendecomposition of the
1129
1155
$p \times p$ matrix $\tilde A$:
1130
1156
1131
1157
$$
1132
1158
\tilde A = \tilde W \Lambda \tilde W^{-1} .
1133
1159
$$ (eq:tildeAeigenred)
1134
1160
1135
1161
1136
-
Mimicking our procedure in Representation 2, we cross our fingers and compute the $m \times p$ matrix
1162
+
Mimicking our procedure in Representation 2, we cross our fingers and compute an $m \times p$ matrix
1137
1163
1138
1164
$$
1139
1165
\tilde \Phi_s = \tilde U \tilde W
@@ -1200,7 +1226,7 @@ $$
1200
1226
\hat A \phi_i = \lambda_i \phi_i .
1201
1227
$$
1202
1228
1203
-
Evidently, $\phi_i$ is an eigenvector of $\hat A$ that corresponds to eigenvalue $\lambda_i$ of both $\tilde A$ and $\hat A$.
1229
+
This equation confirms that $\phi_i$ is an eigenvector of $\hat A$ that corresponds to eigenvalue $\lambda_i$ of both $\tilde A$ and $\hat A$.
1204
1230
1205
1231
This concludes the proof.
1206
1232
@@ -1246,8 +1272,8 @@ $$
1246
1272
\check b = (\Phi^T \Phi)^{-1} \Phi^T X
1247
1273
$$ (eq:checkbform)
1248
1274
1249
-
The $p \times n$ matrix $\check b$ is recognizable as the matrix of least squares regression coefficients of the $m \times n$ matrix
1250
-
$X$ on the $m \times p$ matrix $\Phi$ and
1275
+
The $p \times n$ matrix $\check b$ is recognizable as a matrix of least squares regression coefficients of the $m \times n$ matrix
1276
+
$X$ on the $m \times p$ matrix $\Phi$ and consequently
1251
1277
1252
1278
$$
1253
1279
\check X = \Phi \check b
@@ -1272,7 +1298,7 @@ or
1272
1298
1273
1299
$$
1274
1300
X = \Phi \check b + \epsilon
1275
-
$$
1301
+
$$ (eq:Xbcheck)
1276
1302
1277
1303
where $\epsilon$ is an $m \times n$ matrix of least squares errors satisfying the least squares
1278
1304
orthogonality conditions $\epsilon^T \Phi =0 $ or
@@ -1288,79 +1314,92 @@ which implies formula {eq}`eq:checkbform`.
1288
1314
1289
1315
1290
1316
1291
-
### Alternative algorithm
1317
+
### A useful approximation
1292
1318
1293
1319
1294
1320
1295
1321
There is a useful way to approximate the $p \times 1$ vector $\check b_t$ instead of using formula
1296
1322
{eq}`eq:decoder102`.
1297
1323
1298
-
In particular, the following argument from {cite}`DDSE_book` (page 240) provides a computationally efficient way
1299
-
to compute $\check b_t$.
1324
+
In particular, the following argument adapted from {cite}`DDSE_book` (page 240) provides a computationally efficient way
1325
+
to approximate $\check b_t$.
1300
1326
1301
1327
For convenience, we'll do this first for time $t=1$.
1302
1328
1303
1329
1304
1330
1305
-
For $t=1$, we have
1331
+
For $t=1$, from equation {eq}`eq:Xbcheck` we have
1306
1332
1307
1333
$$
1308
-
X_1 = \Phi \check b_1
1334
+
\check X_1 = \Phi \check b_1
1309
1335
$$ (eq:X1proj)
1310
1336
1311
1337
where $\check b_1$ is a $p \times 1$ vector.
1312
1338
1313
-
Recall from representation 1 above that $X_1 = U \tilde b_1$, where $\tilde b_1$ is a time $1$ basis vector for representation 1.
1339
+
Recall from representation 1 above that $X_1 = U \tilde b_1$, where $\tilde b_1$ is a time $1$ basis vector for representation 1 and $U$ is from a full SVD of $X$.
1340
+
1341
+
It then follows from equation {eq}`eq:Xbcheck` that
1314
1342
1315
-
It then follows from equation {eq}`eq:Phiformula` that
1316
1343
1317
1344
$$
1318
-
U \tilde b_1 = X' V \Sigma^{-1} W \check b_1
1345
+
U \tilde b_1 = X' \tilde V \tilde \Sigma^{-1} \tilde W \check b_1 + \epsilon_1
1346
+
$$
1347
+
1348
+
where $\epsilon_1$ is a least-squares error vector from equation {eq}`eq:Xbcheck`.
1349
+
1350
+
It follows that
1351
+
1352
+
$$
1353
+
\tilde b_1 = U^T X' V \tilde \Sigma^{-1} \tilde W \check b_1 + U^T \epsilon_1
1319
1354
$$
1320
1355
1321
-
and consequently
1356
+
1357
+
Replacing the error term $U^T \epsilon_1$ by zero, and replacing $U$ from a full SVD of $X$ with
1358
+
$\tilde U$ from a reduced SVD, we obtain an approximation $\hat b_1$ to $\tilde b_1$:
1359
+
1360
+
1322
1361
1323
1362
$$
1324
-
\tilde b_1 = U^T X' V \Sigma^{-1} W \check b_1
1363
+
\hat b_1 = \tilde U^T X' \tilde V \tilde \Sigma^{-1} \tilde W \check b_1
1325
1364
$$
1326
1365
1327
-
Recall that from equation {eq}`eq:AhatSVDformula`, $ \tilde A = U^T X' V \Sigma^{-1}$.
1366
+
Recall that from equation {eq}`eq:tildeAverify`, $ \tilde A = \tilde U^T X' \tilde V \tilde \Sigma^{-1}$.
1328
1367
1329
1368
It then follows that
1330
1369
1331
1370
$$
1332
-
\tilde b_1 = \tilde A W \check b_1
1371
+
\hat b_1 = \tilde A \tilde W \check b_1
1333
1372
$$
1334
1373
1335
-
and therefore, by the eigendecomposition {eq}`eq:tildeAeigen` of $\tilde A$, we have
1374
+
and therefore, by the eigendecomposition {eq}`eq:tildeAeigenred` of $\tilde A$, we have
which is computationally more efficient than the following instance of equation {eq}`eq:decoder102` for approximating the initial vector $\check b_1$:
1395
+
which is computationally efficient approximation to the following instance of equation {eq}`eq:decoder102` for the initial vector $\check b_1$:
1357
1396
1358
1397
$$
1359
1398
\check b_1= \Phi^{+} X_1
1360
1399
$$ (eq:bphieqn)
1361
1400
1362
1401
1363
-
Users of DMD sometimes call components of the basis vector $\check b_t = \Phi^+ X_t \equiv (W \Lambda)^{-1} U^T X_t$ the **exact** DMD modes.
1402
+
(To highlight that {eq}`eq:beqnsmall` is an approximation, users of DMD sometimes call components of the basis vector $\check b_t = \Phi^+ X_t $ the **exact** DMD modes.)
1364
1403
1365
1404
Conditional on $X_t$, we can compute our decoded $\check X_{t+j}, j = 1, 2, \ldots $ from
1366
1405
either
@@ -1370,13 +1409,13 @@ $$
1370
1409
$$ (eq:checkXevoln)
1371
1410
1372
1411
1373
-
or
1412
+
or use the approximation
1374
1413
1375
1414
$$
1376
-
\check X_{t+j} = \Phi \Lambda^j (W \Lambda)^{-1} U^T X_t .
0 commit comments