Skip to content

Commit b7349e4

Browse files
authored
FIX: Deprecation and Future Warnings (#444)
* updates to perm_income_cons * check JAX deprecation fix * fix deprecation in jax * fix linear_models * fix kalman.md * remove testing variable * kalman - remove debug * fix deprecations warnings in kalman_2 * fix markov_perf deprecations * review kesten_processes and looking OK * fix pandas_panel deprecations * fix string formatting warning * fix missing solution-end makers * fix missing solution end * ensure latex is installed for rendering of plot (collab) * Revert "ensure latex is installed for rendering of plot (collab)" This reverts commit b62d770. * move texlive install for collab to action * check pickled environment failure * [linear_models] final deprecation notice * remove debug * add gpu backend code to the top of the lecture
1 parent ec27970 commit b7349e4

11 files changed

+202
-201
lines changed

.github/workflows/collab.yml

Lines changed: 6 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -10,6 +10,12 @@ jobs:
1010
- uses: actions/checkout@v4
1111
with:
1212
ref: ${{ github.event.pull_request.head.sha }}
13+
# Install build software
14+
- name: Install Build Software & LaTeX (kalman_2)
15+
shell: bash -l {0}
16+
run: |
17+
pip install jupyter-book==1.0.3 quantecon-book-theme==0.8.2 sphinx-tojupyter==0.3.0 sphinxext-rediraffe==0.2.7 sphinxcontrib-youtube==1.3.0 sphinx-togglebutton==0.3.2 arviz sphinx-proof sphinx-exercise sphinx-reredirects
18+
apt-get install dvipng texlive texlive-latex-extra texlive-fonts-recommended cm-super
1319
- name: Check nvidia drivers
1420
shell: bash -l {0}
1521
run: |
@@ -28,11 +34,6 @@ jobs:
2834
branch: main
2935
name: build-cache
3036
path: _build
31-
# Install build software
32-
- name: Install Build Software
33-
shell: bash -l {0}
34-
run: |
35-
pip install jupyter-book==1.0.3 quantecon-book-theme==0.8.2 sphinx-tojupyter==0.3.0 sphinxext-rediraffe==0.2.7 sphinxcontrib-youtube==1.3.0 sphinx-togglebutton==0.3.2 arviz sphinx-proof sphinx-exercise sphinx-reredirects
3637
# Build of HTML (Execution Testing)
3738
- name: Build HTML
3839
shell: bash -l {0}

lectures/back_prop.md

Lines changed: 8 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -4,9 +4,9 @@ jupytext:
44
extension: .md
55
format_name: myst
66
format_version: 0.13
7-
jupytext_version: 1.11.5
7+
jupytext_version: 1.16.7
88
kernelspec:
9-
display_name: Python 3
9+
display_name: Python 3 (ipykernel)
1010
language: python
1111
name: python3
1212
---
@@ -22,6 +22,12 @@ kernelspec:
2222
!pip install --upgrade jax
2323
```
2424

25+
```{code-cell} ipython3
26+
import jax
27+
## to check that gpu is activated in environment
28+
print(f"JAX backend: {jax.devices()[0].platform}")
29+
```
30+
2531
In addition to what's included in base Anaconda, we need to install the following packages
2632

2733
```{code-cell} ipython3
@@ -604,15 +610,4 @@ Image(fig.to_image(format="png"))
604610
# notebook locally
605611
```
606612
607-
```{code-cell} ipython3
608-
## to check that gpu is activated in environment
609-
610-
from jax.lib import xla_bridge
611-
print(xla_bridge.get_backend().platform)
612-
```
613613
614-
```{note}
615-
**Cloud Environment:** This lecture site is built in a server environment that doesn't have access to a `gpu`
616-
If you run this lecture locally this lets you know where your code is being executed, either
617-
via the `cpu` or the `gpu`
618-
```

lectures/cass_fiscal.md

Lines changed: 4 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ jupytext:
44
extension: .md
55
format_name: myst
66
format_version: 0.13
7-
jupytext_version: 1.16.6
7+
jupytext_version: 1.16.7
88
kernelspec:
99
display_name: Python 3 (ipykernel)
1010
language: python
@@ -750,7 +750,7 @@ def plot_results(solution, k_ss, c_ss, shocks, shock_param,
750750
R_bar_path = compute_R_bar_path(shocks, k_path, model, S)
751751
752752
axes[2].plot(R_bar_path[:T], linestyle=linestyle, label=label)
753-
axes[2].set_title('$\overline{R}$')
753+
axes[2].set_title(r'$\overline{R}$')
754754
axes[2].axhline(1 / model.β, linestyle='--', color='black')
755755
756756
η_path = compute_η_path(k_path, model, S=T)
@@ -1041,7 +1041,7 @@ Indeed, {eq}`eq:euler_house` or {eq}`eq:diff_second` indicates that a foreseen i
10411041
crease in $\tau_{ct}$ (i.e., a decrease in $(1+\tau_{ct})$
10421042
$(1+\tau_{ct+1})$) operates like an increase in $\tau_{kt}$.
10431043
1044-
The following figure portrays the response to a foreseen increase in the consumption tax $\tau_c$.
1044+
The following figure portrays the response to a foreseen increase in the consumption tax $\tau_c$.
10451045
10461046
```{code-cell} ipython3
10471047
shocks = {
@@ -1101,7 +1101,6 @@ The figure shows that:
11011101
- Transition dynamics push $k_t$ (capital stock) toward a new, lower steady-state level. In the new steady state:
11021102
- Consumption is lower due to reduced output from the lower capital stock.
11031103
- Smoother consumption paths occur when $\gamma = 2$ than when $\gamma = 0.2$.
1104-
11051104
11061105
+++
11071106
@@ -1111,8 +1110,6 @@ foreseen one-time change in a policy variable (a "pulse").
11111110
11121111
**Experiment 4: Foreseen one-time increase in $g$ from 0.2 to 0.4 in period 10, after which $g$ returns to 0.2 forever**
11131112
1114-
1115-
11161113
```{code-cell} ipython3
11171114
g_path = np.repeat(0.2, S + 1)
11181115
g_path[10] = 0.4
@@ -1136,6 +1133,7 @@ The figure indicates how:
11361133
- Before $t = 10$, capital accumulates as interest rate changes induce households to prepare for the anticipated increase in government spending.
11371134
- At $t = 10$, the capital stock sharply decreases as the government consumes part of it.
11381135
- $\bar{R}$ jumps above its steady-state value due to the capital reduction and then gradually declines toward its steady-state level.
1136+
11391137
+++
11401138
11411139
### Method 2: Residual Minimization

lectures/kalman.md

Lines changed: 22 additions & 23 deletions
Original file line numberDiff line numberDiff line change
@@ -3,8 +3,10 @@ jupytext:
33
text_representation:
44
extension: .md
55
format_name: myst
6+
format_version: 0.13
7+
jupytext_version: 1.16.7
68
kernelspec:
7-
display_name: Python 3
9+
display_name: Python 3 (ipykernel)
810
language: python
911
name: python3
1012
---
@@ -29,10 +31,9 @@ kernelspec:
2931

3032
In addition to what's in Anaconda, this lecture will need the following libraries:
3133

32-
```{code-cell} ipython
33-
---
34-
tags: [hide-output]
35-
---
34+
```{code-cell} ipython3
35+
:tags: [hide-output]
36+
3637
!pip install quantecon
3738
```
3839

@@ -54,9 +55,8 @@ Required knowledge: Familiarity with matrix manipulations, multivariate normal d
5455

5556
We'll need the following imports:
5657

57-
```{code-cell} ipython
58+
```{code-cell} ipython3
5859
import matplotlib.pyplot as plt
59-
plt.rcParams["figure.figsize"] = (11, 5) #set default figure size
6060
from scipy import linalg
6161
import numpy as np
6262
import matplotlib.cm as cm
@@ -122,10 +122,9 @@ $2 \times 2$ covariance matrix. In our simulations, we will suppose that
122122

123123
This density $p(x)$ is shown below as a contour map, with the center of the red ellipse being equal to $\hat x$.
124124

125-
```{code-cell} python3
126-
---
127-
tags: [output_scroll]
128-
---
125+
```{code-cell} ipython3
126+
:tags: [output_scroll]
127+
129128
# Set up the Gaussian prior density p
130129
Σ = [[0.4, 0.3], [0.3, 0.45]]
131130
Σ = np.matrix(Σ)
@@ -186,7 +185,7 @@ def bivariate_normal(x, y, σ_x=1.0, σ_y=1.0, μ_x=0.0, μ_y=0.0, σ_xy=0.0):
186185
187186
def gen_gaussian_plot_vals(μ, C):
188187
"Z values for plotting the bivariate Gaussian N(μ, C)"
189-
m_x, m_y = float(μ[0]), float(μ[1])
188+
m_x, m_y = float(μ[0,0]), float(μ[1,0])
190189
s_x, s_y = np.sqrt(C[0, 0]), np.sqrt(C[1, 1])
191190
s_xy = C[0, 1]
192191
return bivariate_normal(X, Y, s_x, s_y, m_x, m_y, s_xy)
@@ -213,15 +212,15 @@ The good news is that the missile has been located by our sensors, which report
213212
The next figure shows the original prior $p(x)$ and the new reported
214213
location $y$
215214

216-
```{code-cell} python3
215+
```{code-cell} ipython3
217216
fig, ax = plt.subplots(figsize=(10, 8))
218217
ax.grid()
219218
220219
Z = gen_gaussian_plot_vals(x_hat, Σ)
221220
ax.contourf(X, Y, Z, 6, alpha=0.6, cmap=cm.jet)
222221
cs = ax.contour(X, Y, Z, 6, colors="black")
223222
ax.clabel(cs, inline=1, fontsize=10)
224-
ax.text(float(y[0]), float(y[1]), "$y$", fontsize=20, color="black")
223+
ax.text(float(y[0].item()), float(y[1].item()), "$y$", fontsize=20, color="black")
225224
226225
plt.show()
227226
```
@@ -284,7 +283,7 @@ This new density $p(x \,|\, y) = N(\hat x^F, \Sigma^F)$ is shown in the next fig
284283

285284
The original density is left in as contour lines for comparison
286285

287-
```{code-cell} python3
286+
```{code-cell} ipython3
288287
fig, ax = plt.subplots(figsize=(10, 8))
289288
ax.grid()
290289
@@ -298,7 +297,7 @@ new_Z = gen_gaussian_plot_vals(x_hat_F, Σ_F)
298297
cs2 = ax.contour(X, Y, new_Z, 6, colors="black")
299298
ax.clabel(cs2, inline=1, fontsize=10)
300299
ax.contourf(X, Y, new_Z, 6, alpha=0.6, cmap=cm.jet)
301-
ax.text(float(y[0]), float(y[1]), "$y$", fontsize=20, color="black")
300+
ax.text(float(y[0].item()), float(y[1].item()), "$y$", fontsize=20, color="black")
302301
303302
plt.show()
304303
```
@@ -391,7 +390,7 @@ A
391390
Q = 0.3 * \Sigma
392391
$$
393392

394-
```{code-cell} python3
393+
```{code-cell} ipython3
395394
fig, ax = plt.subplots(figsize=(10, 8))
396395
ax.grid()
397396
@@ -415,7 +414,7 @@ new_Z = gen_gaussian_plot_vals(new_x_hat, new_Σ)
415414
cs3 = ax.contour(X, Y, new_Z, 6, colors="black")
416415
ax.clabel(cs3, inline=1, fontsize=10)
417416
ax.contourf(X, Y, new_Z, 6, alpha=0.6, cmap=cm.jet)
418-
ax.text(float(y[0]), float(y[1]), "$y$", fontsize=20, color="black")
417+
ax.text(float(y[0].item()), float(y[1].item()), "$y$", fontsize=20, color="black")
419418
420419
plt.show()
421420
```
@@ -577,7 +576,7 @@ Your figure should -- modulo randomness -- look something like this
577576
:class: dropdown
578577
```
579578

580-
```{code-cell} python3
579+
```{code-cell} ipython3
581580
# Parameters
582581
θ = 10 # Constant value of state x_t
583582
A, C, G, H = 1, 0, 1, 1
@@ -598,7 +597,7 @@ xgrid = np.linspace(θ - 5, θ + 2, 200)
598597
599598
for i in range(N):
600599
# Record the current predicted mean and variance
601-
m, v = [float(z) for z in (kalman.x_hat, kalman.Sigma)]
600+
m, v = [float(z) for z in (kalman.x_hat.item(), kalman.Sigma.item())]
602601
# Plot, update filter
603602
ax.plot(xgrid, norm.pdf(xgrid, loc=m, scale=np.sqrt(v)), label=f'$t={i}$')
604603
kalman.update(y[i])
@@ -641,7 +640,7 @@ Your figure should show error erratically declining something like this
641640
:class: dropdown
642641
```
643642

644-
```{code-cell} python3
643+
```{code-cell} ipython3
645644
ϵ = 0.1
646645
θ = 10 # Constant value of state x_t
647646
A, C, G, H = 1, 0, 1, 1
@@ -657,7 +656,7 @@ y = y.flatten()
657656
658657
for t in range(T):
659658
# Record the current predicted mean and variance and plot their densities
660-
m, v = [float(temp) for temp in (kalman.x_hat, kalman.Sigma)]
659+
m, v = [float(temp) for temp in (kalman.x_hat.item(), kalman.Sigma.item())]
661660
662661
f = lambda x: norm.pdf(x, loc=m, scale=np.sqrt(v))
663662
integral, error = quad(f, θ - ϵ, θ + ϵ)
@@ -745,7 +744,7 @@ Observe how, after an initial learning period, the Kalman filter performs quite
745744
:class: dropdown
746745
```
747746

748-
```{code-cell} python3
747+
```{code-cell} ipython3
749748
# Define A, C, G, H
750749
G = np.identity(2)
751750
H = np.sqrt(0.5) * np.identity(2)

0 commit comments

Comments
 (0)