@@ -412,19 +412,19 @@ $\beta_0$ (the OLS parameter estimates might be a reasonable
412
412
guess), then
413
413
414
414
1 . Use the updating rule to iterate the algorithm
415
-
415
+
416
416
$$
417
417
\boldsymbol{\beta}_{(k+1)} = \boldsymbol{\beta}_{(k)} - H^{-1}(\boldsymbol{\beta}_{(k)})G(\boldsymbol{\beta}_{(k)})
418
418
$$
419
419
where:
420
-
420
+
421
421
$$
422
422
\begin{aligned}
423
423
G(\boldsymbol{\beta}_{(k)}) = \frac{d \log \mathcal{L(\boldsymbol{\beta}_{(k)})}}{d \boldsymbol{\beta}_{(k)}} \\
424
424
H(\boldsymbol{\beta}_{(k)}) = \frac{d^2 \log \mathcal{L(\boldsymbol{\beta}_{(k)})}}{d \boldsymbol{\beta}_{(k)}d \boldsymbol{\beta}'_{(k)}}
425
425
\end{aligned}
426
426
$$
427
-
427
+
428
428
1 . Check whether $\boldsymbol{\beta}_ {(k+1)} - \boldsymbol{\beta}_ {(k)} < tol$
429
429
- If true, then stop iterating and set
430
430
$\hat{\boldsymbol{\beta}} = \boldsymbol{\beta}_ {(k+1)}$
@@ -506,7 +506,7 @@ def newton_raphson(model, tol=1e-3, max_iter=1000, display=True):
506
506
while np.any(error > tol) and i < max_iter:
507
507
H, G = model.H(), model.G()
508
508
β_new = model.β - (np.linalg.inv(H) @ G)
509
- error = β_new - model.β
509
+ error = np.abs( β_new - model.β)
510
510
model.β = β_new
511
511
512
512
# Print iterations
@@ -547,7 +547,7 @@ poi = PoissonRegression(y, X, β=init_β)
547
547
```
548
548
549
549
As this was a simple model with few observations, the algorithm achieved
550
- convergence in only 6 iterations.
550
+ convergence in only 7 iterations.
551
551
552
552
You can see that with each iteration, the log-likelihood value increased.
553
553
@@ -973,4 +973,4 @@ print(Probit(y, X).fit().summary())
973
973
```
974
974
975
975
``` {solution-end}
976
- ```
976
+ ```
0 commit comments