Skip to content

Commit d0aa4b4

Browse files
committed
Rename var to avoid confusion
Rename n to m, as n tends to be used for the number of parameters rather than the sample size
1 parent 490c565 commit d0aa4b4

File tree

1 file changed

+4
-4
lines changed

1 file changed

+4
-4
lines changed

Diff for: machine_learning/local_weighted_learning/local_weighted_learning.py

+4-4
Original file line numberDiff line numberDiff line change
@@ -47,7 +47,7 @@ def weight_matrix(point: np.ndarray, x_train: np.ndarray, tau: float) -> np.ndar
4747
decreases as the distance from the prediction point increases
4848
4949
Returns:
50-
n x n weight matrix around the prediction point, where n is the size of
50+
m x m weight matrix around the prediction point, where m is the size of
5151
the training set
5252
>>> weight_matrix(
5353
... np.array([1., 1.]),
@@ -58,9 +58,9 @@ def weight_matrix(point: np.ndarray, x_train: np.ndarray, tau: float) -> np.ndar
5858
[0.00000000e+000, 0.00000000e+000, 0.00000000e+000],
5959
[0.00000000e+000, 0.00000000e+000, 0.00000000e+000]])
6060
"""
61-
n = len(x_train) # Number of training samples
62-
weights = np.eye(n) # Initialize weights as identity matrix
63-
for j in range(n):
61+
m = len(x_train) # Number of training samples
62+
weights = np.eye(m) # Initialize weights as identity matrix
63+
for j in range(m):
6464
diff = point - x_train[j]
6565
weights[j, j] = np.exp(diff @ diff.T / (-2.0 * tau**2))
6666

0 commit comments

Comments
 (0)