Skip to content

Commit 0ffe506

Browse files
Humzafazal72pre-commit-ci[bot]tianyizheng02
authored
added mean absolute percentage error (#10464)
* added mean absolute percentage error * [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci * added mean_absolute_percentage_error * added mean_absolute_percentage_error * [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci * added mean_absolute_percentage_error * added mean_absolute_percentage_error * added mean absolute percentage error * [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci * added mean absolute percentage error * added mean absolute percentage error * added mean absolute percentage error * added mean absolute percentage error * added mean absolute percentage error * Update machine_learning/loss_functions.py --------- Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com> Co-authored-by: Tianyi Zheng <[email protected]>
1 parent e1e5963 commit 0ffe506

File tree

1 file changed

+45
-0
lines changed

1 file changed

+45
-0
lines changed

Diff for: machine_learning/loss_functions.py

+45
Original file line numberDiff line numberDiff line change
@@ -297,6 +297,51 @@ def mean_squared_logarithmic_error(y_true: np.ndarray, y_pred: np.ndarray) -> fl
297297
return np.mean(squared_logarithmic_errors)
298298

299299

300+
def mean_absolute_percentage_error(
301+
y_true: np.ndarray, y_pred: np.ndarray, epsilon: float = 1e-15
302+
) -> float:
303+
"""
304+
Calculate the Mean Absolute Percentage Error between y_true and y_pred.
305+
306+
Mean Absolute Percentage Error calculates the average of the absolute
307+
percentage differences between the predicted and true values.
308+
309+
Formula = (Σ|y_true[i]-Y_pred[i]/y_true[i]|)/n
310+
311+
Source: https://stephenallwright.com/good-mape-score/
312+
313+
Parameters:
314+
y_true (np.ndarray): Numpy array containing true/target values.
315+
y_pred (np.ndarray): Numpy array containing predicted values.
316+
317+
Returns:
318+
float: The Mean Absolute Percentage error between y_true and y_pred.
319+
320+
Examples:
321+
>>> y_true = np.array([10, 20, 30, 40])
322+
>>> y_pred = np.array([12, 18, 33, 45])
323+
>>> mean_absolute_percentage_error(y_true, y_pred)
324+
0.13125
325+
326+
>>> y_true = np.array([1, 2, 3, 4])
327+
>>> y_pred = np.array([2, 3, 4, 5])
328+
>>> mean_absolute_percentage_error(y_true, y_pred)
329+
0.5208333333333333
330+
331+
>>> y_true = np.array([34, 37, 44, 47, 48, 48, 46, 43, 32, 27, 26, 24])
332+
>>> y_pred = np.array([37, 40, 46, 44, 46, 50, 45, 44, 34, 30, 22, 23])
333+
>>> mean_absolute_percentage_error(y_true, y_pred)
334+
0.064671076436071
335+
"""
336+
if len(y_true) != len(y_pred):
337+
raise ValueError("The length of the two arrays should be the same.")
338+
339+
y_true = np.where(y_true == 0, epsilon, y_true)
340+
absolute_percentage_diff = np.abs((y_true - y_pred) / y_true)
341+
342+
return np.mean(absolute_percentage_diff)
343+
344+
300345
if __name__ == "__main__":
301346
import doctest
302347

0 commit comments

Comments
 (0)